WorldWideScience

Sample records for sample survey design

  1. Sample design for the residential energy consumption survey

    Energy Technology Data Exchange (ETDEWEB)

    1994-08-01

    The purpose of this report is to provide detailed information about the multistage area-probability sample design used for the Residential Energy Consumption Survey (RECS). It is intended as a technical report, for use by statisticians, to better understand the theory and procedures followed in the creation of the RECS sample frame. For a more cursory overview of the RECS sample design, refer to the appendix entitled ``How the Survey was Conducted,`` which is included in the statistical reports produced for each RECS survey year.

  2. Sample design considerations of indoor air exposure surveys

    International Nuclear Information System (INIS)

    Cox, B.G.; Mage, D.T.; Immerman, F.W.

    1988-01-01

    Concern about the potential for indoor air pollution has prompted recent surveys of radon and NO 2 concentrations in homes and personal exposure studies of volatile organics, carbon monoxide and pesticides, to name a few. The statistical problems in designing sample surveys that measure the physical environment are diverse and more complicated than those encountered in traditional surveys of human attitudes and attributes. This paper addresses issues encountered when designing indoor air quality (IAQ) studies. General statistical concepts related to target population definition, frame creation, and sample selection for area household surveys and telephone surveys are presented. The implications of different measurement approaches are discussed, and response rate considerations are described

  3. Choosing a Cluster Sampling Design for Lot Quality Assurance Sampling Surveys

    OpenAIRE

    Hund, Lauren; Bedrick, Edward J.; Pagano, Marcello

    2015-01-01

    Lot quality assurance sampling (LQAS) surveys are commonly used for monitoring and evaluation in resource-limited settings. Recently several methods have been proposed to combine LQAS with cluster sampling for more timely and cost-effective data collection. For some of these methods, the standard binomial model can be used for constructing decision rules as the clustering can be ignored. For other designs, considered here, clustering is accommodated in the design phase. In this paper, we comp...

  4. Choosing a Cluster Sampling Design for Lot Quality Assurance Sampling Surveys.

    Directory of Open Access Journals (Sweden)

    Lauren Hund

    Full Text Available Lot quality assurance sampling (LQAS surveys are commonly used for monitoring and evaluation in resource-limited settings. Recently several methods have been proposed to combine LQAS with cluster sampling for more timely and cost-effective data collection. For some of these methods, the standard binomial model can be used for constructing decision rules as the clustering can be ignored. For other designs, considered here, clustering is accommodated in the design phase. In this paper, we compare these latter cluster LQAS methodologies and provide recommendations for choosing a cluster LQAS design. We compare technical differences in the three methods and determine situations in which the choice of method results in a substantively different design. We consider two different aspects of the methods: the distributional assumptions and the clustering parameterization. Further, we provide software tools for implementing each method and clarify misconceptions about these designs in the literature. We illustrate the differences in these methods using vaccination and nutrition cluster LQAS surveys as example designs. The cluster methods are not sensitive to the distributional assumptions but can result in substantially different designs (sample sizes depending on the clustering parameterization. However, none of the clustering parameterizations used in the existing methods appears to be consistent with the observed data, and, consequently, choice between the cluster LQAS methods is not straightforward. Further research should attempt to characterize clustering patterns in specific applications and provide suggestions for best-practice cluster LQAS designs on a setting-specific basis.

  5. Choosing a Cluster Sampling Design for Lot Quality Assurance Sampling Surveys.

    Science.gov (United States)

    Hund, Lauren; Bedrick, Edward J; Pagano, Marcello

    2015-01-01

    Lot quality assurance sampling (LQAS) surveys are commonly used for monitoring and evaluation in resource-limited settings. Recently several methods have been proposed to combine LQAS with cluster sampling for more timely and cost-effective data collection. For some of these methods, the standard binomial model can be used for constructing decision rules as the clustering can be ignored. For other designs, considered here, clustering is accommodated in the design phase. In this paper, we compare these latter cluster LQAS methodologies and provide recommendations for choosing a cluster LQAS design. We compare technical differences in the three methods and determine situations in which the choice of method results in a substantively different design. We consider two different aspects of the methods: the distributional assumptions and the clustering parameterization. Further, we provide software tools for implementing each method and clarify misconceptions about these designs in the literature. We illustrate the differences in these methods using vaccination and nutrition cluster LQAS surveys as example designs. The cluster methods are not sensitive to the distributional assumptions but can result in substantially different designs (sample sizes) depending on the clustering parameterization. However, none of the clustering parameterizations used in the existing methods appears to be consistent with the observed data, and, consequently, choice between the cluster LQAS methods is not straightforward. Further research should attempt to characterize clustering patterns in specific applications and provide suggestions for best-practice cluster LQAS designs on a setting-specific basis.

  6. Optimal sampling designs for large-scale fishery sample surveys in Greece

    Directory of Open Access Journals (Sweden)

    G. BAZIGOS

    2007-12-01

    The paper deals with the optimization of the following three large scale sample surveys: biological sample survey of commercial landings (BSCL, experimental fishing sample survey (EFSS, and commercial landings and effort sample survey (CLES.

  7. Probability sampling design in ethnobotanical surveys of medicinal plants

    Directory of Open Access Journals (Sweden)

    Mariano Martinez Espinosa

    2012-07-01

    Full Text Available Non-probability sampling design can be used in ethnobotanical surveys of medicinal plants. However, this method does not allow statistical inferences to be made from the data generated. The aim of this paper is to present a probability sampling design that is applicable in ethnobotanical studies of medicinal plants. The sampling design employed in the research titled "Ethnobotanical knowledge of medicinal plants used by traditional communities of Nossa Senhora Aparecida do Chumbo district (NSACD, Poconé, Mato Grosso, Brazil" was used as a case study. Probability sampling methods (simple random and stratified sampling were used in this study. In order to determine the sample size, the following data were considered: population size (N of 1179 families; confidence coefficient, 95%; sample error (d, 0.05; and a proportion (p, 0.5. The application of this sampling method resulted in a sample size (n of at least 290 families in the district. The present study concludes that probability sampling methods necessarily have to be employed in ethnobotanical studies of medicinal plants, particularly where statistical inferences have to be made using data obtained. This can be achieved by applying different existing probability sampling methods, or better still, a combination of such methods.

  8. Issues in environmental survey design

    International Nuclear Information System (INIS)

    Iachan, R.

    1989-01-01

    Several environmental survey design issues are discussed and illustrated with surveys designed by Research Triangle Institute statisticians. Issues related to sampling and nonsampling errors are illustrated for indoor air quality surveys, radon surveys, pesticide surveys, and occupational and personal exposure surveys. Sample design issues include the use of auxiliary information (e.g. for stratification), and sampling in time. We also discuss the reduction and estimation of nonsampling errors, including nonresponse and measurement bias

  9. A Bayesian Justification for Random Sampling in Sample Survey

    Directory of Open Access Journals (Sweden)

    Glen Meeden

    2012-07-01

    Full Text Available In the usual Bayesian approach to survey sampling the sampling design, plays a minimal role, at best. Although a close relationship between exchangeable prior distributions and simple random sampling has been noted; how to formally integrate simple random sampling into the Bayesian paradigm is not clear. Recently it has been argued that the sampling design can be thought of as part of a Bayesian's prior distribution. We will show here that under this scenario simple random sample can be given a Bayesian justification in survey sampling.

  10. Dealing with trade-offs in destructive sampling designs for occupancy surveys.

    Directory of Open Access Journals (Sweden)

    Stefano Canessa

    Full Text Available Occupancy surveys should be designed to minimise false absences. This is commonly achieved by increasing replication or increasing the efficiency of surveys. In the case of destructive sampling designs, in which searches of individual microhabitats represent the repeat surveys, minimising false absences leads to an inherent trade-off. Surveyors can sample more low quality microhabitats, bearing the resultant financial costs and producing wider-spread impacts, or they can target high quality microhabitats were the focal species is more likely to be found and risk more severe impacts on local habitat quality. We show how this trade-off can be solved with a decision-theoretic approach, using the Millewa Skink Hemiergis millewae from southern Australia as a case study. Hemiergis millewae is an endangered reptile that is best detected using destructive sampling of grass hummocks. Within sites that were known to be occupied by H. millewae, logistic regression modelling revealed that lizards were more frequently detected in large hummocks. If this model is an accurate representation of the detection process, searching large hummocks is more efficient and requires less replication, but this strategy also entails destruction of the best microhabitats for the species. We developed an optimisation tool to calculate the minimum combination of the number and size of hummocks to search to achieve a given cumulative probability of detecting the species at a site, incorporating weights to reflect the sensitivity of the results to a surveyor's priorities. The optimisation showed that placing high weight on minimising volume necessitates impractical replication, whereas placing high weight on minimising replication requires searching very large hummocks which are less common and may be vital for H. millewae. While destructive sampling methods are sometimes necessary, surveyors must be conscious of the ecological impacts of these methods. This study provides a

  11. Creel survey sampling designs for estimating effort in short-duration Chinook salmon fisheries

    Science.gov (United States)

    McCormick, Joshua L.; Quist, Michael C.; Schill, Daniel J.

    2013-01-01

    Chinook Salmon Oncorhynchus tshawytscha sport fisheries in the Columbia River basin are commonly monitored using roving creel survey designs and require precise, unbiased catch estimates. The objective of this study was to examine the relative bias and precision of total catch estimates using various sampling designs to estimate angling effort under the assumption that mean catch rate was known. We obtained information on angling populations based on direct visual observations of portions of Chinook Salmon fisheries in three Idaho river systems over a 23-d period. Based on the angling population, Monte Carlo simulations were used to evaluate the properties of effort and catch estimates for each sampling design. All sampling designs evaluated were relatively unbiased. Systematic random sampling (SYS) resulted in the most precise estimates. The SYS and simple random sampling designs had mean square error (MSE) estimates that were generally half of those observed with cluster sampling designs. The SYS design was more efficient (i.e., higher accuracy per unit cost) than a two-cluster design. Increasing the number of clusters available for sampling within a day decreased the MSE of estimates of daily angling effort, but the MSE of total catch estimates was variable depending on the fishery. The results of our simulations provide guidelines on the relative influence of sample sizes and sampling designs on parameters of interest in short-duration Chinook Salmon fisheries.

  12. Effects of sample survey design on the accuracy of classification tree models in species distribution models

    Science.gov (United States)

    Thomas C. Edwards; D. Richard Cutler; Niklaus E. Zimmermann; Linda Geiser; Gretchen G. Moisen

    2006-01-01

    We evaluated the effects of probabilistic (hereafter DESIGN) and non-probabilistic (PURPOSIVE) sample surveys on resultant classification tree models for predicting the presence of four lichen species in the Pacific Northwest, USA. Models derived from both survey forms were assessed using an independent data set (EVALUATION). Measures of accuracy as gauged by...

  13. Sampling effects on the identification of roadkill hotspots: Implications for survey design.

    Science.gov (United States)

    Santos, Sara M; Marques, J Tiago; Lourenço, André; Medinas, Denis; Barbosa, A Márcia; Beja, Pedro; Mira, António

    2015-10-01

    Although locating wildlife roadkill hotspots is essential to mitigate road impacts, the influence of study design on hotspot identification remains uncertain. We evaluated how sampling frequency affects the accuracy of hotspot identification, using a dataset of vertebrate roadkills (n = 4427) recorded over a year of daily surveys along 37 km of roads. "True" hotspots were identified using this baseline dataset, as the 500-m segments where the number of road-killed vertebrates exceeded the upper 95% confidence limit of the mean, assuming a Poisson distribution of road-kills per segment. "Estimated" hotspots were identified likewise, using datasets representing progressively lower sampling frequencies, which were produced by extracting data from the baseline dataset at appropriate time intervals (1-30 days). Overall, 24.3% of segments were "true" hotspots, concentrating 40.4% of roadkills. For different groups, "true" hotspots accounted from 6.8% (bats) to 29.7% (small birds) of road segments, concentrating from 60% (lizards, lagomorphs, carnivores) of roadkills. Spatial congruence between "true" and "estimated" hotspots declined rapidly with increasing time interval between surveys, due primarily to increasing false negatives (i.e., missing "true" hotspots). There were also false positives (i.e., wrong "estimated" hotspots), particularly at low sampling frequencies. Spatial accuracy decay with increasing time interval between surveys was higher for smaller-bodied (amphibians, reptiles, small birds, small mammals) than for larger-bodied species (birds of prey, hedgehogs, lagomorphs, carnivores). Results suggest that widely used surveys at weekly or longer intervals may produce poor estimates of roadkill hotspots, particularly for small-bodied species. Surveying daily or at two-day intervals may be required to achieve high accuracy in hotspot identification for multiple species. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Watershed-based survey designs

    Science.gov (United States)

    Detenbeck, N.E.; Cincotta, D.; Denver, J.M.; Greenlee, S.K.; Olsen, A.R.; Pitchford, A.M.

    2005-01-01

    Watershed-based sampling design and assessment tools help serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional conditions to meet Section 305(b), identification of impaired water bodies or watersheds to meet Section 303(d), and development of empirical relationships between causes or sources of impairment and biological responses. Creation of GIS databases for hydrography, hydrologically corrected digital elevation models, and hydrologic derivatives such as watershed boundaries and upstream–downstream topology of subcatchments would provide a consistent seamless nationwide framework for these designs. The elements of a watershed-based sample framework can be represented either as a continuous infinite set defined by points along a linear stream network, or as a discrete set of watershed polygons. Watershed-based designs can be developed with existing probabilistic survey methods, including the use of unequal probability weighting, stratification, and two-stage frames for sampling. Case studies for monitoring of Atlantic Coastal Plain streams, West Virginia wadeable streams, and coastal Oregon streams illustrate three different approaches for selecting sites for watershed-based survey designs.

  15. Extending cluster lot quality assurance sampling designs for surveillance programs.

    Science.gov (United States)

    Hund, Lauren; Pagano, Marcello

    2014-07-20

    Lot quality assurance sampling (LQAS) has a long history of applications in industrial quality control. LQAS is frequently used for rapid surveillance in global health settings, with areas classified as poor or acceptable performance on the basis of the binary classification of an indicator. Historically, LQAS surveys have relied on simple random samples from the population; however, implementing two-stage cluster designs for surveillance sampling is often more cost-effective than simple random sampling. By applying survey sampling results to the binary classification procedure, we develop a simple and flexible nonparametric procedure to incorporate clustering effects into the LQAS sample design to appropriately inflate the sample size, accommodating finite numbers of clusters in the population when relevant. We use this framework to then discuss principled selection of survey design parameters in longitudinal surveillance programs. We apply this framework to design surveys to detect rises in malnutrition prevalence in nutrition surveillance programs in Kenya and South Sudan, accounting for clustering within villages. By combining historical information with data from previous surveys, we design surveys to detect spikes in the childhood malnutrition rate. Copyright © 2014 John Wiley & Sons, Ltd.

  16. Air sampling system for airborne surveys

    International Nuclear Information System (INIS)

    Jupiter, C.; Tipton, W.J.

    1975-01-01

    An air sampling system has been designed for installation on the Beechcraft King Air A-100 aircraft as a part of the Aerial Radiological Measuring System (ARMS). It is intended for both particle and whole gas sampling. The sampling probe is designed for isokinetic sampling and is mounted on a removable modified escape hatch cover, behind the co-pilot's seat, and extends about two feet forward of the hatch cover in the air stream lines. Directly behind the sampling probe inside the modified hatch cover is an expansion chamber, space for a 5-inch diameter filter paper cassette, and an optional four-stage cascade impactor for particle size distribution measurements. A pair of motors and blower pumps provide the necessary 0.5 atmosphere pressure across the type MSA 1106 B glass fiber filter paper to allow a flow rate of 50 cfm. The MSA 1106 B filter paper is designed to trap sub-micrometer particles with a high efficiency; it was chosen to enable a quantitative measurement of airborne radon daughters, one of the principal sources of background signals when radiological surveys are being performed. A venturi section and pressure gauges allow air flow rate measurements so that airborne contaminant concentrations may be quantified. A whole gas sampler capable of sampling a cubic meter of air is mounted inside the aircraft cabin. A nuclear counting system on board the aircraft provides capability for α, β and γ counting of filter paper samples. Design data are presented and types of survey missions which may be served by this system are described

  17. Using Linked Survey Paradata to Improve Sampling Strategies in the Medical Expenditure Panel Survey

    Directory of Open Access Journals (Sweden)

    Mirel Lisa B.

    2017-06-01

    Full Text Available Using paradata from a prior survey that is linked to a new survey can help a survey organization develop more effective sampling strategies. One example of this type of linkage or subsampling is between the National Health Interview Survey (NHIS and the Medical Expenditure Panel Survey (MEPS. MEPS is a nationally representative sample of the U.S. civilian, noninstitutionalized population based on a complex multi-stage sample design. Each year a new sample is drawn as a subsample of households from the prior year’s NHIS. The main objective of this article is to examine how paradata from a prior survey can be used in developing a sampling scheme in a subsequent survey. A framework for optimal allocation of the sample in substrata formed for this purpose is presented and evaluated for the relative effectiveness of alternative substratification schemes. The framework is applied, using real MEPS data, to illustrate how utilizing paradata from the linked survey offers the possibility of making improvements to the sampling scheme for the subsequent survey. The improvements aim to reduce the data collection costs while maintaining or increasing effective responding sample sizes and response rates for a harder to reach population.

  18. [Sampling and measurement methods of the protocol design of the China Nine-Province Survey for blindness, visual impairment and cataract surgery].

    Science.gov (United States)

    Zhao, Jia-liang; Wang, Yu; Gao, Xue-cheng; Ellwein, Leon B; Liu, Hu

    2011-09-01

    To design the protocol of the China nine-province survey for blindness, visual impairment and cataract surgery to evaluate the prevalence and main causes of blindness and visual impairment, and the prevalence and outcomes of the cataract surgery. The protocol design was began after accepting the task for the national survey for blindness, visual impairment and cataract surgery from the Department of Medicine, Ministry of Health, China, in November, 2005. The protocol in Beijing Shunyi Eye Study in 1996 and Guangdong Doumen County Eye Study in 1997, both supported by World Health Organization, was taken as the basis for the protocol design. The relative experts were invited to discuss and prove the draft protocol. An international advisor committee was established to examine and approve the draft protocol. Finally, the survey protocol was checked and approved by the Department of Medicine, Ministry of Health, China and Prevention Program of Blindness and Deafness, WHO. The survey protocol was designed according to the characteristics and the scale of the survey. The contents of the protocol included determination of target population and survey sites, calculation of the sample size, design of the random sampling, composition and organization of the survey teams, determination of the examinee, the flowchart of the field work, survey items and methods, diagnostic criteria of blindness and moderate and sever visual impairment, the measures of the quality control, the methods of the data management. The designed protocol became the standard and practical protocol for the survey to evaluate the prevalence and main causes of blindness and visual impairment, and the prevalence and outcomes of the cataract surgery.

  19. New tools for evaluating LQAS survey designs

    OpenAIRE

    Hund, Lauren

    2014-01-01

    Lot Quality Assurance Sampling (LQAS) surveys have become increasingly popular in global health care applications. Incorporating Bayesian ideas into LQAS survey design, such as using reasonable prior beliefs about the distribution of an indicator, can improve the selection of design parameters and decision rules. In this paper, a joint frequentist and Bayesian framework is proposed for evaluating LQAS classification accuracy and informing survey design parameters. Simple software tools are pr...

  20. Statistical properties of mean stand biomass estimators in a LIDAR-based double sampling forest survey design.

    Science.gov (United States)

    H.E. Anderson; J. Breidenbach

    2007-01-01

    Airborne laser scanning (LIDAR) can be a valuable tool in double-sampling forest survey designs. LIDAR-derived forest structure metrics are often highly correlated with important forest inventory variables, such as mean stand biomass, and LIDAR-based synthetic regression estimators have the potential to be highly efficient compared to single-stage estimators, which...

  1. Sample Size Calculations for Population Size Estimation Studies Using Multiplier Methods With Respondent-Driven Sampling Surveys.

    Science.gov (United States)

    Fearon, Elizabeth; Chabata, Sungai T; Thompson, Jennifer A; Cowan, Frances M; Hargreaves, James R

    2017-09-14

    While guidance exists for obtaining population size estimates using multiplier methods with respondent-driven sampling surveys, we lack specific guidance for making sample size decisions. To guide the design of multiplier method population size estimation studies using respondent-driven sampling surveys to reduce the random error around the estimate obtained. The population size estimate is obtained by dividing the number of individuals receiving a service or the number of unique objects distributed (M) by the proportion of individuals in a representative survey who report receipt of the service or object (P). We have developed an approach to sample size calculation, interpreting methods to estimate the variance around estimates obtained using multiplier methods in conjunction with research into design effects and respondent-driven sampling. We describe an application to estimate the number of female sex workers in Harare, Zimbabwe. There is high variance in estimates. Random error around the size estimate reflects uncertainty from M and P, particularly when the estimate of P in the respondent-driven sampling survey is low. As expected, sample size requirements are higher when the design effect of the survey is assumed to be greater. We suggest a method for investigating the effects of sample size on the precision of a population size estimate obtained using multipler methods and respondent-driven sampling. Uncertainty in the size estimate is high, particularly when P is small, so balancing against other potential sources of bias, we advise researchers to consider longer service attendance reference periods and to distribute more unique objects, which is likely to result in a higher estimate of P in the respondent-driven sampling survey. ©Elizabeth Fearon, Sungai T Chabata, Jennifer A Thompson, Frances M Cowan, James R Hargreaves. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 14.09.2017.

  2. Analyzing Repeated Measures Marginal Models on Sample Surveys with Resampling Methods

    Directory of Open Access Journals (Sweden)

    James D. Knoke

    2005-12-01

    Full Text Available Packaged statistical software for analyzing categorical, repeated measures marginal models on sample survey data with binary covariates does not appear to be available. Consequently, this report describes a customized SAS program which accomplishes such an analysis on survey data with jackknifed replicate weights for which the primary sampling unit information has been suppressed for respondent confidentiality. First, the program employs the Macro Language and the Output Delivery System (ODS to estimate the means and covariances of indicator variables for the response variables, taking the design into account. Then, it uses PROC CATMOD and ODS, ignoring the survey design, to obtain the design matrix and hypothesis test specifications. Finally, it enters these results into another run of CATMOD, which performs automated direct input of the survey design specifications and accomplishes the appropriate analysis. This customized SAS program can be employed, with minor editing, to analyze general categorical, repeated measures marginal models on sample surveys with replicate weights. Finally, the results of our analysis accounting for the survey design are compared to the results of two alternate analyses of the same data. This comparison confirms that such alternate analyses, which do not properly account for the design, do not produce useful results.

  3. New tools for evaluating LQAS survey designs.

    Science.gov (United States)

    Hund, Lauren

    2014-02-15

    Lot Quality Assurance Sampling (LQAS) surveys have become increasingly popular in global health care applications. Incorporating Bayesian ideas into LQAS survey design, such as using reasonable prior beliefs about the distribution of an indicator, can improve the selection of design parameters and decision rules. In this paper, a joint frequentist and Bayesian framework is proposed for evaluating LQAS classification accuracy and informing survey design parameters. Simple software tools are provided for calculating the positive and negative predictive value of a design with respect to an underlying coverage distribution and the selected design parameters. These tools are illustrated using a data example from two consecutive LQAS surveys measuring Oral Rehydration Solution (ORS) preparation. Using the survey tools, the dependence of classification accuracy on benchmark selection and the width of the 'grey region' are clarified in the context of ORS preparation across seven supervision areas. Following the completion of an LQAS survey, estimation of the distribution of coverage across areas facilitates quantifying classification accuracy and can help guide intervention decisions.

  4. Emigration Rates From Sample Surveys: An Application to Senegal.

    Science.gov (United States)

    Willekens, Frans; Zinn, Sabine; Leuchter, Matthias

    2017-12-01

    What is the emigration rate of a country, and how reliable is that figure? Answering these questions is not at all straightforward. Most data on international migration are census data on foreign-born population. These migrant stock data describe the immigrant population in destination countries but offer limited information on the rate at which people leave their country of origin. The emigration rate depends on the number leaving in a given period and the population at risk of leaving, weighted by the duration at risk. Emigration surveys provide a useful data source for estimating emigration rates, provided that the estimation method accounts for sample design. In this study, emigration rates and confidence intervals are estimated from a sample survey of households in the Dakar region in Senegal, which was part of the Migration between Africa and Europe survey. The sample was a stratified two-stage sample with oversampling of households with members abroad or return migrants. A combination of methods of survival analysis (time-to-event data) and replication variance estimation (bootstrapping) yields emigration rates and design-consistent confidence intervals that are representative for the study population.

  5. A two-phase sampling design for increasing detections of rare species in occupancy surveys

    Science.gov (United States)

    Pacifici, Krishna; Dorazio, Robert M.; Dorazio, Michael J.

    2012-01-01

    1. Occupancy estimation is a commonly used tool in ecological studies owing to the ease at which data can be collected and the large spatial extent that can be covered. One major obstacle to using an occupancy-based approach is the complications associated with designing and implementing an efficient survey. These logistical challenges become magnified when working with rare species when effort can be wasted in areas with none or very few individuals. 2. Here, we develop a two-phase sampling approach that mitigates these problems by using a design that places more effort in areas with higher predicted probability of occurrence. We compare our new sampling design to traditional single-season occupancy estimation under a range of conditions and population characteristics. We develop an intuitive measure of predictive error to compare the two approaches and use simulations to assess the relative accuracy of each approach. 3. Our two-phase approach exhibited lower predictive error rates compared to the traditional single-season approach in highly spatially correlated environments. The difference was greatest when detection probability was high (0·75) regardless of the habitat or sample size. When the true occupancy rate was below 0·4 (0·05-0·4), we found that allocating 25% of the sample to the first phase resulted in the lowest error rates. 4. In the majority of scenarios, the two-phase approach showed lower error rates compared to the traditional single-season approach suggesting our new approach is fairly robust to a broad range of conditions and design factors and merits use under a wide variety of settings. 5. Synthesis and applications. Conservation and management of rare species are a challenging task facing natural resource managers. It is critical for studies involving rare species to efficiently allocate effort and resources as they are usually of a finite nature. We believe our approach provides a framework for optimal allocation of effort while

  6. Enhancing sampling design in mist-net bat surveys by accounting for sample size optimization

    OpenAIRE

    Trevelin, Leonardo Carreira; Novaes, Roberto Leonan Morim; Colas-Rosas, Paul François; Benathar, Thayse Cristhina Melo; Peres, Carlos A.

    2017-01-01

    The advantages of mist-netting, the main technique used in Neotropical bat community studies to date, include logistical implementation, standardization and sampling representativeness. Nonetheless, study designs still have to deal with issues of detectability related to how different species behave and use the environment. Yet there is considerable sampling heterogeneity across available studies in the literature. Here, we approach the problem of sample size optimization. We evaluated the co...

  7. Off-road sampling reveals a different grassland bird community than roadside sampling: implications for survey design and estimates to guide conservation

    Directory of Open Access Journals (Sweden)

    Troy I. Wellicome

    2014-06-01

    concern. Our results highlight the need to develop appropriate corrections for bias in estimates derived from roadside sampling, and the need to design surveys that sample bird communities across a more representative cross-section of the landscape, both near and far from roads.

  8. Conditional estimation of exponential random graph models from snowball sampling designs

    NARCIS (Netherlands)

    Pattison, Philippa E.; Robins, Garry L.; Snijders, Tom A. B.; Wang, Peng

    2013-01-01

    A complete survey of a network in a large population may be prohibitively difficult and costly. So it is important to estimate models for networks using data from various network sampling designs, such as link-tracing designs. We focus here on snowball sampling designs, designs in which the members

  9. National Sample Survey of Registered Nurses II. Status of Nurses: November 1980.

    Science.gov (United States)

    Bentley, Barbara S.; And Others

    This report provides data describing the nursing population as determined by the second national sample survey of registered nurses. A brief introduction is followed by a chapter that presents an overview of the survey methodology, including details on the sampling design, the response rate, and the statistical reliability. Chapter 3 provides a…

  10. [Saarland Growth Study: sampling design].

    Science.gov (United States)

    Danker-Hopfe, H; Zabransky, S

    2000-01-01

    The use of reference data to evaluate the physical development of children and adolescents is part of the daily routine in the paediatric ambulance. The construction of such reference data is based on the collection of extensive reference data. There are different kinds of reference data: cross sectional references, which are based on data collected from a big representative cross-sectional sample of the population, longitudinal references, which are based on follow-up surveys of usually smaller samples of individuals from birth to maturity, and mixed longitudinal references, which are a combination of longitudinal and cross-sectional reference data. The advantages and disadvantages of the different methods of data collection and the resulting reference data are discussed. The Saarland Growth Study was conducted for several reasons: growth processes are subject to secular changes, there are no specific reference data for children and adolescents from this part of the country and the growth charts in use in the paediatric praxis are possibly not appropriate any more. Therefore, the Saarland Growth Study served two purposes a) to create actual regional reference data and b) to create a database for future studies on secular trends in growth processes of children and adolescents from Saarland. The present contribution focusses on general remarks on the sampling design of (cross-sectional) growth surveys and its inferences for the design of the present study.

  11. Methodological design of the National Health and Nutrition Survey 2016

    OpenAIRE

    Martín Romero-Martínez; Teresa Shamah-Levy; Lucia Cuevas-Nasu; Ignacio Méndez Gómez-Humarán; Elsa Berenice Gaona-Pineda; Luz María Gómez-Acosta; Juan Ángel Rivera-Dommarco; Mauricio Hernández-Ávila

    2017-01-01

    Objective. Describe the design methodology of the halfway health and nutrition national survey (Ensanut-MC) 2016. Materials and methods. The Ensanut-MC is a national probabilistic survey whose objective population are the in­habitants of private households in Mexico. The sample size was determined to make inferences on the urban and rural areas in four regions. Describes main design elements: target population, topics of study, sampling procedure, measurement procedure and logistics organizat...

  12. Inclusion of mobile phone numbers into an ongoing population health survey in New South Wales, Australia: design, methods, call outcomes, costs and sample representativeness.

    Science.gov (United States)

    Barr, Margo L; van Ritten, Jason J; Steel, David G; Thackway, Sarah V

    2012-11-22

    In Australia telephone surveys have been the method of choice for ongoing jurisdictional population health surveys. Although it was estimated in 2011 that nearly 20% of the Australian population were mobile-only phone users, the inclusion of mobile phone numbers into these existing landline population health surveys has not occurred. This paper describes the methods used for the inclusion of mobile phone numbers into an existing ongoing landline random digit dialling (RDD) health survey in an Australian state, the New South Wales Population Health Survey (NSWPHS). This paper also compares the call outcomes, costs and the representativeness of the resultant sample to that of the previous landline sample. After examining several mobile phone pilot studies conducted in Australia and possible sample designs (screening dual-frame and overlapping dual-frame), mobile phone numbers were included into the NSWPHS using an overlapping dual-frame design. Data collection was consistent, where possible, with the previous years' landline RDD phone surveys and between frames. Survey operational data for the frames were compared and combined. Demographic information from the interview data for mobile-only phone users, both, and total were compared to the landline frame using χ2 tests. Demographic information for each frame, landline and the mobile-only (equivalent to a screening dual frame design), and the frames combined (with appropriate overlap adjustment) were compared to the NSW demographic profile from the 2011 census using χ2 tests. In the first quarter of 2012, 3395 interviews were completed with 2171 respondents (63.9%) from the landline frame (17.6% landline only) and 1224 (36.1%) from the mobile frame (25.8% mobile only). Overall combined response, contact and cooperation rates were 33.1%, 65.1% and 72.2% respectively. As expected from previous research, the demographic profile of the mobile-only phone respondents differed most (more that were young, males, Aboriginal

  13. Web-Face-to-Face Mixed-Mode Design in a Longitudinal Survey: Effects on Participation Rates, Sample Composition, and Costs

    Directory of Open Access Journals (Sweden)

    Bianchi Annamaria

    2017-06-01

    Full Text Available Sequential mixed-mode designs are increasingly considered as an alternative to interviewer-administered data collection, allowing researchers to take advantage of the benefits of each mode. We assess the effects of the introduction of a sequential web-face-to-face mixed-mode design over three waves of a longitudinal survey in which members were previously interviewed face-to-face. Findings are reported from a large-scale randomised experiment carried out on the UK Household Longitudinal Study. No differences are found between the mixed-mode design and face-to-face design in terms of cumulative response rates and only minimal differences in terms of sample composition. On the other hand, potential cost savings are evident.

  14. Robustness of Adaptive Survey Designs to Inaccuracy of Design Parameters

    Directory of Open Access Journals (Sweden)

    Burger Joep

    2017-09-01

    Full Text Available Adaptive survey designs (ASDs optimize design features, given 1 the interactions between the design features and characteristics of sampling units and 2 a set of constraints, such as a budget and a minimum number of respondents. Estimation of the interactions is subject to both random and systematic error. In this article, we propose and evaluate four viewpoints to assess robustness of ASDs to inaccuracy of design parameter estimates: the effect of both imprecision and bias on both ASD structure and ASD performance. We additionally propose three distance measures to compare the structure of ASDs. The methodology is illustrated using a simple simulation study and a more complex but realistic case study on the Dutch Travel Survey. The proposed methodology can be applied to other ASD optimization problems. In our simulation study and case study, the ASD was fairly robust to imprecision, but not to realistic dynamics in the design parameters. To deal with the sensitivity of ASDs to changing design parameters, we recommend to learn and update the design parameters.

  15. Galaxy redshift surveys with sparse sampling

    International Nuclear Information System (INIS)

    Chiang, Chi-Ting; Wullstein, Philipp; Komatsu, Eiichiro; Jee, Inh; Jeong, Donghui; Blanc, Guillermo A.; Ciardullo, Robin; Gronwall, Caryl; Hagen, Alex; Schneider, Donald P.; Drory, Niv; Fabricius, Maximilian; Landriau, Martin; Finkelstein, Steven; Jogee, Shardha; Cooper, Erin Mentuch; Tuttle, Sarah; Gebhardt, Karl; Hill, Gary J.

    2013-01-01

    Survey observations of the three-dimensional locations of galaxies are a powerful approach to measure the distribution of matter in the universe, which can be used to learn about the nature of dark energy, physics of inflation, neutrino masses, etc. A competitive survey, however, requires a large volume (e.g., V survey ∼ 10Gpc 3 ) to be covered, and thus tends to be expensive. A ''sparse sampling'' method offers a more affordable solution to this problem: within a survey footprint covering a given survey volume, V survey , we observe only a fraction of the volume. The distribution of observed regions should be chosen such that their separation is smaller than the length scale corresponding to the wavenumber of interest. Then one can recover the power spectrum of galaxies with precision expected for a survey covering a volume of V survey (rather than the volume of the sum of observed regions) with the number density of galaxies given by the total number of observed galaxies divided by V survey (rather than the number density of galaxies within an observed region). We find that regularly-spaced sampling yields an unbiased power spectrum with no window function effect, and deviations from regularly-spaced sampling, which are unavoidable in realistic surveys, introduce calculable window function effects and increase the uncertainties of the recovered power spectrum. On the other hand, we show that the two-point correlation function (pair counting) is not affected by sparse sampling. While we discuss the sparse sampling method within the context of the forthcoming Hobby-Eberly Telescope Dark Energy Experiment, the method is general and can be applied to other galaxy surveys

  16. [Methodological design of the National Health and Nutrition Survey 2016].

    Science.gov (United States)

    Romero-Martínez, Martín; Shamah-Levy, Teresa; Cuevas-Nasu, Lucía; Gómez-Humarán, Ignacio Méndez; Gaona-Pineda, Elsa Berenice; Gómez-Acosta, Luz María; Rivera-Dommarco, Juan Ángel; Hernández-Ávila, Mauricio

    2017-01-01

    Describe the design methodology of the halfway health and nutrition national survey (Ensanut-MC) 2016. The Ensanut-MC is a national probabilistic survey whose objective population are the inhabitants of private households in Mexico. The sample size was determined to make inferences on the urban and rural areas in four regions. Describes main design elements: target population, topics of study, sampling procedure, measurement procedure and logistics organization. A final sample of 9 479 completed household interviews, and a sample of 16 591 individual interviews. The response rate for households was 77.9%, and the response rate for individuals was 91.9%. The Ensanut-MC probabilistic design allows valid statistical inferences about interest parameters for Mexico´s public health and nutrition, specifically on overweight, obesity and diabetes mellitus. Updated information also supports the monitoring, updating and formulation of new policies and priority programs.

  17. THE FMOS-COSMOS SURVEY OF STAR-FORMING GALAXIES AT z ∼ 1.6. III. SURVEY DESIGN, PERFORMANCE, AND SAMPLE CHARACTERISTICS

    Energy Technology Data Exchange (ETDEWEB)

    Silverman, J. D.; Sugiyama, N. [Kavli Institute for the Physics and Mathematics of the Universe (WPI), The University of Tokyo Institutes for Advanced Study, The University of Tokyo, Kashiwa, 277-8583 (Japan); Kashino, D. [Division of Particle and Astrophysical Science, Graduate School of Science, Nagoya University, Nagoya, 464-8602 (Japan); Sanders, D.; Zahid, J.; Kewley, L. J.; Chu, J.; Hasinger, G. [Institute for Astronomy, University of Hawaii, 2680 Woodlawn Drive, Honolulu, HI, 96822 (United States); Kartaltepe, J. S. [National Optical Astronomy Observatory, 950 N. Cherry Ave., Tucson, AZ, 85719 (United States); Arimoto, N. [Subaru Telescope, 650 North A’ohoku Place, Hilo, Hawaii, 96720 (United States); Renzini, A. [Instituto Nazionale de Astrofisica, Osservatorio Astronomico di Padova, vicolo dell’Osservatorio 5, I-35122, Padova, Italy, EU (Italy); Rodighiero, G.; Baronchelli, I. [Dipartimento di Fisica e Astronomia, Universita di Padova, vicolo Osservatorio, 3, I-35122, Padova (Italy); Daddi, E.; Juneau, S. [Laboratoire AIM, CEA/DSM-CNRS-Universite Paris Diderot, Irfu/Service d’Astrophysique, CEA Saclay (France); Nagao, T. [Graduate School of Science and Engineering, Ehime University, 2-5 Bunkyo-cho, Matsuyama 790-8577 (Japan); Lilly, S. J.; Carollo, C. M. [Institute of Astronomy, ETH Zürich, CH-8093, Zürich (Switzerland); Capak, P. [Spitzer Science Center, California Institute of Technology, Pasadena, CA 91125 (United States); Ilbert, O., E-mail: john.silverman@ipmu.jp [Aix Marseille Université, CNRS, LAM (Laboratoire d’Astrophysique de Marseille) UMR 7326, F-13388, Marseille (France); and others

    2015-09-15

    We present a spectroscopic survey of galaxies in the COSMOS field using the Fiber Multi-object Spectrograph (FMOS), a near-infrared instrument on the Subaru Telescope. Our survey is specifically designed to detect the Hα emission line that falls within the H-band (1.6–1.8 μm) spectroscopic window from star-forming galaxies with 1.4 < z < 1.7 and M{sub stellar} ≳ 10{sup 10} M{sub ⊙}. With the high multiplex capability of FMOS, it is now feasible to construct samples of over 1000 galaxies having spectroscopic redshifts at epochs that were previously challenging. The high-resolution mode (R ∼ 2600) effectively separates Hα and [N ii]λ6585, thus enabling studies of the gas-phase metallicity and photoionization state of the interstellar medium. The primary aim of our program is to establish how star formation depends on stellar mass and environment, both recognized as drivers of galaxy evolution at lower redshifts. In addition to the main galaxy sample, our target selection places priority on those detected in the far-infrared by Herschel/PACS to assess the level of obscured star formation and investigate, in detail, outliers from the star formation rate (SFR)—stellar mass relation. Galaxies with Hα detections are followed up with FMOS observations at shorter wavelengths using the J-long (1.11–1.35 μm) grating to detect Hβ and [O iii]λ5008 which provides an assessment of the extinction required to measure SFRs not hampered by dust, and an indication of embedded active galactic nuclei. With 460 redshifts measured from 1153 spectra, we assess the performance of the instrument with respect to achieving our goals, discuss inherent biases in the sample, and detail the emission-line properties. Our higher-level data products, including catalogs and spectra, are available to the community.

  18. Visual Design, Order Effects, and Respondent Characteristics in a Self-Administered Survey

    Directory of Open Access Journals (Sweden)

    Michael Stern

    2007-12-01

    Full Text Available Recent survey design research has shown that small changes in the structure and visual layout of questions can affect respondents' answers. While the findings have provided strong evidence of such effects, they are limited by the homogeneity of their samples, in that many of these studies have used random samples of college students. In this paper, we examine the effects of seven experimental alterations in question format and visual design using data from a general population survey that allows us to examine the effects of demographic differences among respondents. Results from a 2005 random sample mail survey of 1,315 households in a small metropolitan region of the United States suggest that the visual layout of survey questions affects different demographic groups in similar ways.

  19. Methodological design of the National Health and Nutrition Survey 2016

    Directory of Open Access Journals (Sweden)

    Martín Romero-Martínez

    2017-05-01

    Full Text Available Objective. Describe the design methodology of the halfway health and nutrition national survey (Ensanut-MC 2016. Materials and methods. The Ensanut-MC is a national probabilistic survey whose objective population are the in­habitants of private households in Mexico. The sample size was determined to make inferences on the urban and rural areas in four regions. Describes main design elements: target population, topics of study, sampling procedure, measurement procedure and logistics organization. Results. A final sample of 9 479 completed household interviews, and a sample of 16 591 individual interviews. The response rate for households was 77.9%, and the response rate for individuals was 91.9%. Conclusions. The Ensanut-MC probabilistic design allows valid statistical inferences about interest parameters for Mexico´s public health and nutrition, specifically on over­weight, obesity and diabetes mellitus. Updated information also supports the monitoring, updating and formulation of new policies and priority programs.

  20. Multiagency radiation survey and site investigation manual (MARSSIM): Survey design

    International Nuclear Information System (INIS)

    Abelquist, E.W.; Berger, J.D.

    1996-01-01

    This paper describes the MultiAgency Radiation Survey and Site Investigation Manual (MARSSIM) strategy for designing a final status survey. The purpose of the final status survey is to demonstrate that release criteria established by the regulatory agency have been met. Survey design begins with identification of the contaminants and determination of whether the radionuclides of concern exist in background. The decommissioned site is segregated into Class 1, Class 2, and Class 3 areas, based on contamination potential, and each area is further divided into survey units. Appropriate reference areas for indoor and outdoor background measurements are selected. Survey instrumentation and techniques are selected in order to assure that the instrumentation is capable of detecting the contamination at the derived concentration guideline level (DCGL). Survey reference systems are established and the number of survey data points is determined-with the required number of data points distributed on a triangular grid Pattern. Two suitistical tests are used to evaluate data from final status surveys. For contaminants that are b, present in background, the Wilcoxon Rank Sum test is used; for contaminants that are not present in background, the Wilcoxon Signed Rank (or Sign) test is used. The number of data points needed to satisfy these nonparametric tests is based on the contaminant DCGL value, the expected Standard deviation of the contaminant in background and in the survey unit, and the acceptable probability of making Type I and Type II decision errors. The MARSSIM also requires a reasonable level of assurance that any small areas of elevated residual radioactivity that could be significant relative to regulatory limits are not missed during the final status survey. Measurements and sampling on a specified grid size are used to obtain an adequate assurance level that small locations of elevated radioactivity will Still satisfy DCGLs-applicable to small areas

  1. Design and operation of the national home health aide survey: 2007-2008.

    Science.gov (United States)

    Bercovitz, Anita; Moss, Abigail J; Sengupta, Manisha; Harris-Kojetin, Lauren D; Squillace, Marie R; Emily, Rosenoff; Branden, Laura

    2010-03-01

    This report provides an overview of the National Home Health Aide Survey (NHHAS), the first national probability survey of home health aides. NHHAS was designed to provide national estimates of home health aides who provided assistance in activities of daily living (ADLs) and were directly employed by agencies that provide home health and/or hospice care. This report discusses the need for and objectives of the survey, the design process, the survey methods, and data availability. METHODS NHHAS, a multistage probability sample survey, was conducted as a supplement to the 2007 National Home and Hospice Care Survey (NHHCS). Agencies providing home health and/or hospice care were sampled, and then aides employed by these agencies were sampled and interviewed by telephone. Survey topics included recruitment, training, job history, family life, client relations, work-related injuries, and demographics. NHHAS was virtually identical to the 2004 National Nursing Assistant Survey of certified nursing assistants employed in sampled nursing homes with minor changes to account for differences in workplace environment and responsibilities. RESULTS From September 2007 to April 2008, interviews were completed with 3,416 aides. A public-use data file that contains the interview responses, sampling weights, and design variables is available. The NHHAS overall response rate weighted by the inverse of the probability of selection was 41 percent. This rate is the product of the weighted first-stage agency response rate of 57 percent (i.e., weighted response rate of 59 percent for agency participation in NHHCS times the weighted response rate of 97 percent for agencies participating in NHHCS that also participated in NHHAS) and the weighted second-stage aide response rate of 72 percent to NHHAS.

  2. Designing and conducting survey research a comprehensive guide

    CERN Document Server

    Rea, Louis M

    2014-01-01

    The industry standard guide, updated with new ideas and SPSS analysis techniques Designing and Conducting Survey Research: A Comprehensive Guide Fourth Edition is the industry standard resource that covers all major components of the survey process, updated to include new data analysis techniques and SPSS procedures with sample data sets online. The book offers practical, actionable guidance on constructing the instrument, administrating the process, and analyzing and reporting the results, providing extensive examples and worksheets that demonstrate the appropriate use of survey and data tech

  3. National Sample Survey of Registered Nurses

    Data.gov (United States)

    U.S. Department of Health & Human Services — The National Sample Survey of Registered Nurses (NSSRN) Download makes data from the survey readily available to users in a one-stop download. The Survey has been...

  4. Mahalanobis' Contributions to Sample Surveys

    Indian Academy of Sciences (India)

    Sample Survey started its operations in October 1950 under the ... and adopted random cuts for estimating the acreage under jute ... demographic factors relating to indebtedness, unemployment, ... traffic surveys, demand for currency coins and average life of .... Mahalanobis derived the optimum allocation in stratified.

  5. The 2003 Australian Breast Health Survey: survey design and preliminary results

    Directory of Open Access Journals (Sweden)

    Favelle Simone

    2008-01-01

    Full Text Available Abstract Background The Breast Health Surveys, conducted by the National Breast Cancer Centre (NBCC in 1996 and 2003, are designed to gain insight into the knowledge, attitudes and behaviours of a nationally representative sample of Australian women on issues relevant to breast cancer. In this article, we focus on major aspects of the design and present results on respondents' knowledge about mammographic screening. Methods The 2003 BHS surveyed English-speaking Australian women aged 30–69 without a history of breast cancer using computer-assisted telephone interviewing. Questions covered the following themes: knowledge and perceptions about incidence, mortality and risk; knowledge and behaviour regarding early detection, symptoms and diagnosis; mammographic screening; treatment; and accessibility and availability of information and services. Respondents were selected using a complex sample design involving stratification. Sample weights against Australian population benchmarks were used in all statistical analyses. Means and proportions for the entire population and by age group and area of residence were calculated. Statistical tests were conducted using a level of significance of 0.01. Results Of the 3,144 respondents who consented to being interviewed, 138 (4.4% had a previous diagnosis of breast cancer and were excluded leaving 3,006 completed interviews eligible for analysis. A majority of respondents (61.1% reported ever having had a mammogram and 29.1% identified mammography as being the best way of finding breast cancer. A majority of women (85.9% had heard of the BreastScreen Australia (BSA program, the national mammographic screening program providing free biennial screening mammograms, with 94.5% believing that BSA attendance was available regardless of the presence or absence of symptoms. There have been substantial gains in women's knowledge about mammographic screening over the seven years between the two surveys. Conclusion The

  6. The 2003 Australian Breast Health Survey: survey design and preliminary results.

    Science.gov (United States)

    Villanueva, Elmer V; Jones, Sandra; Nehill, Caroline; Favelle, Simone; Steel, David; Iverson, Donald; Zorbas, Helen

    2008-01-14

    The Breast Health Surveys, conducted by the National Breast Cancer Centre (NBCC) in 1996 and 2003, are designed to gain insight into the knowledge, attitudes and behaviours of a nationally representative sample of Australian women on issues relevant to breast cancer. In this article, we focus on major aspects of the design and present results on respondents' knowledge about mammographic screening. The 2003 BHS surveyed English-speaking Australian women aged 30-69 without a history of breast cancer using computer-assisted telephone interviewing. Questions covered the following themes: knowledge and perceptions about incidence, mortality and risk; knowledge and behaviour regarding early detection, symptoms and diagnosis; mammographic screening; treatment; and accessibility and availability of information and services. Respondents were selected using a complex sample design involving stratification. Sample weights against Australian population benchmarks were used in all statistical analyses. Means and proportions for the entire population and by age group and area of residence were calculated. Statistical tests were conducted using a level of significance of 0.01. Of the 3,144 respondents who consented to being interviewed, 138 (4.4%) had a previous diagnosis of breast cancer and were excluded leaving 3,006 completed interviews eligible for analysis. A majority of respondents (61.1%) reported ever having had a mammogram and 29.1% identified mammography as being the best way of finding breast cancer. A majority of women (85.9%) had heard of the BreastScreen Australia (BSA) program, the national mammographic screening program providing free biennial screening mammograms, with 94.5% believing that BSA attendance was available regardless of the presence or absence of symptoms. There have been substantial gains in women's knowledge about mammographic screening over the seven years between the two surveys. The NBCC Breast Health Surveys provide a valuable picture of the

  7. The rise of survey sampling

    NARCIS (Netherlands)

    Bethlehem, J.

    2009-01-01

    This paper is about the history of survey sampling. It describes how sampling became an accepted scientific method. From the first ideas in 1895 it took some 50 years before the principles of probability sampling were widely accepted. This papers has a focus on developments in official statistics in

  8. Optimising cluster survey design for planning schistosomiasis preventive chemotherapy.

    Directory of Open Access Journals (Sweden)

    Sarah C L Knowles

    2017-05-01

    Full Text Available The cornerstone of current schistosomiasis control programmes is delivery of praziquantel to at-risk populations. Such preventive chemotherapy requires accurate information on the geographic distribution of infection, yet the performance of alternative survey designs for estimating prevalence and converting this into treatment decisions has not been thoroughly evaluated.We used baseline schistosomiasis mapping surveys from three countries (Malawi, Côte d'Ivoire and Liberia to generate spatially realistic gold standard datasets, against which we tested alternative two-stage cluster survey designs. We assessed how sampling different numbers of schools per district (2-20 and children per school (10-50 influences the accuracy of prevalence estimates and treatment class assignment, and we compared survey cost-efficiency using data from Malawi. Due to the focal nature of schistosomiasis, up to 53% simulated surveys involving 2-5 schools per district failed to detect schistosomiasis in low endemicity areas (1-10% prevalence. Increasing the number of schools surveyed per district improved treatment class assignment far more than increasing the number of children sampled per school. For Malawi, surveys of 15 schools per district and 20-30 children per school reliably detected endemic schistosomiasis and maximised cost-efficiency. In sensitivity analyses where treatment costs and the country considered were varied, optimal survey size was remarkably consistent, with cost-efficiency maximised at 15-20 schools per district.Among two-stage cluster surveys for schistosomiasis, our simulations indicated that surveying 15-20 schools per district and 20-30 children per school optimised cost-efficiency and minimised the risk of under-treatment, with surveys involving more schools of greater cost-efficiency as treatment costs rose.

  9. Extending cluster Lot Quality Assurance Sampling designs for surveillance programs

    OpenAIRE

    Hund, Lauren; Pagano, Marcello

    2014-01-01

    Lot quality assurance sampling (LQAS) has a long history of applications in industrial quality control. LQAS is frequently used for rapid surveillance in global health settings, with areas classified as poor or acceptable performance based on the binary classification of an indicator. Historically, LQAS surveys have relied on simple random samples from the population; however, implementing two-stage cluster designs for surveillance sampling is often more cost-effective than ...

  10. Sampling design for long-term regional trends in marine rocky intertidal communities

    Science.gov (United States)

    Irvine, Gail V.; Shelley, Alice

    2013-01-01

    Probability-based designs reduce bias and allow inference of results to the pool of sites from which they were chosen. We developed and tested probability-based designs for monitoring marine rocky intertidal assemblages at Glacier Bay National Park and Preserve (GLBA), Alaska. A multilevel design was used that varied in scale and inference. The levels included aerial surveys, extensive sampling of 25 sites, and more intensive sampling of 6 sites. Aerial surveys of a subset of intertidal habitat indicated that the original target habitat of bedrock-dominated sites with slope ≤30° was rare. This unexpected finding illustrated one value of probability-based surveys and led to a shift in the target habitat type to include steeper, more mixed rocky habitat. Subsequently, we evaluated the statistical power of different sampling methods and sampling strategies to detect changes in the abundances of the predominant sessile intertidal taxa: barnacles Balanomorpha, the mussel Mytilus trossulus, and the rockweed Fucus distichus subsp. evanescens. There was greatest power to detect trends in Mytilus and lesser power for barnacles and Fucus. Because of its greater power, the extensive, coarse-grained sampling scheme was adopted in subsequent years over the intensive, fine-grained scheme. The sampling attributes that had the largest effects on power included sampling of “vertical” line transects (vs. horizontal line transects or quadrats) and increasing the number of sites. We also evaluated the power of several management-set parameters. Given equal sampling effort, sampling more sites fewer times had greater power. The information gained through intertidal monitoring is likely to be useful in assessing changes due to climate, including ocean acidification; invasive species; trampling effects; and oil spills.

  11. A SUCCESSFUL BROADBAND SURVEY FOR GIANT Lyα NEBULAE. I. SURVEY DESIGN AND CANDIDATE SELECTION

    International Nuclear Information System (INIS)

    Prescott, Moire K. M.; Dey, Arjun; Jannuzi, Buell T.

    2012-01-01

    Giant Lyα nebulae (or Lyα 'blobs') are likely sites of ongoing massive galaxy formation, but the rarity of these powerful sources has made it difficult to form a coherent picture of their properties, ionization mechanisms, and space density. Systematic narrowband Lyα nebula surveys are ongoing, but the small redshift range covered and the observational expense limit the comoving volume that can be probed by even the largest of these surveys and pose a significant problem when searching for such rare sources. We have developed a systematic search technique designed to find large Lyα nebulae at 2 ∼ 2 NOAO Deep Wide-Field Survey Boötes field. With a total survey comoving volume of ≈10 8 h –3 70 Mpc 3 , this is the largest volume survey for Lyα nebulae ever undertaken. In this first paper in the series, we present the details of the survey design and a systematically selected sample of 79 candidates, which includes one previously discovered Lyα nebula.

  12. Modern survey sampling

    CERN Document Server

    Chaudhuri, Arijit

    2014-01-01

    Exposure to SamplingAbstract Introduction Concepts of Population, Sample, and SamplingInitial RamificationsAbstract Introduction Sampling Design, Sampling SchemeRandom Numbers and Their Uses in Simple RandomSampling (SRS)Drawing Simple Random Samples with and withoutReplacementEstimation of Mean, Total, Ratio of Totals/Means:Variance and Variance EstimationDetermination of Sample SizesA.2 Appendix to Chapter 2 A.More on Equal Probability Sampling A.Horvitz-Thompson EstimatorA.SufficiencyA.LikelihoodA.Non-Existence Theorem More Intricacies Abstract Introduction Unequal Probability Sampling StrategiesPPS Sampling Exploring Improved WaysAbstract Introduction Stratified Sampling Cluster SamplingMulti-Stage SamplingMulti-Phase Sampling: Ratio and RegressionEstimationviiviii ContentsControlled SamplingModeling Introduction Super-Population ModelingPrediction Approach Model-Assisted Approach Bayesian Methods Spatial SmoothingSampling on Successive Occasions: Panel Rotation Non-Response and Not-at-Homes Weighting Adj...

  13. A SUCCESSFUL BROADBAND SURVEY FOR GIANT Ly{alpha} NEBULAE. I. SURVEY DESIGN AND CANDIDATE SELECTION

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, Moire K. M. [Department of Physics, Broida Hall, Mail Code 9530, University of California, Santa Barbara, CA 93106 (United States); Dey, Arjun; Jannuzi, Buell T., E-mail: mkpresco@physics.ucsb.edu [National Optical Astronomy Observatory, 950 North Cherry Avenue, Tucson, AZ 85719 (United States)

    2012-04-01

    Giant Ly{alpha} nebulae (or Ly{alpha} 'blobs') are likely sites of ongoing massive galaxy formation, but the rarity of these powerful sources has made it difficult to form a coherent picture of their properties, ionization mechanisms, and space density. Systematic narrowband Ly{alpha} nebula surveys are ongoing, but the small redshift range covered and the observational expense limit the comoving volume that can be probed by even the largest of these surveys and pose a significant problem when searching for such rare sources. We have developed a systematic search technique designed to find large Ly{alpha} nebulae at 2 {approx}< z {approx}< 3 within deep broadband imaging and have carried out a survey of the 9.4 deg{sup 2} NOAO Deep Wide-Field Survey Booetes field. With a total survey comoving volume of Almost-Equal-To 10{sup 8} h{sup -3}{sub 70} Mpc{sup 3}, this is the largest volume survey for Ly{alpha} nebulae ever undertaken. In this first paper in the series, we present the details of the survey design and a systematically selected sample of 79 candidates, which includes one previously discovered Ly{alpha} nebula.

  14. Improved sampling for airborne surveys to estimate wildlife population parameters in the African Savannah

    NARCIS (Netherlands)

    Khaemba, W.; Stein, A.

    2002-01-01

    Parameter estimates, obtained from airborne surveys of wildlife populations, often have large bias and large standard errors. Sampling error is one of the major causes of this imprecision and the occurrence of many animals in herds violates the common assumptions in traditional sampling designs like

  15. Sample size methods for estimating HIV incidence from cross-sectional surveys.

    Science.gov (United States)

    Konikoff, Jacob; Brookmeyer, Ron

    2015-12-01

    Understanding HIV incidence, the rate at which new infections occur in populations, is critical for tracking and surveillance of the epidemic. In this article, we derive methods for determining sample sizes for cross-sectional surveys to estimate incidence with sufficient precision. We further show how to specify sample sizes for two successive cross-sectional surveys to detect changes in incidence with adequate power. In these surveys biomarkers such as CD4 cell count, viral load, and recently developed serological assays are used to determine which individuals are in an early disease stage of infection. The total number of individuals in this stage, divided by the number of people who are uninfected, is used to approximate the incidence rate. Our methods account for uncertainty in the durations of time spent in the biomarker defined early disease stage. We find that failure to account for this uncertainty when designing surveys can lead to imprecise estimates of incidence and underpowered studies. We evaluated our sample size methods in simulations and found that they performed well in a variety of underlying epidemics. Code for implementing our methods in R is available with this article at the Biometrics website on Wiley Online Library. © 2015, The International Biometric Society.

  16. Statistical literacy and sample survey results

    Science.gov (United States)

    McAlevey, Lynn; Sullivan, Charles

    2010-10-01

    Sample surveys are widely used in the social sciences and business. The news media almost daily quote from them, yet they are widely misused. Using students with prior managerial experience embarking on an MBA course, we show that common sample survey results are misunderstood even by those managers who have previously done a statistics course. In general, they fare no better than managers who have never studied statistics. There are implications for teaching, especially in business schools, as well as for consulting.

  17. Designing an Effective Survey

    National Research Council Canada - National Science Library

    Kasunic, Mark

    2005-01-01

    ... of them. However, to protect the validity of conclusions drawn from a survey, certain procedures must be followed throughout the process of designing, developing, and distributing the survey questionnaire...

  18. Adaptive geostatistical sampling enables efficient identification of malaria hotspots in repeated cross-sectional surveys in rural Malawi.

    Directory of Open Access Journals (Sweden)

    Alinune N Kabaghe

    Full Text Available In the context of malaria elimination, interventions will need to target high burden areas to further reduce transmission. Current tools to monitor and report disease burden lack the capacity to continuously detect fine-scale spatial and temporal variations of disease distribution exhibited by malaria. These tools use random sampling techniques that are inefficient for capturing underlying heterogeneity while health facility data in resource-limited settings are inaccurate. Continuous community surveys of malaria burden provide real-time results of local spatio-temporal variation. Adaptive geostatistical design (AGD improves prediction of outcome of interest compared to current random sampling techniques. We present findings of continuous malaria prevalence surveys using an adaptive sampling design.We conducted repeated cross sectional surveys guided by an adaptive sampling design to monitor the prevalence of malaria parasitaemia and anaemia in children below five years old in the communities living around Majete Wildlife Reserve in Chikwawa district, Southern Malawi. AGD sampling uses previously collected data to sample new locations of high prediction variance or, where prediction exceeds a set threshold. We fitted a geostatistical model to predict malaria prevalence in the area.We conducted five rounds of sampling, and tested 876 children aged 6-59 months from 1377 households over a 12-month period. Malaria prevalence prediction maps showed spatial heterogeneity and presence of hotspots-where predicted malaria prevalence was above 30%; predictors of malaria included age, socio-economic status and ownership of insecticide-treated mosquito nets.Continuous malaria prevalence surveys using adaptive sampling increased malaria prevalence prediction accuracy. Results from the surveys were readily available after data collection. The tool can assist local managers to target malaria control interventions in areas with the greatest health impact and is

  19. Simulating future uncertainty to guide the selection of survey designs for long-term monitoring

    Science.gov (United States)

    Garman, Steven L.; Schweiger, E. William; Manier, Daniel J.; Gitzen, Robert A.; Millspaugh, Joshua J.; Cooper, Andrew B.; Licht, Daniel S.

    2012-01-01

    A goal of environmental monitoring is to provide sound information on the status and trends of natural resources (Messer et al. 1991, Theobald et al. 2007, Fancy et al. 2009). When monitoring observations are acquired by measuring a subset of the population of interest, probability sampling as part of a well-constructed survey design provides the most reliable and legally defensible approach to achieve this goal (Cochran 1977, Olsen et al. 1999, Schreuder et al. 2004; see Chapters 2, 5, 6, 7). Previous works have described the fundamentals of sample surveys (e.g. Hansen et al. 1953, Kish 1965). Interest in survey designs and monitoring over the past 15 years has led to extensive evaluations and new developments of sample selection methods (Stevens and Olsen 2004), of strategies for allocating sample units in space and time (Urquhart et al. 1993, Overton and Stehman 1996, Urquhart and Kincaid 1999), and of estimation (Lesser and Overton 1994, Overton and Stehman 1995) and variance properties (Larsen et al. 1995, Stevens and Olsen 2003) of survey designs. Carefully planned, “scientific” (Chapter 5) survey designs have become a standard in contemporary monitoring of natural resources. Based on our experience with the long-term monitoring program of the US National Park Service (NPS; Fancy et al. 2009; Chapters 16, 22), operational survey designs tend to be selected using the following procedures. For a monitoring indicator (i.e. variable or response), a minimum detectable trend requirement is specified, based on the minimum level of change that would result in meaningful change (e.g. degradation). A probability of detecting this trend (statistical power) and an acceptable level of uncertainty (Type I error; see Chapter 2) within a specified time frame (e.g. 10 years) are specified to ensure timely detection. Explicit statements of the minimum detectable trend, the time frame for detecting the minimum trend, power, and acceptable probability of Type I error (

  20. On a Modular Approach to the Design of Integrated Social Surveys

    Directory of Open Access Journals (Sweden)

    Ioannidis Evangelos

    2016-06-01

    Full Text Available This article considers a modular approach to the design of integrated social surveys. The approach consists of grouping variables into ‘modules’, each of which is then allocated to one or more ‘instruments’. Each instrument is then administered to a random sample of population units, and each sample unit responds to all modules of the instrument. This approach offers a way of designing a system of integrated social surveys that balances the need to limit the cost and the need to obtain sufficient information. The allocation of the modules to instruments draws on the methodology of split questionnaire designs. The composition of the instruments, that is, how the modules are allocated to instruments, and the corresponding sample sizes are obtained as a solution to an optimisation problem. This optimisation involves minimisation of respondent burden and data collection cost, while respecting certain design constraints usually encountered in practice. These constraints may include, for example, the level of precision required and dependencies between the variables. We propose using a random search algorithm to find approximate optimal solutions to this problem. The algorithm is proved to fulfil conditions that ensure convergence to the global optimum and can also produce an efficient design for a split questionnaire.

  1. The quality of sample surveys in a developing nation

    Directory of Open Access Journals (Sweden)

    Paul A Bourne

    2010-08-01

    Full Text Available Paul A Bourne1, Christopher AD Charles2,3, Neva South-Bourne4, Chloe Morris1, Denise Eldemire-Shearer1, Maureen D Kerr-Campbell51Department of Community Health and Psychiatry, Faculty of Medical Sciences, University of the West Indies, Mona, Kingston, Jamaica; 2King Graduate School, Monroe College, Bronx, New York, USA; 3Center for Victim Support, Harlem Hospital Center, New York, USA; 4Research assistant for Paul A Bourne; 5Systems Development Unit, Main Library, Faculty of Humanities and Education, University of the West Indies, Mona, Kingston, JamaicaBackground: In Jamaica, population census began in 1844, and many intercensal ratios obtained from the census data showed that there is a general high degree of accuracy of the data. However, statistics from the Jamaican Ministry of Health showed that there are inaccuracies in health data collected from males using sample surveys.Objectives: The objectives of the present research are to 1 investigate the accuracy of a national sample survey, 2 explore the feasibility and quality of using a subnational sample survey to represent a national survey, 3 aid other scholars in understanding the probability of using national sample surveys and subnational sample surveys, 4 assess older men’s ­evaluation of their health status, and 5 determine whether dichotomization changes self-evaluated health status.Methods: For the current study, the data used in the analysis were originally collected from 2 different sources: 1 the Jamaica Survey of Living Conditions (JSLC and 2 Survey of Older Men (SOM. Cross validation of self-evaluated data of men in Jamaica was done with comparable samples of the complete JSLC data and the SOM data, where men older than 55 years were selected from each sample.Results: In study 1, 50.2% of respondents indicated at least good self-evaluated health status compared with 74.0% in study 2. Statistical associations were found between health status and survey sample (Χ2 [df = 5

  2. The Study on Mental Health at Work: Design and sampling.

    Science.gov (United States)

    Rose, Uwe; Schiel, Stefan; Schröder, Helmut; Kleudgen, Martin; Tophoven, Silke; Rauch, Angela; Freude, Gabriele; Müller, Grit

    2017-08-01

    The Study on Mental Health at Work (S-MGA) generates the first nationwide representative survey enabling the exploration of the relationship between working conditions, mental health and functioning. This paper describes the study design, sampling procedures and data collection, and presents a summary of the sample characteristics. S-MGA is a representative study of German employees aged 31-60 years subject to social security contributions. The sample was drawn from the employment register based on a two-stage cluster sampling procedure. Firstly, 206 municipalities were randomly selected from a pool of 12,227 municipalities in Germany. Secondly, 13,590 addresses were drawn from the selected municipalities for the purpose of conducting 4500 face-to-face interviews. The questionnaire covers psychosocial working and employment conditions, measures of mental health, work ability and functioning. Data from personal interviews were combined with employment histories from register data. Descriptive statistics of socio-demographic characteristics and logistic regressions analyses were used for comparing population, gross sample and respondents. In total, 4511 face-to-face interviews were conducted. A test for sampling bias revealed that individuals in older cohorts participated more often, while individuals with an unknown educational level, residing in major cities or with a non-German ethnic background were slightly underrepresented. There is no indication of major deviations in characteristics between the basic population and the sample of respondents. Hence, S-MGA provides representative data for research on work and health, designed as a cohort study with plans to rerun the survey 5 years after the first assessment.

  3. The Study on Mental Health at Work: Design and sampling

    Science.gov (United States)

    Rose, Uwe; Schiel, Stefan; Schröder, Helmut; Kleudgen, Martin; Tophoven, Silke; Rauch, Angela; Freude, Gabriele; Müller, Grit

    2017-01-01

    Aims: The Study on Mental Health at Work (S-MGA) generates the first nationwide representative survey enabling the exploration of the relationship between working conditions, mental health and functioning. This paper describes the study design, sampling procedures and data collection, and presents a summary of the sample characteristics. Methods: S-MGA is a representative study of German employees aged 31–60 years subject to social security contributions. The sample was drawn from the employment register based on a two-stage cluster sampling procedure. Firstly, 206 municipalities were randomly selected from a pool of 12,227 municipalities in Germany. Secondly, 13,590 addresses were drawn from the selected municipalities for the purpose of conducting 4500 face-to-face interviews. The questionnaire covers psychosocial working and employment conditions, measures of mental health, work ability and functioning. Data from personal interviews were combined with employment histories from register data. Descriptive statistics of socio-demographic characteristics and logistic regressions analyses were used for comparing population, gross sample and respondents. Results: In total, 4511 face-to-face interviews were conducted. A test for sampling bias revealed that individuals in older cohorts participated more often, while individuals with an unknown educational level, residing in major cities or with a non-German ethnic background were slightly underrepresented. Conclusions: There is no indication of major deviations in characteristics between the basic population and the sample of respondents. Hence, S-MGA provides representative data for research on work and health, designed as a cohort study with plans to rerun the survey 5 years after the first assessment. PMID:28673202

  4. Relative Efficiencies of a Three-Stage Versus a Two-Stage Sample Design For a New NLS Cohort Study. 22U-884-38.

    Science.gov (United States)

    Folsom, R. E.; Weber, J. H.

    Two sampling designs were compared for the planned 1978 national longitudinal survey of high school seniors with respect to statistical efficiency and cost. The 1972 survey used a stratified two-stage sample of high schools and seniors within schools. In order to minimize interviewer travel costs, an alternate sampling design was proposed,…

  5. SDSS-IV MaNGA IFS GALAXY SURVEY—SURVEY DESIGN, EXECUTION, AND INITIAL DATA QUALITY

    International Nuclear Information System (INIS)

    Yan, Renbin; Zhang, Kai; Bundy, Kevin; Law, David R.; Bershady, Matthew A.; Diamond-Stanic, Aleksandar M.; Andrews, Brett; Cherinka, Brian; Drory, Niv; MacDonald, Nicholas; Sánchez-Gallego, José R.; Thomas, Daniel; Westfall, Kyle B.; Wake, David A.; Weijmans, Anne-Marie; Aragón-Salamanca, Alfonso; Belfiore, Francesco

    2016-01-01

    The MaNGA Survey (Mapping Nearby Galaxies at Apache Point Observatory) is one of three core programs in the Sloan Digital Sky Survey IV. It is obtaining integral field spectroscopy for 10,000 nearby galaxies at a spectral resolution of R  ∼ 2000 from 3622 to 10354 Å. The design of the survey is driven by a set of science requirements on the precision of estimates of the following properties: star formation rate surface density, gas metallicity, stellar population age, metallicity, and abundance ratio, and their gradients; stellar and gas kinematics; and enclosed gravitational mass as a function of radius. We describe how these science requirements set the depth of the observations and dictate sample selection. The majority of targeted galaxies are selected to ensure uniform spatial coverage in units of effective radius (R e ) while maximizing spatial resolution. About two-thirds of the sample is covered out to 1.5 R e (Primary sample), and one-third of the sample is covered to 2.5 R e (Secondary sample). We describe the survey execution with details that would be useful in the design of similar future surveys. We also present statistics on the achieved data quality, specifically the point-spread function, sampling uniformity, spectral resolution, sky subtraction, and flux calibration. For our Primary sample, the median r -band signal-to-noise ratio is ∼70 per 1.4 Å pixel for spectra stacked between 1 R e and 1.5 R e . Measurements of various galaxy properties from the first-year data show that we are meeting or exceeding the defined requirements for the majority of our science goals.

  6. SDSS-IV MaNGA IFS Galaxy Survey—Survey Design, Execution, and Initial Data Quality

    Science.gov (United States)

    Yan, Renbin; Bundy, Kevin; Law, David R.; Bershady, Matthew A.; Andrews, Brett; Cherinka, Brian; Diamond-Stanic, Aleksandar M.; Drory, Niv; MacDonald, Nicholas; Sánchez-Gallego, José R.; Thomas, Daniel; Wake, David A.; Weijmans, Anne-Marie; Westfall, Kyle B.; Zhang, Kai; Aragón-Salamanca, Alfonso; Belfiore, Francesco; Bizyaev, Dmitry; Blanc, Guillermo A.; Blanton, Michael R.; Brownstein, Joel; Cappellari, Michele; D'Souza, Richard; Emsellem, Eric; Fu, Hai; Gaulme, Patrick; Graham, Mark T.; Goddard, Daniel; Gunn, James E.; Harding, Paul; Jones, Amy; Kinemuchi, Karen; Li, Cheng; Li, Hongyu; Maiolino, Roberto; Mao, Shude; Maraston, Claudia; Masters, Karen; Merrifield, Michael R.; Oravetz, Daniel; Pan, Kaike; Parejko, John K.; Sanchez, Sebastian F.; Schlegel, David; Simmons, Audrey; Thanjavur, Karun; Tinker, Jeremy; Tremonti, Christy; van den Bosch, Remco; Zheng, Zheng

    2016-12-01

    The MaNGA Survey (Mapping Nearby Galaxies at Apache Point Observatory) is one of three core programs in the Sloan Digital Sky Survey IV. It is obtaining integral field spectroscopy for 10,000 nearby galaxies at a spectral resolution of R ˜ 2000 from 3622 to 10354 Å. The design of the survey is driven by a set of science requirements on the precision of estimates of the following properties: star formation rate surface density, gas metallicity, stellar population age, metallicity, and abundance ratio, and their gradients; stellar and gas kinematics; and enclosed gravitational mass as a function of radius. We describe how these science requirements set the depth of the observations and dictate sample selection. The majority of targeted galaxies are selected to ensure uniform spatial coverage in units of effective radius (R e ) while maximizing spatial resolution. About two-thirds of the sample is covered out to 1.5R e (Primary sample), and one-third of the sample is covered to 2.5R e (Secondary sample). We describe the survey execution with details that would be useful in the design of similar future surveys. We also present statistics on the achieved data quality, specifically the point-spread function, sampling uniformity, spectral resolution, sky subtraction, and flux calibration. For our Primary sample, the median r-band signal-to-noise ratio is ˜70 per 1.4 Å pixel for spectra stacked between 1R e and 1.5R e . Measurements of various galaxy properties from the first-year data show that we are meeting or exceeding the defined requirements for the majority of our science goals.

  7. Iranian mental health survey: design and field proced.

    Directory of Open Access Journals (Sweden)

    Afarin Rahimi-Movaghar

    2014-06-01

    Full Text Available Iranian Mental Health Survey (IranMHS was conducted to assess the twelve-month prevalence and severity of psychiatric disorders in the Iranian adult population and to determine the pattern of health care utilization and cost of services. IranMHS is a cross-sectional national household survey with face-to-face interviews as the main data collection method. The study was carried out between January and June 2011. A three-stage probability sampling was applied for the selection of a representative sample from the non-institutionalized population aged 15 to 64. The primary instrument utilized for assessing the prevalence of mental disorders was the Persian version of Composite International Diagnosis Interview, version 2.1. The instruments for assessing the service and cost of mental illness were developed by the research team. The response rate was 86.2%, and a total of 7886 individuals participated in the study. Sampling weights were the joint product of inverse probability of unit selection, non-response weights and post-stratification weights. This paper presents an overview of the study design, fieldwork organization and procedures, weightings and analysis. The strengths and limitations of the study are also discussed.

  8. Can Weighting Compensate for Sampling Issues in Internet Surveys?

    NARCIS (Netherlands)

    Vaske, J.J.; Jacobs, M.H.; Sijtsma, M.T.J.; Beaman, J.

    2011-01-01

    While Internet surveys have increased in popularity, results may not be representative of target populations. Weighting is commonly used to compensate for sampling issues. This article compared two surveys conducted in the Netherlands—a random mail survey (n = 353) and a convenience Internet survey

  9. The Design and Implementation of the 2016 National Survey of Children's Health.

    Science.gov (United States)

    Ghandour, Reem M; Jones, Jessica R; Lebrun-Harris, Lydie A; Minnaert, Jessica; Blumberg, Stephen J; Fields, Jason; Bethell, Christina; Kogan, Michael D

    2018-05-09

    Introduction Since 2001, the Health Resources and Services Administration's Maternal and Child Health Bureau (HRSA MCHB) has funded and directed the National Survey of Children's Health (NSCH) and the National Survey of Children with Special Health Care Needs (NS-CSHCN), unique sources of national and state-level data on child health and health care. Between 2012 and 2015, HRSA MCHB redesigned the surveys, combining content into a single survey, and shifting from a periodic interviewer-assisted telephone survey to an annual self-administered web/paper-based survey utilizing an address-based sampling frame. Methods The U.S. Census Bureau fielded the redesigned NSCH using a random sample of addresses drawn from the Census Master Address File, supplemented with a unique administrative flag to identify households most likely to include children. Data were collected June 2016-February 2017 using a multi-mode design, encouraging web-based responses while allowing for paper mail-in responses. A parent/caregiver knowledgeable about the child's health completed an age-appropriate questionnaire. Experiments on incentives, branding, and contact strategies were conducted. Results Data were released in September 2017. The final sample size was 50,212 children; the overall weighted response rate was 40.7%. Comparison of 2016 estimates to those from previous survey iterations are not appropriate due to sampling and mode changes. Discussion The NSCH remains an invaluable data source for key measures of child health and attendant health care system, family, and community factors. The redesigned survey extended the utility of this resource while seeking a balance between previous strengths and innovations now possible.

  10. The Foundation Supernova Survey: motivation, design, implementation, and first data release

    Science.gov (United States)

    Foley, Ryan J.; Scolnic, Daniel; Rest, Armin; Jha, S. W.; Pan, Y.-C.; Riess, A. G.; Challis, P.; Chambers, K. C.; Coulter, D. A.; Dettman, K. G.; Foley, M. M.; Fox, O. D.; Huber, M. E.; Jones, D. O.; Kilpatrick, C. D.; Kirshner, R. P.; Schultz, A. S. B.; Siebert, M. R.; Flewelling, H. A.; Gibson, B.; Magnier, E. A.; Miller, J. A.; Primak, N.; Smartt, S. J.; Smith, K. W.; Wainscoat, R. J.; Waters, C.; Willman, M.

    2018-03-01

    The Foundation Supernova Survey aims to provide a large, high-fidelity, homogeneous, and precisely calibrated low-redshift Type Ia supernova (SN Ia) sample for cosmology. The calibration of the current low-redshift SN sample is the largest component of systematic uncertainties for SN cosmology, and new data are necessary to make progress. We present the motivation, survey design, observation strategy, implementation, and first results for the Foundation Supernova Survey. We are using the Pan-STARRS telescope to obtain photometry for up to 800 SNe Ia at z ≲ 0.1. This strategy has several unique advantages: (1) the Pan-STARRS system is a superbly calibrated telescopic system, (2) Pan-STARRS has observed 3/4 of the sky in grizyP1 making future template observations unnecessary, (3) we have a well-tested data-reduction pipeline, and (4) we have observed ˜3000 high-redshift SNe Ia on this system. Here, we present our initial sample of 225 SN Ia grizP1 light curves, of which 180 pass all criteria for inclusion in a cosmological sample. The Foundation Supernova Survey already contains more cosmologically useful SNe Ia than all other published low-redshift SN Ia samples combined. We expect that the systematic uncertainties for the Foundation Supernova Sample will be two to three times smaller than other low-redshift samples. We find that our cosmologically useful sample has an intrinsic scatter of 0.111 mag, smaller than other low-redshift samples. We perform detailed simulations showing that simply replacing the current low-redshift SN Ia sample with an equally sized Foundation sample will improve the precision on the dark energy equation-of-state parameter by 35 per cent, and the dark energy figure of merit by 72 per cent.

  11. Manual for the Portable Handheld Neutron Counter (PHNC) for Neutron Survey and the Measurement of Plutonium Samples

    International Nuclear Information System (INIS)

    Menlove, H.O.

    2005-01-01

    We have designed a portable neutron detector for passive neutron scanning measurement and coincidence counting of bulk samples of plutonium. The counter will be used for neutron survey applications as well as the measurement of plutonium samples for portable applications. The detector uses advanced design 3 He tubes to increase the efficiency and battery operated shift register electronics. This report describes the hardware, performance, and calibration for the system

  12. SDSS-IV MaNGA IFS GALAXY SURVEY—SURVEY DESIGN, EXECUTION, AND INITIAL DATA QUALITY

    Energy Technology Data Exchange (ETDEWEB)

    Yan, Renbin; Zhang, Kai [Department of Physics and Astronomy, University of Kentucky, 505 Rose Street, Lexington, KY 40506-0057 (United States); Bundy, Kevin [Kavli IPMU (WPI), UTIAS, The University of Tokyo, Kashiwa, Chiba 277-8583 (Japan); Law, David R. [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21218 (United States); Bershady, Matthew A.; Diamond-Stanic, Aleksandar M. [Department of Astronomy, University of Winsconsin-Madison, 475 N. Charter Street, Madison, WI 53706-1582 (United States); Andrews, Brett [Department of Physics and Astronomy and Pittsburgh Particle Physics, Astrophysics and Cosmology Center (PITT PACC), University of Pittsburgh, 3941 OHara Street, Pittsburgh, PA 15260 (United States); Cherinka, Brian [Department of Physics and Astronomy, Johns Hopkins University, Bloomberg Center, 3400 N. Charles Street, Baltimore, MD 21218 (United States); Drory, Niv [McDonald Observatory, University of Texas at Austin, 1 University Station, Austin, TX 78712-0259 (United States); MacDonald, Nicholas; Sánchez-Gallego, José R. [Department of Astronomy, Box 351580, University of Washington, Seattle, WA 98195 (United States); Thomas, Daniel; Westfall, Kyle B. [Institute of Cosmology and Gravitation, University of Portsmouth, Portsmouth (United Kingdom); Wake, David A. [Department of Physical Sciences, The Open University, Milton Keynes, MK7 6AA (United Kingdom); Weijmans, Anne-Marie [School of Physics and Astronomy, University of St Andrews, North Haugh, St Andrews KY16 9SS (United Kingdom); Aragón-Salamanca, Alfonso [School of Physics and Astronomy, University of Nottingham, University Park, Nottingham NG7 2RD (United Kingdom); Belfiore, Francesco, E-mail: yanrenbin@uky.edu [Cavendish Laboratory, University of Cambridge, 19 J. J. Thomson Avenue, Cambridge CB3 0HE (United Kingdom); and others

    2016-12-01

    The MaNGA Survey (Mapping Nearby Galaxies at Apache Point Observatory) is one of three core programs in the Sloan Digital Sky Survey IV. It is obtaining integral field spectroscopy for 10,000 nearby galaxies at a spectral resolution of R  ∼ 2000 from 3622 to 10354 Å. The design of the survey is driven by a set of science requirements on the precision of estimates of the following properties: star formation rate surface density, gas metallicity, stellar population age, metallicity, and abundance ratio, and their gradients; stellar and gas kinematics; and enclosed gravitational mass as a function of radius. We describe how these science requirements set the depth of the observations and dictate sample selection. The majority of targeted galaxies are selected to ensure uniform spatial coverage in units of effective radius (R{sub e}) while maximizing spatial resolution. About two-thirds of the sample is covered out to 1.5 R{sub e} (Primary sample), and one-third of the sample is covered to 2.5 R{sub e} (Secondary sample). We describe the survey execution with details that would be useful in the design of similar future surveys. We also present statistics on the achieved data quality, specifically the point-spread function, sampling uniformity, spectral resolution, sky subtraction, and flux calibration. For our Primary sample, the median r -band signal-to-noise ratio is ∼70 per 1.4 Å pixel for spectra stacked between 1 R{sub e} and 1.5 R{sub e}. Measurements of various galaxy properties from the first-year data show that we are meeting or exceeding the defined requirements for the majority of our science goals.

  13. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality

    Science.gov (United States)

    Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Introduction Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. Materials and methods The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. Results The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. Conclusions The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in

  14. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality.

    Science.gov (United States)

    L'Engle, Kelly; Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit dialing of mobile

  15. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality.

    Directory of Open Access Journals (Sweden)

    Kelly L'Engle

    Full Text Available Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample.The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census.The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample.The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit

  16. AN EVALUATION OF PRIMARY DATA-COLLECTION MODES IN AN ADDRESS-BASED SAMPLING DESIGN.

    Science.gov (United States)

    Amaya, Ashley; Leclere, Felicia; Carris, Kari; Liao, Youlian

    2015-01-01

    As address-based sampling becomes increasingly popular for multimode surveys, researchers continue to refine data-collection best practices. While much work has been conducted to improve efficiency within a given mode, additional research is needed on how multimode designs can be optimized across modes. Previous research has not evaluated the consequences of mode sequencing on multimode mail and phone surveys, nor has significant research been conducted to evaluate mode sequencing on a variety of indicators beyond response rates. We conducted an experiment within the Racial and Ethnic Approaches to Community Health across the U.S. Risk Factor Survey (REACH U.S.) to evaluate two multimode case-flow designs: (1) phone followed by mail (phone-first) and (2) mail followed by phone (mail-first). We compared response rates, cost, timeliness, and data quality to identify differences across case-flow design. Because surveys often differ on the rarity of the target population, we also examined whether changes in the eligibility rate altered the choice of optimal case flow. Our results suggested that, on most metrics, the mail-first design was superior to the phone-first design. Compared with phone-first, mail-first achieved a higher yield rate at a lower cost with equivalent data quality. While the phone-first design initially achieved more interviews compared to the mail-first design, over time the mail-first design surpassed it and obtained the greatest number of interviews.

  17. The Danish National Health Survey 2010. Study design and respondent characteristics

    DEFF Research Database (Denmark)

    Christensen, Anne Illemann; Ekholm, Ola; Glümer, Charlotte

    2012-01-01

    In 2010 the five Danish regions and the National Institute of Public Health at the University of Southern Denmark conducted a national representative health survey among the adult population in Denmark. This paper describes the study design and the sample and study population as well as the conte...

  18. Human-Robot Site Survey and Sampling for Space Exploration

    Science.gov (United States)

    Fong, Terrence; Bualat, Maria; Edwards, Laurence; Flueckiger, Lorenzo; Kunz, Clayton; Lee, Susan Y.; Park, Eric; To, Vinh; Utz, Hans; Ackner, Nir

    2006-01-01

    NASA is planning to send humans and robots back to the Moon before 2020. In order for extended missions to be productive, high quality maps of lunar terrain and resources are required. Although orbital images can provide much information, many features (local topography, resources, etc) will have to be characterized directly on the surface. To address this need, we are developing a system to perform site survey and sampling. The system includes multiple robots and humans operating in a variety of team configurations, coordinated via peer-to-peer human-robot interaction. In this paper, we present our system design and describe planned field tests.

  19. Design, analysis, and interpretation of field quality-control data for water-sampling projects

    Science.gov (United States)

    Mueller, David K.; Schertz, Terry L.; Martin, Jeffrey D.; Sandstrom, Mark W.

    2015-01-01

    The process of obtaining and analyzing water samples from the environment includes a number of steps that can affect the reported result. The equipment used to collect and filter samples, the bottles used for specific subsamples, any added preservatives, sample storage in the field, and shipment to the laboratory have the potential to affect how accurately samples represent the environment from which they were collected. During the early 1990s, the U.S. Geological Survey implemented policies to include the routine collection of quality-control samples in order to evaluate these effects and to ensure that water-quality data were adequately representing environmental conditions. Since that time, the U.S. Geological Survey Office of Water Quality has provided training in how to design effective field quality-control sampling programs and how to evaluate the resultant quality-control data. This report documents that training material and provides a reference for methods used to analyze quality-control data.

  20. Planetary Sample Caching System Design Options

    Science.gov (United States)

    Collins, Curtis; Younse, Paulo; Backes, Paul

    2009-01-01

    Potential Mars Sample Return missions would aspire to collect small core and regolith samples using a rover with a sample acquisition tool and sample caching system. Samples would need to be stored in individual sealed tubes in a canister that could be transfered to a Mars ascent vehicle and returned to Earth. A sample handling, encapsulation and containerization system (SHEC) has been developed as part of an integrated system for acquiring and storing core samples for application to future potential MSR and other potential sample return missions. Requirements and design options for the SHEC system were studied and a recommended design concept developed. Two families of solutions were explored: 1)transfer of a raw sample from the tool to the SHEC subsystem and 2)transfer of a tube containing the sample to the SHEC subsystem. The recommended design utilizes sample tool bit change out as the mechanism for transferring tubes to and samples in tubes from the tool. The SHEC subsystem design, called the Bit Changeout Caching(BiCC) design, is intended for operations on a MER class rover.

  1. What Are Probability Surveys used by the National Aquatic Resource Surveys?

    Science.gov (United States)

    The National Aquatic Resource Surveys (NARS) use probability-survey designs to assess the condition of the nation’s waters. In probability surveys (also known as sample-surveys or statistical surveys), sampling sites are selected randomly.

  2. Northern Marshall Islands radiological survey: sampling and analysis summary

    Energy Technology Data Exchange (ETDEWEB)

    Robison, W.L.; Conrado, C.L.; Eagle, R.J.; Stuart, M.L.

    1981-07-23

    A radiological survey was conducted in the Northern Marshall Islands to document reamining external gamma exposures from nuclear tests conducted at Enewetak and Bikini Atolls. An additional program was later included to obtain terrestrial and marine samples for radiological dose assessment for current or potential atoll inhabitants. This report is the first of a series summarizing the results from the terrestrial and marine surveys. The sample collection and processing procedures and the general survey methodology are discussed; a summary of the collected samples and radionuclide analyses is presented. Over 5400 samples were collected from the 12 atolls and 2 islands and prepared for analysis including 3093 soil, 961 vegetation, 153 animal, 965 fish composite samples (average of 30 fish per sample), 101 clam, 50 lagoon water, 15 cistern water, 17 groundwater, and 85 lagoon sediment samples. A complete breakdown by sample type, atoll, and island is given here. The total number of analyses by radionuclide are 8840 for /sup 241/Am, 6569 for /sup 137/Cs, 4535 for /sup 239 +240/Pu, 4431 for /sup 90/Sr, 1146 for /sup 238/Pu, 269 for /sup 241/Pu, and 114 each for /sup 239/Pu and /sup 240/Pu. A complete breakdown by sample category, atoll or island, and radionuclide is also included.

  3. Northern Marshall Islands radiological survey: sampling and analysis summary

    International Nuclear Information System (INIS)

    Robison, W.L.; Conrado, C.L.; Eagle, R.J.; Stuart, M.L.

    1981-01-01

    A radiological survey was conducted in the Northern Marshall Islands to document reamining external gamma exposures from nuclear tests conducted at Enewetak and Bikini Atolls. An additional program was later included to obtain terrestrial and marine samples for radiological dose assessment for current or potential atoll inhabitants. This report is the first of a series summarizing the results from the terrestrial and marine surveys. The sample collection and processing procedures and the general survey methodology are discussed; a summary of the collected samples and radionuclide analyses is presented. Over 5400 samples were collected from the 12 atolls and 2 islands and prepared for analysis including 3093 soil, 961 vegetation, 153 animal, 965 fish composite samples (average of 30 fish per sample), 101 clam, 50 lagoon water, 15 cistern water, 17 groundwater, and 85 lagoon sediment samples. A complete breakdown by sample type, atoll, and island is given here. The total number of analyses by radionuclide are 8840 for 241 Am, 6569 for 137 Cs, 4535 for 239+240 Pu, 4431 for 90 Sr, 1146 for 238 Pu, 269 for 241 Pu, and 114 each for 239 Pu and 240 Pu. A complete breakdown by sample category, atoll or island, and radionuclide is also included

  4. Optimizing sampling design to deal with mist-net avoidance in Amazonian birds and bats.

    Directory of Open Access Journals (Sweden)

    João Tiago Marques

    Full Text Available Mist netting is a widely used technique to sample bird and bat assemblages. However, captures often decline with time because animals learn and avoid the locations of nets. This avoidance or net shyness can substantially decrease sampling efficiency. We quantified the day-to-day decline in captures of Amazonian birds and bats with mist nets set at the same location for four consecutive days. We also evaluated how net avoidance influences the efficiency of surveys under different logistic scenarios using re-sampling techniques. Net avoidance caused substantial declines in bird and bat captures, although more accentuated in the latter. Most of the decline occurred between the first and second days of netting: 28% in birds and 47% in bats. Captures of commoner species were more affected. The numbers of species detected also declined. Moving nets daily to minimize the avoidance effect increased captures by 30% in birds and 70% in bats. However, moving the location of nets may cause a reduction in netting time and captures. When moving the nets caused the loss of one netting day it was no longer advantageous to move the nets frequently. In bird surveys that could even decrease the number of individuals captured and species detected. Net avoidance can greatly affect sampling efficiency but adjustments in survey design can minimize this. Whenever nets can be moved without losing netting time and the objective is to capture many individuals, they should be moved daily. If the main objective is to survey species present then nets should still be moved for bats, but not for birds. However, if relocating nets causes a significant loss of netting time, moving them to reduce effects of shyness will not improve sampling efficiency in either group. Overall, our findings can improve the design of mist netting sampling strategies in other tropical areas.

  5. Non-response weighting adjustment approach in survey sampling ...

    African Journals Online (AJOL)

    Hence the discussion is illustrated with real examples from surveys (in particular 2003 KDHS) conducted by Central Bureau of Statistics (CBS) - Kenya. Some suggestions are made for improving the quality of non-response weighting. Keywords: Survey non-response; non-response adjustment factors; weighting; sampling ...

  6. A Survey of Archaeological Samples Dated in 1985

    DEFF Research Database (Denmark)

    Mejdahl, Vagn

    1986-01-01

    A survey is given of archaeological samples received for dating in 1985 at the Nordic Laboratory for Thermoluminescence Dating. A total of 66 samples were dated, 42 of which were burnt stones. All results were corrected for short-term fading as measured for samples stored at room temperature...

  7. Surveying immigrants without sampling frames - evaluating the success of alternative field methods.

    Science.gov (United States)

    Reichel, David; Morales, Laura

    2017-01-01

    This paper evaluates the sampling methods of an international survey, the Immigrant Citizens Survey, which aimed at surveying immigrants from outside the European Union (EU) in 15 cities in seven EU countries. In five countries, no sample frame was available for the target population. Consequently, alternative ways to obtain a representative sample had to be found. In three countries 'location sampling' was employed, while in two countries traditional methods were used with adaptations to reach the target population. The paper assesses the main methodological challenges of carrying out a survey among a group of immigrants for whom no sampling frame exists. The samples of the survey in these five countries are compared to results of official statistics in order to assess the accuracy of the samples obtained through the different sampling methods. It can be shown that alternative sampling methods can provide meaningful results in terms of core demographic characteristics although some estimates differ to some extent from the census results.

  8. A Survey of Archaeological Samples Dated in 1984

    DEFF Research Database (Denmark)

    Mejdahl, Vagn

    A survey is given of archaeological samples dated in 1984 at the Nordic Laboratory for Thermoluminescence Dating. A total of 79 samples were dated, 49 of which were burnt stones. All results were corrected for fading as measured for samples stored for four weeks at room temperature. The alpha dose...

  9. An assessment of Lot Quality Assurance Sampling to evaluate malaria outcome indicators: extending malaria indicator surveys.

    Science.gov (United States)

    Biedron, Caitlin; Pagano, Marcello; Hedt, Bethany L; Kilian, Albert; Ratcliffe, Amy; Mabunda, Samuel; Valadez, Joseph J

    2010-02-01

    Large investments and increased global prioritization of malaria prevention and treatment have resulted in greater emphasis on programme monitoring and evaluation (M&E) in many countries. Many countries currently use large multistage cluster sample surveys to monitor malaria outcome indicators on a regional and national level. However, these surveys often mask local-level variability important to programme management. Lot Quality Assurance Sampling (LQAS) has played a valuable role for local-level programme M&E. If incorporated into these larger surveys, it would provide a comprehensive M&E plan at little, if any, extra cost. The Mozambique Ministry of Health conducted a Malaria Indicator Survey (MIS) in June and July 2007. We applied LQAS classification rules to the 345 sampled enumeration areas to demonstrate identifying high- and low-performing areas with respect to two malaria program indicators-'household possession of any bednet' and 'household possession of any insecticide-treated bednet (ITN)'. As shown by the MIS, no province in Mozambique achieved the 70% coverage target for household possession of bednets or ITNs. By applying LQAS classification rules to the data, we identify 266 of the 345 enumeration areas as having bednet coverage severely below the 70% target. An additional 73 were identified with low ITN coverage. This article demonstrates the feasibility of integrating LQAS into multistage cluster sampling surveys and using these results to support a comprehensive national, regional and local programme M&E system. Furthermore, in the recommendations we outlined how to integrate the Large Country-LQAS design into macro-surveys while still obtaining results available through current sampling practices.

  10. The Italian national survey on radon indoors run by several different regional laboratories: Sampling strategy, realization and follow-up

    International Nuclear Information System (INIS)

    Bochicchio, F.; Risica, S.; Piermattei, S.

    1993-01-01

    The paper outlines the criteria and organization adopted by the Italian National Institutions in carrying out a representative national survey to evaluate the distribution of radon concentration and the exposure of the Italian population to natural radiation indoors. The main items of the survey - i.e. sampling design, choice of the sample size (5000 dwellings), organization, analysis of the actual sample structure, questionnaire to collect data about families and their dwellings, experimental set up and communication with the public - are discussed. Some results, concerning a first fraction of the total sample, are also presented. (author). 13 refs, 2 figs, 2 tabs

  11. Stratified sampling design based on data mining.

    Science.gov (United States)

    Kim, Yeonkook J; Oh, Yoonhwan; Park, Sunghoon; Cho, Sungzoon; Park, Hayoung

    2013-09-01

    To explore classification rules based on data mining methodologies which are to be used in defining strata in stratified sampling of healthcare providers with improved sampling efficiency. We performed k-means clustering to group providers with similar characteristics, then, constructed decision trees on cluster labels to generate stratification rules. We assessed the variance explained by the stratification proposed in this study and by conventional stratification to evaluate the performance of the sampling design. We constructed a study database from health insurance claims data and providers' profile data made available to this study by the Health Insurance Review and Assessment Service of South Korea, and population data from Statistics Korea. From our database, we used the data for single specialty clinics or hospitals in two specialties, general surgery and ophthalmology, for the year 2011 in this study. Data mining resulted in five strata in general surgery with two stratification variables, the number of inpatients per specialist and population density of provider location, and five strata in ophthalmology with two stratification variables, the number of inpatients per specialist and number of beds. The percentages of variance in annual changes in the productivity of specialists explained by the stratification in general surgery and ophthalmology were 22% and 8%, respectively, whereas conventional stratification by the type of provider location and number of beds explained 2% and 0.2% of variance, respectively. This study demonstrated that data mining methods can be used in designing efficient stratified sampling with variables readily available to the insurer and government; it offers an alternative to the existing stratification method that is widely used in healthcare provider surveys in South Korea.

  12. Optimal Allocation of Sampling Effort in Depletion Surveys

    Science.gov (United States)

    We consider the problem of designing a depletion or removal survey as part of estimating animal abundance for populations with imperfect capture or detection rates. In a depletion survey, animals are captured from a given area, counted, and withheld from the population. This proc...

  13. Guide to the design and application of online questionnaire surveys.

    Science.gov (United States)

    Regmi, Pramod R; Waithaka, Elizabeth; Paudyal, Anjana; Simkhada, Padam; van Teijlingen, Edwin

    2016-12-01

    Collecting research data through traditional approaches (face-to-face, postal or telephone survey) can be costly and time consuming. The emerging data collection approach based on internet/e-based technologies (e.g. online platforms and email), is a relatively cost effective survey alternative. These novel data collection strategies can collect large amounts of data from participants in a short time frame. Similarly, they also seem to be feasible and effective in collecting data on sensitive issues or with samples they are generally hard to reach, for example, men who have sex with men (MSM) or migrants. As a significant proportion of the population currently in the world are digitally connected, the shift from postal (paper-pencil) or telephone towards online survey use in research is in the interests of researchers in academia as well as in the commercial world. However, compared to designing and executing paper version of the questionnaire, there is limited literature to help a starting researcher with the design and a use of online questionnaires. This short paper highlights issues around: a) methodological aspect of online questionnaire survey; b) online survey planning and management; and c) ethical concerns that may arise while using this option. We believe that this paper will be useful for researchers who want to gain knowledge or apply this approach in their research.

  14. Deep Extragalactic VIsible Legacy Survey (DEVILS): Motivation, Design and Target Catalogue

    Science.gov (United States)

    Davies, L. J. M.; Robotham, A. S. G.; Driver, S. P.; Lagos, C. P.; Cortese, L.; Mannering, E.; Foster, C.; Lidman, C.; Hashemizadeh, A.; Koushan, S.; O'Toole, S.; Baldry, I. K.; Bilicki, M.; Bland-Hawthorn, J.; Bremer, M. N.; Brown, M. J. I.; Bryant, J. J.; Catinella, B.; Croom, S. M.; Grootes, M. W.; Holwerda, B. W.; Jarvis, M. J.; Maddox, N.; Meyer, M.; Moffett, A. J.; Phillipps, S.; Taylor, E. N.; Windhorst, R. A.; Wolf, C.

    2018-06-01

    The Deep Extragalactic VIsible Legacy Survey (DEVILS) is a large spectroscopic campaign at the Anglo-Australian Telescope (AAT) aimed at bridging the near and distant Universe by producing the highest completeness survey of galaxies and groups at intermediate redshifts (0.3 < z < 1.0). Our sample consists of ˜60,000 galaxies to Y<21.2 mag, over ˜6 deg2 in three well-studied deep extragalactic fields (Cosmic Origins Survey field, COSMOS, Extended Chandra Deep Field South, ECDFS and the X-ray Multi-Mirror Mission Large-Scale Structure region, XMM-LSS - all Large Synoptic Survey Telescope deep-drill fields). This paper presents the broad experimental design of DEVILS. Our target sample has been selected from deep Visible and Infrared Survey Telescope for Astronomy (VISTA) Y-band imaging (VISTA Deep Extragalactic Observations, VIDEO and UltraVISTA), with photometry measured by PROFOUND. Photometric star/galaxy separation is done on the basis of NIR colours, and has been validated by visual inspection. To maximise our observing efficiency for faint targets we employ a redshift feedback strategy, which continually updates our target lists, feeding back the results from the previous night's observations. We also present an overview of the initial spectroscopic observations undertaken in late 2017 and early 2018.

  15. Recommended survey designs for occupancy modelling using motion-activated cameras: insights from empirical wildlife data

    Directory of Open Access Journals (Sweden)

    Graeme Shannon

    2014-08-01

    Full Text Available Motion-activated cameras are a versatile tool that wildlife biologists can use for sampling wild animal populations to estimate species occurrence. Occupancy modelling provides a flexible framework for the analysis of these data; explicitly recognizing that given a species occupies an area the probability of detecting it is often less than one. Despite the number of studies using camera data in an occupancy framework, there is only limited guidance from the scientific literature about survey design trade-offs when using motion-activated cameras. A fuller understanding of these trade-offs will allow researchers to maximise available resources and determine whether the objectives of a monitoring program or research study are achievable. We use an empirical dataset collected from 40 cameras deployed across 160 km2 of the Western Slope of Colorado, USA to explore how survey effort (number of cameras deployed and the length of sampling period affects the accuracy and precision (i.e., error of the occupancy estimate for ten mammal and three virtual species. We do this using a simulation approach where species occupancy and detection parameters were informed by empirical data from motion-activated cameras. A total of 54 survey designs were considered by varying combinations of sites (10–120 cameras and occasions (20–120 survey days. Our findings demonstrate that increasing total sampling effort generally decreases error associated with the occupancy estimate, but changing the number of sites or sampling duration can have very different results, depending on whether a species is spatially common or rare (occupancy = ψ and easy or hard to detect when available (detection probability = p. For rare species with a low probability of detection (i.e., raccoon and spotted skunk the required survey effort includes maximizing the number of sites and the number of survey days, often to a level that may be logistically unrealistic for many studies. For common

  16. A comprehensive survey of brain interface technology designs.

    Science.gov (United States)

    Mason, S G; Bashashati, A; Fatourechi, M; Navarro, K F; Birch, G E

    2007-02-01

    In this work we present the first comprehensive survey of Brain Interface (BI) technology designs published prior to January 2006. Detailed results from this survey, which was based on the Brain Interface Design Framework proposed by Mason and Birch, are presented and discussed to address the following research questions: (1) which BI technologies are directly comparable, (2) what technology designs exist, (3) which application areas (users, activities and environments) have been targeted in these designs, (4) which design approaches have received little or no research and are possible opportunities for new technology, and (5) how well are designs reported. The results of this work demonstrate that meta-analysis of high-level BI design attributes is possible and informative. The survey also produced a valuable, historical cross-reference where BI technology designers can identify what types of technology have been proposed and by whom.

  17. Analysing designed experiments in distance sampling

    Science.gov (United States)

    Stephen T. Buckland; Robin E. Russell; Brett G. Dickson; Victoria A. Saab; Donal N. Gorman; William M. Block

    2009-01-01

    Distance sampling is a survey technique for estimating the abundance or density of wild animal populations. Detection probabilities of animals inherently differ by species, age class, habitats, or sex. By incorporating the change in an observer's ability to detect a particular class of animals as a function of distance, distance sampling leads to density estimates...

  18. Mixing modes in a population-based interview survey: comparison of a sequential and a concurrent mixed-mode design for public health research.

    Science.gov (United States)

    Mauz, Elvira; von der Lippe, Elena; Allen, Jennifer; Schilling, Ralph; Müters, Stephan; Hoebel, Jens; Schmich, Patrick; Wetzstein, Matthias; Kamtsiuris, Panagiotis; Lange, Cornelia

    2018-01-01

    Population-based surveys currently face the problem of decreasing response rates. Mixed-mode designs are now being implemented more often to account for this, to improve sample composition and to reduce overall costs. This study examines whether a concurrent or sequential mixed-mode design achieves better results on a number of indicators of survey quality. Data were obtained from a population-based health interview survey of adults in Germany that was conducted as a methodological pilot study as part of the German Health Update (GEDA). Participants were randomly allocated to one of two surveys; each of the surveys had a different design. In the concurrent mixed-mode design ( n  = 617) two types of self-administered questionnaires (SAQ-Web and SAQ-Paper) and computer-assisted telephone interviewing were offered simultaneously to the respondents along with the invitation to participate. In the sequential mixed-mode design ( n  = 561), SAQ-Web was initially provided, followed by SAQ-Paper, with an option for a telephone interview being sent out together with the reminders at a later date. Finally, this study compared the response rates, sample composition, health indicators, item non-response, the scope of fieldwork and the costs of both designs. No systematic differences were identified between the two mixed-mode designs in terms of response rates, the socio-demographic characteristics of the achieved samples, or the prevalence rates of the health indicators under study. The sequential design gained a higher rate of online respondents. Very few telephone interviews were conducted for either design. With regard to data quality, the sequential design (which had more online respondents) showed less item non-response. There were minor differences between the designs in terms of their costs. Postage and printing costs were lower in the concurrent design, but labour costs were lower in the sequential design. No differences in health indicators were found between

  19. A Survey of Blue-Noise Sampling and Its Applications

    KAUST Repository

    Yan, Dongming; Guo, Jian-Wei; Wang, Bin; Zhang, Xiao-Peng; Wonka, Peter

    2015-01-01

    In this paper, we survey recent approaches to blue-noise sampling and discuss their beneficial applications. We discuss the sampling algorithms that use points as sampling primitives and classify the sampling algorithms based on various aspects, e.g., the sampling domain and the type of algorithm. We demonstrate several well-known applications that can be improved by recent blue-noise sampling techniques, as well as some new applications such as dynamic sampling and blue-noise remeshing.

  20. A Survey of Blue-Noise Sampling and Its Applications

    KAUST Repository

    Yan, Dongming

    2015-05-05

    In this paper, we survey recent approaches to blue-noise sampling and discuss their beneficial applications. We discuss the sampling algorithms that use points as sampling primitives and classify the sampling algorithms based on various aspects, e.g., the sampling domain and the type of algorithm. We demonstrate several well-known applications that can be improved by recent blue-noise sampling techniques, as well as some new applications such as dynamic sampling and blue-noise remeshing.

  1. A Study on the Representative Sampling Survey for Radionuclide Analysis of RI Waste

    Energy Technology Data Exchange (ETDEWEB)

    Jee, K. Y. [KAERI, Daejeon (Korea, Republic of); Kim, Juyoul; Jung, Gunhyo [FNC Tech. Co., Daejeon (Korea, Republic of)

    2007-07-15

    We developed a quantitative method for attaining a representative sample during sampling survey of RI waste. Considering a source, process, and type of RI waste, the method computes the number of sample, confidence interval, variance, and coefficient of variance. We also systematize the method of sampling survey logically and quantitatively. The result of this study can be applied to sampling survey of low- and intermediate-level waste generated from nuclear power plant during the transfer process to disposal facility.

  2. Optical Design for a Survey X-Ray Telescope

    Science.gov (United States)

    Saha, Timo T.; Zhang, William W.; McClelland, Ryan S.

    2014-01-01

    Optical design trades are underway at the Goddard Space Flight Center to define a telescope for an x-ray survey mission. Top-level science objectives of the mission include the study of x-ray transients, surveying and long-term monitoring of compact objects in nearby galaxies, as well as both deep and wide-field x-ray surveys. In this paper we consider Wolter, Wolter-Schwarzschild, and modified Wolter-Schwarzschild telescope designs as basic building blocks for the tightly nested survey telescope. Design principles and dominating aberrations of individual telescopes and nested telescopes are discussed and we compare the off-axis optical performance at 1.0 KeV and 4.0 KeV across a 1.0-degree full field-of-view.

  3. The SDSS-IV MaNGA Sample: Design, Optimization, and Usage Considerations

    Science.gov (United States)

    Wake, David A.; Bundy, Kevin; Diamond-Stanic, Aleksandar M.; Yan, Renbin; Blanton, Michael R.; Bershady, Matthew A.; Sánchez-Gallego, José R.; Drory, Niv; Jones, Amy; Kauffmann, Guinevere; Law, David R.; Li, Cheng; MacDonald, Nicholas; Masters, Karen; Thomas, Daniel; Tinker, Jeremy; Weijmans, Anne-Marie; Brownstein, Joel R.

    2017-09-01

    We describe the sample design for the SDSS-IV MaNGA survey and present the final properties of the main samples along with important considerations for using these samples for science. Our target selection criteria were developed while simultaneously optimizing the size distribution of the MaNGA integral field units (IFUs), the IFU allocation strategy, and the target density to produce a survey defined in terms of maximizing signal-to-noise ratio, spatial resolution, and sample size. Our selection strategy makes use of redshift limits that only depend on I-band absolute magnitude (M I ), or, for a small subset of our sample, M I and color (NUV - I). Such a strategy ensures that all galaxies span the same range in angular size irrespective of luminosity and are therefore covered evenly by the adopted range of IFU sizes. We define three samples: the Primary and Secondary samples are selected to have a flat number density with respect to M I and are targeted to have spectroscopic coverage to 1.5 and 2.5 effective radii (R e ), respectively. The Color-Enhanced supplement increases the number of galaxies in the low-density regions of color-magnitude space by extending the redshift limits of the Primary sample in the appropriate color bins. The samples cover the stellar mass range 5× {10}8≤slant {M}* ≤slant 3× {10}11 {M}⊙ {h}-2 and are sampled at median physical resolutions of 1.37 and 2.5 kpc for the Primary and Secondary samples, respectively. We provide weights that will statistically correct for our luminosity and color-dependent selection function and IFU allocation strategy, thus correcting the observed sample to a volume-limited sample.

  4. Amostras complexas em inquéritos populacionais: planejamento e implicações na análise estatística dos dados Complex Sampling Design in Population Surveys: Planning and effects on statistical data analysis

    Directory of Open Access Journals (Sweden)

    Célia Landmann Szwarcwald

    2008-05-01

    health status of the population and satisfaction with healthcare from the user's point of view. Most national health surveys do not use simple random sampling, either due to budget restrictions or because time constraints associated with data collection. In general, a combination of several probabilistic sampling methods is used to select a representative sample of the population, which is called complex sampling design. Among the several sampling techniques, the most frequently used are simple random sampling, stratified sampling and cluster sampling. As a result of this process, the next concern is the statistical analysis of the data from complex samples. This paper deals with issues related to data analysis obtained from surveys using complex sampling designs. It discusses the problems that arise when the statistical analysis does not incorporate the sampling design. When the design is neglected, traditional statistical analysis, based on the assumption of simple random sampling, might produce improper results not only for the mean estimates but also for standard errors, thus compromising results, hypothesis testing, and survey conclusions. The World Health Survey (WHS carried out in Brazil, in 2003, is used to exemplify complex sampling methods.

  5. A survey of archaeological samples dated in 1984

    International Nuclear Information System (INIS)

    Mejdahl, V.

    1985-10-01

    A survey is given of archaeological samples dated in 1984 at the Nordic Laboratory for Thermoluminescence Dating. A total of 79 samples were dated, 49 of which were burnt stones. All results were corrected for fading as measured for samples stored for four weeks at room temperature. The alpha dose contribution from uranium in the quartz and feldspar grains was included assuming an alpha efficiency facotr of 0.1 for quartz and 0.2 for feldspars. (author)

  6. Balanced sampling

    NARCIS (Netherlands)

    Brus, D.J.

    2015-01-01

    In balanced sampling a linear relation between the soil property of interest and one or more covariates with known means is exploited in selecting the sampling locations. Recent developments make this sampling design attractive for statistical soil surveys. This paper introduces balanced sampling

  7. Precision, time, and cost: a comparison of three sampling designs in an emergency setting

    Science.gov (United States)

    Deitchler, Megan; Deconinck, Hedwig; Bergeron, Gilles

    2008-01-01

    The conventional method to collect data on the health, nutrition, and food security status of a population affected by an emergency is a 30 × 30 cluster survey. This sampling method can be time and resource intensive and, accordingly, may not be the most appropriate one when data are needed rapidly for decision making. In this study, we compare the precision, time and cost of the 30 × 30 cluster survey with two alternative sampling designs: a 33 × 6 cluster design (33 clusters, 6 observations per cluster) and a 67 × 3 cluster design (67 clusters, 3 observations per cluster). Data for each sampling design were collected concurrently in West Darfur, Sudan in September-October 2005 in an emergency setting. Results of the study show the 30 × 30 design to provide more precise results (i.e. narrower 95% confidence intervals) than the 33 × 6 and 67 × 3 design for most child-level indicators. Exceptions are indicators of immunization and vitamin A capsule supplementation coverage which show a high intra-cluster correlation. Although the 33 × 6 and 67 × 3 designs provide wider confidence intervals than the 30 × 30 design for child anthropometric indicators, the 33 × 6 and 67 × 3 designs provide the opportunity to conduct a LQAS hypothesis test to detect whether or not a critical threshold of global acute malnutrition prevalence has been exceeded, whereas the 30 × 30 design does not. For the household-level indicators tested in this study, the 67 × 3 design provides the most precise results. However, our results show that neither the 33 × 6 nor the 67 × 3 design are appropriate for assessing indicators of mortality. In this field application, data collection for the 33 × 6 and 67 × 3 designs required substantially less time and cost than that required for the 30 × 30 design. The findings of this study suggest the 33 × 6 and 67 × 3 designs can provide useful time- and resource-saving alternatives to the 30 × 30 method of data collection in emergency

  8. Precision, time, and cost: a comparison of three sampling designs in an emergency setting

    Directory of Open Access Journals (Sweden)

    Deconinck Hedwig

    2008-05-01

    Full Text Available Abstract The conventional method to collect data on the health, nutrition, and food security status of a population affected by an emergency is a 30 × 30 cluster survey. This sampling method can be time and resource intensive and, accordingly, may not be the most appropriate one when data are needed rapidly for decision making. In this study, we compare the precision, time and cost of the 30 × 30 cluster survey with two alternative sampling designs: a 33 × 6 cluster design (33 clusters, 6 observations per cluster and a 67 × 3 cluster design (67 clusters, 3 observations per cluster. Data for each sampling design were collected concurrently in West Darfur, Sudan in September-October 2005 in an emergency setting. Results of the study show the 30 × 30 design to provide more precise results (i.e. narrower 95% confidence intervals than the 33 × 6 and 67 × 3 design for most child-level indicators. Exceptions are indicators of immunization and vitamin A capsule supplementation coverage which show a high intra-cluster correlation. Although the 33 × 6 and 67 × 3 designs provide wider confidence intervals than the 30 × 30 design for child anthropometric indicators, the 33 × 6 and 67 × 3 designs provide the opportunity to conduct a LQAS hypothesis test to detect whether or not a critical threshold of global acute malnutrition prevalence has been exceeded, whereas the 30 × 30 design does not. For the household-level indicators tested in this study, the 67 × 3 design provides the most precise results. However, our results show that neither the 33 × 6 nor the 67 × 3 design are appropriate for assessing indicators of mortality. In this field application, data collection for the 33 × 6 and 67 × 3 designs required substantially less time and cost than that required for the 30 × 30 design. The findings of this study suggest the 33 × 6 and 67 × 3 designs can provide useful time- and resource-saving alternatives to the 30 × 30 method of data

  9. A two-phase sampling survey for nonresponse and its paradata to correct nonresponse bias in a health surveillance survey.

    Science.gov (United States)

    Santin, G; Bénézet, L; Geoffroy-Perez, B; Bouyer, J; Guéguen, A

    2017-02-01

    The decline in participation rates in surveys, including epidemiological surveillance surveys, has become a real concern since it may increase nonresponse bias. The aim of this study is to estimate the contribution of a complementary survey among a subsample of nonrespondents, and the additional contribution of paradata in correcting for nonresponse bias in an occupational health surveillance survey. In 2010, 10,000 workers were randomly selected and sent a postal questionnaire. Sociodemographic data were available for the whole sample. After data collection of the questionnaires, a complementary survey among a random subsample of 500 nonrespondents was performed using a questionnaire administered by an interviewer. Paradata were collected for the complete subsample of the complementary survey. Nonresponse bias in the initial sample and in the combined samples were assessed using variables from administrative databases available for the whole sample, not subject to differential measurement errors. Corrected prevalences by reweighting technique were estimated by first using the initial survey alone and then the initial and complementary surveys combined, under several assumptions regarding the missing data process. Results were compared by computing relative errors. The response rates of the initial and complementary surveys were 23.6% and 62.6%, respectively. For the initial and the combined surveys, the relative errors decreased after correction for nonresponse on sociodemographic variables. For the combined surveys without paradata, relative errors decreased compared with the initial survey. The contribution of the paradata was weak. When a complex descriptive survey has a low response rate, a short complementary survey among nonrespondents with a protocol which aims to maximize the response rates, is useful. The contribution of sociodemographic variables in correcting for nonresponse bias is important whereas the additional contribution of paradata in

  10. Responsive survey design, demographic data collection, and models of demographic behavior.

    Science.gov (United States)

    Axinn, William G; Link, Cynthia F; Groves, Robert M

    2011-08-01

    To address declining response rates and rising data-collection costs, survey methodologists have devised new techniques for using process data ("paradata") to address nonresponse by altering the survey design dynamically during data collection. We investigate the substantive consequences of responsive survey design-tools that use paradata to improve the representative qualities of surveys and control costs. By improving representation of reluctant respondents, responsive design can change our understanding of the topic being studied. Using the National Survey of Family Growth Cycle 6, we illustrate how responsive survey design can shape both demographic estimates and models of demographic behaviors based on survey data. By juxtaposing measures from regular and responsive data collection phases, we document how special efforts to interview reluctant respondents may affect demographic estimates. Results demonstrate the potential of responsive survey design to change the quality of demographic research based on survey data.

  11. [New design of the Health Survey of Catalonia (Spain, 2010-2014): a step forward in health planning and evaluation].

    Science.gov (United States)

    Alcañiz-Zanón, Manuela; Mompart-Penina, Anna; Guillén-Estany, Montserrat; Medina-Bustos, Antonia; Aragay-Barbany, Josep M; Brugulat-Guiteras, Pilar; Tresserras-Gaju, Ricard

    2014-01-01

    This article presents the genesis of the Health Survey of Catalonia (Spain, 2010-2014) with its semiannual subsamples and explains the basic characteristics of its multistage sampling design. In comparison with previous surveys, the organizational advantages of this new statistical operation include rapid data availability and the ability to continuously monitor the population. The main benefits are timeliness in the production of indicators and the possibility of introducing new topics through the supplemental questionnaire as a function of needs. Limitations consist of the complexity of the sample design and the lack of longitudinal follow-up of the sample. Suitable sampling weights for each specific subsample are necessary for any statistical analysis of micro-data. Accuracy in the analysis of territorial disaggregation or population subgroups increases if annual samples are accumulated. Copyright © 2013 SESPAS. Published by Elsevier Espana. All rights reserved.

  12. Sample Archaeological Survey of Public Use Areas, Milford Lake, Kansas

    Science.gov (United States)

    1982-09-01

    especially ceramics); Middle "" Mississippian, Middle Woodland and Central Plains archaeology ; the engineering and building technology of the Maya ...Sample Archaeological Survey of Public Use Areas -- 0C 0 awo (L" . .614 4.- -. 1?CNOV 1 40484 * , "n. O ji - 0" By Laura S. Schwiekhard Thn ’.iint haUs...RECIPIENT’S CATALOG NUMBER 4. TITLE (and Subtitle) 5. TYPE OF REPORT & PERIOD COVERED Milford Lake, Kansas Sample Archaeological Survey of Public Use

  13. Designing web surveys for the multi-device internet

    NARCIS (Netherlands)

    de Bruijne, M.A.

    2015-01-01

    The rise of the mobile internet has rapidly changed the landscape for fielding web surveys. The devices that respondents use to take a web survey vary greatly in size and user interface. This diversity in the interaction between survey and respondent makes it challenging to design a web survey for

  14. Like a virgin (mother): analysis of data from a longitudinal, US population representative sample survey

    OpenAIRE

    Herring, Amy H; Attard, Samantha M; Gordon-Larsen, Penny; Joyner, William H; Halpern, Carolyn T

    2013-01-01

    Objective To estimate the incidence of self report of pregnancy without sexual intercourse (virgin pregnancy) and factors related to such reporting, in a population representative group of US adolescents and young adults. Design Longitudinal, population representative sample survey. Setting Nationally representative, multiethnic National Longitudinal Study of Adolescent Health, United States. Participants 7870 women enrolled at wave I (1995) and completing the most recent wave of data collect...

  15. A survey of archaeological samples dated in 1987

    International Nuclear Information System (INIS)

    Mejdahl, V.

    1988-10-01

    A survey is given of archaeological samples dated in 1987 at the Nordic Laboratory for Thermoluminescence Dating. A total of 74 samples were dated. The results were corrected for short-term fading of feldspars as measured for samples stored at room temperature for four weeks or at 100 deg. C for two weeks. The beta dose from potassium and rubidium in feldspar, and the alpha dose from uranium and thorium in quartz and feldspar were included, assuming alpha efficiency factors of 0.1 for quartz and 0.2 for feldspar. (author) 20 tabs., 29 refs

  16. A survey of archaeological samples dated in 1985

    International Nuclear Information System (INIS)

    Mejdahl, V.

    1986-11-01

    A survey is given of archaeological samples received for dating in 1985 at the Nordic Laboratory for Thermoluminescence Dating. A total of 66 samples were dated, 42 of which were burnt stones. All results were corrected for short- term fading as measured for samples stored at room temperature for four weeks. The beta dose from potassium and rubidium in feldspar and the alpha dose from uranium and thorium in quarts and feldspar were included assuming alpha efficiency factors of 0.1 for quartz and 0.2 for feldspar. (author)

  17. A survey of archaeological samples dated in 1986

    International Nuclear Information System (INIS)

    Mejdahl, V.

    1987-10-01

    A survey is given of archaeological samples dated in 1986 at the Nordic Laboratory for Thermoluminescence Dating. A total of 56 samples were dated. The results were corrected for shortterm fading measured for samples stored for four weeks at room temperature or at 100 deg. C. The beta dose from potassium and rubidium in feldspar and the alpha dose from uranium and thorium in quartz and feldspar grain were included assuming alpha efficiency factors of 0.1 and 0.2 for quartz and feldspar, respectively. 21 refs. (author)

  18. Tackling the Survey: A Learning-by-Induction Design

    Science.gov (United States)

    Witte, Anne E.

    2017-01-01

    Free online survey tools provide a practical learning-by-induction platform for business communication instructors interested in trying out an advanced multidisciplinary survey activity coupled with an innovative teaching design. More than just building skills in marketing, survey projects marshal a wider set of thinking and doing activities that…

  19. Are quantitative trait-dependent sampling designs cost-effective for analysis of rare and common variants?

    Science.gov (United States)

    Yilmaz, Yildiz E; Bull, Shelley B

    2011-11-29

    Use of trait-dependent sampling designs in whole-genome association studies of sequence data can reduce total sequencing costs with modest losses of statistical efficiency. In a quantitative trait (QT) analysis of data from the Genetic Analysis Workshop 17 mini-exome for unrelated individuals in the Asian subpopulation, we investigate alternative designs that sequence only 50% of the entire cohort. In addition to a simple random sampling design, we consider extreme-phenotype designs that are of increasing interest in genetic association analysis of QTs, especially in studies concerned with the detection of rare genetic variants. We also evaluate a novel sampling design in which all individuals have a nonzero probability of being selected into the sample but in which individuals with extreme phenotypes have a proportionately larger probability. We take differential sampling of individuals with informative trait values into account by inverse probability weighting using standard survey methods which thus generalizes to the source population. In replicate 1 data, we applied the designs in association analysis of Q1 with both rare and common variants in the FLT1 gene, based on knowledge of the generating model. Using all 200 replicate data sets, we similarly analyzed Q1 and Q4 (which is known to be free of association with FLT1) to evaluate relative efficiency, type I error, and power. Simulation study results suggest that the QT-dependent selection designs generally yield greater than 50% relative efficiency compared to using the entire cohort, implying cost-effectiveness of 50% sample selection and worthwhile reduction of sequencing costs.

  20. Sample design effects in landscape genetics

    Science.gov (United States)

    Oyler-McCance, Sara J.; Fedy, Bradley C.; Landguth, Erin L.

    2012-01-01

    An important research gap in landscape genetics is the impact of different field sampling designs on the ability to detect the effects of landscape pattern on gene flow. We evaluated how five different sampling regimes (random, linear, systematic, cluster, and single study site) affected the probability of correctly identifying the generating landscape process of population structure. Sampling regimes were chosen to represent a suite of designs common in field studies. We used genetic data generated from a spatially-explicit, individual-based program and simulated gene flow in a continuous population across a landscape with gradual spatial changes in resistance to movement. Additionally, we evaluated the sampling regimes using realistic and obtainable number of loci (10 and 20), number of alleles per locus (5 and 10), number of individuals sampled (10-300), and generational time after the landscape was introduced (20 and 400). For a simulated continuously distributed species, we found that random, linear, and systematic sampling regimes performed well with high sample sizes (>200), levels of polymorphism (10 alleles per locus), and number of molecular markers (20). The cluster and single study site sampling regimes were not able to correctly identify the generating process under any conditions and thus, are not advisable strategies for scenarios similar to our simulations. Our research emphasizes the importance of sampling data at ecologically appropriate spatial and temporal scales and suggests careful consideration for sampling near landscape components that are likely to most influence the genetic structure of the species. In addition, simulating sampling designs a priori could help guide filed data collection efforts.

  1. Experimental and Sampling Design for the INL-2 Sample Collection Operational Test

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, Gregory F.; Amidan, Brett G.; Matzke, Brett D.

    2009-02-16

    This report describes the experimental and sampling design developed to assess sampling approaches and methods for detecting contamination in a building and clearing the building for use after decontamination. An Idaho National Laboratory (INL) building will be contaminated with BG (Bacillus globigii, renamed Bacillus atrophaeus), a simulant for Bacillus anthracis (BA). The contamination, sampling, decontamination, and re-sampling will occur per the experimental and sampling design. This INL-2 Sample Collection Operational Test is being planned by the Validated Sampling Plan Working Group (VSPWG). The primary objectives are: 1) Evaluate judgmental and probabilistic sampling for characterization as well as probabilistic and combined (judgment and probabilistic) sampling approaches for clearance, 2) Conduct these evaluations for gradient contamination (from low or moderate down to absent or undetectable) for different initial concentrations of the contaminant, 3) Explore judgment composite sampling approaches to reduce sample numbers, 4) Collect baseline data to serve as an indication of the actual levels of contamination in the tests. A combined judgmental and random (CJR) approach uses Bayesian methodology to combine judgmental and probabilistic samples to make clearance statements of the form "X% confidence that at least Y% of an area does not contain detectable contamination” (X%/Y% clearance statements). The INL-2 experimental design has five test events, which 1) vary the floor of the INL building on which the contaminant will be released, 2) provide for varying the amount of contaminant released to obtain desired concentration gradients, and 3) investigate overt as well as covert release of contaminants. Desirable contaminant gradients would have moderate to low concentrations of contaminant in rooms near the release point, with concentrations down to zero in other rooms. Such gradients would provide a range of contamination levels to challenge the sampling

  2. A survey of archaeological samples dated in 1988

    International Nuclear Information System (INIS)

    Mejdahl, V.

    1989-08-01

    A survey is given of archaeological samples dated in 1988 at the Nordic Laboratory for Thermoluminescence Dating. A total of 67 samples were dated. The results were corrected for short-term fading of feldspars as measured for samples stored at room temperature for four weeks or at 100 deg. C for two weeks. The beta dose from potassium and rubidium in feldspar, and the alpha dose from uranium and thorium in quartz and feldspar were included assuming alpha efficiency factors of 0.1 for quartz and 0.2 for feldspar. (author) 22 tabs., 1 ill., 14 refs

  3. Novel platform for ocean survey and autonomous sampling using multi-agent system

    OpenAIRE

    Taher, Tawfiq; Weymouth, G.D.; Varghese, Tony

    2013-01-01

    In-situ surveying and sampling of ocean environments provides critical data for laboratory work and oceanographic research. However, sampling a time-varying ocean field is often time and resource limited-meaning that samples often miss the features of interest. This paper presents a modular autonomous multi-agent robotic system which has been developed to accommodate a variety of research activities. This paper demonstrates the complementary capabilities of the agents by simultaneously survey...

  4. The ESO Diffuse Interstellar Bands Large Exploration Survey (EDIBLES) . I. Project description, survey sample, and quality assessment

    Science.gov (United States)

    Cox, Nick L. J.; Cami, Jan; Farhang, Amin; Smoker, Jonathan; Monreal-Ibero, Ana; Lallement, Rosine; Sarre, Peter J.; Marshall, Charlotte C. M.; Smith, Keith T.; Evans, Christopher J.; Royer, Pierre; Linnartz, Harold; Cordiner, Martin A.; Joblin, Christine; van Loon, Jacco Th.; Foing, Bernard H.; Bhatt, Neil H.; Bron, Emeric; Elyajouri, Meriem; de Koter, Alex; Ehrenfreund, Pascale; Javadi, Atefeh; Kaper, Lex; Khosroshadi, Habib G.; Laverick, Mike; Le Petit, Franck; Mulas, Giacomo; Roueff, Evelyne; Salama, Farid; Spaans, Marco

    2017-10-01

    The carriers of the diffuse interstellar bands (DIBs) are largely unidentified molecules ubiquitously present in the interstellar medium (ISM). After decades of study, two strong and possibly three weak near-infrared DIBs have recently been attributed to the C60^+ fullerene based on observational and laboratory measurements. There is great promise for the identification of the over 400 other known DIBs, as this result could provide chemical hints towards other possible carriers. In an effort tosystematically study the properties of the DIB carriers, we have initiated a new large-scale observational survey: the ESO Diffuse Interstellar Bands Large Exploration Survey (EDIBLES). The main objective is to build on and extend existing DIB surveys to make a major step forward in characterising the physical and chemical conditions for a statistically significant sample of interstellar lines-of-sight, with the goal to reverse-engineer key molecular properties of the DIB carriers. EDIBLES is a filler Large Programme using the Ultraviolet and Visual Echelle Spectrograph at the Very Large Telescope at Paranal, Chile. It is designed to provide an observationally unbiased view of the presence and behaviour of the DIBs towards early-spectral-type stars whose lines-of-sight probe the diffuse-to-translucent ISM. Such a complete dataset will provide a deep census of the atomic and molecular content, physical conditions, chemical abundances and elemental depletion levels for each sightline. Achieving these goals requires a homogeneous set of high-quality data in terms of resolution (R 70 000-100 000), sensitivity (S/N up to 1000 per resolution element), and spectral coverage (305-1042 nm), as well as a large sample size (100+ sightlines). In this first paper the goals, objectives and methodology of the EDIBLES programme are described and an initial assessment of the data is provided.

  5. Pairing call-response surveys and distance sampling for a mammalian carnivore

    Science.gov (United States)

    Hansen, Sara J. K.; Frair, Jacqueline L.; Underwood, Harold B.; Gibbs, James P.

    2015-01-01

    Density estimates accounting for differential animal detectability are difficult to acquire for wide-ranging and elusive species such as mammalian carnivores. Pairing distance sampling with call-response surveys may provide an efficient means of tracking changes in populations of coyotes (Canis latrans), a species of particular interest in the eastern United States. Blind field trials in rural New York State indicated 119-m linear error for triangulated coyote calls, and a 1.8-km distance threshold for call detectability, which was sufficient to estimate a detection function with precision using distance sampling. We conducted statewide road-based surveys with sampling locations spaced ≥6 km apart from June to August 2010. Each detected call (be it a single or group) counted as a single object, representing 1 territorial pair, because of uncertainty in the number of vocalizing animals. From 524 survey points and 75 detections, we estimated the probability of detecting a calling coyote to be 0.17 ± 0.02 SE, yielding a detection-corrected index of 0.75 pairs/10 km2 (95% CI: 0.52–1.1, 18.5% CV) for a minimum of 8,133 pairs across rural New York State. Importantly, we consider this an index rather than true estimate of abundance given the unknown probability of coyote availability for detection during our surveys. Even so, pairing distance sampling with call-response surveys provided a novel, efficient, and noninvasive means of monitoring populations of wide-ranging and elusive, albeit reliably vocal, mammalian carnivores. Our approach offers an effective new means of tracking species like coyotes, one that is readily extendable to other species and geographic extents, provided key assumptions of distance sampling are met.

  6. The Large Area Radio Galaxy Evolution Spectroscopic Survey (LARGESS): survey design, data catalogue and GAMA/WiggleZ spectroscopy

    Science.gov (United States)

    Ching, John H. Y.; Sadler, Elaine M.; Croom, Scott M.; Johnston, Helen M.; Pracy, Michael B.; Couch, Warrick J.; Hopkins, A. M.; Jurek, Russell J.; Pimbblet, K. A.

    2017-01-01

    We present the Large Area Radio Galaxy Evolution Spectroscopic Survey (LARGESS), a spectroscopic catalogue of radio sources designed to include the full range of radio AGN populations out to redshift z ˜ 0.8. The catalogue covers ˜800 deg2 of sky, and provides optical identifications for 19 179 radio sources from the 1.4 GHz Faint Images of the Radio Sky at Twenty-cm (FIRST) survey down to an optical magnitude limit of Imod point-like objects are included, and no colour cuts are applied. In collaboration with the WiggleZ and Galaxy And Mass Assembly (GAMA) spectroscopic survey teams, we have obtained new spectra for over 5000 objects in the LARGESS sample. Combining these new spectra with data from earlier surveys provides spectroscopic data for 12 329 radio sources in the survey area, of which 10 856 have reliable redshifts. 85 per cent of the LARGESS spectroscopic sample are radio AGN (median redshift z = 0.44), and 15 per cent are nearby star-forming galaxies (median z = 0.08). Low-excitation radio galaxies (LERGs) comprise the majority (83 per cent) of LARGESS radio AGN at z < 0.8, with 12 per cent being high-excitation radio galaxies (HERGs) and 5 per cent radio-loud QSOs. Unlike the more homogeneous LERG and QSO sub-populations, HERGs are a heterogeneous class of objects with relatively blue optical colours and a wide dispersion in mid-infrared colours. This is consistent with a picture in which most HERGs are hosted by galaxies with recent or ongoing star formation as well as a classical accretion disc.

  7. Design of a statewide radiation survey

    International Nuclear Information System (INIS)

    Nagda, N.L.; Koontz, M.D.; Rector, H.E.; Nifong, G.D.

    1989-01-01

    The Florida Institute of Phosphate Research (FIPR) recently sponsored a statewide survey to identify all significant land areas in Florida where the state's environmental radiation rule should be applied. Under this rule, newly constructed buildings must be tested for radiation levels unless approved construction techniques are used. Two parallel surveys - a land-based survey and a population-based survey - were designed and conducted to address the objective. Each survey included measurements in more than 3000 residences throughout the state. Other information sources that existed at the outset of the study, such as geologic profiles mapped by previous investigators and terrestrial uranium levels characterized through aerial gamma radiation surveys, were also examined. Initial data analysis efforts focused on determining the extent of evidence of radon potential for each of 67 counties in the state. Within 18 countries that were determined to have definite evidence of elevated radon potential, more detailed spatial analyses were conducted to identify areas of which the rule should apply. A total of 74 quadrangles delineated by the U.S. Geological Survey, representing about 7% of those constituting the state, were identified as having elevated radon potential and being subject to the rule

  8. Lot quality assurance sampling techniques in health surveys in developing countries: advantages and current constraints.

    Science.gov (United States)

    Lanata, C F; Black, R E

    1991-01-01

    Traditional survey methods, which are generally costly and time-consuming, usually provide information at the regional or national level only. The utilization of lot quality assurance sampling (LQAS) methodology, developed in industry for quality control, makes it possible to use small sample sizes when conducting surveys in small geographical or population-based areas (lots). This article describes the practical use of LQAS for conducting health surveys to monitor health programmes in developing countries. Following a brief description of the method, the article explains how to build a sample frame and conduct the sampling to apply LQAS under field conditions. A detailed description of the procedure for selecting a sampling unit to monitor the health programme and a sample size is given. The sampling schemes utilizing LQAS applicable to health surveys, such as simple- and double-sampling schemes, are discussed. The interpretation of the survey results and the planning of subsequent rounds of LQAS surveys are also discussed. When describing the applicability of LQAS in health surveys in developing countries, the article considers current limitations for its use by health planners in charge of health programmes, and suggests ways to overcome these limitations through future research. It is hoped that with increasing attention being given to industrial sampling plans in general, and LQAS in particular, their utilization to monitor health programmes will provide health planners in developing countries with powerful techniques to help them achieve their health programme targets.

  9. On the problems of PPS sampling in multi-character surveys ...

    African Journals Online (AJOL)

    This paper, which is on the problems of PPS sampling in multi-character surveys, compares the efficiency of some estimators used in PPSWR sampling for multiple characteristics. From a superpopulation model, we computed the expected variances of the different estimators for each of the first two finite populations ...

  10. Mode Equivalence of Health Indicators Between Data Collection Modes and Mixed-Mode Survey Designs in Population-Based Health Interview Surveys for Children and Adolescents: Methodological Study

    Science.gov (United States)

    Hoffmann, Robert; Houben, Robin; Krause, Laura; Kamtsiuris, Panagiotis; Gößwald, Antje

    2018-01-01

    Background The implementation of an Internet option in an existing public health interview survey using a mixed-mode design is attractive because of lower costs and faster data availability. Additionally, mixed-mode surveys can increase response rates and improve sample composition. However, mixed-mode designs can increase the risk of measurement error (mode effects). Objective This study aimed to determine whether the prevalence rates or mean values of self- and parent-reported health indicators for children and adolescents aged 0-17 years differ between self-administered paper-based questionnaires (SAQ-paper) and self-administered Web-based questionnaires (SAQ-Web), as well as between a single-mode control group and different mixed-mode groups. Methods Data were collected for a methodological pilot of the third wave of the "German Health Interview and Examination Survey for Children and Adolescents". Questionnaires were completed by parents or adolescents. A population-based sample of 11,140 children and adolescents aged 0-17 years was randomly allocated to 4 survey designs—a single-mode control group with paper-and-pencil questionnaires only (n=970 parents, n=343 adolescents)—and 3 mixed-mode designs, all of which offered Web-based questionnaire options. In the concurrent mixed-mode design, both questionnaires were offered at the same time (n=946 parents, n=290 adolescents); in the sequential mixed-mode design, the SAQ-Web was sent first, followed by the paper questionnaire along with a reminder (n=854 parents, n=269 adolescents); and in the preselect mixed-mode design, both options were offered and the respondents were asked to request the desired type of questionnaire (n=698 parents, n=292 adolescents). In total, 3468 questionnaires of parents of children aged 0-17 years (SAQ-Web: n=708; SAQ-paper: n=2760) and 1194 questionnaires of adolescents aged 11-17 years (SAQ-Web: n=299; SAQ-paper: n=895) were analyzed. Sociodemographic characteristics and a broad

  11. Understanding Sample Surveys: Selective Learning about Social Science Research Methods

    Science.gov (United States)

    Currin-Percival, Mary; Johnson, Martin

    2010-01-01

    We investigate differences in what students learn about survey methodology in a class on public opinion presented in two critically different ways: with the inclusion or exclusion of an original research project using a random-digit-dial telephone survey. Using a quasi-experimental design and data obtained from pretests and posttests in two public…

  12. Tobacco smoking surveillance: is quota sampling an efficient tool for monitoring national trends? A comparison with a random cross-sectional survey.

    Directory of Open Access Journals (Sweden)

    Romain Guignard

    Full Text Available OBJECTIVES: It is crucial for policy makers to monitor the evolution of tobacco smoking prevalence. In France, this monitoring is based on a series of cross-sectional general population surveys, the Health Barometers, conducted every five years and based on random samples. A methodological study has been carried out to assess the reliability of a monitoring system based on regular quota sampling surveys for smoking prevalence. DESIGN / OUTCOME MEASURES: In 2010, current and daily tobacco smoking prevalences obtained in a quota survey on 8,018 people were compared with those of the 2010 Health Barometer carried out on 27,653 people. Prevalences were assessed separately according to the telephone equipment of the interviewee (landline phone owner vs "mobile-only", and logistic regressions were conducted in the pooled database to assess the impact of the telephone equipment and of the survey mode on the prevalences found. Finally, logistic regressions adjusted for sociodemographic characteristics were conducted in the random sample in order to determine the impact of the needed number of calls to interwiew "hard-to-reach" people on the prevalence found. RESULTS: Current and daily prevalences were higher in the random sample (respectively 33.9% and 27.5% in 15-75 years-old than in the quota sample (respectively 30.2% and 25.3%. In both surveys, current and daily prevalences were lower among landline phone owners (respectively 31.8% and 25.5% in the random sample and 28.9% and 24.0% in the quota survey. The required number of calls was slightly related to the smoking status after adjustment for sociodemographic characteristics. CONCLUSION: Random sampling appears to be more effective than quota sampling, mainly by making it possible to interview hard-to-reach populations.

  13. Using Electronic Surveys: Advice from Survey Professionals.

    Directory of Open Access Journals (Sweden)

    David M. Shannon

    2002-01-01

    Full Text Available The study reports the perceptions and recommendations of sixty-two experienced survey..researchers from the American Educational Research Association regarding the use of..electronic surveys. The most positive aspects cited for the use of electronic surveys were..reduction of costs (i.e., postage, phone charges, the use of electronic mail for pre-notification or..follow-up purposes, and the compatibility of data with existing software programs. These..professionals expressed limitations in using electronic surveys pertaining to the limited..sampling frame as well as issues of confidentiality, privacy, and the credibility of the sample...They advised that electronic surveys designed with the varied technological background and..capabilities of the respondent in mind, follow sound principles of survey construction, and be..administered to pre-notified, targeted populations with published email addresses.

  14. 30 CFR 71.208 - Bimonthly sampling; designated work positions.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Bimonthly sampling; designated work positions... UNDERGROUND COAL MINES Sampling Procedures § 71.208 Bimonthly sampling; designated work positions. (a) Each... standard when quartz is present), respirable dust sampling of designated work positions shall begin on the...

  15. Design of the South East Asian Nutrition Survey (SEANUTS): a four-country multistage cluster design study.

    Science.gov (United States)

    Schaafsma, Anne; Deurenberg, Paul; Calame, Wim; van den Heuvel, Ellen G H M; van Beusekom, Christien; Hautvast, Jo; Sandjaja; Bee Koon, Poh; Rojroongwasinkul, Nipa; Le Nguyen, Bao Khanh; Parikh, Panam; Khouw, Ilse

    2013-09-01

    Nutrition is a well-known factor in the growth, health and development of children. It is also acknowledged that worldwide many people have dietary imbalances resulting in over- or undernutrition. In 2009, the multinational food company FrieslandCampina initiated the South East Asian Nutrition Survey (SEANUTS), a combination of surveys carried out in Indonesia, Malaysia, Thailand and Vietnam, to get a better insight into these imbalances. The present study describes the general study design and methodology, as well as some problems and pitfalls encountered. In each of these countries, participants in the age range of 0·5-12 years were recruited according to a multistage cluster randomised or stratified random sampling methodology. Field teams took care of recruitment and data collection. For the health status of children, growth and body composition, physical activity, bone density, and development and cognition were measured. For nutrition, food intake and food habits were assessed by questionnaires, whereas in subpopulations blood and urine samples were collected to measure the biochemical status parameters of Fe, vitamins A and D, and DHA. In Thailand, the researchers additionally studied the lipid profile in blood, whereas in Indonesia iodine excretion in urine was analysed. Biochemical data were analysed in certified laboratories. Study protocols and methodology were aligned where practically possible. In December 2011, data collection was finalised. In total, 16,744 children participated in the present study. Information that will be very relevant for formulating nutritional health policies, as well as for designing innovative food and nutrition research and development programmes, has become available.

  16. A Systematic Review of Published Respondent-Driven Sampling Surveys Collecting Behavioral and Biologic Data.

    Science.gov (United States)

    Johnston, Lisa G; Hakim, Avi J; Dittrich, Samantha; Burnett, Janet; Kim, Evelyn; White, Richard G

    2016-08-01

    Reporting key details of respondent-driven sampling (RDS) survey implementation and analysis is essential for assessing the quality of RDS surveys. RDS is both a recruitment and analytic method and, as such, it is important to adequately describe both aspects in publications. We extracted data from peer-reviewed literature published through September, 2013 that reported collected biological specimens using RDS. We identified 151 eligible peer-reviewed articles describing 222 surveys conducted in seven regions throughout the world. Most published surveys reported basic implementation information such as survey city, country, year, population sampled, interview method, and final sample size. However, many surveys did not report essential methodological and analytical information for assessing RDS survey quality, including number of recruitment sites, seeds at start and end, maximum number of waves, and whether data were adjusted for network size. Understanding the quality of data collection and analysis in RDS is useful for effectively planning public health service delivery and funding priorities.

  17. Optimum strata boundaries and sample sizes in health surveys using auxiliary variables.

    Science.gov (United States)

    Reddy, Karuna Garan; Khan, Mohammad G M; Khan, Sabiha

    2018-01-01

    Using convenient stratification criteria such as geographical regions or other natural conditions like age, gender, etc., is not beneficial in order to maximize the precision of the estimates of variables of interest. Thus, one has to look for an efficient stratification design to divide the whole population into homogeneous strata that achieves higher precision in the estimation. In this paper, a procedure for determining Optimum Stratum Boundaries (OSB) and Optimum Sample Sizes (OSS) for each stratum of a variable of interest in health surveys is developed. The determination of OSB and OSS based on the study variable is not feasible in practice since the study variable is not available prior to the survey. Since many variables in health surveys are generally skewed, the proposed technique considers the readily-available auxiliary variables to determine the OSB and OSS. This stratification problem is formulated into a Mathematical Programming Problem (MPP) that seeks minimization of the variance of the estimated population parameter under Neyman allocation. It is then solved for the OSB by using a dynamic programming (DP) technique. A numerical example with a real data set of a population, aiming to estimate the Haemoglobin content in women in a national Iron Deficiency Anaemia survey, is presented to illustrate the procedure developed in this paper. Upon comparisons with other methods available in literature, results reveal that the proposed approach yields a substantial gain in efficiency over the other methods. A simulation study also reveals similar results.

  18. THE DEEP2 GALAXY REDSHIFT SURVEY: DESIGN, OBSERVATIONS, DATA REDUCTION, AND REDSHIFTS

    International Nuclear Information System (INIS)

    Newman, Jeffrey A.; Cooper, Michael C.; Davis, Marc; Faber, S. M.; Guhathakurta, Puragra; Koo, David C.; Phillips, Andrew C.; Conroy, Charlie; Harker, Justin J.; Lai, Kamson; Coil, Alison L.; Dutton, Aaron A.; Finkbeiner, Douglas P.; Gerke, Brian F.; Rosario, David J.; Weiner, Benjamin J.; Willmer, C. N. A.; Yan Renbin; Kassin, Susan A.; Konidaris, N. P.

    2013-01-01

    We describe the design and data analysis of the DEEP2 Galaxy Redshift Survey, the densest and largest high-precision redshift survey of galaxies at z ∼ 1 completed to date. The survey was designed to conduct a comprehensive census of massive galaxies, their properties, environments, and large-scale structure down to absolute magnitude M B = –20 at z ∼ 1 via ∼90 nights of observation on the Keck telescope. The survey covers an area of 2.8 deg 2 divided into four separate fields observed to a limiting apparent magnitude of R AB = 24.1. Objects with z ∼ 0.7 to be targeted ∼2.5 times more efficiently than in a purely magnitude-limited sample. Approximately 60% of eligible targets are chosen for spectroscopy, yielding nearly 53,000 spectra and more than 38,000 reliable redshift measurements. Most of the targets that fail to yield secure redshifts are blue objects that lie beyond z ∼ 1.45, where the [O II] 3727 Å doublet lies in the infrared. The DEIMOS 1200 line mm –1 grating used for the survey delivers high spectral resolution (R ∼ 6000), accurate and secure redshifts, and unique internal kinematic information. Extensive ancillary data are available in the DEEP2 fields, particularly in the Extended Groth Strip, which has evolved into one of the richest multiwavelength regions on the sky. This paper is intended as a handbook for users of the DEEP2 Data Release 4, which includes all DEEP2 spectra and redshifts, as well as for the DEEP2 DEIMOS data reduction pipelines. Extensive details are provided on object selection, mask design, biases in target selection and redshift measurements, the spec2d two-dimensional data-reduction pipeline, the spec1d automated redshift pipeline, and the zspec visual redshift verification process, along with examples of instrumental signatures or other artifacts that in some cases remain after data reduction. Redshift errors and catastrophic failure rates are assessed through more than 2000 objects with duplicate

  19. Complex surveys analysis of categorical data

    CERN Document Server

    Mukhopadhyay, Parimal

    2016-01-01

    The primary objective of this book is to study some of the research topics in the area of analysis of complex surveys which have not been covered in any book yet. It discusses the analysis of categorical data using three models: a full model, a log-linear model and a logistic regression model. It is a valuable resource for survey statisticians and practitioners in the field of sociology, biology, economics, psychology and other areas who have to use these procedures in their day-to-day work. It is also useful for courses on sampling and complex surveys at the upper-undergraduate and graduate levels. The importance of sample surveys today cannot be overstated. From voters’ behaviour to fields such as industry, agriculture, economics, sociology, psychology, investigators generally resort to survey sampling to obtain an assessment of the behaviour of the population they are interested in. Many large-scale sample surveys collect data using complex survey designs like multistage stratified cluster designs. The o...

  20. Sample size methodology

    CERN Document Server

    Desu, M M

    2012-01-01

    One of the most important problems in designing an experiment or a survey is sample size determination and this book presents the currently available methodology. It includes both random sampling from standard probability distributions and from finite populations. Also discussed is sample size determination for estimating parameters in a Bayesian setting by considering the posterior distribution of the parameter and specifying the necessary requirements. The determination of the sample size is considered for ranking and selection problems as well as for the design of clinical trials. Appropria

  1. Performance of small cluster surveys and the clustered LQAS design to estimate local-level vaccination coverage in Mali.

    Science.gov (United States)

    Minetti, Andrea; Riera-Montes, Margarita; Nackers, Fabienne; Roederer, Thomas; Koudika, Marie Hortense; Sekkenes, Johanne; Taconet, Aurore; Fermon, Florence; Touré, Albouhary; Grais, Rebecca F; Checchi, Francesco

    2012-10-12

    Estimation of vaccination coverage at the local level is essential to identify communities that may require additional support. Cluster surveys can be used in resource-poor settings, when population figures are inaccurate. To be feasible, cluster samples need to be small, without losing robustness of results. The clustered LQAS (CLQAS) approach has been proposed as an alternative, as smaller sample sizes are required. We explored (i) the efficiency of cluster surveys of decreasing sample size through bootstrapping analysis and (ii) the performance of CLQAS under three alternative sampling plans to classify local VC, using data from a survey carried out in Mali after mass vaccination against meningococcal meningitis group A. VC estimates provided by a 10 × 15 cluster survey design were reasonably robust. We used them to classify health areas in three categories and guide mop-up activities: i) health areas not requiring supplemental activities; ii) health areas requiring additional vaccination; iii) health areas requiring further evaluation. As sample size decreased (from 10 × 15 to 10 × 3), standard error of VC and ICC estimates were increasingly unstable. Results of CLQAS simulations were not accurate for most health areas, with an overall risk of misclassification greater than 0.25 in one health area out of three. It was greater than 0.50 in one health area out of two under two of the three sampling plans. Small sample cluster surveys (10 × 15) are acceptably robust for classification of VC at local level. We do not recommend the CLQAS method as currently formulated for evaluating vaccination programmes.

  2. Design and Validation of the Quantum Mechanics Conceptual Survey

    Science.gov (United States)

    McKagan, S. B.; Perkins, K. K.; Wieman, C. E.

    2010-01-01

    The Quantum Mechanics Conceptual Survey (QMCS) is a 12-question survey of students' conceptual understanding of quantum mechanics. It is intended to be used to measure the relative effectiveness of different instructional methods in modern physics courses. In this paper, we describe the design and validation of the survey, a process that included…

  3. Semi-automatic surface sediment sampling system - A prototype to be implemented in bivalve fishing surveys

    Science.gov (United States)

    Rufino, Marta M.; Baptista, Paulo; Pereira, Fábio; Gaspar, Miguel B.

    2018-01-01

    In the current work we propose a new method to sample surface sediment during bivalve fishing surveys. Fishing institutes all around the word carry out regular surveys with the aim of monitoring the stocks of commercial species. These surveys comprise often more than one hundred of sampling stations and cover large geographical areas. Although superficial sediment grain sizes are among the main drivers of benthic communities and provide crucial information for studies on coastal dynamics, overall there is a strong lack of this type of data, possibly, because traditional surface sediment sampling methods use grabs, that require considerable time and effort to be carried out on regular basis or on large areas. In face of these aspects, we developed an easy and un-expensive method to sample superficial sediments, during bivalve fisheries monitoring surveys, without increasing survey time or human resources. The method was successfully evaluated and validated during a typical bivalve survey carried out on the Northwest coast of Portugal, confirming that it had any interference with the survey objectives. Furthermore, the method was validated by collecting samples using a traditional Van Veen grabs (traditional method), which showed a similar grain size composition to the ones collected by the new method, on the same localities. We recommend that the procedure is implemented on regular bivalve fishing surveys, together with an image analysis system to analyse the collected samples. The new method will provide substantial quantity of data on surface sediment in coastal areas, using a non-expensive and efficient manner, with a high potential application in different fields of research.

  4. Survey of statistical and sampling needs for environmental monitoring of commercial low-level radioactive waste disposal facilities

    International Nuclear Information System (INIS)

    Eberhardt, L.L.; Thomas, J.M.

    1986-07-01

    This project was designed to develop guidance for implementing 10 CFR Part 61 and to determine the overall needs for sampling and statistical work in characterizing, surveying, monitoring, and closing commercial low-level waste sites. When cost-effectiveness and statistical reliability are of prime importance, then double sampling, compositing, and stratification (with optimal allocation) are identified as key issues. If the principal concern is avoiding questionable statistical practice, then the applicability of kriging (for assessing spatial pattern), methods for routine monitoring, and use of standard textbook formulae in reporting monitoring results should be reevaluated. Other important issues identified include sampling for estimating model parameters and the use of data from left-censored (less than detectable limits) distributions

  5. Mode Equivalence of Health Indicators Between Data Collection Modes and Mixed-Mode Survey Designs in Population-Based Health Interview Surveys for Children and Adolescents: Methodological Study.

    Science.gov (United States)

    Mauz, Elvira; Hoffmann, Robert; Houben, Robin; Krause, Laura; Kamtsiuris, Panagiotis; Gößwald, Antje

    2018-03-05

    The implementation of an Internet option in an existing public health interview survey using a mixed-mode design is attractive because of lower costs and faster data availability. Additionally, mixed-mode surveys can increase response rates and improve sample composition. However, mixed-mode designs can increase the risk of measurement error (mode effects). This study aimed to determine whether the prevalence rates or mean values of self- and parent-reported health indicators for children and adolescents aged 0-17 years differ between self-administered paper-based questionnaires (SAQ-paper) and self-administered Web-based questionnaires (SAQ-Web), as well as between a single-mode control group and different mixed-mode groups. Data were collected for a methodological pilot of the third wave of the "German Health Interview and Examination Survey for Children and Adolescents". Questionnaires were completed by parents or adolescents. A population-based sample of 11,140 children and adolescents aged 0-17 years was randomly allocated to 4 survey designs-a single-mode control group with paper-and-pencil questionnaires only (n=970 parents, n=343 adolescents)-and 3 mixed-mode designs, all of which offered Web-based questionnaire options. In the concurrent mixed-mode design, both questionnaires were offered at the same time (n=946 parents, n=290 adolescents); in the sequential mixed-mode design, the SAQ-Web was sent first, followed by the paper questionnaire along with a reminder (n=854 parents, n=269 adolescents); and in the preselect mixed-mode design, both options were offered and the respondents were asked to request the desired type of questionnaire (n=698 parents, n=292 adolescents). In total, 3468 questionnaires of parents of children aged 0-17 years (SAQ-Web: n=708; SAQ-paper: n=2760) and 1194 questionnaires of adolescents aged 11-17 years (SAQ-Web: n=299; SAQ-paper: n=895) were analyzed. Sociodemographic characteristics and a broad range of health indicators for

  6. Design and methodology of a mixed methods follow-up study to the 2014 Ghana Demographic and Health Survey.

    Science.gov (United States)

    Staveteig, Sarah; Aryeetey, Richmond; Anie-Ansah, Michael; Ahiadeke, Clement; Ortiz, Ladys

    2017-01-01

    The intended meaning behind responses to standard questions posed in large-scale health surveys are not always well understood. Systematic follow-up studies, particularly those which pose a few repeated questions followed by open-ended discussions, are well positioned to gauge stability and consistency of data and to shed light on the intended meaning behind survey responses. Such follow-up studies require extensive coordination and face challenges in protecting respondent confidentiality during the process of recontacting and reinterviewing participants. We describe practical field strategies for undertaking a mixed methods follow-up study during a large-scale health survey. The study was designed as a mixed methods follow-up study embedded within the 2014 Ghana Demographic and Health Survey (GDHS). The study was implemented in 13 clusters. Android tablets were used to import reference data from the parent survey and to administer the questionnaire, which asked a mixture of closed- and open-ended questions on reproductive intentions, decision-making, and family planning. Despite a number of obstacles related to recontacting respondents and concern about respondent fatigue, over 92 percent of the selected sub-sample were successfully recontacted and reinterviewed; all consented to audio recording. A confidential linkage between GDHS data, follow-up tablet data, and audio transcripts was successfully created for the purpose of analysis. We summarize the challenges in follow-up study design, including ethical considerations, sample size, auditing, filtering, successful use of tablets, and share lessons learned for future such follow-up surveys.

  7. Sampling guidelines for oral fluid-based surveys of group-housed animals.

    Science.gov (United States)

    Rotolo, Marisa L; Sun, Yaxuan; Wang, Chong; Giménez-Lirola, Luis; Baum, David H; Gauger, Phillip C; Harmon, Karen M; Hoogland, Marlin; Main, Rodger; Zimmerman, Jeffrey J

    2017-09-01

    Formulas and software for calculating sample size for surveys based on individual animal samples are readily available. However, sample size formulas are not available for oral fluids and other aggregate samples that are increasingly used in production settings. Therefore, the objective of this study was to develop sampling guidelines for oral fluid-based porcine reproductive and respiratory syndrome virus (PRRSV) surveys in commercial swine farms. Oral fluid samples were collected in 9 weekly samplings from all pens in 3 barns on one production site beginning shortly after placement of weaned pigs. Samples (n=972) were tested by real-time reverse-transcription PCR (RT-rtPCR) and the binary results analyzed using a piecewise exponential survival model for interval-censored, time-to-event data with misclassification. Thereafter, simulation studies were used to study the barn-level probability of PRRSV detection as a function of sample size, sample allocation (simple random sampling vs fixed spatial sampling), assay diagnostic sensitivity and specificity, and pen-level prevalence. These studies provided estimates of the probability of detection by sample size and within-barn prevalence. Detection using fixed spatial sampling was as good as, or better than, simple random sampling. Sampling multiple barns on a site increased the probability of detection with the number of barns sampled. These results are relevant to PRRSV control or elimination projects at the herd, regional, or national levels, but the results are also broadly applicable to contagious pathogens of swine for which oral fluid tests of equivalent performance are available. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  8. Samples and data accessibility in research biobanks: an explorative survey

    Directory of Open Access Journals (Sweden)

    Marco Capocasa

    2016-02-01

    Full Text Available Biobanks, which contain human biological samples and/or data, provide a crucial contribution to the progress of biomedical research. However, the effective and efficient use of biobank resources depends on their accessibility. In fact, making bio-resources promptly accessible to everybody may increase the benefits for society. Furthermore, optimizing their use and ensuring their quality will promote scientific creativity and, in general, contribute to the progress of bio-medical research. Although this has become a rather common belief, several laboratories are still secretive and continue to withhold samples and data. In this study, we conducted a questionnaire-based survey in order to investigate sample and data accessibility in research biobanks operating all over the world. The survey involved a total of 46 biobanks. Most of them gave permission to access their samples (95.7% and data (85.4%, but free and unconditioned accessibility seemed not to be common practice. The analysis of the guidelines regarding the accessibility to resources of the biobanks that responded to the survey highlights three issues: (i the request for applicants to explain what they would like to do with the resources requested; (ii the role of funding, public or private, in the establishment of fruitful collaborations between biobanks and research labs; (iii the request of co-authorship in order to give access to their data. These results suggest that economic and academic aspects are involved in determining the extent of sample and data sharing stored in biobanks. As a second step of this study, we investigated the reasons behind the high diversity of requirements to access biobank resources. The analysis of informative answers suggested that the different modalities of resource accessibility seem to be largely influenced by both social context and legislation of the countries where the biobanks operate.

  9. Performance of small cluster surveys and the clustered LQAS design to estimate local-level vaccination coverage in Mali

    Directory of Open Access Journals (Sweden)

    Minetti Andrea

    2012-10-01

    Full Text Available Abstract Background Estimation of vaccination coverage at the local level is essential to identify communities that may require additional support. Cluster surveys can be used in resource-poor settings, when population figures are inaccurate. To be feasible, cluster samples need to be small, without losing robustness of results. The clustered LQAS (CLQAS approach has been proposed as an alternative, as smaller sample sizes are required. Methods We explored (i the efficiency of cluster surveys of decreasing sample size through bootstrapping analysis and (ii the performance of CLQAS under three alternative sampling plans to classify local VC, using data from a survey carried out in Mali after mass vaccination against meningococcal meningitis group A. Results VC estimates provided by a 10 × 15 cluster survey design were reasonably robust. We used them to classify health areas in three categories and guide mop-up activities: i health areas not requiring supplemental activities; ii health areas requiring additional vaccination; iii health areas requiring further evaluation. As sample size decreased (from 10 × 15 to 10 × 3, standard error of VC and ICC estimates were increasingly unstable. Results of CLQAS simulations were not accurate for most health areas, with an overall risk of misclassification greater than 0.25 in one health area out of three. It was greater than 0.50 in one health area out of two under two of the three sampling plans. Conclusions Small sample cluster surveys (10 × 15 are acceptably robust for classification of VC at local level. We do not recommend the CLQAS method as currently formulated for evaluating vaccination programmes.

  10. Musculoskeletal impairment survey in Rwanda: Design of survey tool, survey methodology, and results of the pilot study (a cross sectional survey

    Directory of Open Access Journals (Sweden)

    Simms Victoria

    2007-03-01

    Full Text Available Abstract Background Musculoskeletal impairment (MSI is an important cause of morbidity and mortality worldwide, especially in developing countries. Prevalence studies for MSI in the developing world have used varying methodologies and are seldom directly comparable. This study aimed to develop a new tool to screen for and diagnose MSI and to pilot test the methodology for a national survey in Rwanda. Methods A 7 question screening tool to identify cases of MSI was developed through literature review and discussions with healthcare professionals. To validate the tool, trained rehabilitation technicians screened 93 previously identified gold standard 'cases' and 86 'non cases'. Sensitivity, specificity and positive predictive value were calculated. A standardised examination protocol was developed to determine the aetiology and diagnosis of MSI for those who fail the screening test. For the national survey in Rwanda, multistage cluster random sampling, with probability proportional to size procedures will be used for selection of a cross-sectional, nationally representative sample of the population. Households to be surveyed will be chosen through compact segment sampling and all individuals within chosen households will be screened. A pilot survey of 680 individuals was conducted using the protocol. Results: The screening tool demonstrated 99% sensitivity and 97% specificity for MSI, and a positive predictive value of 98%. During the pilot study 468 out of 680 eligible subjects (69% were screened. 45 diagnoses were identified in 38 persons who were cases of MSI. The subjects were grouped into categories based on diagnostic subgroups of congenital (1, traumatic (17, infective (2 neurological (6 and other acquired(19. They were also separated into mild (42.1%, moderate (42.1% and severe (15.8% cases, using an operational definition derived from the World Health Organisation's International Classification of Functioning, Disability and Health

  11. Evaluating the quality of sampling frames used in European cross-national surveys

    NARCIS (Netherlands)

    Maineri, A.M.; Scherpenzeel, A.; Bristle, Johanna; Pflüger, Senta-Melissa; Butt, Sarah; Zins, Stefan; Emery, Tom; Luijkx, R.

    This report addresses the quality of the population registers which are currently being used as sampling frames in countries participating in the four cross-European surveys cooperating in SERISS: the European Social Survey (ESS), the European Values Study (EVS), the Gender and Generations Program

  12. Single-Phase Mail Survey Design for Rare Population Subgroups

    Science.gov (United States)

    Brick, J. Michael; Andrews, William R.; Mathiowetz, Nancy A.

    2016-01-01

    Although using random digit dialing (RDD) telephone samples was the preferred method for conducting surveys of households for many years, declining response and coverage rates have led researchers to explore alternative approaches. The use of address-based sampling (ABS) has been examined for sampling the general population and subgroups, most…

  13. On efficiency of some ratio estimators in double sampling design ...

    African Journals Online (AJOL)

    In this paper, three sampling ratio estimators in double sampling design were proposed with the intention of finding an alternative double sampling design estimator to the conventional ratio estimator in double sampling design discussed by Cochran (1997), Okafor (2002) , Raj (1972) and Raj and Chandhok (1999).

  14. [Comparison study on sampling methods of Oncomelania hupensis snail survey in marshland schistosomiasis epidemic areas in China].

    Science.gov (United States)

    An, Zhao; Wen-Xin, Zhang; Zhong, Yao; Yu-Kuan, Ma; Qing, Liu; Hou-Lang, Duan; Yi-di, Shang

    2016-06-29

    To optimize and simplify the survey method of Oncomelania hupensis snail in marshland endemic region of schistosomiasis and increase the precision, efficiency and economy of the snail survey. A quadrate experimental field was selected as the subject of 50 m×50 m size in Chayegang marshland near Henghu farm in the Poyang Lake region and a whole-covered method was adopted to survey the snails. The simple random sampling, systematic sampling and stratified random sampling methods were applied to calculate the minimum sample size, relative sampling error and absolute sampling error. The minimum sample sizes of the simple random sampling, systematic sampling and stratified random sampling methods were 300, 300 and 225, respectively. The relative sampling errors of three methods were all less than 15%. The absolute sampling errors were 0.221 7, 0.302 4 and 0.047 8, respectively. The spatial stratified sampling with altitude as the stratum variable is an efficient approach of lower cost and higher precision for the snail survey.

  15. Does self-selection affect samples' representativeness in online surveys? An investigation in online video game research.

    Science.gov (United States)

    Khazaal, Yasser; van Singer, Mathias; Chatton, Anne; Achab, Sophia; Zullino, Daniele; Rothen, Stephane; Khan, Riaz; Billieux, Joel; Thorens, Gabriel

    2014-07-07

    The number of medical studies performed through online surveys has increased dramatically in recent years. Despite their numerous advantages (eg, sample size, facilitated access to individuals presenting stigmatizing issues), selection bias may exist in online surveys. However, evidence on the representativeness of self-selected samples in online studies is patchy. Our objective was to explore the representativeness of a self-selected sample of online gamers using online players' virtual characters (avatars). All avatars belonged to individuals playing World of Warcraft (WoW), currently the most widely used online game. Avatars' characteristics were defined using various games' scores, reported on the WoW's official website, and two self-selected samples from previous studies were compared with a randomly selected sample of avatars. We used scores linked to 1240 avatars (762 from the self-selected samples and 478 from the random sample). The two self-selected samples of avatars had higher scores on most of the assessed variables (except for guild membership and exploration). Furthermore, some guilds were overrepresented in the self-selected samples. Our results suggest that more proficient players or players more involved in the game may be more likely to participate in online surveys. Caution is needed in the interpretation of studies based on online surveys that used a self-selection recruitment procedure. Epidemiological evidence on the reduced representativeness of sample of online surveys is warranted.

  16. Sampling strategy for a large scale indoor radiation survey - a pilot project

    International Nuclear Information System (INIS)

    Strand, T.; Stranden, E.

    1986-01-01

    Optimisation of a stratified random sampling strategy for large scale indoor radiation surveys is discussed. It is based on the results from a small scale pilot project where variances in dose rates within different categories of houses were assessed. By selecting a predetermined precision level for the mean dose rate in a given region, the number of measurements needed can be optimised. The results of a pilot project in Norway are presented together with the development of the final sampling strategy for a planned large scale survey. (author)

  17. A sample of galaxy pairs identified from the LAMOST spectral survey and the Sloan Digital Sky Survey

    International Nuclear Information System (INIS)

    Shen, Shi-Yin; Argudo-Fernández, Maria; Chen, Li; Feng, Shuai; Hou, Jin-Liang; Shao, Zheng-Yi; Chen, Xiao-Yan; Luo, A-Li; Wu, Hong; Yang, Hai-Feng; Yang, Ming; Hou, Yong-Hui; Wang, Yue-Fei; Jiang, Peng; Wang, Ting-Gui; Jing, Yi-Peng; Kong, Xu; Wang, Wen-Ting; Luo, Zhi-Jian; Wu, Xue-Bing

    2016-01-01

    A small fraction (< 10%) of the SDSS main galaxy (MG) sample has not been targeted with spectroscopy due to the effect of fiber collisions. These galaxies have been compiled into the input catalog of the LAMOST ExtraGAlactic Surveys and named the complementary galaxy sample. In this paper, we introduce this project and status of the spectroscopies associated with the complementary galaxies in the first two years of the LAMOST spectral survey (till Sep. of 2014). Moreover, we present a sample of 1102 galaxy pairs identified from the LAMOST complementary galaxies and SDSS MGs, which are defined as two members that have a projected distance smaller than 100 h −1 70 kpc and a recessional velocity difference smaller than 500 km s −1 . Compared with galaxy pairs that are only selected from SDSS, the LAMOST-SDSS pairs have the advantages of not being biased toward large separations and therefore act as a useful supplement in statistical studies of galaxy interaction and galaxy merging. (paper)

  18. Design compliance matrix waste sample container filling system for nested, fixed-depth sampling system

    International Nuclear Information System (INIS)

    BOGER, R.M.

    1999-01-01

    This design compliance matrix document provides specific design related functional characteristics, constraints, and requirements for the container filling system that is part of the nested, fixed-depth sampling system. This document addresses performance, external interfaces, ALARA, Authorization Basis, environmental and design code requirements for the container filling system. The container filling system will interface with the waste stream from the fluidic pumping channels of the nested, fixed-depth sampling system and will fill containers with waste that meet the Resource Conservation and Recovery Act (RCRA) criteria for waste that contains volatile and semi-volatile organic materials. The specifications for the nested, fixed-depth sampling system are described in a Level 2 Specification document (HNF-3483, Rev. 1). The basis for this design compliance matrix document is the Tank Waste Remediation System (TWRS) desk instructions for design Compliance matrix documents (PI-CP-008-00, Rev. 0)

  19. Sample survey methods as a quality assurance tool in a general practice immunisation audit.

    Science.gov (United States)

    Cullen, R

    1994-04-27

    In a multidoctor family practice there are often just too many sets of patients records to make it practical to repeat an audit by census of even an age band of the practice on a regular basis. This paper attempts to demonstrate how sample survey methodology can be incorporated into the quality assurance cycle. A simple random sample (with replacement) of 120 from 580 children with permanent records who were aged between 6 weeks and 2 years old from an Auckland general practice was performed, with sample size selected to give a predetermined precision. The survey was then repeated after 4 weeks. Both surveys were able to be completed within the course of a normal working day. An unexpectedly low level of under 2 years olds that were recorded as not overdue for any immunisations was found (22.5%) with only a modest improvement after a standard telephone/letter catch up campaign. Seventy-two percent of the sample held a group one community services card. The advantages of properly conducted sample surveys in producing useful estimates of known precision without disrupting office routines excessively were demonstrated. Through some attention to methodology, the trauma of a practice census can be avoided.

  20. The U.S. Geological Survey Geologic Collections Management System (GCMS)—A master catalog and collections management plan for U.S. Geological Survey geologic samples and sample collections

    Science.gov (United States)

    ,

    2015-01-01

    The U.S. Geological Survey (USGS) is widely recognized in the earth science community as possessing extensive collections of earth materials collected by research personnel over the course of its history. In 2006, a Geologic Collections Inventory was conducted within the USGS Geology Discipline to determine the extent and nature of its sample collections, and in 2008, a working group was convened by the USGS National Geologic and Geophysical Data Preservation Program to examine ways in which these collections could be coordinated, cataloged, and made available to researchers both inside and outside the USGS. The charge to this working group was to evaluate the proposition of creating a Geologic Collections Management System (GCMS), a centralized database that would (1) identify all existing USGS geologic collections, regardless of size, (2) create a virtual link among the collections, and (3) provide a way for scientists and other researchers to obtain access to the samples and data in which they are interested. Additionally, the group was instructed to develop criteria for evaluating current collections and to establish an operating plan and set of standard practices for handling, identifying, and managing future sample collections. Policies and procedures promoted by the GCMS would be based on extant best practices established by the National Science Foundation and the Smithsonian Institution. The resulting report—USGS Circular 1410, “The U.S. Geological Survey Geologic Collections Management System (GCMS): A Master Catalog and Collections Management Plan for U.S. Geological Survey Geologic Samples and Sample Collections”—has been developed for sample repositories to be a guide to establishing common practices in the collection, retention, and disposal of geologic research materials throughout the USGS.

  1. On Survey Data Analysis in Corporate Finance

    OpenAIRE

    Serita, Toshio

    2008-01-01

    Recently, survey data analysis has emerged as a new method for testing hypotheses andfor clarifying the relative importance of different factors in corporate finance decisions. This paper investigates the advantages and drawbacks of survey data analysis, methodology of survey data analysis such as questionnaire design, and analytical methods for survey data, incomparison with traditional large sample analysis. We show that survey data analysis does not replace traditional large sample analysi...

  2. Sampling design and procedures for fixed surface-water sites in the Georgia-Florida coastal plain study unit, 1993

    Science.gov (United States)

    Hatzell, H.H.; Oaksford, E.T.; Asbury, C.E.

    1995-01-01

    The implementation of design guidelines for the National Water-Quality Assessment (NAWQA) Program has resulted in the development of new sampling procedures and the modification of existing procedures commonly used in the Water Resources Division of the U.S. Geological Survey. The Georgia-Florida Coastal Plain (GAFL) study unit began the intensive data collection phase of the program in October 1992. This report documents the implementation of the NAWQA guidelines by describing the sampling design and procedures for collecting surface-water samples in the GAFL study unit in 1993. This documentation is provided for agencies that use water-quality data and for future study units that will be entering the intensive phase of data collection. The sampling design is intended to account for large- and small-scale spatial variations, and temporal variations in water quality for the study area. Nine fixed sites were selected in drainage basins of different sizes and different land-use characteristics located in different land-resource provinces. Each of the nine fixed sites was sampled regularly for a combination of six constituent groups composed of physical and chemical constituents: field measurements, major ions and metals, nutrients, organic carbon, pesticides, and suspended sediments. Some sites were also sampled during high-flow conditions and storm events. Discussion of the sampling procedure is divided into three phases: sample collection, sample splitting, and sample processing. A cone splitter was used to split water samples for the analysis of the sampling constituent groups except organic carbon from approximately nine liters of stream water collected at four fixed sites that were sampled intensively. An example of the sample splitting schemes designed to provide the sample volumes required for each sample constituent group is described in detail. Information about onsite sample processing has been organized into a flowchart that describes a pathway for each of

  3. Use of methods for specifying the target difference in randomised controlled trial sample size calculations: Two surveys of trialists' practice.

    Science.gov (United States)

    Cook, Jonathan A; Hislop, Jennifer M; Altman, Doug G; Briggs, Andrew H; Fayers, Peter M; Norrie, John D; Ramsay, Craig R; Harvey, Ian M; Vale, Luke D

    2014-06-01

    Central to the design of a randomised controlled trial (RCT) is a calculation of the number of participants needed. This is typically achieved by specifying a target difference, which enables the trial to identify a difference of a particular magnitude should one exist. Seven methods have been proposed for formally determining what the target difference should be. However, in practice, it may be driven by convenience or some other informal basis. It is unclear how aware the trialist community is of these formal methods or whether they are used. To determine current practice regarding the specification of the target difference by surveying trialists. Two surveys were conducted: (1) Members of the Society for Clinical Trials (SCT): participants were invited to complete an online survey through the society's email distribution list. Respondents were asked about their awareness, use of, and willingness to recommend methods; (2) Leading UK- and Ireland-based trialists: the survey was sent to UK Clinical Research Collaboration registered Clinical Trials Units, Medical Research Council UK Hubs for Trial Methodology Research, and the Research Design Services of the National Institute for Health Research. This survey also included questions about the most recent trial developed by the respondent's group. Survey 1: Of the 1182 members on the SCT membership email distribution list, 180 responses were received (15%). Awareness of methods ranged from 69 (38%) for health economic methods to 162 (90%) for pilot study. Willingness to recommend among those who had used a particular method ranged from 56% for the opinion-seeking method to 89% for the review of evidence-base method. Survey 2: Of the 61 surveys sent out, 34 (56%) responses were received. Awareness of methods ranged from 33 (97%) for the review of evidence-base and pilot methods to 14 (41%) for the distribution method. The highest level of willingness to recommend among users was for the anchor method (87%). Based upon

  4. THE ALFALFA H α SURVEY. I. PROJECT DESCRIPTION AND THE LOCAL STAR FORMATION RATE DENSITY FROM THE FALL SAMPLE

    International Nuclear Information System (INIS)

    Sistine, Angela Van; Salzer, John J.; Janowiecki, Steven; Sugden, Arthur; Giovanelli, Riccardo; Haynes, Martha P.; Jaskot, Anne E.; Wilcots, Eric M.

    2016-01-01

    The ALFALFA H α survey utilizes a large sample of H i-selected galaxies from the ALFALFA survey to study star formation (SF) in the local universe. ALFALFA H α contains 1555 galaxies with distances between ∼20 and ∼100 Mpc. We have obtained continuum-subtracted narrowband H α images and broadband R images for each galaxy, creating one of the largest homogeneous sets of H α images ever assembled. Our procedures were designed to minimize the uncertainties related to the calculation of the local SF rate density (SFRD). The galaxy sample we constructed is as close to volume-limited as possible, is a robust statistical sample, and spans a wide range of galaxy environments. In this paper, we discuss the properties of our Fall sample of 565 galaxies, our procedure for deriving individual galaxy SF rates, and our method for calculating the local SFRD. We present a preliminary value of log(SFRD[ M ⊙ yr −1 Mpc −3 ]) = −1.747 ± 0.018 (random) ±0.05 (systematic) based on the 565 galaxies in our Fall sub-sample. Compared to the weighted average of SFRD values around z ≈ 2, our local value indicates a drop in the global SFRD of a factor of 10.2 over that lookback time.

  5. 36 CFR 9.42 - Well records and reports, plots and maps, samples, tests and surveys.

    Science.gov (United States)

    2010-07-01

    ... Well records and reports, plots and maps, samples, tests and surveys. Any technical data gathered... 36 Parks, Forests, and Public Property 1 2010-07-01 2010-07-01 false Well records and reports, plots and maps, samples, tests and surveys. 9.42 Section 9.42 Parks, Forests, and Public Property...

  6. System design description for sampling fuel in K basins

    International Nuclear Information System (INIS)

    Baker, R.B.

    1996-01-01

    This System Design Description provides: (1) statements of the Spent Nuclear Fuel Projects (SNFP) needs requiring sampling of fuel in the K East and K West Basins, (2) the sampling equipment functions and requirements, (3) a general work plan and the design logic being followed to develop the equipment, and (4) a summary description of the design for the sampling equipment. The report summarizes the integrated application of both the subject equipment and the canister sludge sampler in near-term characterization campaigns at K Basins

  7. Sampling design for use by the soil decontamination project

    International Nuclear Information System (INIS)

    Rutherford, D.W.; Stevens, J.R.

    1981-01-01

    This report proposes a general approach to the problem and discusses sampling of soil to map the contaminated area and to provide samples for characterizaton of soil components and contamination. Basic concepts in sample design are reviewed with reference to environmental transuranic studies. Common designs are reviewed and evaluated for use with specific objectives that might be required by the soil decontamination project. Examples of a hierarchial design pilot study and a combined hierarchial and grid study are proposed for the Rocky Flats 903 pad area

  8. An investigation of the effects of relevant samples and a comparison of verification versus discovery based lab design

    Science.gov (United States)

    Rieben, James C., Jr.

    This study focuses on the effects of relevance and lab design on student learning within the chemistry laboratory environment. A general chemistry conductivity of solutions experiment and an upper level organic chemistry cellulose regeneration experiment were employed. In the conductivity experiment, the two main variables studied were the effect of relevant (or "real world") samples on student learning and a verification-based lab design versus a discovery-based lab design. With the cellulose regeneration experiment, the effect of a discovery-based lab design vs. a verification-based lab design was the sole focus. Evaluation surveys consisting of six questions were used at three different times to assess student knowledge of experimental concepts. In the general chemistry laboratory portion of this study, four experimental variants were employed to investigate the effect of relevance and lab design on student learning. These variants consisted of a traditional (or verification) lab design, a traditional lab design using "real world" samples, a new lab design employing real world samples/situations using unknown samples, and the new lab design using real world samples/situations that were known to the student. Data used in this analysis were collected during the Fall 08, Winter 09, and Fall 09 terms. For the second part of this study a cellulose regeneration experiment was employed to investigate the effects of lab design. A demonstration creating regenerated cellulose "rayon" was modified and converted to an efficient and low-waste experiment. In the first variant students tested their products and verified a list of physical properties. In the second variant, students filled in a blank physical property chart with their own experimental results for the physical properties. Results from the conductivity experiment show significant student learning of the effects of concentration on conductivity and how to use conductivity to differentiate solution types with the

  9. Design and evaluation of representative indoor radon surveys

    International Nuclear Information System (INIS)

    Csige, I.; Csegzi, S.

    2004-01-01

    We have developed a procedure to design and evaluate representative indoor radon surveys. The procedure is based on random sampling of a population of houses and careful statistical analysis of measured indoor radon concentrations. The method is designed to estimate the fraction of houses in which annual average 222 Rn activity concentration may exceed a certain reference level. Measurements of annual average indoor 222 Rn activity concentration were done in sleeping rooms at pillow level using etched track type radon detectors. We applied the above procedure in an old fashioned village and in a fast developing small city in Transylvania, Romania. In the village almost all houses were single floor wooden made houses without cellar built with traditional technology on a geologically uniform area. The distribution of indoor 222 Rn activity concentration in a sample of 115 houses can almost perfectly be fitted with log-normal probability density function. The correlation coefficient of linear fitting on linearized scales was k = -0.9980. The percentages of houses expected to have annual average 222 Rn activity concentration higher than 400 Bq m -3 is less than 1 %, and of those higher than 600 Bq m -3 can be estimated to be around 0.1 %. The small city, on the other hand lies on a geologically inhomogeneous area, and house construction technology has also changed dramatically in past decades. The resulting distribution of measured indoor 222 Rn activity concentration in a sample of 116 houses cannot be fitted with any simple probability density function. Therefore the prediction of the fraction of houses in which the annual average 222 Rn activity concentration may exceed a certain reference level could not be done adequately. With certain assumptions we estimated that the percentages of houses expected to have annual average 222 Rn activity concentration higher than 400 Bq m -3 is between 3 and 7 %, and of those higher than 600 Bq m -3 can be estimated to be between

  10. Sample size reduction in groundwater surveys via sparse data assimilation

    KAUST Repository

    Hussain, Z.; Muhammad, A.

    2013-01-01

    In this paper, we focus on sparse signal recovery methods for data assimilation in groundwater models. The objective of this work is to exploit the commonly understood spatial sparsity in hydrodynamic models and thereby reduce the number of measurements to image a dynamic groundwater profile. To achieve this we employ a Bayesian compressive sensing framework that lets us adaptively select the next measurement to reduce the estimation error. An extension to the Bayesian compressive sensing framework is also proposed which incorporates the additional model information to estimate system states from even lesser measurements. Instead of using cumulative imaging-like measurements, such as those used in standard compressive sensing, we use sparse binary matrices. This choice of measurements can be interpreted as randomly sampling only a small subset of dug wells at each time step, instead of sampling the entire grid. Therefore, this framework offers groundwater surveyors a significant reduction in surveying effort without compromising the quality of the survey. © 2013 IEEE.

  11. Sample size reduction in groundwater surveys via sparse data assimilation

    KAUST Repository

    Hussain, Z.

    2013-04-01

    In this paper, we focus on sparse signal recovery methods for data assimilation in groundwater models. The objective of this work is to exploit the commonly understood spatial sparsity in hydrodynamic models and thereby reduce the number of measurements to image a dynamic groundwater profile. To achieve this we employ a Bayesian compressive sensing framework that lets us adaptively select the next measurement to reduce the estimation error. An extension to the Bayesian compressive sensing framework is also proposed which incorporates the additional model information to estimate system states from even lesser measurements. Instead of using cumulative imaging-like measurements, such as those used in standard compressive sensing, we use sparse binary matrices. This choice of measurements can be interpreted as randomly sampling only a small subset of dug wells at each time step, instead of sampling the entire grid. Therefore, this framework offers groundwater surveyors a significant reduction in surveying effort without compromising the quality of the survey. © 2013 IEEE.

  12. Design and development of multiple sample counting setup

    International Nuclear Information System (INIS)

    Rath, D.P.; Murali, S.; Babu, D.A.R.

    2010-01-01

    Full text: The analysis of active samples on regular basis for ambient air activity and floor contamination from radio chemical lab accounts for major chunk of the operational activity in Health Physicist's responsibility. The requirement for daily air sample analysis on immediate counting and delayed counting from various labs in addition to samples of smear swipe check of lab led to the urge for development of system that could cater multiple sample analysis in a time programmed manner on a single sample loading. A multiple alpha/beta counting system for counting was designed and fabricated. It has arrangements for loading 10 samples in slots in order, get counted in a time programmed manner with results displayed and records maintained in PC. The paper describes the design and development of multiple sample counting setup presently in use at the facility has resulted in reduction of man-hour consumption in counting and recording of the results

  13. Research Note Pilot survey to assess sample size for herbaceous ...

    African Journals Online (AJOL)

    A pilot survey to determine sub-sample size (number of point observations per plot) for herbaceous species composition assessments, using a wheel-point apparatus applying the nearest-plant method, was conducted. Three plots differing in species composition on the Zululand coastal plain were selected, and on each plot ...

  14. Design and operation of the National Survey of Children with Special Health Care Needs, 2009-2010.

    Science.gov (United States)

    Bramlett, Matthew D; Blumberg, Stephen J; Ormson, A Elizabeth; George, Jacquelyn M; Williams, Kim L; Frasier, Alicia M; Skalland, Benjamin J; Santos, Kathleen B; Vsetecka, Danielle M; Morrison, Heather M; Pedlow, Steven; Wang, Fang

    2014-11-01

    This report presents the development, plan, and operation of the 2009-2010 National Survey of Children with Special Health Care Needs, a module of the State and Local Area Integrated Telephone Survey. The survey is conducted by the Centers for Disease Control and Prevention's National Center for Health Statistics. This survey was designed to produce national and state-specific prevalence estimates of children with special health care needs (CSHCN), to describe the types of services that they need and use, and to assess aspects of the system of care for CSHCN. A random-digit-dial sample of households with children under age 18 years was constructed for each of the 50 states and the District of Columbia. The sampling frame consisted of landline phone numbers and cellular(cell) phone numbers of households that reported a cell-phone-only or cell-phone-mainly status. Children in identified households were screened for special health care needs. If CSHCN were identified in the household, a detailed interview was conducted for one randomly selected child with special health care needs. Respondents were parents or guardians who knew about the children's health and health care. A total of 196,159 household screening interviews were completed from July 2009 through March 2011, resulting in 40,242 completed special-needs interviews, including 2,991 from cell-phone interviews. The weighted overall response rate was 43.7% for the landline sample, 15.2% for the cell-phone sample, and 25.5% overall. All material appearing in this report is in the public domain and may be reproduced or copied without permission; citation as to source, however, is appreciated.

  15. Large Synoptic Survey Telescope: From Science Drivers to Reference Design

    Energy Technology Data Exchange (ETDEWEB)

    Ivezic, Z.; Axelrod, T.; Brandt, W.N.; Burke, D.L.; Claver, C.F.; Connolly, A.; Cook, K.H.; Gee, P.; Gilmore, D.K.; Jacoby, S.H.; Jones, R.L.; Kahn, S.M.; Kantor, J.P.; Krabbendam, V.; Lupton, R.H.; Monet, D.G.; Pinto, P.A.; Saha, A.; Schalk, T.L.; Schneider, D.P.; Strauss, Michael A.; /Washington U., Seattle, Astron. Dept. /LSST Corp. /Penn State U., Astron. Astrophys. /KIPAC, Menlo Park /NOAO, Tucson /LLNL, Livermore /UC, Davis /Princeton U., Astrophys. Sci. Dept. /Naval Observ., Flagstaff /Arizona U., Astron. Dept. - Steward Observ. /UC, Santa Cruz /Harvard U. /Johns Hopkins U. /Illinois U., Urbana

    2011-10-14

    In the history of astronomy, major advances in our understanding of the Universe have come from dramatic improvements in our ability to accurately measure astronomical quantities. Aided by rapid progress in information technology, current sky surveys are changing the way we view and study the Universe. Next-generation surveys will maintain this revolutionary progress. We focus here on the most ambitious survey currently planned in the visible band, the Large Synoptic Survey Telescope (LSST). LSST will have unique survey capability in the faint time domain. The LSST design is driven by four main science themes: constraining dark energy and dark matter, taking an inventory of the Solar System, exploring the transient optical sky, and mapping the Milky Way. It will be a large, wide-field ground-based system designed to obtain multiple images covering the sky that is visible from Cerro Pachon in Northern Chile. The current baseline design, with an 8.4 m (6.5 m effective) primary mirror, a 9.6 deg{sup 2} field of view, and a 3,200 Megapixel camera, will allow about 10,000 square degrees of sky to be covered using pairs of 15-second exposures in two photometric bands every three nights on average. The system is designed to yield high image quality, as well as superb astrometric and photometric accuracy. The survey area will include 30,000 deg{sup 2} with {delta} < +34.5{sup o}, and will be imaged multiple times in six bands, ugrizy, covering the wavelength range 320-1050 nm. About 90% of the observing time will be devoted to a deep-wide-fast survey mode which will observe a 20,000 deg{sup 2} region about 1000 times in the six bands during the anticipated 10 years of operation. These data will result in databases including 10 billion galaxies and a similar number of stars, and will serve the majority of science programs. The remaining 10% of the observing time will be allocated to special programs such as Very Deep and Very Fast time domain surveys. We describe how the

  16. Large Synoptic Survey Telescope: From science drivers to reference design

    Directory of Open Access Journals (Sweden)

    Ivezić Ž.

    2008-01-01

    Full Text Available In the history of astronomy, major advances in our understanding of the Universe have come from dramatic improvements in our ability to accurately measure astronomical quantities. Aided by rapid progress in information technology, current sky surveys are changing the way we view and study the Universe. Next- generation surveys will maintain this revolutionary progress. We focus here on the most ambitious survey currently planned in the visible band, the Large Synoptic Survey Telescope (LSST. LSST will have unique survey capability in the faint time domain. The LSST design is driven by four main science themes: constraining dark energy and dark matter, taking an inventory of the Solar System, exploring the transient optical sky, and mapping the Milky Way. It will be a large, wide-field ground-based system designed to obtain multiple images covering the sky that is visible from Cerro Pachon in Northern Chile. The current baseline design, with an 8.4 m (6.5 m effective primary mirror, a 9.6 deg2 field of view, and a 3,200 Megapixel camera, will allow about 10,000 square degrees of sky to be covered using pairs of 15-second exposures in two photometric bands every three nights on average. The system is designed to yield high image quality, as well as superb astrometric and photometric accuracy. The survey area will include 30,000 deg2 with δ < +34.5◦ , and will be imaged multiple times in six bands, ugrizy, covering the wavelength range 320-1050 nm. About 90% of the observing time will be devoted to a deep- wide-fast survey mode which will observe a 20,000 deg2 region about 1000 times in the six bands during the anticipated 10 years of operation. These data will result in databases including 10 billion galaxies and a similar number of stars, and will serve the majority of science programs. The remaining 10% of the observing time will be allocated to special programs such as Very Deep and Very Fast time domain surveys. We describe how the LSST

  17. Large Synoptic Survey Telescope: From Science Drivers To Reference Design

    Directory of Open Access Journals (Sweden)

    Ivezić, Ž.

    2008-06-01

    Full Text Available In the history of astronomy, major advances in our understanding of the Universe have come from dramatic improvements in our ability to accurately measure astronomical quantities. Aided by rapid progress in information technology, current sky surveys are changing the way we view and study the Universe. Next-generation surveys will maintain this revolutionary progress. We focus here on the most ambitious survey currently planned in the visible band, the Large Synoptic Survey Telescope (LSST. LSST will have unique survey capability in the faint time domain. The LSST design is driven by four main science themes: constraining dark energy and dark matter, taking an inventory of the Solar System, exploring the transient optical sky, and mapping the Milky Way. It will be a large, wide-field ground-based system designed to obtain multiple images covering the sky that is visible from Cerro Pach'{o}n in Northern Chile. The current baseline design, with an 8.4, m (6.5, m effective primary mirror, a 9.6 deg$^2$ field of view, and a 3,200 Megapixel camera, will allow about 10,000 square degrees of sky to be covered using pairs of 15-second exposures in two photometric bands every three nights on average. The system is designed to yield high image quality, as well as superb astrometric and photometric accuracy. The survey area will include 30,000 deg$^2$ with $delta<+34.5^circ$, and will be imaged multiple times in six bands, $ugrizy$, covering the wavelength range 320--1050 nm. About 90\\% of the observing time will be devoted to a deep-wide-fast survey mode which will observe a 20,000 deg$^2$ region about 1000 times in the six bands during the anticipated 10 years of operation. These data will result in databases including 10 billion galaxies and a similar number of stars, and will serve the majority of science programs. The remaining 10\\% of the observing time will be allocated to special programs such as Very Deep and Very Fast time domain surveys. We

  18. Designing an enhanced groundwater sample collection system

    International Nuclear Information System (INIS)

    Schalla, R.

    1994-10-01

    As part of an ongoing technical support mission to achieve excellence and efficiency in environmental restoration activities at the Laboratory for Energy and Health-Related Research (LEHR), Pacific Northwest Laboratory (PNL) provided guidance on the design and construction of monitoring wells and identified the most suitable type of groundwater sampling pump and accessories for monitoring wells. The goal was to utilize a monitoring well design that would allow for hydrologic testing and reduce turbidity to minimize the impact of sampling. The sampling results of the newly designed monitoring wells were clearly superior to those of the previously installed monitoring wells. The new wells exhibited reduced turbidity, in addition to improved access for instrumentation and hydrologic testing. The variable frequency submersible pump was selected as the best choice for obtaining groundwater samples. The literature references are listed at the end of this report. Despite some initial difficulties, the actual performance of the variable frequency, submersible pump and its accessories was effective in reducing sampling time and labor costs, and its ease of use was preferred over the previously used bladder pumps. The surface seals system, called the Dedicator, proved to be useful accessory to prevent surface contamination while providing easy access for water-level measurements and for connecting the pump. Cost savings resulted from the use of the pre-production pumps (beta units) donated by the manufacturer for the demonstration. However, larger savings resulted from shortened field time due to the ease in using the submersible pumps and the surface seal access system. Proper deployment of the monitoring wells also resulted in cost savings and ensured representative samples

  19. Quality Control in Survey Design: Evaluating a Survey of Educators’ Attitudes Concerning Differentiated Compensation

    OpenAIRE

    Kelly D. Bradley; Michael Peabody; Shannon O. Sampson

    2015-01-01

    This study utilized the Rasch model to assess the quality of a survey instrument designed to measure attitudes of administrators and teachers concerning a differentiated teacher compensation program piloted in Kentucky.  Researchers addressing potentially contentious issues should ensure their methods stand up to rigorous criticism.  The results indicate that the rating scale does not function as expected, with items being too easy to endorse.  Future iterations of this survey should be revis...

  20. Satellite-aided survey sampling and implementation in low- and middle-income contexts: a low-cost/low-tech alternative.

    Science.gov (United States)

    Haenssgen, Marco J

    2015-01-01

    The increasing availability of online maps, satellite imagery, and digital technology can ease common constraints of survey sampling in low- and middle-income countries. However, existing approaches require specialised software and user skills, professional GPS equipment, and/or commercial data sources; they tend to neglect spatial sampling considerations when using satellite maps; and they continue to face implementation challenges analogous to conventional survey implementation methods. This paper presents an alternative way of utilising satellite maps and digital aides that aims to address these challenges. The case studies of two rural household surveys in Rajasthan (India) and Gansu (China) compare conventional survey sampling and implementation techniques with the use of online map services such as Google, Bing, and HERE maps. Modern yet basic digital technology can be integrated into the processes of preparing, implementing, and monitoring a rural household survey. Satellite-aided systematic random sampling enhanced the spatial representativeness of the village samples and entailed savings of approximately £4000 compared to conventional household listing, while reducing the duration of the main survey by at least 25 %. This low-cost/low-tech satellite-aided survey sampling approach can be useful for student researchers and resource-constrained research projects operating in low- and middle-income contexts with high survey implementation costs. While achieving transparent and efficient survey implementation at low costs, researchers aiming to adopt a similar process should be aware of the locational, technical, and logistical requirements as well as the methodological challenges of this strategy.

  1. An affordable proxy of representative national survey on radon concentration in dwellings: Design, organisation and preliminary evaluation of representativeness

    International Nuclear Information System (INIS)

    Antignani, Sara; Carelli, Vinicio; Cordedda, Carlo; Zonno, Fedele; Ampollini, Marco; Carpentieri, Carmela; Venoso, Gennaro; Bochicchio, Francesco

    2013-01-01

    comparison showed that, at national level, the Italian dwellings are quite well represented by the sample. - Highlights: ► In Italy a new national survey was designed and organised. ► A new approach for a proxy of a representative national survey was used. ► The aim is to estimate the frequency distribution of indoor radon concentration in all the 110 Italian Provinces. ► The realisation of this survey was feasible and affordable. ► Preliminary evaluation of sample representativeness was performed at national level

  2. Thermal probe design for Europa sample acquisition

    Science.gov (United States)

    Horne, Mera F.

    2018-01-01

    The planned lander missions to the surface of Europa will access samples from the subsurface of the ice in a search for signs of life. A small thermal drill (probe) is proposed to meet the sample requirement of the Science Definition Team's (SDT) report for the Europa mission. The probe is 2 cm in diameter and 16 cm in length and is designed to access the subsurface to 10 cm deep and to collect five ice samples of 7 cm3 each, approximately. The energy required to penetrate the top 10 cm of ice in a vacuum is 26 Wh, approximately, and to melt 7 cm3 of ice is 1.2 Wh, approximately. The requirement stated in the SDT report of collecting samples from five different sites can be accommodated with repeated use of the same thermal drill. For smaller sample sizes, a smaller probe of 1.0 cm in diameter with the same length of 16 cm could be utilized that would require approximately 6.4 Wh to penetrate the top 10 cm of ice, and 0.02 Wh to collect 0.1 g of sample. The thermal drill has the advantage of simplicity of design and operations and the ability to penetrate ice over a range of densities and hardness while maintaining sample integrity.

  3. Technology, design and dementia: an exploratory survey of developers.

    Science.gov (United States)

    Jiancaro, Tizneem; Jaglal, Susan B; Mihailidis, Alex

    2017-08-01

    Despite worldwide surges in dementia, we still know relatively little about the design of home technologies that support this population. The purpose of this study was to investigate design considerations from the perspective of developers. Participants, including technical and clinical specialists, were recruited internationally and answered web-based survey questions comprising Likert-type responses with text entry options. Developers were queried on 23 technology acceptance characteristics and 24 design practices. In all, forty developers completed the survey. Concerning "technology acceptance", cost, learnability, self-confidence (during use) and usability were deemed very important. Concerning "design practice", developers overwhelmingly valued user-centred design (UCD). In terms of general assistive technology (AT) models, these were largely unknown by technical specialists compared to clinical specialists. Recommendations based on this study include incorporating "self-confidence" into design protocols; examining the implications of "usability" and UCD in this context; and considering empathy-based design approaches to suit a diverse user population. Moreover, clinical specialists have much to offer development teams, particularly concerning the use of conceptual AT models. Implications of rehabilitation Stipulate precise usability criteria. Consider "learnability" and "self-confidence" as technology adoption criteria. Recognize the important theoretical role that clinical specialists can fulfil concerning the use of design schemas. Acknowledge the diversity amongst users with dementia, potentially adopting techniques, such as designing for "extraordinary users".

  4. A stratified two-stage sampling design for digital soil mapping in a Mediterranean basin

    Science.gov (United States)

    Blaschek, Michael; Duttmann, Rainer

    2015-04-01

    The quality of environmental modelling results often depends on reliable soil information. In order to obtain soil data in an efficient manner, several sampling strategies are at hand depending on the level of prior knowledge and the overall objective of the planned survey. This study focuses on the collection of soil samples considering available continuous secondary information in an undulating, 16 km²-sized river catchment near Ussana in southern Sardinia (Italy). A design-based, stratified, two-stage sampling design has been applied aiming at the spatial prediction of soil property values at individual locations. The stratification based on quantiles from density functions of two land-surface parameters - topographic wetness index and potential incoming solar radiation - derived from a digital elevation model. Combined with four main geological units, the applied procedure led to 30 different classes in the given test site. Up to six polygons of each available class were selected randomly excluding those areas smaller than 1ha to avoid incorrect location of the points in the field. Further exclusion rules were applied before polygon selection masking out roads and buildings using a 20m buffer. The selection procedure was repeated ten times and the set of polygons with the best geographical spread were chosen. Finally, exact point locations were selected randomly from inside the chosen polygon features. A second selection based on the same stratification and following the same methodology (selecting one polygon instead of six) was made in order to create an appropriate validation set. Supplementary samples were obtained during a second survey focusing on polygons that have either not been considered during the first phase at all or were not adequately represented with respect to feature size. In total, both field campaigns produced an interpolation set of 156 samples and a validation set of 41 points. The selection of sample point locations has been done using

  5. [Study of spatial stratified sampling strategy of Oncomelania hupensis snail survey based on plant abundance].

    Science.gov (United States)

    Xun-Ping, W; An, Z

    2017-07-27

    Objective To optimize and simplify the survey method of Oncomelania hupensis snails in marshland endemic regions of schistosomiasis, so as to improve the precision, efficiency and economy of the snail survey. Methods A snail sampling strategy (Spatial Sampling Scenario of Oncomelania based on Plant Abundance, SOPA) which took the plant abundance as auxiliary variable was explored and an experimental study in a 50 m×50 m plot in a marshland in the Poyang Lake region was performed. Firstly, the push broom surveyed data was stratified into 5 layers by the plant abundance data; then, the required numbers of optimal sampling points of each layer through Hammond McCullagh equation were calculated; thirdly, every sample point in the line with the Multiple Directional Interpolation (MDI) placement scheme was pinpointed; and finally, the comparison study among the outcomes of the spatial random sampling strategy, the traditional systematic sampling method, the spatial stratified sampling method, Sandwich spatial sampling and inference and SOPA was performed. Results The method (SOPA) proposed in this study had the minimal absolute error of 0.213 8; and the traditional systematic sampling method had the largest estimate, and the absolute error was 0.924 4. Conclusion The snail sampling strategy (SOPA) proposed in this study obtains the higher estimation accuracy than the other four methods.

  6. Triangulation based inclusion probabilities: a design-unbiased sampling approach

    OpenAIRE

    Fehrmann, Lutz; Gregoire, Timothy; Kleinn, Christoph

    2011-01-01

    A probabilistic sampling approach for design-unbiased estimation of area-related quantitative characteristics of spatially dispersed population units is proposed. The developed field protocol includes a fixed number of 3 units per sampling location and is based on partial triangulations over their natural neighbors to derive the individual inclusion probabilities. The performance of the proposed design is tested in comparison to fixed area sample plots in a simulation with two forest stands. ...

  7. A new design of neutron survey instrument

    International Nuclear Information System (INIS)

    Tanner, R.J.; Eakins, J.S.; Hager, L.G.

    2010-01-01

    A novel design of neutron survey instrument has been developed. The moderator has been modified via the use of 'neutron guides', which help thermal neutrons reach the central proportional counter. This innovation has allowed the variations in the energy dependence of ambient dose equivalent response to be reduced compared to prior single-detector designs, whilst maintaining a relatively light moderator and simple construction. In particular, the design has a relatively small over-response to neutrons with energies around 5 keV, when compared to prior designs. The final optimized design has been verified using MCNP5 calculations to ensure that the response is relatively independent of the energy and direction of the incident neutron. This has required the ends of the guides to be structured so that unidirectional and isotropic neutron fields have closely matched responses, as is necessary in the workplace. The reading of the instrument in workplace fields is calculated via folding and the suitability of the design for use in the workplace discussed.

  8. THE ALFALFA H α SURVEY. I. PROJECT DESCRIPTION AND THE LOCAL STAR FORMATION RATE DENSITY FROM THE FALL SAMPLE

    Energy Technology Data Exchange (ETDEWEB)

    Sistine, Angela Van [Department of Physics, University of Wisconsin-Milwaukee, Milwaukee, WI 53211 (United States); Salzer, John J.; Janowiecki, Steven [Department of Astronomy, Indiana University, Bloomington, IN 47405 (United States); Sugden, Arthur [Department of Endocrinology, Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, MA 02115 (United States); Giovanelli, Riccardo; Haynes, Martha P. [Center for Astrophysics and Planetary Science, Cornell University, Ithaca, NY 14853 (United States); Jaskot, Anne E. [Department of Astronomy, Smith College, Northampton, MA 01063 (United States); Wilcots, Eric M. [Department of Astronomy, University of Wisconsin-Madison, Madison, WI 53706 (United States)

    2016-06-10

    The ALFALFA H α survey utilizes a large sample of H i-selected galaxies from the ALFALFA survey to study star formation (SF) in the local universe. ALFALFA H α contains 1555 galaxies with distances between ∼20 and ∼100 Mpc. We have obtained continuum-subtracted narrowband H α images and broadband R images for each galaxy, creating one of the largest homogeneous sets of H α images ever assembled. Our procedures were designed to minimize the uncertainties related to the calculation of the local SF rate density (SFRD). The galaxy sample we constructed is as close to volume-limited as possible, is a robust statistical sample, and spans a wide range of galaxy environments. In this paper, we discuss the properties of our Fall sample of 565 galaxies, our procedure for deriving individual galaxy SF rates, and our method for calculating the local SFRD. We present a preliminary value of log(SFRD[ M {sub ⊙} yr{sup −1} Mpc{sup −3}]) = −1.747 ± 0.018 (random) ±0.05 (systematic) based on the 565 galaxies in our Fall sub-sample. Compared to the weighted average of SFRD values around z ≈ 2, our local value indicates a drop in the global SFRD of a factor of 10.2 over that lookback time.

  9. Using Social Media and Targeted Snowball Sampling to Survey a Hard-to-reach Population: A Case Study

    Directory of Open Access Journals (Sweden)

    Gary Dusek

    2015-08-01

    Full Text Available Response rates to the academic surveys used in quantitative research are decreasing and have been for several decades among both individuals and organizations. Given this trend, providing doctoral students an opportunity to complete their dissertations in a timely and cost effective manner may necessitate identifying more innovative and relevant ways to collect data while maintaining appropriate research standards and rigor. The case of a research study is presented which describes the data collection process used to survey a hard-to-reach population. It details the use of social media, in this case LinkedIn, to facilitate the distribution of the web-based survey. A roadmap to illustrate how this data collection process unfolded is presented, as well as several “lessons learned” during this journey. An explanation of the considerations that impacted the sampling design is provided. The goal of this case study is to provide researchers, including doctoral students, with realistic expectations and an awareness of the benefits and risks associated with the use of this method of data collection.

  10. Conducting a respondent-driven sampling survey with the use of existing resources in Sydney, Australia.

    Science.gov (United States)

    Paquette, Dana M; Bryant, Joanne; Crawford, Sione; de Wit, John B F

    2011-07-01

    Respondent-driven sampling (RDS) is a form of chain-referral sampling that is increasingly being used for HIV behavioural surveillance. When used for surveillance purposes, a sampling method should be relatively inexpensive and simple to operate. This study examined whether an RDS survey of people who inject drugs (PWID) in Sydney, Australia, could be successfully conducted through the use of minimal and existing resources. The RDS survey was conducted on the premises of a local needle and syringe program (NSP) with some adjustments to take into account the constraints of existing resources. The impact of the survey on clients and on staff was examined by summarizing NSP service data and by conducting post-survey discussions with NSP staff. From November 2009 till March 2010, 261 participants were recruited in 16 waves. A significant increase was found in the number of services provided by the NSP during and after data collection. Generally, staff felt that the survey had a positive impact by exposing a broader group of people to the NSP. However, conducting the survey may have led to privacy issues for NSP clients due to an increased number of people gathering around the NSP. This study shows that RDS can be conducted with the use of minimal and existing resources under certain conditions (e.g., use of a self-administered questionnaire and no biological samples taken). A more detailed cost-utility analysis is needed to determine whether RDS' advantages outweigh potential challenges when compared to simpler and less costly convenience methods. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  11. Customer satisfaction surveys: Methodological recommendations for financial service providers

    Directory of Open Access Journals (Sweden)

    Đorđić Marko

    2010-01-01

    Full Text Available This methodological article investigates practical challenges that emerge when conducting customer satisfaction surveys (CSS for financial service providers such as banks, insurance or leasing companies, and so forth. It displays methodological recommendations in reference with: (a survey design, (b sampling, (c survey method, (d questionnaire design, and (e data acquisition. Article provides appropriate explanations that usage of: two-stage survey design, SRS method, large samples, and rigorous fieldwork preparation can enhance the overall quality of CSS in financial services. Proposed methodological recommendations can primarily be applied to the primary quantitative marketing research in retail financial services. However, majority of them can be successfully applied when conducting primary quantitative marketing research in corporate financial services as well. .

  12. A Survey on the Color of Interior Design for University Students

    OpenAIRE

    Takata, Hiroshi

    2008-01-01

    The purpose of this study is to clarify the actual conditions of the color on interior design for university students. At first, the author carried out the questionnaire survey in order to grasp the characteristics of respondents and their consciousness of interior and color design. Based on the results of the questionnaire survey, the relationship between the characteristics of respondents and their consciousness was studied. Especially, the influence of the respondents' academic grade and t...

  13. ACS sampling system: design, implementation, and performance evaluation

    Science.gov (United States)

    Di Marcantonio, Paolo; Cirami, Roberto; Chiozzi, Gianluca

    2004-09-01

    By means of ACS (ALMA Common Software) framework we designed and implemented a sampling system which allows sampling of every Characteristic Component Property with a specific, user-defined, sustained frequency limited only by the hardware. Collected data are sent to various clients (one or more Java plotting widgets, a dedicated GUI or a COTS application) using the ACS/CORBA Notification Channel. The data transport is optimized: samples are cached locally and sent in packets with a lower and user-defined frequency to keep network load under control. Simultaneous sampling of the Properties of different Components is also possible. Together with the design and implementation issues we present the performance of the sampling system evaluated on two different platforms: on a VME based system using VxWorks RTOS (currently adopted by ALMA) and on a PC/104+ embedded platform using Red Hat 9 Linux operating system. The PC/104+ solution offers, as an alternative, a low cost PC compatible hardware environment with free and open operating system.

  14. Progress in XRCS-Survey plant instrumentation and control design for ITER

    International Nuclear Information System (INIS)

    Varshney, Sanjeev; Jha, Shivakant; Simrock, Stefan; Barnsley, Robin; Martin, Vincent; Mishra, Sapna; Patil, Prabhakant; Patel, Shreyas; Kumar, Vinay

    2016-01-01

    Highlights: • An identification of the major process functions system compliant to Plant Control Design Handbook (PCDH) has been made for XRCS-Survey plant I&C. • I&C Functional Breakdown Structure (FBS) and Operation Procedure (OP) have been drafted using Enterprise architect (EA). • I&C architecture, interface with ITER networks and Plants, configuration of cubicles are discussed towards nine design review deliverables. - Abstract: A real time, plasma impurity survey system based on X-ray Crystal Spectroscopy (XRCS) has been designed for ITER and will be made available in the set of first plasma diagnostics for measuring impurity ion concentrations and their in-flux. For the purpose of developing a component level design of XRCS-Survey plant I&C system that is compliant to the rules and guidelines defined in the Plant Control Design Handbook (PCDH), firstly an identification of the major process functions has been made. The preliminary plant I&C Functional Breakdown Structure (FBS) and Operation Procedure (OP) have been drafted using a system engineering tool, Enterprise Architect (EA). Conceptual I&C architecture, interface with the ITER networks and other Plants have been discussed along with the basic configuration of I&C cubicles aiming towards nine I&C deliverables for the design review.

  15. Progress in XRCS-Survey plant instrumentation and control design for ITER

    Energy Technology Data Exchange (ETDEWEB)

    Varshney, Sanjeev, E-mail: sanjeev.varshney@iter-india.org [ITER-India, Institute for Plasma Research, Bhat, Gandhinagar, 382 428 (India); Jha, Shivakant [ITER-India, Institute for Plasma Research, Bhat, Gandhinagar, 382 428 (India); Simrock, Stefan; Barnsley, Robin; Martin, Vincent [ITER-Organization, Route de Vinon sur Verdon, CS 90 046, 13067 St. Paul-Lez-Durance, Cedex (France); Mishra, Sapna [ITER-India, Institute for Plasma Research, Bhat, Gandhinagar, 382 428 (India); Patil, Prabhakant [ITER-Organization, Route de Vinon sur Verdon, CS 90 046, 13067 St. Paul-Lez-Durance, Cedex (France); Patel, Shreyas; Kumar, Vinay [ITER-India, Institute for Plasma Research, Bhat, Gandhinagar, 382 428 (India)

    2016-11-15

    Highlights: • An identification of the major process functions system compliant to Plant Control Design Handbook (PCDH) has been made for XRCS-Survey plant I&C. • I&C Functional Breakdown Structure (FBS) and Operation Procedure (OP) have been drafted using Enterprise architect (EA). • I&C architecture, interface with ITER networks and Plants, configuration of cubicles are discussed towards nine design review deliverables. - Abstract: A real time, plasma impurity survey system based on X-ray Crystal Spectroscopy (XRCS) has been designed for ITER and will be made available in the set of first plasma diagnostics for measuring impurity ion concentrations and their in-flux. For the purpose of developing a component level design of XRCS-Survey plant I&C system that is compliant to the rules and guidelines defined in the Plant Control Design Handbook (PCDH), firstly an identification of the major process functions has been made. The preliminary plant I&C Functional Breakdown Structure (FBS) and Operation Procedure (OP) have been drafted using a system engineering tool, Enterprise Architect (EA). Conceptual I&C architecture, interface with the ITER networks and other Plants have been discussed along with the basic configuration of I&C cubicles aiming towards nine I&C deliverables for the design review.

  16. [Design and implementation of mobile terminal data acquisition for Chinese materia medica resources survey].

    Science.gov (United States)

    Qi, Yuan-Hua; Wang, Hui; Zhang, Xiao-Bo; Jin, Yan; Ge, Xiao-Guang; Jing, Zhi-Xian; Wang, Ling; Zhao, Yu-Ping; Guo, Lan-Ping; Huang, Lu-Qi

    2017-11-01

    In this paper, a data acquisition system based on mobile terminal combining GPS, offset correction, automatic speech recognition and database networking technology was designed implemented with the function of locating the latitude and elevation information fast, taking conveniently various types of Chinese herbal plant photos, photos, samples habitat photos and so on. The mobile system realizes automatic association with Chinese medicine source information, through the voice recognition function it records the information of plant characteristics and environmental characteristics, and record relevant plant specimen information. The data processing platform based on Chinese medicine resources survey data reporting client can effectively assists in indoor data processing, derives the mobile terminal data to computer terminal. The established data acquisition system provides strong technical support for the fourth national survey of the Chinese materia medica resources (CMMR). Copyright© by the Chinese Pharmaceutical Association.

  17. Mobile Variable Depth Sampling System Design Study

    International Nuclear Information System (INIS)

    BOGER, R.M.

    2000-01-01

    A design study is presented for a mobile, variable depth sampling system (MVDSS) that will support the treatment and immobilization of Hanford LAW and HLW. The sampler can be deployed in a 4-inch tank riser and has a design that is based on requirements identified in the Level 2 Specification (latest revision). The waste feed sequence for the MVDSS is based on Phase 1, Case 3S6 waste feed sequence. Technical information is also presented that supports the design study

  18. Mobile Variable Depth Sampling System Design Study

    Energy Technology Data Exchange (ETDEWEB)

    BOGER, R.M.

    2000-08-25

    A design study is presented for a mobile, variable depth sampling system (MVDSS) that will support the treatment and immobilization of Hanford LAW and HLW. The sampler can be deployed in a 4-inch tank riser and has a design that is based on requirements identified in the Level 2 Specification (latest revision). The waste feed sequence for the MVDSS is based on Phase 1, Case 3S6 waste feed sequence. Technical information is also presented that supports the design study.

  19. Spatial distribution, sampling precision and survey design optimisation with non-normal variables: The case of anchovy (Engraulis encrasicolus) recruitment in Spanish Mediterranean waters

    Science.gov (United States)

    Tugores, M. Pilar; Iglesias, Magdalena; Oñate, Dolores; Miquel, Joan

    2016-02-01

    In the Mediterranean Sea, the European anchovy (Engraulis encrasicolus) displays a key role in ecological and economical terms. Ensuring stock sustainability requires the provision of crucial information, such as species spatial distribution or unbiased abundance and precision estimates, so that management strategies can be defined (e.g. fishing quotas, temporal closure areas or marine protected areas MPA). Furthermore, the estimation of the precision of global abundance at different sampling intensities can be used for survey design optimisation. Geostatistics provide a priori unbiased estimations of the spatial structure, global abundance and precision for autocorrelated data. However, their application to non-Gaussian data introduces difficulties in the analysis in conjunction with low robustness or unbiasedness. The present study applied intrinsic geostatistics in two dimensions in order to (i) analyse the spatial distribution of anchovy in Spanish Western Mediterranean waters during the species' recruitment season, (ii) produce distribution maps, (iii) estimate global abundance and its precision, (iv) analyse the effect of changing the sampling intensity on the precision of global abundance estimates and, (v) evaluate the effects of several methodological options on the robustness of all the analysed parameters. The results suggested that while the spatial structure was usually non-robust to the tested methodological options when working with the original dataset, it became more robust for the transformed datasets (especially for the log-backtransformed dataset). The global abundance was always highly robust and the global precision was highly or moderately robust to most of the methodological options, except for data transformation.

  20. Wide Field Infrared Survey Telescope [WFIRST]: telescope design and simulated performance

    Science.gov (United States)

    Goullioud, R.; Content, D. A.; Kuan, G. M.; Moore, J. D.; Chang, Z.; Sunada, E. T.; Villalvazo, J.; Hawk, J. P.; Armani, N. V.; Johnson, E. L.; Powell, C. A.

    2012-09-01

    The Wide Field Infrared Survey Telescope (WFIRST) mission concept was ranked first in new space astrophysics missions by the Astro2010 Decadal Survey, incorporating the Joint Dark Energy Mission payload concept and multiple science white papers. This mission is based on a space telescope at L2 studying exoplanets [via gravitational microlensing], probing dark energy, and surveying the near infrared sky. Since the release of the Astro2010 Decadal Survey, the team has been working with the WFIRST Science Definition Team to refine mission and payload concepts. We present the current interim reference mission point design of the payload, based on the use of a 1.3m unobscured aperture three mirror anastigmat form, with focal imaging and slit-less spectroscopy science channels. We also present the first results of Structural/Thermal/Optical performance modeling of the telescope point design.

  1. Men who have sex with men in Great Britain: comparing methods and estimates from probability and convenience sample surveys

    Science.gov (United States)

    Prah, Philip; Hickson, Ford; Bonell, Chris; McDaid, Lisa M; Johnson, Anne M; Wayal, Sonali; Clifton, Soazig; Sonnenberg, Pam; Nardone, Anthony; Erens, Bob; Copas, Andrew J; Riddell, Julie; Weatherburn, Peter; Mercer, Catherine H

    2016-01-01

    Objective To examine sociodemographic and behavioural differences between men who have sex with men (MSM) participating in recent UK convenience surveys and a national probability sample survey. Methods We compared 148 MSM aged 18–64 years interviewed for Britain's third National Survey of Sexual Attitudes and Lifestyles (Natsal-3) undertaken in 2010–2012, with men in the same age range participating in contemporaneous convenience surveys of MSM: 15 500 British resident men in the European MSM Internet Survey (EMIS); 797 in the London Gay Men's Sexual Health Survey; and 1234 in Scotland's Gay Men's Sexual Health Survey. Analyses compared men reporting at least one male sexual partner (past year) on similarly worded questions and multivariable analyses accounted for sociodemographic differences between the surveys. Results MSM in convenience surveys were younger and better educated than MSM in Natsal-3, and a larger proportion identified as gay (85%–95% vs 62%). Partner numbers were higher and same-sex anal sex more common in convenience surveys. Unprotected anal intercourse was more commonly reported in EMIS. Compared with Natsal-3, MSM in convenience surveys were more likely to report gonorrhoea diagnoses and HIV testing (both past year). Differences between the samples were reduced when restricting analysis to gay-identifying MSM. Conclusions National probability surveys better reflect the population of MSM but are limited by their smaller samples of MSM. Convenience surveys recruit larger samples of MSM but tend to over-represent MSM identifying as gay and reporting more sexual risk behaviours. Because both sampling strategies have strengths and weaknesses, methods are needed to triangulate data from probability and convenience surveys. PMID:26965869

  2. Coupling methods for multistage sampling

    OpenAIRE

    Chauvet, Guillaume

    2015-01-01

    Multistage sampling is commonly used for household surveys when there exists no sampling frame, or when the population is scattered over a wide area. Multistage sampling usually introduces a complex dependence in the selection of the final units, which makes asymptotic results quite difficult to prove. In this work, we consider multistage sampling with simple random without replacement sampling at the first stage, and with an arbitrary sampling design for further stages. We consider coupling ...

  3. Sampling in health geography: reconciling geographical objectives and probabilistic methods. An example of a health survey in Vientiane (Lao PDR

    Directory of Open Access Journals (Sweden)

    Bochaton Audrey

    2007-06-01

    Full Text Available Abstract Background Geographical objectives and probabilistic methods are difficult to reconcile in a unique health survey. Probabilistic methods focus on individuals to provide estimates of a variable's prevalence with a certain precision, while geographical approaches emphasise the selection of specific areas to study interactions between spatial characteristics and health outcomes. A sample selected from a small number of specific areas creates statistical challenges: the observations are not independent at the local level, and this results in poor statistical validity at the global level. Therefore, it is difficult to construct a sample that is appropriate for both geographical and probability methods. Methods We used a two-stage selection procedure with a first non-random stage of selection of clusters. Instead of randomly selecting clusters, we deliberately chose a group of clusters, which as a whole would contain all the variation in health measures in the population. As there was no health information available before the survey, we selected a priori determinants that can influence the spatial homogeneity of the health characteristics. This method yields a distribution of variables in the sample that closely resembles that in the overall population, something that cannot be guaranteed with randomly-selected clusters, especially if the number of selected clusters is small. In this way, we were able to survey specific areas while minimising design effects and maximising statistical precision. Application We applied this strategy in a health survey carried out in Vientiane, Lao People's Democratic Republic. We selected well-known health determinants with unequal spatial distribution within the city: nationality and literacy. We deliberately selected a combination of clusters whose distribution of nationality and literacy is similar to the distribution in the general population. Conclusion This paper describes the conceptual reasoning behind

  4. Sampling in health geography: reconciling geographical objectives and probabilistic methods. An example of a health survey in Vientiane (Lao PDR).

    Science.gov (United States)

    Vallée, Julie; Souris, Marc; Fournet, Florence; Bochaton, Audrey; Mobillion, Virginie; Peyronnie, Karine; Salem, Gérard

    2007-06-01

    Geographical objectives and probabilistic methods are difficult to reconcile in a unique health survey. Probabilistic methods focus on individuals to provide estimates of a variable's prevalence with a certain precision, while geographical approaches emphasise the selection of specific areas to study interactions between spatial characteristics and health outcomes. A sample selected from a small number of specific areas creates statistical challenges: the observations are not independent at the local level, and this results in poor statistical validity at the global level. Therefore, it is difficult to construct a sample that is appropriate for both geographical and probability methods. We used a two-stage selection procedure with a first non-random stage of selection of clusters. Instead of randomly selecting clusters, we deliberately chose a group of clusters, which as a whole would contain all the variation in health measures in the population. As there was no health information available before the survey, we selected a priori determinants that can influence the spatial homogeneity of the health characteristics. This method yields a distribution of variables in the sample that closely resembles that in the overall population, something that cannot be guaranteed with randomly-selected clusters, especially if the number of selected clusters is small. In this way, we were able to survey specific areas while minimising design effects and maximising statistical precision. We applied this strategy in a health survey carried out in Vientiane, Lao People's Democratic Republic. We selected well-known health determinants with unequal spatial distribution within the city: nationality and literacy. We deliberately selected a combination of clusters whose distribution of nationality and literacy is similar to the distribution in the general population. This paper describes the conceptual reasoning behind the construction of the survey sample and shows that it can be

  5. Geochemical reanalysis of historical U.S. Geological Survey sediment samples from the Tonsina area, Valdez Quadrangle, Alaska

    Science.gov (United States)

    Werdon, Melanie B.; Granitto, Matthew; Azain, Jaime S.

    2015-01-01

    The State of Alaska’s Strategic and Critical Minerals (SCM) Assessment project, a State-funded Capital Improvement Project (CIP), is designed to evaluate Alaska’s statewide potential for SCM resources. The SCM Assessment is being implemented by the Alaska Division of Geological & Geophysical Surveys (DGGS), and involves obtaining new airborne-geophysical, geological, and geochemical data. As part of the SCM Assessment, thousands of historical geochemical samples from DGGS, U.S. Geological Survey (USGS), and U.S. Bureau of Mines archives are being reanalyzed by DGGS using modern, quantitative, geochemical-analytical methods. The objective is to update the statewide geochemical database to more clearly identify areas in Alaska with SCM potential. The USGS is also undertaking SCM-related geologic studies in Alaska through the federally funded Alaska Critical Minerals cooperative project. DGGS and USGS share the goal of evaluating Alaska’s strategic and critical minerals potential and together created a Letter of Agreement (signed December 2012) and a supplementary Technical Assistance Agreement (#14CMTAA143458) to facilitate the two agencies’ cooperative work. Under these agreements, DGGS contracted the USGS in Denver to reanalyze historical USGS sediment samples from Alaska. For this report, DGGS funded reanalysis of 128 historical USGS sediment samples from the statewide Alaska Geochemical Database Version 2.0 (AGDB2; Granitto and others, 2013). Samples were chosen from the Tonsina area in the Chugach Mountains, Valdez quadrangle, Alaska (fig. 1). The USGS was responsible for sample retrieval from the National Geochemical Sample Archive (NGSA) in Denver, Colorado through the final quality assurance/quality control (QA/QC) of the geochemical analyses obtained through the USGS contract lab. The new geochemical data are published in this report as a coauthored DGGS report, and will be incorporated into the statewide geochemical databases of both agencies

  6. Reliability of impingement sampling designs: An example from the Indian Point station

    International Nuclear Information System (INIS)

    Mattson, M.T.; Waxman, J.B.; Watson, D.A.

    1988-01-01

    A 4-year data base (1976-1979) of daily fish impingement counts at the Indian Point electric power station on the Hudson River was used to compare the precision and reliability of three random-sampling designs: (1) simple random, (2) seasonally stratified, and (3) empirically stratified. The precision of daily impingement estimates improved logarithmically for each design as more days in the year were sampled. Simple random sampling was the least, and empirically stratified sampling was the most precise design, and the difference in precision between the two stratified designs was small. Computer-simulated sampling was used to estimate the reliability of the two stratified-random-sampling designs. A seasonally stratified sampling design was selected as the most appropriate reduced-sampling program for Indian Point station because: (1) reasonably precise and reliable impingement estimates were obtained using this design for all species combined and for eight common Hudson River fish by sampling only 30% of the days in a year (110 d); and (2) seasonal strata may be more precise and reliable than empirical strata if future changes in annual impingement patterns occur. The seasonally stratified design applied to the 1976-1983 Indian Point impingement data showed that selection of sampling dates based on daily species-specific impingement variability gave results that were more precise, but not more consistently reliable, than sampling allocations based on the variability of all fish species combined. 14 refs., 1 fig., 6 tabs

  7. Design Considerations: Falcon M Dwarf Habitable Exoplanet Survey

    Science.gov (United States)

    Polsgrove, Daniel; Novotny, Steven; Della-Rose, Devin J.; Chun, Francis; Tippets, Roger; O'Shea, Patrick; Miller, Matthew

    2016-01-01

    The Falcon Telescope Network (FTN) is an assemblage of twelve automated 20-inch telescopes positioned around the globe, controlled from the Cadet Space Operations Center (CSOC) at the US Air Force Academy (USAFA) in Colorado Springs, Colorado. Five of the 12 sites are currently installed, with full operational capability expected by the end of 2016. Though optimized for studying near-earth objects to accomplish its primary mission of Space Situational Awareness (SSA), the Falcon telescopes are in many ways similar to those used by ongoing and planned exoplanet transit surveys targeting individual M dwarf stars (e.g., MEarth, APACHE, SPECULOOS). The network's worldwide geographic distribution provides additional potential advantages. We have performed analytical and empirical studies exploring the viability of employing the FTN for a future survey of nearby late-type M dwarfs tailored to detect transits of 1-2REarth exoplanets in habitable-zone orbits . We present empirical results on photometric precision derived from data collected with multiple Falcon telescopes on a set of nearby (survey design parameters is also described, including an analysis of site-specific weather data, anticipated telescope time allocation and the percentage of nearby M dwarfs with sufficient check stars within the Falcons' 11' x 11' field-of-view required to perform effective differential photometry. The results of this ongoing effort will inform the likelihood of discovering one (or more) habitable-zone exoplanets given current occurrence rate estimates over a nominal five-year campaign, and will dictate specific survey design features in preparation for initiating project execution when the FTN begins full-scale automated operations.

  8. Men who have sex with men in Great Britain: comparing methods and estimates from probability and convenience sample surveys.

    Science.gov (United States)

    Prah, Philip; Hickson, Ford; Bonell, Chris; McDaid, Lisa M; Johnson, Anne M; Wayal, Sonali; Clifton, Soazig; Sonnenberg, Pam; Nardone, Anthony; Erens, Bob; Copas, Andrew J; Riddell, Julie; Weatherburn, Peter; Mercer, Catherine H

    2016-09-01

    To examine sociodemographic and behavioural differences between men who have sex with men (MSM) participating in recent UK convenience surveys and a national probability sample survey. We compared 148 MSM aged 18-64 years interviewed for Britain's third National Survey of Sexual Attitudes and Lifestyles (Natsal-3) undertaken in 2010-2012, with men in the same age range participating in contemporaneous convenience surveys of MSM: 15 500 British resident men in the European MSM Internet Survey (EMIS); 797 in the London Gay Men's Sexual Health Survey; and 1234 in Scotland's Gay Men's Sexual Health Survey. Analyses compared men reporting at least one male sexual partner (past year) on similarly worded questions and multivariable analyses accounted for sociodemographic differences between the surveys. MSM in convenience surveys were younger and better educated than MSM in Natsal-3, and a larger proportion identified as gay (85%-95% vs 62%). Partner numbers were higher and same-sex anal sex more common in convenience surveys. Unprotected anal intercourse was more commonly reported in EMIS. Compared with Natsal-3, MSM in convenience surveys were more likely to report gonorrhoea diagnoses and HIV testing (both past year). Differences between the samples were reduced when restricting analysis to gay-identifying MSM. National probability surveys better reflect the population of MSM but are limited by their smaller samples of MSM. Convenience surveys recruit larger samples of MSM but tend to over-represent MSM identifying as gay and reporting more sexual risk behaviours. Because both sampling strategies have strengths and weaknesses, methods are needed to triangulate data from probability and convenience surveys. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  9. Sampling and Analysis Plan for Supplemental Environmental Project: Aquatic Life Surveys

    Energy Technology Data Exchange (ETDEWEB)

    Berryhill, Jesse Tobias [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Gaukler, Shannon Marie [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-26

    As part of a settlement agreement for nuclear waste incidents in 2014, several supplemental environment projects (SEPs) were initiated at Los Alamos National Laboratory (LANL or the Laboratory) between the U.S. Department of Energy and the state of New Mexico. One SEP from this agreement consists of performing aquatic life surveys and will be used to assess the applicability of using generic ambient water-quality criteria (AWQC) for aquatic life. AWQC are generic criteria developed by the U.S. Environmental Protection Agency (EPA) to cover a broad range of aquatic species and are not unique to a specific region or state. AWQC are established by a composition of toxicity data, called species sensitivity distributions (SSDs), and are determined by LC50 (lethal concentration of 50% of the organisms studied) acute toxicity experiments for chemicals of interest. It is of interest to determine whether aquatic species inhabiting waters on the Pajarito Plateau are adequately protected using the current generic AWQC. The focus of this study will determine which aquatic species are present in ephemeral, intermittent, and perennial waters within LANL boundaries and from reference waters adjacent to LANL. If the species identified from these waters do not generally represent species used in the SSDs, then SSDs may need to be modified and AWQC may need to be updated. This sampling and analysis plan details the sampling methodology, surveillance locations, temporal scheduling, and analytical approaches that will be used to complete aquatic life surveys. A significant portion of this sampling and analysis plan was formalized by referring to Appendix E: SEP Aquatic Life Surveys DQO (Data Quality Objectives).

  10. Preferential sampling in veterinary parasitological surveillance

    Directory of Open Access Journals (Sweden)

    Lorenzo Cecconi

    2016-04-01

    Full Text Available In parasitological surveillance of livestock, prevalence surveys are conducted on a sample of farms using several sampling designs. For example, opportunistic surveys or informative sampling designs are very common. Preferential sampling refers to any situation in which the spatial process and the sampling locations are not independent. Most examples of preferential sampling in the spatial statistics literature are in environmental statistics with focus on pollutant monitors, and it has been shown that, if preferential sampling is present and is not accounted for in the statistical modelling and data analysis, statistical inference can be misleading. In this paper, working in the context of veterinary parasitology, we propose and use geostatistical models to predict the continuous and spatially-varying risk of a parasite infection. Specifically, breaking with the common practice in veterinary parasitological surveillance to ignore preferential sampling even though informative or opportunistic samples are very common, we specify a two-stage hierarchical Bayesian model that adjusts for preferential sampling and we apply it to data on Fasciola hepatica infection in sheep farms in Campania region (Southern Italy in the years 2013-2014.

  11. ANL small-sample calorimeter system design and operation

    International Nuclear Information System (INIS)

    Roche, C.T.; Perry, R.B.; Lewis, R.N.; Jung, E.A.; Haumann, J.R.

    1978-07-01

    The Small-Sample Calorimetric System is a portable instrument designed to measure the thermal power produced by radioactive decay of plutonium-containing fuels. The small-sample calorimeter is capable of measuring samples producing power up to 32 milliwatts at a rate of one sample every 20 min. The instrument is contained in two packages: a data-acquisition module consisting of a microprocessor with an 8K-byte nonvolatile memory, and a measurement module consisting of the calorimeter and a sample preheater. The total weight of the system is 18 kg

  12. Adaptive designs for the one-sample log-rank test.

    Science.gov (United States)

    Schmidt, Rene; Faldum, Andreas; Kwiecien, Robert

    2017-09-22

    Traditional designs in phase IIa cancer trials are single-arm designs with a binary outcome, for example, tumor response. In some settings, however, a time-to-event endpoint might appear more appropriate, particularly in the presence of loss to follow-up. Then the one-sample log-rank test might be the method of choice. It allows to compare the survival curve of the patients under treatment to a prespecified reference survival curve. The reference curve usually represents the expected survival under standard of the care. In this work, convergence of the one-sample log-rank statistic to Brownian motion is proven using Rebolledo's martingale central limit theorem while accounting for staggered entry times of the patients. On this basis, a confirmatory adaptive one-sample log-rank test is proposed where provision is made for data dependent sample size reassessment. The focus is to apply the inverse normal method. This is done in two different directions. The first strategy exploits the independent increments property of the one-sample log-rank statistic. The second strategy is based on the patient-wise separation principle. It is shown by simulation that the proposed adaptive test might help to rescue an underpowered trial and at the same time lowers the average sample number (ASN) under the null hypothesis as compared to a single-stage fixed sample design. © 2017, The International Biometric Society.

  13. SEDS: THE SPITZER EXTENDED DEEP SURVEY. SURVEY DESIGN, PHOTOMETRY, AND DEEP IRAC SOURCE COUNTS

    International Nuclear Information System (INIS)

    Ashby, M. L. N.; Willner, S. P.; Fazio, G. G.; Huang, J.-S.; Hernquist, L.; Hora, J. L.; Arendt, R.; Barmby, P.; Barro, G.; Faber, S.; Guhathakurta, P.; Bell, E. F.; Bouwens, R.; Cattaneo, A.; Croton, D.; Davé, R.; Dunlop, J. S.; Egami, E.; Finlator, K.; Grogin, N. A.

    2013-01-01

    The Spitzer Extended Deep Survey (SEDS) is a very deep infrared survey within five well-known extragalactic science fields: the UKIDSS Ultra-Deep Survey, the Extended Chandra Deep Field South, COSMOS, the Hubble Deep Field North, and the Extended Groth Strip. SEDS covers a total area of 1.46 deg 2 to a depth of 26 AB mag (3σ) in both of the warm Infrared Array Camera (IRAC) bands at 3.6 and 4.5 μm. Because of its uniform depth of coverage in so many widely-separated fields, SEDS is subject to roughly 25% smaller errors due to cosmic variance than a single-field survey of the same size. SEDS was designed to detect and characterize galaxies from intermediate to high redshifts (z = 2-7) with a built-in means of assessing the impact of cosmic variance on the individual fields. Because the full SEDS depth was accumulated in at least three separate visits to each field, typically with six-month intervals between visits, SEDS also furnishes an opportunity to assess the infrared variability of faint objects. This paper describes the SEDS survey design, processing, and publicly-available data products. Deep IRAC counts for the more than 300,000 galaxies detected by SEDS are consistent with models based on known galaxy populations. Discrete IRAC sources contribute 5.6 ± 1.0 and 4.4 ± 0.8 nW m –2 sr –1 at 3.6 and 4.5 μm to the diffuse cosmic infrared background (CIB). IRAC sources cannot contribute more than half of the total CIB flux estimated from DIRBE data. Barring an unexpected error in the DIRBE flux estimates, half the CIB flux must therefore come from a diffuse component.

  14. Baseline Design Compliance Matrix for the Rotary Mode Core Sampling System

    International Nuclear Information System (INIS)

    LECHELT, J.A.

    2000-01-01

    The purpose of the design compliance matrix (DCM) is to provide a single-source document of all design requirements associated with the fifteen subsystems that make up the rotary mode core sampling (RMCS) system. It is intended to be the baseline requirement document for the RMCS system and to be used in governing all future design and design verification activities associated with it. This document is the DCM for the RMCS system used on Hanford single-shell radioactive waste storage tanks. This includes the Exhauster System, Rotary Mode Core Sample Trucks, Universal Sampling System, Diesel Generator System, Distribution Trailer, X-Ray Cart System, Breathing Air Compressor, Nitrogen Supply Trailer, Casks and Cask Truck, Service Trailer, Core Sampling Riser Equipment, Core Sampling Support Trucks, Foot Clamp, Ramps and Platforms and Purged Camera System. Excluded items are tools such as light plants and light stands. Other items such as the breather inlet filter are covered by a different design baseline. In this case, the inlet breather filter is covered by the Tank Farms Design Compliance Matrix

  15. Variable selection and estimation for longitudinal survey data

    KAUST Repository

    Wang, Li

    2014-09-01

    There is wide interest in studying longitudinal surveys where sample subjects are observed successively over time. Longitudinal surveys have been used in many areas today, for example, in the health and social sciences, to explore relationships or to identify significant variables in regression settings. This paper develops a general strategy for the model selection problem in longitudinal sample surveys. A survey weighted penalized estimating equation approach is proposed to select significant variables and estimate the coefficients simultaneously. The proposed estimators are design consistent and perform as well as the oracle procedure when the correct submodel was known. The estimating function bootstrap is applied to obtain the standard errors of the estimated parameters with good accuracy. A fast and efficient variable selection algorithm is developed to identify significant variables for complex longitudinal survey data. Simulated examples are illustrated to show the usefulness of the proposed methodology under various model settings and sampling designs. © 2014 Elsevier Inc.

  16. Outcome-Dependent Sampling Design and Inference for Cox's Proportional Hazards Model.

    Science.gov (United States)

    Yu, Jichang; Liu, Yanyan; Cai, Jianwen; Sandler, Dale P; Zhou, Haibo

    2016-11-01

    We propose a cost-effective outcome-dependent sampling design for the failure time data and develop an efficient inference procedure for data collected with this design. To account for the biased sampling scheme, we derive estimators from a weighted partial likelihood estimating equation. The proposed estimators for regression parameters are shown to be consistent and asymptotically normally distributed. A criteria that can be used to optimally implement the ODS design in practice is proposed and studied. The small sample performance of the proposed method is evaluated by simulation studies. The proposed design and inference procedure is shown to be statistically more powerful than existing alternative designs with the same sample sizes. We illustrate the proposed method with an existing real data from the Cancer Incidence and Mortality of Uranium Miners Study.

  17. Point process models for spatio-temporal distance sampling data from a large-scale survey of blue whales

    KAUST Repository

    Yuan, Yuan; Bachl, Fabian E.; Lindgren, Finn; Borchers, David L.; Illian, Janine B.; Buckland, Stephen T.; Rue, Haavard; Gerrodette, Tim

    2017-01-01

    Distance sampling is a widely used method for estimating wildlife population abundance. The fact that conventional distance sampling methods are partly design-based constrains the spatial resolution at which animal density can be estimated using these methods. Estimates are usually obtained at survey stratum level. For an endangered species such as the blue whale, it is desirable to estimate density and abundance at a finer spatial scale than stratum. Temporal variation in the spatial structure is also important. We formulate the process generating distance sampling data as a thinned spatial point process and propose model-based inference using a spatial log-Gaussian Cox process. The method adopts a flexible stochastic partial differential equation (SPDE) approach to model spatial structure in density that is not accounted for by explanatory variables, and integrated nested Laplace approximation (INLA) for Bayesian inference. It allows simultaneous fitting of detection and density models and permits prediction of density at an arbitrarily fine scale. We estimate blue whale density in the Eastern Tropical Pacific Ocean from thirteen shipboard surveys conducted over 22 years. We find that higher blue whale density is associated with colder sea surface temperatures in space, and although there is some positive association between density and mean annual temperature, our estimates are consistent with no trend in density across years. Our analysis also indicates that there is substantial spatially structured variation in density that is not explained by available covariates.

  18. Point process models for spatio-temporal distance sampling data from a large-scale survey of blue whales

    KAUST Repository

    Yuan, Yuan

    2017-12-28

    Distance sampling is a widely used method for estimating wildlife population abundance. The fact that conventional distance sampling methods are partly design-based constrains the spatial resolution at which animal density can be estimated using these methods. Estimates are usually obtained at survey stratum level. For an endangered species such as the blue whale, it is desirable to estimate density and abundance at a finer spatial scale than stratum. Temporal variation in the spatial structure is also important. We formulate the process generating distance sampling data as a thinned spatial point process and propose model-based inference using a spatial log-Gaussian Cox process. The method adopts a flexible stochastic partial differential equation (SPDE) approach to model spatial structure in density that is not accounted for by explanatory variables, and integrated nested Laplace approximation (INLA) for Bayesian inference. It allows simultaneous fitting of detection and density models and permits prediction of density at an arbitrarily fine scale. We estimate blue whale density in the Eastern Tropical Pacific Ocean from thirteen shipboard surveys conducted over 22 years. We find that higher blue whale density is associated with colder sea surface temperatures in space, and although there is some positive association between density and mean annual temperature, our estimates are consistent with no trend in density across years. Our analysis also indicates that there is substantial spatially structured variation in density that is not explained by available covariates.

  19. Amostra mestra e geoprocessamento: tecnologias para inquéritos domiciliares Master sample and geoprocessing: technologies for household surveys

    Directory of Open Access Journals (Sweden)

    Nilza Nunes da Silva

    2003-08-01

    census enumeration areas in several epidemiological surveys using updated information from the National Survey of Households (PNAD. METHODS: Address data file comprising 72 census enumeration areas was kept as primary sampling units for the city of São Paulo. During the period 1995-2000, three distinct household samples were drawn using the two-stage cluster sampling procedure. Geographic Information System (GIS technology allowed delimiting boundaries, blocks and streets for any primary sampling unit and printing updated maps for selected sub-samples. RESULTS: Twenty-five thousand dwellings made up the permanent address data file of the master sample. A cheaper and quicker selection of each sample, plus gathering information on demographic and topographical profiles of census enumeration areas were the main contribution of the study results. CONCLUSIONS: The master sample concept, integrated with GIS technology, is an advantageous alternative sampling design for household surveys in urban areas. Using the list of addresses from the PNAD updated yearly, although limiting its application to the most populated Brazilian cities, avoids the need of creating an independent sampling procedure for each individual survey carried out in the period between demographic censuses, and it is an important contribution for planning sampling surveys in public health.

  20. A Nationwide Random Sampling Survey of Potential Complicated Grief in Japan

    Science.gov (United States)

    Mizuno, Yasunao; Kishimoto, Junji; Asukai, Nozomu

    2012-01-01

    To investigate the prevalence of significant loss, potential complicated grief (CG), and its contributing factors, we conducted a nationwide random sampling survey of Japanese adults aged 18 or older (N = 1,343) using a self-rating Japanese-language version of the Complicated Grief Brief Screen. Among them, 37.0% experienced their most significant…

  1. A Frequency Domain Design Method For Sampled-Data Compensators

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Jannerup, Ole Erik

    1990-01-01

    A new approach to the design of a sampled-data compensator in the frequency domain is investigated. The starting point is a continuous-time compensator for the continuous-time system which satisfy specific design criteria. The new design method will graphically show how the discrete...

  2. Sampling methods. A survey of methods in use in the Nordic countries

    International Nuclear Information System (INIS)

    Isaksson, M.

    2000-10-01

    This report is a survey of sampling techniques currently in use for radioactivity measurements in the Nordic countries, but restricted to sampling techniques for pasture, soil and deposition in emergency situations. It is found that the participating laboratories apply similar sampling procedures for pasture, including cutting height and size of sampled areas. Soil samples are generally taken by some sort of corer of different diameter. The number of cores taken varies, different sampling patterns are used, and pooling of the samples is done by some of the laboratories. The analysis of pasture and of soil is made with Nal-detectors or by high-resolution gamma spectrometry on fresh or dried samples. Precipitation collectors of a range of sizes are used to determine the activity concentration in precipitation and of dry deposited radionuclides. The analysis is made with high-resolution gamma-spectrometry, either directly on a water sample or on ion exchange resins. (au)

  3. CT dose survey in adults: what sample size for what precision?

    International Nuclear Information System (INIS)

    Taylor, Stephen; Muylem, Alain van; Howarth, Nigel; Gevenois, Pierre Alain; Tack, Denis

    2017-01-01

    To determine variability of volume computed tomographic dose index (CTDIvol) and dose-length product (DLP) data, and propose a minimum sample size to achieve an expected precision. CTDIvol and DLP values of 19,875 consecutive CT acquisitions of abdomen (7268), thorax (3805), lumbar spine (3161), cervical spine (1515) and head (4106) were collected in two centers. Their variabilities were investigated according to sample size (10 to 1000 acquisitions) and patient body weight categories (no weight selection, 67-73 kg and 60-80 kg). The 95 % confidence interval in percentage of their median (CI95/med) value was calculated for increasing sample sizes. We deduced the sample size that set a 95 % CI lower than 10 % of the median (CI95/med ≤ 10 %). Sample size ensuring CI95/med ≤ 10 %, ranged from 15 to 900 depending on the body region and the dose descriptor considered. In sample sizes recommended by regulatory authorities (i.e., from 10-20 patients), mean CTDIvol and DLP of one sample ranged from 0.50 to 2.00 times its actual value extracted from 2000 samples. The sampling error in CTDIvol and DLP means is high in dose surveys based on small samples of patients. Sample size should be increased at least tenfold to decrease this variability. (orig.)

  4. CT dose survey in adults: what sample size for what precision?

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, Stephen [Hopital Ambroise Pare, Department of Radiology, Mons (Belgium); Muylem, Alain van [Hopital Erasme, Department of Pneumology, Brussels (Belgium); Howarth, Nigel [Clinique des Grangettes, Department of Radiology, Chene-Bougeries (Switzerland); Gevenois, Pierre Alain [Hopital Erasme, Department of Radiology, Brussels (Belgium); Tack, Denis [EpiCURA, Clinique Louis Caty, Department of Radiology, Baudour (Belgium)

    2017-01-15

    To determine variability of volume computed tomographic dose index (CTDIvol) and dose-length product (DLP) data, and propose a minimum sample size to achieve an expected precision. CTDIvol and DLP values of 19,875 consecutive CT acquisitions of abdomen (7268), thorax (3805), lumbar spine (3161), cervical spine (1515) and head (4106) were collected in two centers. Their variabilities were investigated according to sample size (10 to 1000 acquisitions) and patient body weight categories (no weight selection, 67-73 kg and 60-80 kg). The 95 % confidence interval in percentage of their median (CI95/med) value was calculated for increasing sample sizes. We deduced the sample size that set a 95 % CI lower than 10 % of the median (CI95/med ≤ 10 %). Sample size ensuring CI95/med ≤ 10 %, ranged from 15 to 900 depending on the body region and the dose descriptor considered. In sample sizes recommended by regulatory authorities (i.e., from 10-20 patients), mean CTDIvol and DLP of one sample ranged from 0.50 to 2.00 times its actual value extracted from 2000 samples. The sampling error in CTDIvol and DLP means is high in dose surveys based on small samples of patients. Sample size should be increased at least tenfold to decrease this variability. (orig.)

  5. A Unimodal Model for Double Observer Distance Sampling Surveys.

    Directory of Open Access Journals (Sweden)

    Earl F Becker

    Full Text Available Distance sampling is a widely used method to estimate animal population size. Most distance sampling models utilize a monotonically decreasing detection function such as a half-normal. Recent advances in distance sampling modeling allow for the incorporation of covariates into the distance model, and the elimination of the assumption of perfect detection at some fixed distance (usually the transect line with the use of double-observer models. The assumption of full observer independence in the double-observer model is problematic, but can be addressed by using the point independence assumption which assumes there is one distance, the apex of the detection function, where the 2 observers are assumed independent. Aerially collected distance sampling data can have a unimodal shape and have been successfully modeled with a gamma detection function. Covariates in gamma detection models cause the apex of detection to shift depending upon covariate levels, making this model incompatible with the point independence assumption when using double-observer data. This paper reports a unimodal detection model based on a two-piece normal distribution that allows covariates, has only one apex, and is consistent with the point independence assumption when double-observer data are utilized. An aerial line-transect survey of black bears in Alaska illustrate how this method can be applied.

  6. A two-stage cluster sampling method using gridded population data, a GIS, and Google EarthTM imagery in a population-based mortality survey in Iraq

    Directory of Open Access Journals (Sweden)

    Galway LP

    2012-04-01

    Full Text Available Abstract Background Mortality estimates can measure and monitor the impacts of conflict on a population, guide humanitarian efforts, and help to better understand the public health impacts of conflict. Vital statistics registration and surveillance systems are rarely functional in conflict settings, posing a challenge of estimating mortality using retrospective population-based surveys. Results We present a two-stage cluster sampling method for application in population-based mortality surveys. The sampling method utilizes gridded population data and a geographic information system (GIS to select clusters in the first sampling stage and Google Earth TM imagery and sampling grids to select households in the second sampling stage. The sampling method is implemented in a household mortality study in Iraq in 2011. Factors affecting feasibility and methodological quality are described. Conclusion Sampling is a challenge in retrospective population-based mortality studies and alternatives that improve on the conventional approaches are needed. The sampling strategy presented here was designed to generate a representative sample of the Iraqi population while reducing the potential for bias and considering the context specific challenges of the study setting. This sampling strategy, or variations on it, are adaptable and should be considered and tested in other conflict settings.

  7. Estimating HIES Data through Ratio and Regression Methods for Different Sampling Designs

    Directory of Open Access Journals (Sweden)

    Faqir Muhammad

    2007-01-01

    Full Text Available In this study, comparison has been made for different sampling designs, using the HIES data of North West Frontier Province (NWFP for 2001-02 and 1998-99 collected from the Federal Bureau of Statistics, Statistical Division, Government of Pakistan, Islamabad. The performance of the estimators has also been considered using bootstrap and Jacknife. A two-stage stratified random sample design is adopted by HIES. In the first stage, enumeration blocks and villages are treated as the first stage Primary Sampling Units (PSU. The sample PSU’s are selected with probability proportional to size. Secondary Sampling Units (SSU i.e., households are selected by systematic sampling with a random start. They have used a single study variable. We have compared the HIES technique with some other designs, which are: Stratified Simple Random Sampling. Stratified Systematic Sampling. Stratified Ranked Set Sampling. Stratified Two Phase Sampling. Ratio and Regression methods were applied with two study variables, which are: Income (y and Household sizes (x. Jacknife and Bootstrap are used for variance replication. Simple Random Sampling with sample size (462 to 561 gave moderate variances both by Jacknife and Bootstrap. By applying Systematic Sampling, we received moderate variance with sample size (467. In Jacknife with Systematic Sampling, we obtained variance of regression estimator greater than that of ratio estimator for a sample size (467 to 631. At a sample size (952 variance of ratio estimator gets greater than that of regression estimator. The most efficient design comes out to be Ranked set sampling compared with other designs. The Ranked set sampling with jackknife and bootstrap, gives minimum variance even with the smallest sample size (467. Two Phase sampling gave poor performance. Multi-stage sampling applied by HIES gave large variances especially if used with a single study variable.

  8. Sampling designs matching species biology produce accurate and affordable abundance indices

    Directory of Open Access Journals (Sweden)

    Grant Harris

    2013-12-01

    Full Text Available Wildlife biologists often use grid-based designs to sample animals and generate abundance estimates. Although sampling in grids is theoretically sound, in application, the method can be logistically difficult and expensive when sampling elusive species inhabiting extensive areas. These factors make it challenging to sample animals and meet the statistical assumption of all individuals having an equal probability of capture. Violating this assumption biases results. Does an alternative exist? Perhaps by sampling only where resources attract animals (i.e., targeted sampling, it would provide accurate abundance estimates more efficiently and affordably. However, biases from this approach would also arise if individuals have an unequal probability of capture, especially if some failed to visit the sampling area. Since most biological programs are resource limited, and acquiring abundance data drives many conservation and management applications, it becomes imperative to identify economical and informative sampling designs. Therefore, we evaluated abundance estimates generated from grid and targeted sampling designs using simulations based on geographic positioning system (GPS data from 42 Alaskan brown bears (Ursus arctos. Migratory salmon drew brown bears from the wider landscape, concentrating them at anadromous streams. This provided a scenario for testing the targeted approach. Grid and targeted sampling varied by trap amount, location (traps placed randomly, systematically or by expert opinion, and traps stationary or moved between capture sessions. We began by identifying when to sample, and if bears had equal probability of capture. We compared abundance estimates against seven criteria: bias, precision, accuracy, effort, plus encounter rates, and probabilities of capture and recapture. One grid (49 km2 cells and one targeted configuration provided the most accurate results. Both placed traps by expert opinion and moved traps between capture

  9. Sampling designs matching species biology produce accurate and affordable abundance indices.

    Science.gov (United States)

    Harris, Grant; Farley, Sean; Russell, Gareth J; Butler, Matthew J; Selinger, Jeff

    2013-01-01

    Wildlife biologists often use grid-based designs to sample animals and generate abundance estimates. Although sampling in grids is theoretically sound, in application, the method can be logistically difficult and expensive when sampling elusive species inhabiting extensive areas. These factors make it challenging to sample animals and meet the statistical assumption of all individuals having an equal probability of capture. Violating this assumption biases results. Does an alternative exist? Perhaps by sampling only where resources attract animals (i.e., targeted sampling), it would provide accurate abundance estimates more efficiently and affordably. However, biases from this approach would also arise if individuals have an unequal probability of capture, especially if some failed to visit the sampling area. Since most biological programs are resource limited, and acquiring abundance data drives many conservation and management applications, it becomes imperative to identify economical and informative sampling designs. Therefore, we evaluated abundance estimates generated from grid and targeted sampling designs using simulations based on geographic positioning system (GPS) data from 42 Alaskan brown bears (Ursus arctos). Migratory salmon drew brown bears from the wider landscape, concentrating them at anadromous streams. This provided a scenario for testing the targeted approach. Grid and targeted sampling varied by trap amount, location (traps placed randomly, systematically or by expert opinion), and traps stationary or moved between capture sessions. We began by identifying when to sample, and if bears had equal probability of capture. We compared abundance estimates against seven criteria: bias, precision, accuracy, effort, plus encounter rates, and probabilities of capture and recapture. One grid (49 km(2) cells) and one targeted configuration provided the most accurate results. Both placed traps by expert opinion and moved traps between capture sessions

  10. Sampling designs matching species biology produce accurate and affordable abundance indices

    Science.gov (United States)

    Farley, Sean; Russell, Gareth J.; Butler, Matthew J.; Selinger, Jeff

    2013-01-01

    Wildlife biologists often use grid-based designs to sample animals and generate abundance estimates. Although sampling in grids is theoretically sound, in application, the method can be logistically difficult and expensive when sampling elusive species inhabiting extensive areas. These factors make it challenging to sample animals and meet the statistical assumption of all individuals having an equal probability of capture. Violating this assumption biases results. Does an alternative exist? Perhaps by sampling only where resources attract animals (i.e., targeted sampling), it would provide accurate abundance estimates more efficiently and affordably. However, biases from this approach would also arise if individuals have an unequal probability of capture, especially if some failed to visit the sampling area. Since most biological programs are resource limited, and acquiring abundance data drives many conservation and management applications, it becomes imperative to identify economical and informative sampling designs. Therefore, we evaluated abundance estimates generated from grid and targeted sampling designs using simulations based on geographic positioning system (GPS) data from 42 Alaskan brown bears (Ursus arctos). Migratory salmon drew brown bears from the wider landscape, concentrating them at anadromous streams. This provided a scenario for testing the targeted approach. Grid and targeted sampling varied by trap amount, location (traps placed randomly, systematically or by expert opinion), and traps stationary or moved between capture sessions. We began by identifying when to sample, and if bears had equal probability of capture. We compared abundance estimates against seven criteria: bias, precision, accuracy, effort, plus encounter rates, and probabilities of capture and recapture. One grid (49 km2 cells) and one targeted configuration provided the most accurate results. Both placed traps by expert opinion and moved traps between capture sessions, which

  11. Lagoa Real design. Description and evaluation of sampling system

    International Nuclear Information System (INIS)

    Hashizume, B.K.

    1982-10-01

    This report describes the samples preparation system of drilling from Lagoa Real Design, aiming obtainment representative fraction of the half from drilling outlier. The error of sampling + analysis and analytical accuracy was obtainment by delayed neutron analysis. (author)

  12. Design and methods in a survey of living conditions in the Arctic - the SLiCA study.

    Science.gov (United States)

    Eliassen, Bent-Martin; Melhus, Marita; Kruse, Jack; Poppel, Birger; Broderstad, Ann Ragnhild

    2012-03-19

    The main objective of this study is to describe the methods and design of the survey of living conditions in the Arctic (SLiCA), relevant participation rates and the distribution of participants, as applicable to the survey data in Alaska, Greenland and Norway. This article briefly addresses possible selection bias in the data and also the ways to tackle it in future studies. Population-based cross-sectional survey. Indigenous individuals aged 16 years and older, living in Greenland, Alaska and in traditional settlement areas in Norway, were invited to participate. Random sampling methods were applied in Alaska and Greenland, while non-probability sampling methods were applied in Norway. Data were collected in 3 periods: in Alaska, from January 2002 to February 2003; in Greenland, from December 2003 to August 2006; and in Norway, in 2003 and from June 2006 to June 2008. The principal method in SLiCA was standardised face-to-face interviews using a questionnaire. A total of 663, 1,197 and 445 individuals were interviewed in Alaska, Greenland and Norway, respectively. Very high overall participation rates of 83% were obtained in Greenland and Alaska, while a more conventional rate of 57% was achieved in Norway. A predominance of female respondents was obtained in Alaska. Overall, the Sami cohort is older than the cohorts from Greenland and Alaska. Preliminary assessments suggest that selection bias in the Sami sample is plausible but not a major threat. Few or no threats to validity are detected in the data from Alaska and Greenland. Despite different sampling and recruitment methods, and sociocultural differences, a unique database has been generated, which shall be used to explore relationships between health and other living conditions variables.

  13. Synthetic Multiple-Imputation Procedure for Multistage Complex Samples

    Directory of Open Access Journals (Sweden)

    Zhou Hanzhi

    2016-03-01

    Full Text Available Multiple imputation (MI is commonly used when item-level missing data are present. However, MI requires that survey design information be built into the imputation models. For multistage stratified clustered designs, this requires dummy variables to represent strata as well as primary sampling units (PSUs nested within each stratum in the imputation model. Such a modeling strategy is not only operationally burdensome but also inferentially inefficient when there are many strata in the sample design. Complexity only increases when sampling weights need to be modeled. This article develops a generalpurpose analytic strategy for population inference from complex sample designs with item-level missingness. In a simulation study, the proposed procedures demonstrate efficient estimation and good coverage properties. We also consider an application to accommodate missing body mass index (BMI data in the analysis of BMI percentiles using National Health and Nutrition Examination Survey (NHANES III data. We argue that the proposed methods offer an easy-to-implement solution to problems that are not well-handled by current MI techniques. Note that, while the proposed method borrows from the MI framework to develop its inferential methods, it is not designed as an alternative strategy to release multiply imputed datasets for complex sample design data, but rather as an analytic strategy in and of itself.

  14. Handling missing data in ranked set sampling

    CERN Document Server

    Bouza-Herrera, Carlos N

    2013-01-01

    The existence of missing observations is a very important aspect to be considered in the application of survey sampling, for example. In human populations they may be caused by a refusal of some interviewees to give the true value for the variable of interest. Traditionally, simple random sampling is used to select samples. Most statistical models are supported by the use of samples selected by means of this design. In recent decades, an alternative design has started being used, which, in many cases, shows an improvement in terms of accuracy compared with traditional sampling. It is called R

  15. The economic impact of poor sample quality in clinical chemistry laboratories: results from a global survey.

    Science.gov (United States)

    Erdal, Erik P; Mitra, Debanjali; Khangulov, Victor S; Church, Stephen; Plokhoy, Elizabeth

    2017-03-01

    Background Despite advances in clinical chemistry testing, poor blood sample quality continues to impact laboratory operations and the quality of results. While previous studies have identified the preanalytical causes of lower sample quality, few studies have examined the economic impact of poor sample quality on the laboratory. Specifically, the costs associated with workarounds related to fibrin and gel contaminants remain largely unexplored. Methods A quantitative survey of clinical chemistry laboratory stakeholders across 10 international regions, including countries in North America, Europe and Oceania, was conducted to examine current blood sample testing practices, sample quality issues and practices to remediate poor sample quality. Survey data were used to estimate costs incurred by laboratories to mitigate sample quality issues. Results Responses from 164 participants were included in the analysis, which was focused on three specific issues: fibrin strands, fibrin masses and gel globules. Fibrin strands were the most commonly reported issue, with an overall incidence rate of ∼3%. Further, 65% of respondents indicated that these issues contribute to analyzer probe clogging, and the majority of laboratories had visual inspection and manual remediation practices in place to address fibrin- and gel-related quality problems (55% and 70%, respectively). Probe maintenance/replacement, visual inspection and manual remediation were estimated to carry significant costs for the laboratories surveyed. Annual cost associated with lower sample quality and remediation related to fibrin and/or gel globules for an average US laboratory was estimated to be $100,247. Conclusions Measures to improve blood sample quality present an important step towards improved laboratory operations.

  16. Design review report for rotary mode core sample truck (RMCST) modifications for flammable gas tanks, preliminary design

    International Nuclear Information System (INIS)

    Corbett, J.E.

    1996-02-01

    This report documents the completion of a preliminary design review for the Rotary Mode Core Sample Truck (RMCST) modifications for flammable gas tanks. The RMCST modifications are intended to support core sampling operations in waste tanks requiring flammable gas controls. The objective of this review was to validate basic design assumptions and concepts to support a path forward leading to a final design. The conclusion reached by the review committee was that the design was acceptable and efforts should continue toward a final design review

  17. Development of a sampling strategy and sample size calculation to estimate the distribution of mammographic breast density in Korean women.

    Science.gov (United States)

    Jun, Jae Kwan; Kim, Mi Jin; Choi, Kui Son; Suh, Mina; Jung, Kyu-Won

    2012-01-01

    Mammographic breast density is a known risk factor for breast cancer. To conduct a survey to estimate the distribution of mammographic breast density in Korean women, appropriate sampling strategies for representative and efficient sampling design were evaluated through simulation. Using the target population from the National Cancer Screening Programme (NCSP) for breast cancer in 2009, we verified the distribution estimate by repeating the simulation 1,000 times using stratified random sampling to investigate the distribution of breast density of 1,340,362 women. According to the simulation results, using a sampling design stratifying the nation into three groups (metropolitan, urban, and rural), with a total sample size of 4,000, we estimated the distribution of breast density in Korean women at a level of 0.01% tolerance. Based on the results of our study, a nationwide survey for estimating the distribution of mammographic breast density among Korean women can be conducted efficiently.

  18. Mechanical design and simulation of an automatized sample exchanger

    International Nuclear Information System (INIS)

    Lopez, Yon; Gora, Jimmy; Bedregal, Patricia; Hernandez, Yuri; Baltuano, Oscar; Gago, Javier

    2013-01-01

    The design of a turntable type sample exchanger for irradiation and with a capacity for up to 20 capsules was performed. Its function is the automatic sending of samples contained in polyethylene capsules, for irradiation in the grid position of the reactor core, using a pneumatic system and further analysis by neutron activation. This study shows the structural design analysis and calculations in selecting motors and actuators. This development will improve efficiency in the analysis, reducing the contribution of the workers and also the radiation exposure time. (authors).

  19. Precision of systematic and random sampling in clustered populations: habitat patches and aggregating organisms.

    Science.gov (United States)

    McGarvey, Richard; Burch, Paul; Matthews, Janet M

    2016-01-01

    Natural populations of plants and animals spatially cluster because (1) suitable habitat is patchy, and (2) within suitable habitat, individuals aggregate further into clusters of higher density. We compare the precision of random and systematic field sampling survey designs under these two processes of species clustering. Second, we evaluate the performance of 13 estimators for the variance of the sample mean from a systematic survey. Replicated simulated surveys, as counts from 100 transects, allocated either randomly or systematically within the study region, were used to estimate population density in six spatial point populations including habitat patches and Matérn circular clustered aggregations of organisms, together and in combination. The standard one-start aligned systematic survey design, a uniform 10 x 10 grid of transects, was much more precise. Variances of the 10 000 replicated systematic survey mean densities were one-third to one-fifth of those from randomly allocated transects, implying transect sample sizes giving equivalent precision by random survey would need to be three to five times larger. Organisms being restricted to patches of habitat was alone sufficient to yield this precision advantage for the systematic design. But this improved precision for systematic sampling in clustered populations is underestimated by standard variance estimators used to compute confidence intervals. True variance for the survey sample mean was computed from the variance of 10 000 simulated survey mean estimates. Testing 10 published and three newly proposed variance estimators, the two variance estimators (v) that corrected for inter-transect correlation (ν₈ and ν(W)) were the most accurate and also the most precise in clustered populations. These greatly outperformed the two "post-stratification" variance estimators (ν₂ and ν₃) that are now more commonly applied in systematic surveys. Similar variance estimator performance rankings were found with

  20. Outcome-Dependent Sampling Design and Inference for Cox’s Proportional Hazards Model

    Science.gov (United States)

    Yu, Jichang; Liu, Yanyan; Cai, Jianwen; Sandler, Dale P.; Zhou, Haibo

    2016-01-01

    We propose a cost-effective outcome-dependent sampling design for the failure time data and develop an efficient inference procedure for data collected with this design. To account for the biased sampling scheme, we derive estimators from a weighted partial likelihood estimating equation. The proposed estimators for regression parameters are shown to be consistent and asymptotically normally distributed. A criteria that can be used to optimally implement the ODS design in practice is proposed and studied. The small sample performance of the proposed method is evaluated by simulation studies. The proposed design and inference procedure is shown to be statistically more powerful than existing alternative designs with the same sample sizes. We illustrate the proposed method with an existing real data from the Cancer Incidence and Mortality of Uranium Miners Study. PMID:28090134

  1. Shorlist Masterplan Wind. Evaluation of the sampling grid of the year-round ichthyoplankton survey

    NARCIS (Netherlands)

    Bolle, L.J.; Beek, van J.K.L.

    2011-01-01

    Within the research programme 'Shortlist Masterplan Wind' a year-round ichthyoplankton survey is being carried out. The sampling area is based on known spawning concentrations and prevailing currents.

  2. A survey of archaeological and geological samples dated in 1990

    International Nuclear Information System (INIS)

    Mejdahl, V.

    1991-01-01

    A survey of dated archaeological and geological samples is given, using thermoluminescence dating. Some of the sediment samples were also dated by means of optically stimulated luminescence (OSL) using a newly developed infrared diode system. In most cases the luminescence dates are in accordance with archaeological and geological estimates. Some discrepancies were found because some feldspar samples exhibited severe anomalous fading. It may be possible to avoid this problem by basing the dating on OSL of quartz. For sediment samples of Eemian or Early Weichselian age severe underestimates were encountered with both methods. The reason might be related to the large difference between the natural dose rate and that used in laboratory irradiations. Traps corresponding to low-temperature peaks such as the 150 deg. C peak in feldspars will remain almost empty under natural conditions, but will fill up to saturation under laboratory irradiation and thereby more charges will be captured in high-temperature traps. As a result, natural growth curves and laboratory produced luminescence growth curves will have different slopes and this will lead to underestimation. This problem might avoided by holding samples at an elevated temperature during laboratory irradiation, thus keeping the low-temperature traps empty. Preliminary experiments where feldspar samples were held at 130 deg. C during irradiation have given promising results. (AB) (31 refs.)

  3. Hospital survey on patient safety culture: psychometric analysis on a Scottish sample.

    Science.gov (United States)

    Sarac, Cakil; Flin, Rhona; Mearns, Kathryn; Jackson, Jeanette

    2011-10-01

    To investigate the psychometric properties of the Hospital Survey on Patient Safety Culture on a Scottish NHS data set. The data were collected from 1969 clinical staff (estimated 22% response rate) from one acute hospital from each of seven Scottish Health boards. Using a split-half validation technique, the data were randomly split; an exploratory factor analysis was conducted on the calibration data set, and confirmatory factor analyses were conducted on the validation data set to investigate and check the original US model fit in a Scottish sample. Following the split-half validation technique, exploratory factor analysis results showed a 10-factor optimal measurement model. The confirmatory factor analyses were then performed to compare the model fit of two competing models (10-factor alternative model vs 12-factor original model). An S-B scaled χ(2) square difference test demonstrated that the original 12-factor model performed significantly better in a Scottish sample. Furthermore, reliability analyses of each component yielded satisfactory results. The mean scores on the climate dimensions in the Scottish sample were comparable with those found in other European countries. This study provided evidence that the original 12-factor structure of the Hospital Survey on Patient Safety Culture scale has been replicated in this Scottish sample. Therefore, no modifications are required to the original 12-factor model, which is suggested for use, since it would allow researchers the possibility of cross-national comparisons.

  4. Geochemical reanalysis of historical U.S. Geological Survey sediment samples from the Zane Hills, Hughes and Shungnak quadrangles, Alaska

    Science.gov (United States)

    Werdon, Melanie B.; Granitto, Matthew; Azain, Jaime S.

    2015-01-01

    The State of Alaska’s Strategic and Critical Minerals (SCM) Assessment project, a State-funded Capital Improvement Project (CIP), is designed to evaluate Alaska’s statewide potential for SCM resources. The SCM Assessment is being implemented by the Alaska Division of Geological & Geophysical Surveys (DGGS), and involves obtaining new airborne-geophysical, geological, and geochemical data. As part of the SCM Assessment, thousands of historical geochemical samples from DGGS, U.S. Geological Survey (USGS), and U.S. Bureau of Mines archives are being reanalyzed by DGGS using modern, quantitative, geochemical-analytical methods. The objective is to update the statewide geochemical database to more clearly identify areas in Alaska with SCM potential.The USGS is also undertaking SCM-related geologic studies in Alaska through the federally funded Alaska Critical Minerals cooperative project. DGGS and USGS share the goal of evaluating Alaska’s strategic and critical minerals potential and together created a Letter of Agreement (signed December 2012) and a supplementary Technical Assistance Agreement (#14CMTAA143458) to facilitate the two agencies’ cooperative work. Under these agreements, DGGS contracted the USGS in Denver to reanalyze historical USGS sediment samples from Alaska.For this report, DGGS funded reanalysis of 105 historical USGS sediment samples from the statewide Alaska Geochemical Database Version 2.0 (AGDB2; Granitto and others, 2013). Samples were chosen from the Zane Hills area in the Hughes and Shungnak quadrangles, Alaska (fig. 1). The USGS was responsible for sample retrieval from the National Geochemical Sample Archive (NGSA) in Denver, Colorado through the final quality assurance/quality control (QA/QC) of the geochemical analyses obtained through the USGS contract lab. The new geochemical data are published in this report as a coauthored DGGS report, and will be incorporated into the statewide geochemical databases of both agencies.

  5. Technical Report and Data File User's Manual for the 1992 National Adult Literacy Survey.

    Science.gov (United States)

    Kirsch, Irwin; Yamamoto, Kentaro; Norris, Norma; Rock, Donald; Jungeblut, Ann; O'Reilly, Patricia; Berlin, Martha; Mohadjer, Leyla; Waksberg, Joseph; Goksel, Huseyin; Burke, John; Rieger, Susan; Green, James; Klein, Merle; Campbell, Anne; Jenkins, Lynn; Kolstad, Andrew; Mosenthal, Peter; Baldi, Stephane

    Chapter 1 of this report and user's manual describes design and implementation of the 1992 National Adult Literacy Survey (NALS). Chapter 2 reviews stages of sampling for national and state survey components; presents weighted and unweighted response rates for the household component; and describes non-incentive and prison sample designs. Chapter…

  6. Metamodels for Computer-Based Engineering Design: Survey and Recommendations

    Science.gov (United States)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.

  7. 1979 Reserve Force Studies Surveys: Survey Design, Sample Design and Administrative Procedures,

    Science.gov (United States)

    1981-08-01

    Webster NY 14580 35 29 R064 Q43AA R 11 SC Des Moines IA 50315 109 60 R125 Q7ZAA R 11 AG Tifton GA 31794 9 70 R027 Q9XAA R 11 QM Helena MT 59601 21 23...NY 12208 8 46 GIB PCNBO AR 1 AR Palmdale CA 33550 2 68 GO PCQBO AR I AR San Diego CA 2111 78 6 G106 PC2CO AR 1 AR Dalton GA 30720 88 82 G08 PDZBO AR 1...NY 12401 51 47 G016 PCEAO R 4 FA Culver City CA 90230 94 52 G184 a PCKAO R 4 IN Visalia CA 93277 G197 a PDACO R 4 IN Eastman GA 31810 G130 PDGCO R 4 FA

  8. A knowledge - based system to assist in the design of soil survey schemes

    NARCIS (Netherlands)

    Domburg, P.

    1994-01-01

    Soil survey information with quantified accuracy is relevant to decisions on land use and environmental problems. To obtain such information statistical strategies should be used for collecting and analysing data. A survey project based on a statistical sampling strategy requires a soil

  9. SABE Colombia: Survey on Health, Well-Being, and Aging in Colombia—Study Design and Protocol

    Science.gov (United States)

    Corchuelo, Jairo; Curcio, Carmen-Lucia; Calzada, Maria-Teresa; Mendez, Fabian

    2016-01-01

    Objective. To describe the design of the SABE Colombia study. The major health study of the old people in Latin America and the Caribbean (LAC) is the Survey on Health, Well-Being, and Aging in LAC, SABE (from initials in Spanish: SAlud, Bienestar & Envejecimiento). Methods. The SABE Colombia is a population-based cross-sectional study on health, aging, and well-being of elderly individuals aged at least 60 years focusing attention on social determinants of health inequities. Methods and design were similar to original LAC SABE. The total sample size of the study at the urban and rural research sites (244 municipalities) was 23.694 elderly Colombians representative of the total population. The study had three components: (1) a questionnaire covering active aging determinants including anthropometry, blood pressure measurement, physical function, and biochemical and hematological measures; (2) a subsample survey among family caregivers; (3) a qualitative study with gender and cultural perspectives of quality of life to understand different dimensions of people meanings. Conclusions. The SABE Colombia is a comprehensive, multidisciplinary study of the elderly with respect to active aging determinants. The results of this study are intended to inform public policies aimed at tackling health inequalities for the aging society in Colombia. PMID:27956896

  10. Design and validation of the Quantum Mechanics Conceptual Survey

    Directory of Open Access Journals (Sweden)

    S. B. McKagan

    2010-11-01

    Full Text Available The Quantum Mechanics Conceptual Survey (QMCS is a 12-question survey of students’ conceptual understanding of quantum mechanics. It is intended to be used to measure the relative effectiveness of different instructional methods in modern physics courses. In this paper, we describe the design and validation of the survey, a process that included observations of students, a review of previous literature and textbooks and syllabi, faculty and student interviews, and statistical analysis. We also discuss issues in the development of specific questions, which may be useful both for instructors who wish to use the QMCS in their classes and for researchers who wish to conduct further research of student understanding of quantum mechanics. The QMCS has been most thoroughly tested in, and is most appropriate for assessment of (as a posttest only, sophomore-level modern physics courses. We also describe testing with students in junior quantum courses and graduate quantum courses, from which we conclude that the QMCS may be appropriate for assessing junior quantum courses, but is not appropriate for assessing graduate courses. One surprising result of our faculty interviews is a lack of faculty consensus on what topics should be taught in modern physics, which has made designing a test that is valued by a majority of physics faculty more difficult than expected.

  11. An evaluation of the quality of statistical design and analysis of published medical research: results from a systematic survey of general orthopaedic journals.

    Science.gov (United States)

    Parsons, Nick R; Price, Charlotte L; Hiskens, Richard; Achten, Juul; Costa, Matthew L

    2012-04-25

    The application of statistics in reported research in trauma and orthopaedic surgery has become ever more important and complex. Despite the extensive use of statistical analysis, it is still a subject which is often not conceptually well understood, resulting in clear methodological flaws and inadequate reporting in many papers. A detailed statistical survey sampled 100 representative orthopaedic papers using a validated questionnaire that assessed the quality of the trial design and statistical analysis methods. The survey found evidence of failings in study design, statistical methodology and presentation of the results. Overall, in 17% (95% confidence interval; 10-26%) of the studies investigated the conclusions were not clearly justified by the results, in 39% (30-49%) of studies a different analysis should have been undertaken and in 17% (10-26%) a different analysis could have made a difference to the overall conclusions. It is only by an improved dialogue between statistician, clinician, reviewer and journal editor that the failings in design methodology and analysis highlighted by this survey can be addressed.

  12. An evaluation of the quality of statistical design and analysis of published medical research: results from a systematic survey of general orthopaedic journals

    Directory of Open Access Journals (Sweden)

    Parsons Nick R

    2012-04-01

    Full Text Available Abstract Background The application of statistics in reported research in trauma and orthopaedic surgery has become ever more important and complex. Despite the extensive use of statistical analysis, it is still a subject which is often not conceptually well understood, resulting in clear methodological flaws and inadequate reporting in many papers. Methods A detailed statistical survey sampled 100 representative orthopaedic papers using a validated questionnaire that assessed the quality of the trial design and statistical analysis methods. Results The survey found evidence of failings in study design, statistical methodology and presentation of the results. Overall, in 17% (95% confidence interval; 10–26% of the studies investigated the conclusions were not clearly justified by the results, in 39% (30–49% of studies a different analysis should have been undertaken and in 17% (10–26% a different analysis could have made a difference to the overall conclusions. Conclusion It is only by an improved dialogue between statistician, clinician, reviewer and journal editor that the failings in design methodology and analysis highlighted by this survey can be addressed.

  13. Small-vessel Survey and Auction Sampling to Estimate Growth and Maturity of Eteline Snappers

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Small-vessel Survey and Auction Sampling to Estimate Growth and Maturity of Eteline Snappers and Improve Data-Limited Stock Assessments. This biosampling project...

  14. National sample survey to assess the new case disease burden of leprosy in India

    Directory of Open Access Journals (Sweden)

    Kiran Katoch

    2017-01-01

    Full Text Available A national sample survey of leprosy was undertaken in partnership with Indian Council of Medical Research (ICMR institutions, National Leprosy Eradication Programme (NLEP, Panchayati Raj members, and treated leprosy patients to detect new cases of leprosy in India. The objectives of the survey were to estimate the new leprosy case load; record both Grade 1 and Grade 2 disabilities in the new cases; and to assess the magnitude of stigma and discrimination prevalent in the society. A cluster based, cross-sectional survey involving all States was used for the door-to-door survey using inverse sampling methodology. Rural and urban clusters were sampled separately. The population screened for detecting 28 new cases in rural and 30 in urban clusters was enumerated, recorded and analyzed. Data capture and analysis in different schedules were the main tools used. For quality control three tiers of experts were utilized for the confirmation of cases and disabilities. Self-stigma was assessed in more than half of the total new patients detected with disabilities by the approved questionnaire. A different questionnaire was used to assess the stigma in the community. A population of 14,725,525 (10,302,443 rural; 4,423,082 urban was screened and 2161 new cases - 1300 paucibacillary (PB and 861 multibacillary (MB were detected. New case estimates for leprosy was 330,346 (95% Confidence limits, 287,445-380,851. Disabilities observed in these cases were 2.05/100,000 population and 13.9 per cent (302/2161 in new cases. Self-stigma in patients with disabilities was reduced, and the patients were well accepted by the spouse, neighbour, at workplace and in social functions.

  15. Design development of robotic system for on line sampling in fuel reprocessing

    International Nuclear Information System (INIS)

    Balasubramanian, G.R.; Venugopal, P.R.; Padmashali, G.K.

    1990-01-01

    This presentation describes the design and developmental work that is being carried out for the design of an automated sampling system for fast reactor fuel reprocessing plants. The plant proposes to use integrated sampling system. The sample is taken across regular process streams from any intermediate hold up pot. A robot system is planned to take the sample from the sample pot, transfer it to the sample bottle, cap the bottle and transfer the bottle to a pneumatic conveying station. The system covers a large number of sample pots. Alternate automated systems are also examined (1). (author). 4 refs., 2 figs

  16. Research Methods in Healthcare Epidemiology: Survey and Qualitative Research.

    Science.gov (United States)

    Safdar, Nasia; Abbo, Lilian M; Knobloch, Mary Jo; Seo, Susan K

    2016-11-01

    Surveys are one of the most frequently employed study designs in healthcare epidemiology research. Generally easier to undertake and less costly than many other study designs, surveys can be invaluable to gain insights into opinions and practices in large samples and may be descriptive and/or be used to test associations. In this context, qualitative research methods may complement this study design either at the survey development phase and/or at the interpretation/extension of results stage. This methods article focuses on key considerations for designing and deploying surveys in healthcare epidemiology and antibiotic stewardship, including identification of whether or not de novo survey development is necessary, ways to optimally lay out and display a survey, denominator measurement, discussion of biases to keep in mind particularly in research using surveys, and the role of qualitative research methods to complement surveys. We review examples of surveys in healthcare epidemiology and antimicrobial stewardship and review the pros and cons of methods used. A checklist is provided to help aid design and deployment of surveys in healthcare epidemiology and antimicrobial stewardship. Infect Control Hosp Epidemiol 2016;1-6.

  17. Factors affecting study efficiency and item non-response in health surveys in developing countries: the Jamaica national healthy lifestyle survey

    Directory of Open Access Journals (Sweden)

    Bennett Franklyn

    2007-02-01

    Full Text Available Abstract Background Health surveys provide important information on the burden and secular trends of risk factors and disease. Several factors including survey and item non-response can affect data quality. There are few reports on efficiency, validity and the impact of item non-response, from developing countries. This report examines factors associated with item non-response and study efficiency in a national health survey in a developing Caribbean island. Methods A national sample of participants aged 15–74 years was selected in a multi-stage sampling design accounting for 4 health regions and 14 parishes using enumeration districts as primary sampling units. Means and proportions of the variables of interest were compared between various categories. Non-response was defined as failure to provide an analyzable response. Linear and logistic regression models accounting for sample design and post-stratification weighting were used to identify independent correlates of recruitment efficiency and item non-response. Results We recruited 2012 15–74 year-olds (66.2% females at a response rate of 87.6% with significant variation between regions (80.9% to 97.6%; p Conclusion Informative health surveys are possible in developing countries. While survey response rates may be satisfactory, item non-response was high in respect of income and sexual practice. In contrast to developed countries, non-response to questions on income is higher and has different correlates. These findings can inform future surveys.

  18. Web-based Surveys: Changing the Survey Process

    OpenAIRE

    Gunn, Holly

    2002-01-01

    Web-based surveys are having a profound influence on the survey process. Unlike other types of surveys, Web page design skills and computer programming expertise play a significant role in the design of Web-based surveys. Survey respondents face new and different challenges in completing a Web-based survey. This paper examines the different types of Web-based surveys, the advantages and challenges of using Web-based surveys, the design of Web-based surveys, and the issues of validity, error, ...

  19. Principles of survey development for telemedicine applications.

    Science.gov (United States)

    Demiris, George

    2006-01-01

    Surveys can be used in the evaluation of telemedicine applications but they must be properly designed, consistent and accurate. The purpose of the survey and the resources available will determine the extent of testing that a survey instrument should undergo prior to its use. The validity of an instrument is the correspondence between what is being measured and what was intended to be measured. The reliability of an instrument describes the 'consistency' or 'repeatability' of the measurements made with it. Survey instruments should be designed and tested following basic principles of survey development. The actual survey administration also requires consideration, for example data collection and processing, as well as the interpretation of the findings. Surveys are of two different types. Either they are self-administered, or they are administered by interview. In the latter case, they may be administered by telephone or in a face-to-face meeting. It is important to design a survey instrument based on a detailed definition of what it intends to measure and to test it before administering it to the larger sample.

  20. Induced Polarization Surveying for Acid Rock Screening in Highway Design

    Science.gov (United States)

    Butler, K. E.; Al, T.; Bishop, T.

    2004-05-01

    Highway and pipeline construction agencies have become increasingly vigilant in their efforts to avoid cutting through sulphide-bearing bedrock that has potential to produce acid rock drainage. Blasting and fragmentation of such rock increases the surface area available for sulphide oxidation and hence increases the risk of acid rock drainage unless the rock contains enough natural buffering capacity to neutralize the pH. In December, 2001, the New Brunswick Department of Transportation (NBOT) sponsored a field trial of geophysical surveying in order to assess its suitability as a screening tool for locating near-surface sulphides along proposed highway alignments. The goal was to develop a protocol that would allow existing programs of drilling and geochemical testing to be targeted more effectively, and provide design engineers with the information needed to reduce rock cuts where necessary and dispose of blasted material in a responsible fashion. Induced polarization (IP) was chosen as the primary geophysical method given its ability to detect low-grade disseminated mineralization. The survey was conducted in dipole-dipole mode using an exploration-style time domain IP system, dipoles 8 to 25 m in length, and six potential dipoles for each current dipole location (i.e. n = 1 - 6). Supplementary information was provided by resistivity and VLF-EM surveys sensitive to lateral changes in electrical conductivity, and by magnetic field surveying chosen for its sensitivity to the magnetic susceptibility of pyrrhotite. Geological and geochemical analyses of samples taken from several IP anomalies located along 4.3 line-km of proposed highway confirmed the effectiveness of the screening technique. IP pseudosections from a region of metamorphosed shales and volcaniclastic rocks identified discrete, well-defined mineralized zones. Stronger, overlapping, and more laterally extensive IP anomalies were observed over a section of graphitic and sulphide-bearing metasedimentary

  1. Informed Design of Mixed-Mode Surveys : Evaluating mode effects on measurement and selection error

    NARCIS (Netherlands)

    Klausch, Thomas|info:eu-repo/dai/nl/341427306

    2014-01-01

    “Mixed-mode designs” are innovative types of surveys which combine more than one mode of administration in the same project, such as surveys administered partly on the web (online), on paper, by telephone, or face-to-face. Mixed-mode designs have become increasingly popular in international survey

  2. Latent spatial models and sampling design for landscape genetics

    Science.gov (United States)

    Hanks, Ephraim M.; Hooten, Mevin B.; Knick, Steven T.; Oyler-McCance, Sara J.; Fike, Jennifer A.; Cross, Todd B.; Schwartz, Michael K.

    2016-01-01

    We propose a spatially-explicit approach for modeling genetic variation across space and illustrate how this approach can be used to optimize spatial prediction and sampling design for landscape genetic data. We propose a multinomial data model for categorical microsatellite allele data commonly used in landscape genetic studies, and introduce a latent spatial random effect to allow for spatial correlation between genetic observations. We illustrate how modern dimension reduction approaches to spatial statistics can allow for efficient computation in landscape genetic statistical models covering large spatial domains. We apply our approach to propose a retrospective spatial sampling design for greater sage-grouse (Centrocercus urophasianus) population genetics in the western United States.

  3. Can smartphones measure momentary quality of life and participation? A proof of concept using experience sampling surveys with university students.

    Science.gov (United States)

    Liddle, Jacki; Wishink, Anna; Springfield, Liz; Gustafsson, Louise; Ireland, David; Silburn, Peter

    2017-08-01

    Understanding quality of life and participation is a key aspect of occupational therapy research. The use of smartphones to deliver experience-sampling surveys may provide an accessible way to monitor these outcomes. This study used smartphone-based experience sampling methods (ESM) to investigate factors influencing momentary quality of life (mQOL) of university students. A convenience sample of students at an Australian university participated. Using a custom smartphone application, ESM surveys were sent six to eight times, every second day, over a week. Participants indicated their mQOL, occupational participation, occupational enjoyment, social context and location via surveys and provided demographic and health information in a single self-report questionnaire. The relationship between mQOL and variables was analysed at the survey level using logistic regression. Forty students completed 391 surveys. Higher mQOL was significantly related to participation in productive occupations (z = 3.48; P = 0.001), moderate (z = 4.00; P sample, analysing at the individual level, and using ESM in conjunction with other methodologies is recommended. © 2017 Occupational Therapy Australia.

  4. Prevalence of dementia-associated disability among Chinese older adults: results from a national sample survey.

    Science.gov (United States)

    Li, Ning; Zhang, Lei; Du, Wei; Pang, Lihua; Guo, Chao; Chen, Gong; Zheng, Xiaoying

    2015-03-01

    Due to rapid population aging and a tidal wave of dementia, dementia has become an urgent public health issue in China. Few large-scale surveys on dementia have been conducted in China and little was known about the magnitude of dysfunction and disability caused by dementia. In this study, using national sample survey data, we aimed to describe the prevalence rate of dementia-associated disability, its associated factors, and daily activities and social functions of people with dementia-associated disability in Chinese older adults. We used the second China National Sample Survey on Disability, comprising 2,526,145 persons from 771,797 households. Identification for dementia was based on consensus manuals. Standard weighting procedures were used to construct sample weights considering the multistage stratified cluster sampling survey scheme. Population weighted numbers, weighted prevalence, and the odd ratios (ORs) were calculated. The prevalence rate of dementia-associated disability was 4.64% (95% CI: 4.26-5.01) and it accounted for 41.03% of mental disability among Chinese older adults. Urban residence (OR: 1.33 [1.12-1.57]), older age (80+ years) (OR: 4.12 [3.38-.03]), illiteracy (OR: 1.79 [1.27-2.53]), and currently not married (OR: 1.15 [1.00-1.32]) were associated with increased risk of dementia-associated disability. Compared with those with mental disability of other causes and those with other types of disabilities, older adults with dementia-asscoiated disability were more likely to have severe or extreme difficulty in daily activities and social functions. Countermeasures are warranted to obtain a more precise overview of dementia in China, and strategies on enhancing early identification, treatment, and rehabilitation should be developed for people with dementia. Copyright © 2015 American Association for Geriatric Psychiatry. Published by Elsevier Inc. All rights reserved.

  5. THE HETDEX PILOT SURVEY. I. SURVEY DESIGN, PERFORMANCE, AND CATALOG OF EMISSION-LINE GALAXIES

    International Nuclear Information System (INIS)

    Adams, Joshua J.; Blanc, Guillermo A.; Gebhardt, Karl; Hao, Lei; Byun, Joyce; Fry, Alex; Jeong, Donghui; Komatsu, Eiichiro; Hill, Gary J.; Cornell, Mark E.; MacQueen, Phillip J.; Drory, Niv; Bender, Ralf; Hopp, Ulrich; Kelzenberg, Ralf; Ciardullo, Robin; Gronwall, Caryl; Finkelstein, Steven L.; Gawiser, Eric; Kelz, Andreas

    2011-01-01

    We present a catalog of emission-line galaxies selected solely by their emission-line fluxes using a wide-field integral field spectrograph. This work is partially motivated as a pilot survey for the upcoming Hobby-Eberly Telescope Dark Energy Experiment. We describe the observations, reductions, detections, redshift classifications, line fluxes, and counterpart information for 397 emission-line galaxies detected over 169 □' with a 3500-5800 A bandpass under 5 A full-width-half-maximum (FWHM) spectral resolution. The survey's best sensitivity for unresolved objects under photometric conditions is between 4 and 20x 10 -17 erg s -1 cm -2 depending on the wavelength, and Lyα luminosities between 3 x 10 42 and 6 x 10 42 erg s -1 are detectable. This survey method complements narrowband and color-selection techniques in the search of high-redshift galaxies with its different selection properties and large volume probed. The four survey fields within the COSMOS, GOODS-N, MUNICS, and XMM-LSS areas are rich with existing, complementary data. We find 105 galaxies via their high-redshift Lyα emission at 1.9 44 □' which appear to be extended Lyα nebulae. We also find three high-z objects with rest-frame Lyα EW above the level believed to be achievable with normal star formation, EW 0 >240 A. Future papers will investigate the physical properties of this sample.

  6. Global review of health care surveys using lot quality assurance sampling (LQAS), 1984-2004.

    Science.gov (United States)

    Robertson, Susan E; Valadez, Joseph J

    2006-09-01

    We conducted a global review on the use of lot quality assurance sampling (LQAS) to assess health care services, health behaviors, and disease burden. Publications and reports on LQAS surveys were sought from Medline and five other electronic databases; the World Health Organization; the World Bank; governments, nongovernmental organizations, and individual scientists. We identified a total of 805 LQAS surveys conducted by different management groups during January 1984 through December 2004. There was a striking increase in the annual number of LQAS surveys conducted in 2000-2004 (128/year) compared with 1984-1999 (10/year). Surveys were conducted in 55 countries, and in 12 of these countries there were 10 or more LQAS surveys. Geographically, 317 surveys (39.4%) were conducted in Africa, 197 (28.5%) in the Americas, 115 (14.3%) in the Eastern Mediterranean, 114 (14.2%) in South-East Asia, 48 (6.0%) in Europe, and 14 (1.8%) in the Western Pacific. Health care parameters varied, and some surveys assessed more than one parameter. There were 320 surveys about risk factors for HIV/AIDS/sexually transmitted infections; 266 surveys on immunization coverage, 240 surveys post-disasters, 224 surveys on women's health, 142 surveys on growth and nutrition, 136 surveys on diarrheal disease control, and 88 surveys on quality management. LQAS surveys to assess disease burden included 23 neonatal tetanus mortality surveys and 12 surveys on other diseases. LQAS is a practical field method which increasingly is being applied in assessment of preventive and curative health services, and may offer new research opportunities to social scientists. When LQAS data are collected recurrently at multiple time points, they can be used to measure the spatial variation in behavior change. Such data provide insight into understanding relationships between various investments in social, human, and physical capital, and into the effectiveness of different public health strategies in achieving

  7. Visual Sample Plan (VSP) Software: Designs and Data Analyses for Sampling Contaminated Buildings

    International Nuclear Information System (INIS)

    Pulsipher, Brent A.; Wilson, John E.; Gilbert, Richard O.; Nuffer, Lisa L.; Hassig, Nancy L.

    2005-01-01

    A new module of the Visual Sample Plan (VSP) software has been developed to provide sampling designs and data analyses for potentially contaminated buildings. An important application is assessing levels of contamination in buildings after a terrorist attack. This new module, funded by DHS through the Combating Terrorism Technology Support Office, Technical Support Working Group, was developed to provide a tailored, user-friendly and visually-orientated buildings module within the existing VSP software toolkit, the latest version of which can be downloaded from http://dqo.pnl.gov/vsp. In case of, or when planning against, a chemical, biological, or radionuclide release within a building, the VSP module can be used to quickly and easily develop and visualize technically defensible sampling schemes for walls, floors, ceilings, and other surfaces to statistically determine if contamination is present, its magnitude and extent throughout the building and if decontamination has been effective. This paper demonstrates the features of this new VSP buildings module, which include: the ability to import building floor plans or to easily draw, manipulate, and view rooms in several ways; being able to insert doors, windows and annotations into a room; 3-D graphic room views with surfaces labeled and floor plans that show building zones that have separate air handing units. The paper will also discuss the statistical design and data analysis options available in the buildings module. Design objectives supported include comparing an average to a threshold when the data distribution is normal or unknown, and comparing measurements to a threshold to detect hotspots or to insure most of the area is uncontaminated when the data distribution is normal or unknown

  8. A Survey of Former Drafting & Engineering Design Technology Students. Summary Findings of Respondents District-Wide.

    Science.gov (United States)

    Glyer-Culver, Betty

    In fall 2001 staff of the Los Rios Community College District Office of Institutional Research collaborated with occupational deans, academic deans, and faculty to develop and administer a survey of former Drafting and Engineering Design Technology students. The survey was designed to determine how well courses had met the needs of former drafting…

  9. The challenge of comprehensively mapping children's health in a nation-wide health survey: Design of the German KiGGS-Study

    Directory of Open Access Journals (Sweden)

    Schlack Robert

    2008-06-01

    Full Text Available Abstract Background From May 2003 to May 2006, the Robert Koch Institute conducted the German Health Interview and Examination Survey for Children and Adolescents (KiGGS. Aim of this first nationwide interview and examination survey was to collect comprehensive data on the health status of children and adolescents aged 0 to 17 years. Methods/Design Participants were enrolled in two steps: first, 167 study locations (sample points were chosen; second, subjects were randomly selected from the official registers of local residents. The survey involved questionnaires filled in by parents and parallel questionnaires for children aged 11 years and older, physical examinations and tests, and a computer assisted personal interview performed by study physicians. A wide range of blood and urine testing was carried out at central laboratories. A total of 17 641 children and adolescents were surveyed – 8985 boys and 8656 girls. The proportion of sample neutral drop-outs was 5.3%. The response rate was 66.6%. Discussion The response rate showed little variation between age groups and sexes, but marked variation between resident aliens and Germans, between inhabitants of cities with a population of 100 000 or more and sample points with fewer inhabitants, as well as between the old West German states and the former East German states. By analysing the short non-responder questionnaires it was proven that the collected data give comprehensive and nationally representative evidence on the health status of children and adolescents aged 0 to 17 years.

  10. Rapid Sampling of Hydrogen Bond Networks for Computational Protein Design.

    Science.gov (United States)

    Maguire, Jack B; Boyken, Scott E; Baker, David; Kuhlman, Brian

    2018-05-08

    Hydrogen bond networks play a critical role in determining the stability and specificity of biomolecular complexes, and the ability to design such networks is important for engineering novel structures, interactions, and enzymes. One key feature of hydrogen bond networks that makes them difficult to rationally engineer is that they are highly cooperative and are not energetically favorable until the hydrogen bonding potential has been satisfied for all buried polar groups in the network. Existing computational methods for protein design are ill-equipped for creating these highly cooperative networks because they rely on energy functions and sampling strategies that are focused on pairwise interactions. To enable the design of complex hydrogen bond networks, we have developed a new sampling protocol in the molecular modeling program Rosetta that explicitly searches for sets of amino acid mutations that can form self-contained hydrogen bond networks. For a given set of designable residues, the protocol often identifies many alternative sets of mutations/networks, and we show that it can readily be applied to large sets of residues at protein-protein interfaces or in the interior of proteins. The protocol builds on a recently developed method in Rosetta for designing hydrogen bond networks that has been experimentally validated for small symmetric systems but was not extensible to many larger protein structures and complexes. The sampling protocol we describe here not only recapitulates previously validated designs with performance improvements but also yields viable hydrogen bond networks for cases where the previous method fails, such as the design of large, asymmetric interfaces relevant to engineering protein-based therapeutics.

  11. Practical guidelines for developing a smartphone-based survey instrument

    DEFF Research Database (Denmark)

    Ohme, Jakob; de Vreese, Claes Holger; Albæk, Erik

    The increasing relevance of mobile surveys makes it important to gather empirical evidence on designs of such surveys. This research note presents the results of a test study conducted to identify the best set-up for a smartphone-based survey. We base our analysis on a random sample of Danish...

  12. Random sampling or geostatistical modelling? Choosing between design-based and model-based sampling strategies for soil (with discussion)

    NARCIS (Netherlands)

    Brus, D.J.; Gruijter, de J.J.

    1997-01-01

    Classical sampling theory has been repeatedly identified with classical statistics which assumes that data are identically and independently distributed. This explains the switch of many soil scientists from design-based sampling strategies, based on classical sampling theory, to the model-based

  13. Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method, 4th Edition

    Science.gov (United States)

    Dillman, Don A.; Smyth, Jolene D.; Christian, Lean Melani

    2014-01-01

    For over two decades, Dillman's classic text on survey design has aided both students and professionals in effectively planning and conducting mail, telephone, and, more recently, Internet surveys. The new edition is thoroughly updated and revised, and covers all aspects of survey research. It features expanded coverage of mobile phones, tablets,…

  14. Effect of survey design and catch rate estimation on total catch estimates in Chinook salmon fisheries

    Science.gov (United States)

    McCormick, Joshua L.; Quist, Michael C.; Schill, Daniel J.

    2012-01-01

    Roving–roving and roving–access creel surveys are the primary techniques used to obtain information on harvest of Chinook salmon Oncorhynchus tshawytscha in Idaho sport fisheries. Once interviews are conducted using roving–roving or roving–access survey designs, mean catch rate can be estimated with the ratio-of-means (ROM) estimator, the mean-of-ratios (MOR) estimator, or the MOR estimator with exclusion of short-duration (≤0.5 h) trips. Our objective was to examine the relative bias and precision of total catch estimates obtained from use of the two survey designs and three catch rate estimators for Idaho Chinook salmon fisheries. Information on angling populations was obtained by direct visual observation of portions of Chinook salmon fisheries in three Idaho river systems over an 18-d period. Based on data from the angling populations, Monte Carlo simulations were performed to evaluate the properties of the catch rate estimators and survey designs. Among the three estimators, the ROM estimator provided the most accurate and precise estimates of mean catch rate and total catch for both roving–roving and roving–access surveys. On average, the root mean square error of simulated total catch estimates was 1.42 times greater and relative bias was 160.13 times greater for roving–roving surveys than for roving–access surveys. Length-of-stay bias and nonstationary catch rates in roving–roving surveys both appeared to affect catch rate and total catch estimates. Our results suggest that use of the ROM estimator in combination with an estimate of angler effort provided the least biased and most precise estimates of total catch for both survey designs. However, roving–access surveys were more accurate than roving–roving surveys for Chinook salmon fisheries in Idaho.

  15. Estimation of sample size and testing power (Part 4).

    Science.gov (United States)

    Hu, Liang-ping; Bao, Xiao-lei; Guan, Xue; Zhou, Shi-guo

    2012-01-01

    Sample size estimation is necessary for any experimental or survey research. An appropriate estimation of sample size based on known information and statistical knowledge is of great significance. This article introduces methods of sample size estimation of difference test for data with the design of one factor with two levels, including sample size estimation formulas and realization based on the formulas and the POWER procedure of SAS software for quantitative data and qualitative data with the design of one factor with two levels. In addition, this article presents examples for analysis, which will play a leading role for researchers to implement the repetition principle during the research design phase.

  16. A proposal of optimal sampling design using a modularity strategy

    Science.gov (United States)

    Simone, A.; Giustolisi, O.; Laucelli, D. B.

    2016-08-01

    In real water distribution networks (WDNs) are present thousands nodes and optimal placement of pressure and flow observations is a relevant issue for different management tasks. The planning of pressure observations in terms of spatial distribution and number is named sampling design and it was faced considering model calibration. Nowadays, the design of system monitoring is a relevant issue for water utilities e.g., in order to manage background leakages, to detect anomalies and bursts, to guarantee service quality, etc. In recent years, the optimal location of flow observations related to design of optimal district metering areas (DMAs) and leakage management purposes has been faced considering optimal network segmentation and the modularity index using a multiobjective strategy. Optimal network segmentation is the basis to identify network modules by means of optimal conceptual cuts, which are the candidate locations of closed gates or flow meters creating the DMAs. Starting from the WDN-oriented modularity index, as a metric for WDN segmentation, this paper proposes a new way to perform the sampling design, i.e., the optimal location of pressure meters, using newly developed sampling-oriented modularity index. The strategy optimizes the pressure monitoring system mainly based on network topology and weights assigned to pipes according to the specific technical tasks. A multiobjective optimization minimizes the cost of pressure meters while maximizing the sampling-oriented modularity index. The methodology is presented and discussed using the Apulian and Exnet networks.

  17. Using mark-recapture distance sampling methods on line transect surveys

    Science.gov (United States)

    Burt, Louise M.; Borchers, David L.; Jenkins, Kurt J.; Marques, Tigao A

    2014-01-01

    Mark–recapture distance sampling (MRDS) methods are widely used for density and abundance estimation when the conventional DS assumption of certain detection at distance zero fails, as they allow detection at distance zero to be estimated and incorporated into the overall probability of detection to better estimate density and abundance. However, incorporating MR data in DS models raises survey and analysis issues not present in conventional DS. Conversely, incorporating DS assumptions in MR models raises issues not present in conventional MR. As a result, being familiar with either conventional DS methods or conventional MR methods does not on its own put practitioners in good a position to apply MRDS methods appropriately. This study explains the sometimes subtly different varieties of MRDS survey methods and the associated concepts underlying MRDS models. This is done as far as possible without giving mathematical details – in the hope that this will make the key concepts underlying the methods accessible to a wider audience than if we were to present the concepts via equations.

  18. Survalytics: An Open-Source Cloud-Integrated Experience Sampling, Survey, and Analytics and Metadata Collection Module for Android Operating System Apps.

    Science.gov (United States)

    O'Reilly-Shah, Vikas; Mackey, Sean

    2016-06-03

    We describe here Survalytics, a software module designed to address two broad areas of need. The first area is in the domain of surveys and app analytics: developers of mobile apps in both academic and commercial environments require information about their users, as well as how the apps are being used, to understand who their users are and how to optimally approach app development. The second area of need is in the field of ecological momentary assessment, also referred to as experience sampling: researchers in a wide variety of fields, spanning from the social sciences to psychology to clinical medicine, would like to be able to capture daily or even more frequent data from research subjects while in their natural environment. Survalytics is an open-source solution for the collection of survey responses as well as arbitrary analytic metadata from users of Android operating system apps. Surveys may be administered in any combination of one-time questions and ongoing questions. The module may be deployed as a stand-alone app for experience sampling purposes or as an add-on to existing apps. The module takes advantage of free-tier NoSQL cloud database management offered by the Amazon Web Services DynamoDB platform to package a secure, flexible, extensible data collection module. DynamoDB is capable of Health Insurance Portability and Accountability Act compliant storage of personal health information. The provided example app may be used without modification for a basic experience sampling project, and we provide example questions for daily collection of blood glucose data from study subjects. The module will help researchers in a wide variety of fields rapidly develop tailor-made Android apps for a variety of data collection purposes.

  19. Survalytics: An Open-Source Cloud-Integrated Experience Sampling, Survey, and Analytics and Metadata Collection Module for Android Operating System Apps

    Science.gov (United States)

    Mackey, Sean

    2016-01-01

    Background We describe here Survalytics, a software module designed to address two broad areas of need. The first area is in the domain of surveys and app analytics: developers of mobile apps in both academic and commercial environments require information about their users, as well as how the apps are being used, to understand who their users are and how to optimally approach app development. The second area of need is in the field of ecological momentary assessment, also referred to as experience sampling: researchers in a wide variety of fields, spanning from the social sciences to psychology to clinical medicine, would like to be able to capture daily or even more frequent data from research subjects while in their natural environment. Objective Survalytics is an open-source solution for the collection of survey responses as well as arbitrary analytic metadata from users of Android operating system apps. Methods Surveys may be administered in any combination of one-time questions and ongoing questions. The module may be deployed as a stand-alone app for experience sampling purposes or as an add-on to existing apps. The module takes advantage of free-tier NoSQL cloud database management offered by the Amazon Web Services DynamoDB platform to package a secure, flexible, extensible data collection module. DynamoDB is capable of Health Insurance Portability and Accountability Act compliant storage of personal health information. Results The provided example app may be used without modification for a basic experience sampling project, and we provide example questions for daily collection of blood glucose data from study subjects. Conclusions The module will help researchers in a wide variety of fields rapidly develop tailor-made Android apps for a variety of data collection purposes. PMID:27261155

  20. Farm survey design in the Sahel : Experiences from Burkina Faso

    NARCIS (Netherlands)

    Graaff, de J.; Mijl, van der J.P.; Nibbering, J.W.

    1999-01-01

    In designing farm household surveys in the Sahel in West Africa much attention should be paid to various specific features of the farming systems. These features relate in particular to the social organisation of the farming communities. Within households, kinship relations have a strong bearing on

  1. Nationwide survey of policies and practices related to capillary blood sampling in medical laboratories in Croatia.

    Science.gov (United States)

    Krleza, Jasna Lenicek

    2014-01-01

    Capillary sampling is increasingly used to obtain blood for laboratory tests in volumes as small as necessary and as non-invasively as possible. Whether capillary blood sampling is also frequent in Croatia, and whether it is performed according to international laboratory standards is unclear. All medical laboratories that participate in the Croatian National External Quality Assessment Program (N = 204) were surveyed on-line to collect information about the laboratory's parent institution, patient population, types and frequencies of laboratory tests based on capillary blood samples, choice of reference intervals, and policies and procedures specifically related to capillary sampling. Sampling practices were compared with guidelines from the Clinical and Laboratory Standards Institute (CLSI) and the World Health Organization (WHO). Of the 204 laboratories surveyed, 174 (85%) responded with complete questionnaires. Among the 174 respondents, 155 (89%) reported that they routinely perform capillary sampling, which is carried out by laboratory staff in 118 laboratories (76%). Nearly half of respondent laboratories (48%) do not have a written protocol including order of draw for multiple sampling. A single puncture site is used to provide capillary blood for up to two samples at 43% of laboratories that occasionally or regularly perform such sampling. Most respondents (88%) never perform arterialisation prior to capillary blood sampling. Capillary blood sampling is highly prevalent in Croatia across different types of clinical facilities and patient populations. Capillary sampling procedures are not standardised in the country, and the rate of laboratory compliance with CLSI and WHO guidelines is low.

  2. Secondary Analysis under Cohort Sampling Designs Using Conditional Likelihood

    Directory of Open Access Journals (Sweden)

    Olli Saarela

    2012-01-01

    Full Text Available Under cohort sampling designs, additional covariate data are collected on cases of a specific type and a randomly selected subset of noncases, primarily for the purpose of studying associations with a time-to-event response of interest. With such data available, an interest may arise to reuse them for studying associations between the additional covariate data and a secondary non-time-to-event response variable, usually collected for the whole study cohort at the outset of the study. Following earlier literature, we refer to such a situation as secondary analysis. We outline a general conditional likelihood approach for secondary analysis under cohort sampling designs and discuss the specific situations of case-cohort and nested case-control designs. We also review alternative methods based on full likelihood and inverse probability weighting. We compare the alternative methods for secondary analysis in two simulated settings and apply them in a real-data example.

  3. Detecting spatial structures in throughfall data: The effect of extent, sample size, sampling design, and variogram estimation method

    Science.gov (United States)

    Voss, Sebastian; Zimmermann, Beate; Zimmermann, Alexander

    2016-09-01

    In the last decades, an increasing number of studies analyzed spatial patterns in throughfall by means of variograms. The estimation of the variogram from sample data requires an appropriate sampling scheme: most importantly, a large sample and a layout of sampling locations that often has to serve both variogram estimation and geostatistical prediction. While some recommendations on these aspects exist, they focus on Gaussian data and high ratios of the variogram range to the extent of the study area. However, many hydrological data, and throughfall data in particular, do not follow a Gaussian distribution. In this study, we examined the effect of extent, sample size, sampling design, and calculation method on variogram estimation of throughfall data. For our investigation, we first generated non-Gaussian random fields based on throughfall data with large outliers. Subsequently, we sampled the fields with three extents (plots with edge lengths of 25 m, 50 m, and 100 m), four common sampling designs (two grid-based layouts, transect and random sampling) and five sample sizes (50, 100, 150, 200, 400). We then estimated the variogram parameters by method-of-moments (non-robust and robust estimators) and residual maximum likelihood. Our key findings are threefold. First, the choice of the extent has a substantial influence on the estimation of the variogram. A comparatively small ratio of the extent to the correlation length is beneficial for variogram estimation. Second, a combination of a minimum sample size of 150, a design that ensures the sampling of small distances and variogram estimation by residual maximum likelihood offers a good compromise between accuracy and efficiency. Third, studies relying on method-of-moments based variogram estimation may have to employ at least 200 sampling points for reliable variogram estimates. These suggested sample sizes exceed the number recommended by studies dealing with Gaussian data by up to 100 %. Given that most previous

  4. Analysis and radiological assessment of survey results and samples from the beaches around Sellafield

    International Nuclear Information System (INIS)

    Webb, G.A.M.; Fry, F.A.

    1983-12-01

    After radioactive sea debris had been found on beaches near the BNFL, Sellafield, plant, NRPB was asked by the Department of the Environment to analyse some of the samples collected and to assess the radiological hazard to members of the public. A report is presented containing an analysis of survey reports for the period 19 November - 4 December 1983 and preliminary results of the analysis of all samples received, together with the Board's recommendations. (author)

  5. Hit by a Perfect Storm? Art & Design in the National Student Survey

    Science.gov (United States)

    Yorke, Mantz; Orr, Susan; Blair, Bernadette

    2014-01-01

    There has long been the suspicion amongst staff in Art & Design that the ratings given to their subject disciplines in the UK's National Student Survey are adversely affected by a combination of circumstances--a "perfect storm". The "perfect storm" proposition is tested by comparing ratings for Art & Design with those…

  6. Trypanosoma brucei gambiense trypanosomiasis in Terego county, northern Uganda, 1996: a lot quality assurance sampling survey.

    Science.gov (United States)

    Hutin, Yvan J F; Legros, Dominique; Owini, Vincent; Brown, Vincent; Lee, Evan; Mbulamberi, Dawson; Paquet, Christophe

    2004-04-01

    We estimated the pre-intervention prevalence of Trypanosoma brucei gambiense (Tbg) trypanosomiasis using the lot quality assurance sampling (LQAS) methods in 14 parishes of Terego County in northern Uganda. A total of 826 participants were included in the survey sample in 1996. The prevalence of laboratory confirmed Tbg trypanosomiasis adjusted for parish population sizes was 2.2% (95% confidence interval =1.1-3.2). This estimate was consistent with the 1.1% period prevalence calculated on the basis of cases identified through passive and active screening in 1996-1999. Ranking of parishes in four categories according to LQAS analysis of the 1996 survey predicted the prevalences observed during the first round of active screening in the population in 1997-1998 (P LQAS were validated by the results of the population screening, suggesting that these survey methods may be useful in the pre-intervention phase of sleeping sickness control programs.

  7. Optimal surveillance strategy for invasive species management when surveys stop after detection.

    Science.gov (United States)

    Guillera-Arroita, Gurutzeta; Hauser, Cindy E; McCarthy, Michael A

    2014-05-01

    Invasive species are a cause for concern in natural and economic systems and require both monitoring and management. There is a trade-off between the amount of resources spent on surveying for the species and conducting early management of occupied sites, and the resources that are ultimately spent in delayed management at sites where the species was present but undetected. Previous work addressed this optimal resource allocation problem assuming that surveys continue despite detection until the initially planned survey effort is consumed. However, a more realistic scenario is often that surveys stop after detection (i.e., follow a "removal" sampling design) and then management begins. Such an approach will indicate a different optimal survey design and can be expected to be more efficient. We analyze this case and compare the expected efficiency of invasive species management programs under both survey methods. We also evaluate the impact of mis-specifying the type of sampling approach during the program design phase. We derive analytical expressions that optimize resource allocation between monitoring and management in surveillance programs when surveys stop after detection. We do this under a scenario of unconstrained resources and scenarios where survey budget is constrained. The efficiency of surveillance programs is greater if a "removal survey" design is used, with larger gains obtained when savings from early detection are high, occupancy is high, and survey costs are not much lower than early management costs at a site. Designing a surveillance program disregarding that surveys stop after detection can result in an efficiency loss. Our results help guide the design of future surveillance programs for invasive species. Addressing program design within a decision-theoretic framework can lead to a better use of available resources. We show how species prevalence, its detectability, and the benefits derived from early detection can be considered.

  8. Modeling the Sensitivity of Field Surveys for Detection of Environmental DNA (eDNA.

    Directory of Open Access Journals (Sweden)

    Martin T Schultz

    Full Text Available The environmental DNA (eDNA method is the practice of collecting environmental samples and analyzing them for the presence of a genetic marker specific to a target species. Little is known about the sensitivity of the eDNA method. Sensitivity is the probability that the target marker will be detected if it is present in the water body. Methods and tools are needed to assess the sensitivity of sampling protocols, design eDNA surveys, and interpret survey results. In this study, the sensitivity of the eDNA method is modeled as a function of ambient target marker concentration. The model accounts for five steps of sample collection and analysis, including: 1 collection of a filtered water sample from the source; 2 extraction of DNA from the filter and isolation in a purified elution; 3 removal of aliquots from the elution for use in the polymerase chain reaction (PCR assay; 4 PCR; and 5 genetic sequencing. The model is applicable to any target species. For demonstration purposes, the model is parameterized for bighead carp (Hypophthalmichthys nobilis and silver carp (H. molitrix assuming sampling protocols used in the Chicago Area Waterway System (CAWS. Simulation results show that eDNA surveys have a high false negative rate at low concentrations of the genetic marker. This is attributed to processing of water samples and division of the extraction elution in preparation for the PCR assay. Increases in field survey sensitivity can be achieved by increasing sample volume, sample number, and PCR replicates. Increasing sample volume yields the greatest increase in sensitivity. It is recommended that investigators estimate and communicate the sensitivity of eDNA surveys to help facilitate interpretation of eDNA survey results. In the absence of such information, it is difficult to evaluate the results of surveys in which no water samples test positive for the target marker. It is also recommended that invasive species managers articulate concentration

  9. A field test of three LQAS designs to assess the prevalence of acute malnutrition.

    Science.gov (United States)

    Deitchler, Megan; Valadez, Joseph J; Egge, Kari; Fernandez, Soledad; Hennigan, Mary

    2007-08-01

    The conventional method for assessing the prevalence of Global Acute Malnutrition (GAM) in emergency settings is the 30 x 30 cluster-survey. This study describes alternative approaches: three Lot Quality Assurance Sampling (LQAS) designs to assess GAM. The LQAS designs were field-tested and their results compared with those from a 30 x 30 cluster-survey. Computer simulations confirmed that small clusters instead of a simple random sample could be used for LQAS assessments of GAM. Three LQAS designs were developed (33 x 6, 67 x 3, Sequential design) to assess GAM thresholds of 10, 15 and 20%. The designs were field-tested simultaneously with a 30 x 30 cluster-survey in Siraro, Ethiopia during June 2003. Using a nested study design, anthropometric, morbidity and vaccination data were collected on all children 6-59 months in sampled households. Hypothesis tests about GAM thresholds were conducted for each LQAS design. Point estimates were obtained for the 30 x 30 cluster-survey and the 33 x 6 and 67 x 3 LQAS designs. Hypothesis tests showed GAM as or =10% for the 67 x 3 and Sequential designs. Point estimates for the 33 x 6 and 67 x 3 designs were similar to those of the 30 x 30 cluster-survey for GAM (6.7%, CI = 3.2-10.2%; 8.2%, CI = 4.3-12.1%, 7.4%, CI = 4.8-9.9%) and all other indicators. The CIs for the LQAS designs were only slightly wider than the CIs for the 30 x 30 cluster-survey; yet the LQAS designs required substantially less time to administer. The LQAS designs provide statistically appropriate alternatives to the more time-consuming 30 x 30 cluster-survey. However, additional field-testing is needed using independent samples rather than a nested study design.

  10. The Swift Gamma-Ray Burst Host Galaxy Legacy Survey. I. Sample Selection and Redshift Distribution

    Science.gov (United States)

    Perley, D. A.; Kruhler, T.; Schulze, S.; Postigo, A. De Ugarte; Hjorth, J.; Berger, E.; Cenko, S. B.; Chary, R.; Cucchiara, A.; Ellis, R.; hide

    2016-01-01

    We introduce the Swift Gamma-Ray Burst Host Galaxy Legacy Survey (SHOALS), a multi-observatory high redshift galaxy survey targeting the largest unbiased sample of long-duration gamma-ray burst (GRB) hosts yet assembled (119 in total). We describe the motivations of the survey and the development of our selection criteria, including an assessment of the impact of various observability metrics on the success rate of afterglow-based redshift measurement. We briefly outline our host galaxy observational program, consisting of deep Spitzer/IRAC imaging of every field supplemented by similarly deep, multicolor optical/near-IR photometry, plus spectroscopy of events without preexisting redshifts. Our optimized selection cuts combined with host galaxy follow-up have so far enabled redshift measurements for 110 targets (92%) and placed upper limits on all but one of the remainder. About 20% of GRBs in the sample are heavily dust obscured, and at most 2% originate from z > 5.5. Using this sample, we estimate the redshift-dependent GRB rate density, showing it to peak at z approx. 2.5 and fall by at least an order of magnitude toward low (z = 0) redshift, while declining more gradually toward high (z approx. 7) redshift. This behavior is consistent with a progenitor whose formation efficiency varies modestly over cosmic history. Our survey will permit the most detailed examination to date of the connection between the GRB host population and general star-forming galaxies, directly measure evolution in the host population over cosmic time and discern its causes, and provide new constraints on the fraction of cosmic star formation occurring in undetectable galaxies at all redshifts.

  11. The Design and Validation of the Colorado Learning Attitudes about Science Survey

    Science.gov (United States)

    Adams, W. K.; Perkins, K. K.; Dubson, M.; Finkelstein, N. D.; Wieman, C. E.

    2005-09-01

    The Colorado Learning Attitudes about Science Survey (CLASS) is a new instrument designed to measure various facets of student attitudes and beliefs about learning physics. This instrument extends previous work by probing additional facets of student attitudes and beliefs. It has been written to be suitably worded for students in a variety of different courses. This paper introduces the CLASS and its design and validation studies, which include analyzing results from over 2400 students, interviews and factor analyses. Methodology used to determine categories and how to analyze the robustness of categories for probing various facets of student learning are also described. This paper serves as the foundation for the results and conclusions from the analysis of our survey data.

  12. Estimating the encounter rate variance in distance sampling

    Science.gov (United States)

    Fewster, R.M.; Buckland, S.T.; Burnham, K.P.; Borchers, D.L.; Jupp, P.E.; Laake, J.L.; Thomas, L.

    2009-01-01

    The dominant source of variance in line transect sampling is usually the encounter rate variance. Systematic survey designs are often used to reduce the true variability among different realizations of the design, but estimating the variance is difficult and estimators typically approximate the variance by treating the design as a simple random sample of lines. We explore the properties of different encounter rate variance estimators under random and systematic designs. We show that a design-based variance estimator improves upon the model-based estimator of Buckland et al. (2001, Introduction to Distance Sampling. Oxford: Oxford University Press, p. 79) when transects are positioned at random. However, if populations exhibit strong spatial trends, both estimators can have substantial positive bias under systematic designs. We show that poststratification is effective in reducing this bias. ?? 2008, The International Biometric Society.

  13. Comparing two survey methods of measuring health-related indicators: Lot Quality Assurance Sampling and Demographic Health Surveys.

    Science.gov (United States)

    Anoke, Sarah C; Mwai, Paul; Jeffery, Caroline; Valadez, Joseph J; Pagano, Marcello

    2015-12-01

    Two common methods used to measure indicators for health programme monitoring and evaluation are the demographic and health surveys (DHS) and lot quality assurance sampling (LQAS); each one has different strengths. We report on both methods when utilised in comparable situations. We compared 24 indicators in south-west Uganda, where data for prevalence estimations were collected independently for the two methods in 2011 (LQAS: n = 8876; DHS: n = 1200). Data were stratified (e.g. gender and age) resulting in 37 comparisons. We used a two-sample two-sided Z-test of proportions to compare both methods. The average difference between LQAS and DHS for 37 estimates was 0.062 (SD = 0.093; median = 0.039). The average difference among the 21 failures to reject equality of proportions was 0.010 (SD = 0.041; median = 0.009); among the 16 rejections, it was 0.130 (SD = 0.010, median = 0.118). Seven of the 16 rejections exhibited absolute differences of 0.10 and 0.20 (mean = 0.261, SD = 0.083). There is 75.7% agreement across the two surveys. Both methods yield regional results, but only LQAS provides information at less granular levels (e.g. the district level) where managerial action is taken. The cost advantage and localisation make LQAS feasible to conduct more frequently, and provides the possibility for real-time health outcomes monitoring. © 2015 The Authors. Tropical Medicine & International Health Published by John Wiley & Sons Ltd.

  14. Sampling designs and methods for estimating fish-impingement losses at cooling-water intakes

    International Nuclear Information System (INIS)

    Murarka, I.P.; Bodeau, D.J.

    1977-01-01

    Several systems for estimating fish impingement at power plant cooling-water intakes are compared to determine the most statistically efficient sampling designs and methods. Compared to a simple random sampling scheme the stratified systematic random sampling scheme, the systematic random sampling scheme, and the stratified random sampling scheme yield higher efficiencies and better estimators for the parameters in two models of fish impingement as a time-series process. Mathematical results and illustrative examples of the applications of the sampling schemes to simulated and real data are given. Some sampling designs applicable to fish-impingement studies are presented in appendixes

  15. Nationwide survey of policies and practices related to capillary blood sampling in medical laboratories in Croatia

    Science.gov (United States)

    Krleza, Jasna Lenicek

    2014-01-01

    Introduction: Capillary sampling is increasingly used to obtain blood for laboratory tests in volumes as small as necessary and as non-invasively as possible. Whether capillary blood sampling is also frequent in Croatia, and whether it is performed according to international laboratory standards is unclear. Materials and methods: All medical laboratories that participate in the Croatian National External Quality Assessment Program (N = 204) were surveyed on-line to collect information about the laboratory’s parent institution, patient population, types and frequencies of laboratory tests based on capillary blood samples, choice of reference intervals, and policies and procedures specifically related to capillary sampling. Sampling practices were compared with guidelines from the Clinical and Laboratory Standards Institute (CLSI) and the World Health Organization (WHO). Results: Of the 204 laboratories surveyed, 174 (85%) responded with complete questionnaires. Among the 174 respondents, 155 (89%) reported that they routinely perform capillary sampling, which is carried out by laboratory staff in 118 laboratories (76%). Nearly half of respondent laboratories (48%) do not have a written protocol including order of draw for multiple sampling. A single puncture site is used to provide capillary blood for up to two samples at 43% of laboratories that occasionally or regularly perform such sampling. Most respondents (88%) never perform arterialisation prior to capillary blood sampling. Conclusions: Capillary blood sampling is highly prevalent in Croatia across different types of clinical facilities and patient populations. Capillary sampling procedures are not standardised in the country, and the rate of laboratory compliance with CLSI and WHO guidelines is low. PMID:25351353

  16. Report on the Survey of the Design Review of New Reactor Applications. Volume 3: Reactor

    International Nuclear Information System (INIS)

    Downey, Steven; Monninger, John; Nevalainen, Janne; Lorin, Aurelie; ); Webster, Philip; Joyer, Philippe; Kawamura, Tomonori; Lankin, Mikhail; Kubanyi, Jozef; Haluska, Ladislav; Persic, Andreja; Reierson, Craig; Kang, Kyungmin; Kim, Walter

    2016-01-01

    At the tenth meeting of the CNRA Working Group on the Regulation of New Reactors (WGRNR) in March 2013, the Working Group agreed to present the responses to the Second Phase, or Design Phase, of the Licensing Process Survey as a multi-volume text. As such, each report will focus on one of the eleven general technical categories covered in the survey. The general technical categories were selected to conform to the topics covered in the International Atomic Energy Agency (IAEA) Safety Guide GS-G-4.1. This document, which is the third report on the results of the Design Phase Survey, focuses on the Reactor. The Reactor category includes the following technical topics: fuel system design, reactor internals and core support, nuclear design and core nuclear performance, thermal and hydraulic design, reactor materials, and functional design of reactivity control system. For each technical topic, the member countries described the information provided by the applicant, the scope and level of detail of the technical review, the technical basis for granting regulatory authorisation, the skill sets required and the level of effort needed to perform the review. Based on a comparison of the information provided by the member countries in response to the survey, the following observations were made: - Although the description of the information provided by the applicant differs in scope and level of detail among the member countries that provided responses, there are similarities in the information that is required. - All of the technical topics covered in the survey are reviewed in some manner by all of the regulatory authorities that provided responses. - Design review strategies most commonly used to confirm that the regulatory requirements have been met include document review and independent verification of calculations, computer codes, or models used to describe the design and performance of the core and the fuel. - It is common to consider operating experience and

  17. Geochemical reanalysis of historical U.S. Geological Survey sediment samples from the Kougarok area, Bendeleben and Teller quadrangles, Seward Peninsula, Alaska

    Science.gov (United States)

    Werdon, Melanie B.; Granitto, Matthew; Azain, Jaime S.

    2015-01-01

    The State of Alaska’s Strategic and Critical Minerals (SCM) Assessment project, a State-funded Capital Improvement Project (CIP), is designed to evaluate Alaska’s statewide potential for SCM resources. The SCM Assessment is being implemented by the Alaska Division of Geological & Geophysical Surveys (DGGS), and involves obtaining new airborne-geophysical, geological, and geochemical data. As part of the SCM Assessment, thousands of historical geochemical samples from DGGS, U.S. Geological Survey (USGS), and U.S. Bureau of Mines archives are being reanalyzed by DGGS using modern, quantitative, geochemical-analytical methods. The objective is to update the statewide geochemical database to more clearly identify areas in Alaska with SCM potential. The USGS is also undertaking SCM-related geologic studies in Alaska through the federally funded Alaska Critical Minerals cooperative project. DGGS and USGS share the goal of evaluating Alaska’s strategic and critical minerals potential and together created a Letter of Agreement (signed December 2012) and a supplementary Technical Assistance Agreement (#14CMTAA143458) to facilitate the two agencies’ cooperative work. Under these agreements, DGGS contracted the USGS in Denver to reanalyze historical USGS sediment samples from Alaska. For this report, DGGS funded reanalysis of 302 historical USGS sediment samples from the statewide Alaska Geochemical Database Version 2.0 (AGDB2; Granitto and others, 2013). Samples were chosen from the Kougarok River drainage as well as smaller adjacent drainages in the Bendeleben and Teller quadrangles, Seward Peninsula, Alaska (fig. 1). The USGS was responsible for sample retrieval from the National Geochemical Sample Archive (NGSA) in Denver, Colorado through the final quality assurance/quality control (QA/QC) of the geochemical analyses obtained through the USGS contract lab. The new geochemical data are published in this report as a coauthored DGGS report, and will be incorporated

  18. Geochemical reanalysis of historical U.S. Geological Survey sediment samples from the Haines area, Juneau and Skagway quadrangles, southeast Alaska

    Science.gov (United States)

    Werdon, Melanie B.; Granitto, Matthew; Azain, Jaime S.

    2015-01-01

    The State of Alaska’s Strategic and Critical Minerals (SCM) Assessment project, a State-funded Capital Improvement Project (CIP), is designed to evaluate Alaska’s statewide potential for SCM resources. The SCM Assessment is being implemented by the Alaska Division of Geological & Geophysical Surveys (DGGS), and involves obtaining new airborne-geophysical, geological, and geochemical data. As part of the SCM Assessment, thousands of historical geochemical samples from DGGS, U.S. Geological Survey (USGS), and U.S. Bureau of Mines archives are being reanalyzed by DGGS using modern, quantitative, geochemical-analytical methods. The objective is to update the statewide geochemical database to more clearly identify areas in Alaska with SCM potential. The USGS is also undertaking SCM-related geologic studies in Alaska through the federally funded Alaska Critical Minerals cooperative project. DGGS and USGS share the goal of evaluating Alaska’s strategic and critical minerals potential and together created a Letter of Agreement (signed December 2012) and a supplementary Technical Assistance Agreement (#14CMTAA143458) to facilitate the two agencies’ cooperative work. Under these agreements, DGGS contracted the USGS in Denver to reanalyze historical USGS sediment samples from Alaska. For this report, DGGS funded reanalysis of 212 historical USGS sediment samples from the statewide Alaska Geochemical Database Version 2.0 (AGDB2; Granitto and others, 2013). Samples were chosen from the Chilkat, Klehini, Tsirku, and Takhin river drainages, as well as smaller drainages flowing into Chilkat and Chilkoot Inlets near Haines, Skagway Quadrangle, Southeast Alaska. Additionally some samples were also chosen from the Juneau gold belt, Juneau Quadrangle, Southeast Alaska (fig. 1). The USGS was responsible for sample retrieval from the National Geochemical Sample Archive (NGSA) in Denver, Colorado through the final quality assurance/quality control (QA/QC) of the geochemical

  19. Design for Restoration: beyond the survey

    Directory of Open Access Journals (Sweden)

    Giovanni Carbonara

    2015-01-01

    Full Text Available  This new issue, that we can define special, marks an important change for DISEGNARECON (its transfer from the University of Bologna to the University of L’Aquila facing the topic of the Design for the Restoration in a way that is special too. Treated in fact - beside the outgoing editor in chief, Roberto Mingucci - by Mario Centofanti, who now assumes the responsibility for the magazine, and Giovanni Carbonara, which is definitely authoritative reference in the field. Sharing a strong interest for communicating the Restoration Project, they intended to indicate the substantial union of methods and objectives between the disciplines of architectural survey and of restoration, which makes the meaning of an aggregation now also institutionally formalized and particularly significant for the project on the existing architecture. 

  20. Health inequalities: survey data compared to doctor defined data.

    NARCIS (Netherlands)

    Westert, G.P.; Schellevis, F.G.

    2003-01-01

    Aim: To compare prevalence of conditions and health inequalities in one study population using two methods of data collection: health interview survey and GP registration of consultations. Methods: Data is from the Second Dutch Survey of General Practice, using a multistage sampling design with

  1. Bionic Design for Mars Sampling Scoop Inspired by Himalayan Marmot Claw

    Directory of Open Access Journals (Sweden)

    Long Xue

    2016-01-01

    Full Text Available Cave animals are often adapted to digging and life underground, with claw toes similar in structure and function to a sampling scoop. In this paper, the clawed toes of the Himalayan marmot were selected as a biological prototype for bionic research. Based on geometric parameter optimization of the clawed toes, a bionic sampling scoop for use on Mars was designed. Using a 3D laser scanner, the point cloud data of the second front claw toe was acquired. Parametric equations and contour curves for the claw were then built with cubic polynomial fitting. We obtained 18 characteristic curve equations for the internal and external contours of the claw. A bionic sampling scoop was designed according to the structural parameters of Curiosity’s sampling shovel and the contours of the Himalayan marmot’s claw. Verifying test results showed that when the penetration angle was 45° and the sampling speed was 0.33 r/min, the bionic sampling scoops’ resistance torque was 49.6% less than that of the prototype sampling scoop. When the penetration angle was 60° and the sampling speed was 0.22 r/min, the resistance torque of the bionic sampling scoop was 28.8% lower than that of the prototype sampling scoop.

  2. Evaluation of optimized bronchoalveolar lavage sampling designs for characterization of pulmonary drug distribution.

    Science.gov (United States)

    Clewe, Oskar; Karlsson, Mats O; Simonsson, Ulrika S H

    2015-12-01

    Bronchoalveolar lavage (BAL) is a pulmonary sampling technique for characterization of drug concentrations in epithelial lining fluid and alveolar cells. Two hypothetical drugs with different pulmonary distribution rates (fast and slow) were considered. An optimized BAL sampling design was generated assuming no previous information regarding the pulmonary distribution (rate and extent) and with a maximum of two samples per subject. Simulations were performed to evaluate the impact of the number of samples per subject (1 or 2) and the sample size on the relative bias and relative root mean square error of the parameter estimates (rate and extent of pulmonary distribution). The optimized BAL sampling design depends on a characterized plasma concentration time profile, a population plasma pharmacokinetic model, the limit of quantification (LOQ) of the BAL method and involves only two BAL sample time points, one early and one late. The early sample should be taken as early as possible, where concentrations in the BAL fluid ≥ LOQ. The second sample should be taken at a time point in the declining part of the plasma curve, where the plasma concentration is equivalent to the plasma concentration in the early sample. Using a previously described general pulmonary distribution model linked to a plasma population pharmacokinetic model, simulated data using the final BAL sampling design enabled characterization of both the rate and extent of pulmonary distribution. The optimized BAL sampling design enables characterization of both the rate and extent of the pulmonary distribution for both fast and slowly equilibrating drugs.

  3. Survey of research on the optimal design of sea harbours

    Directory of Open Access Journals (Sweden)

    Hassan Diab

    2017-07-01

    Full Text Available The design of harbours, as with any other system design, must be an optimization process. In this study, a global examination of the different constraints in coastal engineering was performed and an optimization problem was defined. The problem has multiple objectives, and the criteria to be minimized are the structure cost and wave height disturbance inside a harbour. As concluded in this survey, the constraints are predefined parameters, mandatory constraints or optional constraints. All of these constraints are categorized into four categories: environmental, fluid mechanical, structural and manoeuvring.

  4. The Army Communications Objectives Measurement System (ACOMS): Survey Design

    Science.gov (United States)

    1988-04-01

    tiveness, assessments of advertising strategy efficiencies, management of the advertising program, and planning and development of new marketing strategies... advertising strategy and market segmentation. The ACOMS development effort has focused on specifying the design and analysis plan for the survey...second set of goals involves the use of ACOMS data to assess the Army’s advertising strategy . ACOMS is examining the extent to which the Army’s in

  5. Gathering Opinions on Depression Information Needs and Preferences: Samples and Opinions in Clinic Versus Web-Based Surveys.

    Science.gov (United States)

    Bernstein, Matthew T; Walker, John R; Sexton, Kathryn A; Katz, Alan; Beatie, Brooke E

    2017-04-24

    There has been limited research on the information needs and preferences of the public concerning treatment for depression. Very little research is available comparing samples and opinions when recruitment for surveys is done over the Web as opposed to a personal invitation to complete a paper survey. This study aimed to (1) to explore information needs and preferences among members of the public and (2) compare Clinic and Web samples on sample characteristics and survey findings. Web survey participants were recruited with a notice on three self-help association websites (N=280). Clinic survey participants were recruited by a research assistant in the waiting rooms of a family medicine clinic and a walk-in medical clinic (N=238) and completed a paper version of the survey. The Clinic and Web samples were similar in age (39.0 years, SD 13.9 vs 40.2 years, SD 12.5, respectively), education, and proportion in full time employment. The Clinic sample was more diverse in demographic characteristics and closer to the demographic characteristics of the region (Winnipeg, Canada) with a higher proportion of males (102/238 [42.9%] vs 45/280 [16.1%]) and nonwhites (Aboriginal, Asian, and black) (69/238 [29.0%] vs 39/280 [13.9%]). The Web sample reported a higher level of emotional distress and had more previous psychological (224/280 [80.0%] vs 83/238 [34.9%]) and pharmacological (202/280 [72.1%] vs 57/238 [23.9%]) treatment. In terms of opinions, most respondents in both settings saw information on a wide range of topics around depression treatment as very important including information about treatment choices, effectiveness of treatment, how long it takes treatment to work, how long treatment continues, what happens when treatment stops, advantages and disadvantages of treatments, and potential side effects. Females, respondents with a white background, and those who had received or felt they would have benefited from therapy in the past saw more information topics as very

  6. Archaeology in the Kilauea East Rift Zone: Part 2, A preliminary sample survey, Kapoho, Kamaili and Kilauea geothermal subzones, Puna District, Hawaii island

    Energy Technology Data Exchange (ETDEWEB)

    Sweeney, M.T.K.; Burtchard, G.C. [International Archaeological Research Inst., Inc., Honolulu, HI (United States)

    1995-05-01

    This report describes a preliminary sample inventory and offers an initial evaluation of settlement and land-use patterns for the Geothermal Resources Subzones (GRS) area, located in Puna District on the island of Hawaii. The report is the second of a two part project dealing with archaeology of the Puna GRS area -- or more generally, the Kilauea East Rift Zone. In the first phase of the project, a long-term land-use model and inventory research design was developed for the GRS area and Puna District generally. That report is available under separate cover as Archaeology in the Kilauea East Rift Zone, Part I: Land-Use Model and Research Design. The present report gives results of a limited cultural resource survey built on research design recommendations. It offers a preliminary evaluation of modeled land-use expectations and offers recommendations for continuing research into Puna`s rich cultural heritage. The present survey was conducted under the auspices of the United States Department of Energy, and subcontracted to International Archaeological Research Institute, Inc. (IARII) by Martin Marietta Energy Systems, Inc. The purpose of the archaeological work is to contribute toward the preparation of an environmental impact statement by identifying cultural materials which could be impacted through completion of the proposed Hawaii Geothermal Project.

  7. Optimal experiment design in a filtering context with application to sampled network data

    OpenAIRE

    Singhal, Harsh; Michailidis, George

    2010-01-01

    We examine the problem of optimal design in the context of filtering multiple random walks. Specifically, we define the steady state E-optimal design criterion and show that the underlying optimization problem leads to a second order cone program. The developed methodology is applied to tracking network flow volumes using sampled data, where the design variable corresponds to controlling the sampling rate. The optimal design is numerically compared to a myopic and a naive strategy. Finally, w...

  8. Prevalence of coronary artery disease and coronary risk factors in Kerala, South India: A population surveyDesign and methods

    Directory of Open Access Journals (Sweden)

    Geevar Zachariah

    2013-05-01

    Methods: The design of the study was cross-sectional population survey. We estimated the sample size based on an anticipated prevalence of 7.4% of CAD for rural and 11% for urban Kerala. The derived sample sizes for rural and urban areas were 3000 and 2400, respectively. The urban areas for sampling constituted one ward each from three municipal corporations at different parts of the state. The rural sample was drawn from two panchayats each in the same districts as the urban sample. One adult from each household in the age group of 20–59 years was selected using Kish method. All subjects between 60 and 79 years were included from each household. A detailed questionnaire was administered to assess the risk factors, history of CAD, family history, educational status, socioeconomic status, dietary habits, physical activity and treatment for CAD; anthropometric measurements, blood pressure, electrocardiogram and fasting blood levels of glucose and lipids were recorded.

  9. A Survey for Spectroscopic Binaries in a Large Sample of G Dwarfs

    Science.gov (United States)

    Udry, S.; Mayor, M.; Latham, D. W.; Stefanik, R. P.; Torres, G.; Mazeh, T.; Goldberg, D.; Andersen, J.; Nordstrom, B.

    For more than 5 years now, the radial velocities for a large sample of G dwarfs (3,347 stars) have been monitored in order to obtain an unequaled set of orbital parameters for solar-type stars (~400 orbits, up to now). This survey provides a considerable improvement on the classical systematic study by Duquennoy and Mayor (1991; DM91). The observational part of the survey has been carried out in the context of a collaboration between the Geneva Observatory on the two coravel spectrometers for the southern sky and CfA at Oakridge and Whipple Observatories for the northern sky. As a first glance at these new results, we will address in this contribution a special aspect of the orbital eccentricity distribution, namely the disappearance of the void observed in DM91 for quasi-circular orbits with periods larger than 10 days.

  10. Pseudo-populations a basic concept in statistical surveys

    CERN Document Server

    Quatember, Andreas

    2015-01-01

    This book emphasizes that artificial or pseudo-populations play an important role in statistical surveys from finite universes in two manners: firstly, the concept of pseudo-populations may substantially improve users’ understanding of various aspects in the sampling theory and survey methodology; an example of this scenario is the Horvitz-Thompson estimator. Secondly, statistical procedures exist in which pseudo-populations actually have to be generated. An example of such a scenario can be found in simulation studies in the field of survey sampling, where close-to-reality pseudo-populations are generated from known sample and population data to form the basis for the simulation process. The chapters focus on estimation methods, sampling techniques, nonresponse, questioning designs and statistical disclosure control.This book is a valuable reference in understanding the importance of the pseudo-population concept and applying it in teaching and research.

  11. Survey, design, development, and installation of micro hydel power generation

    International Nuclear Information System (INIS)

    Ijaz, M.

    2011-01-01

    This paper presents the survey, design, development and installation Of micro hydel power generation using low head Kaplan water turbine. Electricity production from hydro power has been and still is today, the first renewable source used to generate electricity. The development of energy from renewable is very important step in reduction of carbon emissions(CO/sub 2/).

  12. A Primer for Conducting Survey Research Using MTurk: Tips for the Field

    Science.gov (United States)

    Chambers, Silvana; Nimon, Kim; Anthony-McMann, Paula

    2016-01-01

    This paper presents best practices for conducting survey research using Amazon Mechanical Turk (MTurk). Readers will learn the benefits, limitations, and trade-offs of using MTurk as compared to other recruitment services, including SurveyMonkey and Qualtrics. A synthesis of survey design guidelines along with a sample survey are presented to help…

  13. THE SLOAN DIGITAL SKY SURVEY QUASAR LENS SEARCH. IV. STATISTICAL LENS SAMPLE FROM THE FIFTH DATA RELEASE

    International Nuclear Information System (INIS)

    Inada, Naohisa; Oguri, Masamune; Shin, Min-Su; Kayo, Issha; Fukugita, Masataka; Strauss, Michael A.; Gott, J. Richard; Hennawi, Joseph F.; Morokuma, Tomoki; Becker, Robert H.; Gregg, Michael D.; White, Richard L.; Kochanek, Christopher S.; Chiu, Kuenley; Johnston, David E.; Clocchiatti, Alejandro; Richards, Gordon T.; Schneider, Donald P.; Frieman, Joshua A.

    2010-01-01

    We present the second report of our systematic search for strongly lensed quasars from the data of the Sloan Digital Sky Survey (SDSS). From extensive follow-up observations of 136 candidate objects, we find 36 lenses in the full sample of 77,429 spectroscopically confirmed quasars in the SDSS Data Release 5. We then define a complete sample of 19 lenses, including 11 from our previous search in the SDSS Data Release 3, from the sample of 36,287 quasars with i Λ = 0.84 +0.06 -0.08 (stat.) +0.09 -0.07 (syst.) assuming a flat universe, which is in good agreement with other cosmological observations. We also report the discoveries of seven binary quasars with separations ranging from 1.''1 to 16.''6, which are identified in the course of our lens survey. This study concludes the construction of our statistical lens sample in the full SDSS-I data set.

  14. Survey on Cooled-Vessel Designs in High Temperature Gas-Cooled Reactors

    International Nuclear Information System (INIS)

    Kim, Min-Hwan; Lee, Won-Jae

    2006-01-01

    The core outlet temperature of the coolant in the high temperature gas-cooled reactors (HTGR) has been increased to improve the overall efficiency of their electricity generation by using the Brayton cycle or their nuclear hydrogen production by using thermo-chemical processes. The increase of the outlet temperature accompanies an increase of the coolant inlet temperature. A high coolant inlet temperature results in an increase of the reactor pressure vessel (RPV) operation temperature. The conventional steels, proven vessel material in light water reactors, cannot be used as materials for the RPV in the elevated temperatures which necessitate its design to account for the creep effects. Some ferritic or martensitic steels like 2 1/4Cr-1Mo and 9Cr-1Mo-V are very well established creep resistant materials for a temperature range of 400 to 550 C. Although these materials have been used in a chemical plant, there is limited experience with using these materials in nuclear reactors. Even though the 2 1/4Cr-1Mo steel was used to manufacture the RPV for HTR-10 of Japan Atomic Energy Agency(JAEA), a large RPV has not been manufactured by using this material or 9Cr-1Mo-V steel. Due to not only its difficulties in manufacturing but also its high cost, the JAEA determined that they would exclude these materials from the GTHTR design. For the above reasons, KAERI has been considering a cooled-vessel design as an option for the RPV design of a NHDD plant (Nuclear Hydrogen Development and Demonstration). In this study, we surveyed several HTGRs, which adopt the cooled-vessel concept for their RPV design, and discussed their design characteristics. The survey results in design considerations for the NHDD cooled-vessel design

  15. Inclusion of mobile telephone numbers into an ongoing population health survey in New South Wales, Australia, using an overlapping dual-frame design: impact on the time series.

    Science.gov (United States)

    Barr, Margo L; Ferguson, Raymond A; Steel, David G

    2014-08-12

    Since 1997, the NSW Population Health Survey (NSWPHS) had selected the sample using random digit dialing of landline telephone numbers. When the survey began coverage of the population by landline phone frames was high (96%). As landline coverage in Australia has declined and continues to do so, in 2012, a sample of mobile telephone numbers was added to the survey using an overlapping dual-frame design. Details of the methodology are published elsewhere. This paper discusses the impacts of the sampling frame change on the time series, and provides possible approaches to handling these impacts. Prevalence estimates were calculated for type of phone-use, and a range of health indicators. Prevalence ratios (PR) for each of the health indicators were also calculated using Poisson regression analysis with robust variance estimation by type of phone-use. Health estimates for 2012 were compared to 2011. The full time series was examined for selected health indicators. It was estimated from the 2012 NSWPHS that 20.0% of the NSW population were mobile-only phone users. Looking at the full time series for overweight or obese and current smoking if the NSWPHS had continued to be undertaken only using a landline frame, overweight or obese would have been shown to continue to increase and current smoking would have been shown to continue to decrease. However, with the introduction of the overlapping dual-frame design in 2012, overweight or obese increased until 2011 and then decreased in 2012, and current smoking decreased until 2011, and then increased in 2012. Our examination of these time series showed that the changes were a consequence of the sampling frame change and were not real changes. Both the backcasting method and the minimal coverage method could adequately adjust for the design change and allow for the continuation of the time series. The inclusion of the mobile telephone numbers, through an overlapping dual-frame design, did impact on the time series for some of

  16. Design/Operations review of core sampling trucks and associated equipment

    International Nuclear Information System (INIS)

    Shrivastava, H.P.

    1996-01-01

    A systematic review of the design and operations of the core sampling trucks was commissioned by Characterization Equipment Engineering of the Westinghouse Hanford Company in October 1995. The review team reviewed the design documents, specifications, operating procedure, training manuals and safety analysis reports. The review process, findings and corrective actions are summarized in this supporting document

  17. Practical iterative learning control with frequency domain design and sampled data implementation

    CERN Document Server

    Wang, Danwei; Zhang, Bin

    2014-01-01

    This book is on the iterative learning control (ILC) with focus on the design and implementation. We approach the ILC design based on the frequency domain analysis and address the ILC implementation based on the sampled data methods. This is the first book of ILC from frequency domain and sampled data methodologies. The frequency domain design methods offer ILC users insights to the convergence performance which is of practical benefits. This book presents a comprehensive framework with various methodologies to ensure the learnable bandwidth in the ILC system to be set with a balance between learning performance and learning stability. The sampled data implementation ensures effective execution of ILC in practical dynamic systems. The presented sampled data ILC methods also ensure the balance of performance and stability of learning process. Furthermore, the presented theories and methodologies are tested with an ILC controlled robotic system. The experimental results show that the machines can work in much h...

  18. Design of sampling tools for Monte Carlo particle transport code JMCT

    International Nuclear Information System (INIS)

    Shangguan Danhua; Li Gang; Zhang Baoyin; Deng Li

    2012-01-01

    A class of sampling tools for general Monte Carlo particle transport code JMCT is designed. Two ways are provided to sample from distributions. One is the utilization of special sampling methods for special distribution; the other is the utilization of general sampling methods for arbitrary discrete distribution and one-dimensional continuous distribution on a finite interval. Some open source codes are included in the general sampling method for the maximum convenience of users. The sampling results show sampling correctly from distribution which are popular in particle transport can be achieved with these tools, and the user's convenience can be assured. (authors)

  19. Geochemical surveys in the United States in relation to health

    Energy Technology Data Exchange (ETDEWEB)

    Tourtelot, H A

    1979-12-11

    Geochemical surveys in relation to health may be classified as having one, two or three dimensions. One-dimensional surveys examine relations between concentrations of elements such as Pb in soils and other media and burdens of the same elements in humans, at a given time. The spatial distributions of element concentrations are not investigated. The primary objective of two-dimensional surveys is to map the distributions of element concentrations, commonly according to stratified random sampling designs based on either conceptual landscape units or artificial sampling strata, but systematic sampling intervals have also been used. Political units have defined sample areas that coincide with the units used to accumulate epidemiological data. Element concentrations affected by point sources have also been mapped. Background values, location of natural or technological anomalies and the geographic scale of variation for several elements often are determined. Three-dimensional surveys result when two-dimensional surveys are repeated to detect environmental changes.

  20. An open-population hierarchical distance sampling model

    Science.gov (United States)

    Sollmann, Rachel; Beth Gardner,; Richard B Chandler,; Royle, J. Andrew; T Scott Sillett,

    2015-01-01

    Modeling population dynamics while accounting for imperfect detection is essential to monitoring programs. Distance sampling allows estimating population size while accounting for imperfect detection, but existing methods do not allow for direct estimation of demographic parameters. We develop a model that uses temporal correlation in abundance arising from underlying population dynamics to estimate demographic parameters from repeated distance sampling surveys. Using a simulation study motivated by designing a monitoring program for island scrub-jays (Aphelocoma insularis), we investigated the power of this model to detect population trends. We generated temporally autocorrelated abundance and distance sampling data over six surveys, using population rates of change of 0.95 and 0.90. We fit the data generating Markovian model and a mis-specified model with a log-linear time effect on abundance, and derived post hoc trend estimates from a model estimating abundance for each survey separately. We performed these analyses for varying number of survey points. Power to detect population changes was consistently greater under the Markov model than under the alternatives, particularly for reduced numbers of survey points. The model can readily be extended to more complex demographic processes than considered in our simulations. This novel framework can be widely adopted for wildlife population monitoring.

  1. An open-population hierarchical distance sampling model.

    Science.gov (United States)

    Sollmann, Rahel; Gardner, Beth; Chandler, Richard B; Royle, J Andrew; Sillett, T Scott

    2015-02-01

    Modeling population dynamics while accounting for imperfect detection is essential to monitoring programs. Distance sampling allows estimating population size while accounting for imperfect detection, but existing methods do not allow for estimation of demographic parameters. We develop a model that uses temporal correlation in abundance arising from underlying population dynamics to estimate demographic parameters from repeated distance sampling surveys. Using a simulation study motivated by designing a monitoring program for Island Scrub-Jays (Aphelocoma insularis), we investigated the power of this model to detect population trends. We generated temporally autocorrelated abundance and distance sampling data over six surveys, using population rates of change of 0.95 and 0.90. We fit the data generating Markovian model and a mis-specified model with a log-linear time effect on abundance, and derived post hoc trend estimates from a model estimating abundance for each survey separately. We performed these analyses for varying numbers of survey points. Power to detect population changes was consistently greater under the Markov model than under the alternatives, particularly for reduced numbers of survey points. The model can readily be extended to more complex demographic processes than considered in our simulations. This novel framework can be widely adopted for wildlife population monitoring.

  2. CHAracteristics of research studies that iNfluence practice: a GEneral survey of Canadian orthopaedic Surgeons (CHANGES): a pilot survey.

    Science.gov (United States)

    de Sa, Darren; Thornley, Patrick; Evaniew, Nathan; Madden, Kim; Bhandari, Mohit; Ghert, Michelle

    2015-01-01

    Evidence Based Medicine (EBM) is increasingly being applied to inform clinical decision-making in orthopaedic surgery. Despite the promotion of EBM in Orthopaedic Surgery, the adoption of results from high quality clinical research seems highly unpredictable and does not appear to be driven strictly by randomized trial data. The objective of this study was to pilot a survey to determine if we could identify surgeon opinions on the characteristics of research studies that are perceived as being most likely to influence clinical decision-making among orthopaedic surgeons in Canada. A 28-question electronic survey was distributed to active members of the Canadian Orthopaedic Association (COA) over a period of 11 weeks. The questionnaire sought to analyze the influence of both extrinsic and intrinsic characteristics of research studies and their potential to influence practice patterns. Extrinsic factors included the perceived journal quality and investigator profiles, economic impact, peer/patient/industry influence and individual surgeon residency/fellowship training experiences. Intrinsic factors included study design, sample size, and outcomes reported. Descriptive statistics are provided. Of the 109 members of the COA who opened the survey, 95 (87%) completed the survey in its entirety. The overall response rate was 11% (95/841). Surgeons achieved consensus on the influence of three key designs on their practices: 1) randomized controlled trials 94 (99%), 2) meta-analysis 83 (87%), and 3) systematic reviews 81 (85%). Sixty-seven percent of surgeons agreed that studies with sample sizes of 101-500 or more were more likely to influence clinical practice than smaller studies (n = design influencing adoption included 1) reputation of the investigators (99%) and 2) perceived quality of the journal (75%). Although study design and sample size (i.e. minimum of 100 patients) have some influence on clinical decision making, surgeon respondents are equally influenced

  3. Men who have sex with men in Great Britain: comparing methods and estimates from probability and convenience sample surveys

    OpenAIRE

    Prah, Philip; Hickson, Ford; Bonell, Chris; McDaid, Lisa M; Johnson, Anne M; Wayal, Sonali; Clifton, Soazig; Sonnenberg, Pam; Nardone, Anthony; Erens, Bob; Copas, Andrew J; Riddell, Julie; Weatherburn, Peter; Mercer, Catherine H

    2016-01-01

    Objective: To examine sociodemographic and behavioural differences between men whohave sex with men (MSM) participating in recent UK convenience surveys and a national probability sample survey.\\ud Methods: We compared 148 MSM aged 18–64 years interviewed for Britain's third National Survey of Sexual Attitudes and Lifestyles (Natsal-3) undertaken in 2010–2012, with men inthe same age range participating in contemporaneous convenience surveys of MSM: 15 500 British resident men in the European...

  4. [A respondent-driven sampling survey on HIV and risk factors among men who have sex with men in Chongqing].

    Science.gov (United States)

    Ouyang, Lin; Feng, Lian-gui; Ding, Xian-bin; Zhao, Jin-kou; Xu, Jing; Han, Mei; Zhou, Chao

    2009-10-01

    To examine HIV prevalence and related risk factors among men who have sex with men (MSM) in Chongqing, and to explore the feasibility of using respondent-driven sampling (RDS) in the survey. Based on results from formative research, a RDS survey was designed and conducted to collect demographic, behavioral and serologic data. RDSAT was used to calculate point estimation and confidence intervals. SPSS was used for bi-variate analysis using RDSAT exported weighed data. NETDRAW was used to draw network diagram. Among 617 subjects recruited, the adjusted HIV and syphilis prevalence rates were 16.8% and 10.9%, respectively. 73.0% of the subjects were 20 to 29 years old and 72.9% were officially registered residents of Chongqing. 83.4% were single, with the proportion of students the highest, accounting for 24.6%. During the last six months, 83.4% of them reported ever having anal sex, and 54.0% reported having unprotected anal sex. This survey confirmed that Chongqing had a higher reported HIV prevalence among MSM than from other Chinese cities. Comprehensive intervention services were required to address this alarmingly high prevalence, with focus on intervention through internet and those having syphilis infection. RDS seemed one of the effective ways of recruiting hidden MSM populations in Chongqing which had a large population of active MSM who did not frequently visit MSM venues as compared with snowball method.

  5. Optimizing incomplete sample designs for item response model parameters

    NARCIS (Netherlands)

    van der Linden, Willem J.

    Several models for optimizing incomplete sample designs with respect to information on the item parameters are presented. The following cases are considered: (1) known ability parameters; (2) unknown ability parameters; (3) item sets with multiple ability scales; and (4) response models with

  6. The Lyα reference sample. I. Survey outline and first results for Markarian 259

    International Nuclear Information System (INIS)

    Östlin, Göran; Hayes, Matthew; Duval, Florent; Sandberg, Andreas; Rivera-Thorsen, Thøger; Marquart, Thomas; Adamo, Angela; Melinder, Jens; Guaita, Lucia; Micheva, Genoveva; Orlitová, Ivana; Atek, Hakim; Cannon, John M.; Pardy, Stephen A.; Gruyters, Pieter; Herenz, Edmund Christian; Kunth, Daniel; Laursen, Peter; Mas-Hesse, J. Miguel; Otí-Floranes, Héctor

    2014-01-01

    The Lyα Reference Sample (LARS) is a substantial program with the Hubble Space Telescope (HST) that provides a sample of local universe laboratory galaxies in which to study the detailed astrophysics of the visibility and strength of the Lyαline of neutral hydrogen. Lyα is the dominant spectral line in use for characterizing high-redshift (z) galaxies. This paper presents an overview of the survey, its selection function, and HST imaging observations. The sample was selected from the combined GALEX+Sloan Digital Sky Survey catalog at z = 0.028-0.19, in order to allow Lyα to be captured with combinations of long-pass filters in the Solar Blind Channel (SBC) of the Advanced Camera for Surveys (ACS) onboard HST. In addition, LARS utilizes Hα and Hβ narrowband and u, b, i broadband imaging with ACS and the Wide Field Camera 3 (WFC3). In order to study galaxies in which large numbers of Lyα photons are produced (whether or not they escape), we demanded an Hα equivalent width W(Hα) ≥100 Å. The final sample of 14 galaxies covers far-UV (FUV, λ ∼ 1500 Å) luminosities that overlap with those of high-z Lyα emitters (LAEs) and Lyman break galaxies (LBGs), making LARS a valid comparison sample. We present the reduction steps used to obtain the Lyα images, including our LARS eXtraction software (LaXs), which utilizes pixel-by-pixel spectral synthesis fitting of the energy distribution to determine and subtract the continuum at Lyα. We demonstrate that the use of SBC long-pass-filter combinations increase the signal-to-noise ratio by an order of magnitude compared to the nominal Lyα filter available in SBC. To exemplify the science potential of LARS, we also present some first results for a single galaxy, Mrk 259 (LARS #1). This irregular galaxy shows bright and extended (indicative of resonance scattering) but strongly asymmetric Lyα emission. Spectroscopy from the Cosmic Origins Spectrograph on board HST centered on the brightest UV knot shows a moderate

  7. The Lyα reference sample. I. Survey outline and first results for Markarian 259

    Energy Technology Data Exchange (ETDEWEB)

    Östlin, Göran; Hayes, Matthew; Duval, Florent; Sandberg, Andreas; Rivera-Thorsen, Thøger; Marquart, Thomas; Adamo, Angela; Melinder, Jens; Guaita, Lucia; Micheva, Genoveva [Department of Astronomy, Stockholm University, Oscar Klein Centre, AlbaNova, Stockholm SE-106 91 (Sweden); Orlitová, Ivana [Observatoire de Genève, Université de Genève, Chemin des Maillettes 51, 1290 Versoix (Switzerland); Atek, Hakim [Laboratoire d' Astrophysique, Ecole Polytechnique Fédérale de Lausanne, Observatoire de Sauverny, CH-1290 Versoix (Switzerland); Cannon, John M.; Pardy, Stephen A. [Department of Physics and Astronomy, Macalester College, 1600 Grand Avenue, Saint Paul, MN 55105 (United States); Gruyters, Pieter [Department of Physics and Astronomy, Division of Astronomy and Space Physics, Uppsala University, Box 516, 75120 Uppsala (Sweden); Herenz, Edmund Christian [Leibniz-Institute for Astrophysics Potsdam (AIP), innoFSPEC, An der Sternwarte 16, D-14482 Potsdam (Germany); Kunth, Daniel [Institut d' Astrophysique Paris, 98bis Bd Arago, F-75014 Paris (France); Laursen, Peter [Dark Cosmology Centre, Niels Bohr Institute, University of Copenhagen, DK-2100 Copenhagen (Denmark); Mas-Hesse, J. Miguel [Centro de Astrobiologa (CSIC-INTA), Departamento de Astrofsica, POB 78, E-28691, Villanueva de la Cañada (Spain); Otí-Floranes, Héctor [Instituto de Astronoma, Universidad Nacional Autnoma de Mxico, Apdo. Postal 106, Ensenada B. C. 22800 (Mexico); and others

    2014-12-10

    The Lyα Reference Sample (LARS) is a substantial program with the Hubble Space Telescope (HST) that provides a sample of local universe laboratory galaxies in which to study the detailed astrophysics of the visibility and strength of the Lyαline of neutral hydrogen. Lyα is the dominant spectral line in use for characterizing high-redshift (z) galaxies. This paper presents an overview of the survey, its selection function, and HST imaging observations. The sample was selected from the combined GALEX+Sloan Digital Sky Survey catalog at z = 0.028-0.19, in order to allow Lyα to be captured with combinations of long-pass filters in the Solar Blind Channel (SBC) of the Advanced Camera for Surveys (ACS) onboard HST. In addition, LARS utilizes Hα and Hβ narrowband and u, b, i broadband imaging with ACS and the Wide Field Camera 3 (WFC3). In order to study galaxies in which large numbers of Lyα photons are produced (whether or not they escape), we demanded an Hα equivalent width W(Hα) ≥100 Å. The final sample of 14 galaxies covers far-UV (FUV, λ ∼ 1500 Å) luminosities that overlap with those of high-z Lyα emitters (LAEs) and Lyman break galaxies (LBGs), making LARS a valid comparison sample. We present the reduction steps used to obtain the Lyα images, including our LARS eXtraction software (LaXs), which utilizes pixel-by-pixel spectral synthesis fitting of the energy distribution to determine and subtract the continuum at Lyα. We demonstrate that the use of SBC long-pass-filter combinations increase the signal-to-noise ratio by an order of magnitude compared to the nominal Lyα filter available in SBC. To exemplify the science potential of LARS, we also present some first results for a single galaxy, Mrk 259 (LARS #1). This irregular galaxy shows bright and extended (indicative of resonance scattering) but strongly asymmetric Lyα emission. Spectroscopy from the Cosmic Origins Spectrograph on board HST centered on the brightest UV knot shows a moderate

  8. A Survey on Efficient Collaboration of Design and Simulation in Product Development

    OpenAIRE

    Kreimeyer, M.;Deubzer, F.;Herfeld, U..;Lindemann, U.

    2017-01-01

    Efficient collaboration is a popular topic in all kinds of industry with products evolving into evermore complex sytems and with taylorism in product development increasing. With the goal of satisfying the customer’s functional desires for the product, the cooperation of embodiment design, simulation and testing departments in a company plays an essential role. The results of a survey to explore problems and chances of the former two are layed out in the following. For the survey, about 50 qu...

  9. The Outer Solar System Origins Survey. I. ; Design and First-Quarter Discoveries

    Science.gov (United States)

    Bannister, Michele T.; Kavelaars, J. J.; Petit, Jean-Marc; Gladman, Brett J.; Gwyn, Stephen D. J.; Chen, Ying-Tung; Volk, Kathryn; Alexandersen, Mike; Benecchi, Susan D.; Delsanti, Audrey; hide

    2016-01-01

    We report the discovery, tracking, and detection circumstances for 85 trans-Neptunian objects (TNOs) from the first 42 square degrees of the Outer Solar System Origins Survey. This ongoing r-band solar system survey uses the 0.9 square degree field of view MegaPrime camera on the 3.6 meter Canada-France-Hawaii Telescope. Our orbital elements for these TNOs are precise to a fractional semimajor axis uncertainty of less than 0.1 percent. We achieve this precision in just two oppositions, as compared to the normal three to five oppositions, via a dense observing cadence and innovative astrometric technique. These discoveries are free of ephemeris bias, a first for large trans-Neptunian surveys. We also provide the necessary information to enable models of TNO orbital distributions to be tested against our TNO sample. We confirm the existence of a cold "kernel" of objects within the main cold classical Kuiper Belt and infer the existence of an extension of the "stirred" cold classical Kuiper Belt to at least several au beyond the 2:1 mean motion resonance with Neptune. We find that the population model of Petit et al. remains a plausible representation of the Kuiper Belt. The full survey, to be completed in 2017, will provide an exquisitely characterized sample of important resonant TNO populations, ideal for testing models of giant planet migration during the early history of the solar system.

  10. Network Sampling with Memory: A proposal for more efficient sampling from social networks

    Science.gov (United States)

    Mouw, Ted; Verdery, Ashton M.

    2013-01-01

    Techniques for sampling from networks have grown into an important area of research across several fields. For sociologists, the possibility of sampling from a network is appealing for two reasons: (1) A network sample can yield substantively interesting data about network structures and social interactions, and (2) it is useful in situations where study populations are difficult or impossible to survey with traditional sampling approaches because of the lack of a sampling frame. Despite its appeal, methodological concerns about the precision and accuracy of network-based sampling methods remain. In particular, recent research has shown that sampling from a network using a random walk based approach such as Respondent Driven Sampling (RDS) can result in high design effects (DE)—the ratio of the sampling variance to the sampling variance of simple random sampling (SRS). A high design effect means that more cases must be collected to achieve the same level of precision as SRS. In this paper we propose an alternative strategy, Network Sampling with Memory (NSM), which collects network data from respondents in order to reduce design effects and, correspondingly, the number of interviews needed to achieve a given level of statistical power. NSM combines a “List” mode, where all individuals on the revealed network list are sampled with the same cumulative probability, with a “Search” mode, which gives priority to bridge nodes connecting the current sample to unexplored parts of the network. We test the relative efficiency of NSM compared to RDS and SRS on 162 school and university networks from Add Health and Facebook that range in size from 110 to 16,278 nodes. The results show that the average design effect for NSM on these 162 networks is 1.16, which is very close to the efficiency of a simple random sample (DE=1), and 98.5% lower than the average DE we observed for RDS. PMID:24159246

  11. Nonprobability Web Surveys to Measure Sexual Behaviors and Attitudes in the General Population: A Comparison With a Probability Sample Interview Survey

    Science.gov (United States)

    Burkill, Sarah; Couper, Mick P; Conrad, Frederick; Clifton, Soazig; Tanton, Clare; Phelps, Andrew; Datta, Jessica; Mercer, Catherine H; Sonnenberg, Pam; Prah, Philip; Mitchell, Kirstin R; Wellings, Kaye; Johnson, Anne M; Copas, Andrew J

    2014-01-01

    Background Nonprobability Web surveys using volunteer panels can provide a relatively cheap and quick alternative to traditional health and epidemiological surveys. However, concerns have been raised about their representativeness. Objective The aim was to compare results from different Web panels with a population-based probability sample survey (n=8969 aged 18-44 years) that used computer-assisted self-interview (CASI) for sensitive behaviors, the third British National Survey of Sexual Attitudes and Lifestyles (Natsal-3). Methods Natsal-3 questions were included on 4 nonprobability Web panel surveys (n=2000 to 2099), 2 using basic quotas based on age and sex, and 2 using modified quotas based on additional variables related to key estimates. Results for sociodemographic characteristics were compared with external benchmarks and for sexual behaviors and opinions with Natsal-3. Odds ratios (ORs) were used to express differences between the benchmark data and each survey for each variable of interest. A summary measure of survey performance was the average absolute OR across variables. Another summary measure was the number of key estimates for which the survey differed significantly (at the 5% level) from the benchmarks. Results For sociodemographic variables, the Web surveys were less representative of the general population than Natsal-3. For example, for men, the average absolute OR for Natsal-3 was 1.14, whereas for the Web surveys the average absolute ORs ranged from 1.86 to 2.30. For all Web surveys, approximately two-thirds of the key estimates of sexual behaviors were different from Natsal-3 and the average absolute ORs ranged from 1.32 to 1.98. Differences were appreciable even for questions asked by CASI in Natsal-3. No single Web survey performed consistently better than any other did. Modified quotas slightly improved results for men, but not for women. Conclusions Consistent with studies from other countries on less sensitive topics, volunteer Web

  12. Three-dimensional seismic survey planning based on the newest data acquisition design technique; Saishin no data shutoku design ni motozuku sanjigen jishin tansa keikaku

    Energy Technology Data Exchange (ETDEWEB)

    Minehara, M; Nakagami, K; Tanaka, H [Japan National Oil Corp., Tokyo (Japan). Technology Research Center

    1996-10-01

    Theory of parameter setting for data acquisition is arranged, mainly as to the seismic generating and receiving geometry. This paper also introduces an example of survey planning for three-dimensional land seismic exploration in progress. For the design of data acquisition, fundamental parameters are firstly determined on the basis of the characteristics of reflection records at a given district, and then, the layout of survey is determined. In this study, information through modeling based on the existing interpretation of geologic structures is also utilized, to reflect them for survey specifications. Land three-dimensional seismic survey was designed. Ground surface of the surveyed area consists of rice fields and hilly regions. The target was a nose-shaped structure in the depth about 2,500 m underground. A survey area of 4km{times}5km was set. Records in the shallow layers could not obtained when near offset was not ensured. Quality control of this distribution was important for grasping the shallow structure required. In this survey, the seismic generating point could be ensured more certainly than initially expected, which resulted in the sufficient security of near offset. 2 refs., 2 figs.

  13. Don't spin the pen: two alternative methods for second-stage sampling in urban cluster surveys

    Directory of Open Access Journals (Sweden)

    Rose Angela MC

    2007-06-01

    Full Text Available Abstract In two-stage cluster surveys, the traditional method used in second-stage sampling (in which the first household in a cluster is selected is time-consuming and may result in biased estimates of the indicator of interest. Firstly, a random direction from the center of the cluster is selected, usually by spinning a pen. The houses along that direction are then counted out to the boundary of the cluster, and one is then selected at random to be the first household surveyed. This process favors households towards the center of the cluster, but it could easily be improved. During a recent meningitis vaccination coverage survey in Maradi, Niger, we compared this method of first household selection to two alternatives in urban zones: 1 using a superimposed grid on the map of the cluster area and randomly selecting an intersection; and 2 drawing the perimeter of the cluster area using a Global Positioning System (GPS and randomly selecting one point within the perimeter. Although we only compared a limited number of clusters using each method, we found the sampling grid method to be the fastest and easiest for field survey teams, although it does require a map of the area. Selecting a random GPS point was also found to be a good method, once adequate training can be provided. Spinning the pen and counting households to the boundary was the most complicated and time-consuming. The two methods tested here represent simpler, quicker and potentially more robust alternatives to spinning the pen for cluster surveys in urban areas. However, in rural areas, these alternatives would favor initial household selection from lower density (or even potentially empty areas. Bearing in mind these limitations, as well as available resources and feasibility, investigators should choose the most appropriate method for their particular survey context.

  14. The Hyper Suprime-Cam SSP Survey: Overview and survey design

    Science.gov (United States)

    Aihara, Hiroaki; Arimoto, Nobuo; Armstrong, Robert; Arnouts, Stéphane; Bahcall, Neta A.; Bickerton, Steven; Bosch, James; Bundy, Kevin; Capak, Peter L.; Chan, James H. H.; Chiba, Masashi; Coupon, Jean; Egami, Eiichi; Enoki, Motohiro; Finet, Francois; Fujimori, Hiroki; Fujimoto, Seiji; Furusawa, Hisanori; Furusawa, Junko; Goto, Tomotsugu; Goulding, Andy; Greco, Johnny P.; Greene, Jenny E.; Gunn, James E.; Hamana, Takashi; Harikane, Yuichi; Hashimoto, Yasuhiro; Hattori, Takashi; Hayashi, Masao; Hayashi, Yusuke; Hełminiak, Krzysztof G.; Higuchi, Ryo; Hikage, Chiaki; Ho, Paul T. P.; Hsieh, Bau-Ching; Huang, Kuiyun; Huang, Song; Ikeda, Hiroyuki; Imanishi, Masatoshi; Inoue, Akio K.; Iwasawa, Kazushi; Iwata, Ikuru; Jaelani, Anton T.; Jian, Hung-Yu; Kamata, Yukiko; Karoji, Hiroshi; Kashikawa, Nobunari; Katayama, Nobuhiko; Kawanomoto, Satoshi; Kayo, Issha; Koda, Jin; Koike, Michitaro; Kojima, Takashi; Komiyama, Yutaka; Konno, Akira; Koshida, Shintaro; Koyama, Yusei; Kusakabe, Haruka; Leauthaud, Alexie; Lee, Chien-Hsiu; Lin, Lihwai; Lin, Yen-Ting; Lupton, Robert H.; Mandelbaum, Rachel; Matsuoka, Yoshiki; Medezinski, Elinor; Mineo, Sogo; Miyama, Shoken; Miyatake, Hironao; Miyazaki, Satoshi; Momose, Rieko; More, Anupreeta; More, Surhud; Moritani, Yuki; Moriya, Takashi J.; Morokuma, Tomoki; Mukae, Shiro; Murata, Ryoma; Murayama, Hitoshi; Nagao, Tohru; Nakata, Fumiaki; Niida, Mana; Niikura, Hiroko; Nishizawa, Atsushi J.; Obuchi, Yoshiyuki; Oguri, Masamune; Oishi, Yukie; Okabe, Nobuhiro; Okamoto, Sakurako; Okura, Yuki; Ono, Yoshiaki; Onodera, Masato; Onoue, Masafusa; Osato, Ken; Ouchi, Masami; Price, Paul A.; Pyo, Tae-Soo; Sako, Masao; Sawicki, Marcin; Shibuya, Takatoshi; Shimasaku, Kazuhiro; Shimono, Atsushi; Shirasaki, Masato; Silverman, John D.; Simet, Melanie; Speagle, Joshua; Spergel, David N.; Strauss, Michael A.; Sugahara, Yuma; Sugiyama, Naoshi; Suto, Yasushi; Suyu, Sherry H.; Suzuki, Nao; Tait, Philip J.; Takada, Masahiro; Takata, Tadafumi; Tamura, Naoyuki; Tanaka, Manobu M.; Tanaka, Masaomi; Tanaka, Masayuki; Tanaka, Yoko; Terai, Tsuyoshi; Terashima, Yuichi; Toba, Yoshiki; Tominaga, Nozomu; Toshikawa, Jun; Turner, Edwin L.; Uchida, Tomohisa; Uchiyama, Hisakazu; Umetsu, Keiichi; Uraguchi, Fumihiro; Urata, Yuji; Usuda, Tomonori; Utsumi, Yousuke; Wang, Shiang-Yu; Wang, Wei-Hao; Wong, Kenneth C.; Yabe, Kiyoto; Yamada, Yoshihiko; Yamanoi, Hitomi; Yasuda, Naoki; Yeh, Sherry; Yonehara, Atsunori; Yuma, Suraphong

    2018-01-01

    Hyper Suprime-Cam (HSC) is a wide-field imaging camera on the prime focus of the 8.2-m Subaru telescope on the summit of Mauna Kea in Hawaii. A team of scientists from Japan, Taiwan, and Princeton University is using HSC to carry out a 300-night multi-band imaging survey of the high-latitude sky. The survey includes three layers: the Wide layer will cover 1400 deg2 in five broad bands (grizy), with a 5 σ point-source depth of r ≈ 26. The Deep layer covers a total of 26 deg2 in four fields, going roughly a magnitude fainter, while the UltraDeep layer goes almost a magnitude fainter still in two pointings of HSC (a total of 3.5 deg2). Here we describe the instrument, the science goals of the survey, and the survey strategy and data processing. This paper serves as an introduction to a special issue of the Publications of the Astronomical Society of Japan, which includes a large number of technical and scientific papers describing results from the early phases of this survey.

  15. Western states uranium resource survey

    International Nuclear Information System (INIS)

    Tinney, J.F.

    1977-01-01

    ERDA's National Uranium Resource Evaluation (NURE) program was established to provide a comprehensive description of uranium resources in the United States. To carry out this task, ERDA has contracted with various facilities, including universities, private companies, and state agencies, to undertake projects such as airborne radiometric surveys, geological and geochemical studies, and the development of advanced geophysical technology. LLL is one of four ERDA laboratories systematically studying uranium distribution in surface water, groundwater, and lake and stream sediments. We are specifically responsible for surveying seven western states. This past year we have designed and installed facilities for delayed-neutron counting and neutron-activation analysis, completed seven orientation surveys, and analyzed several thousand field samples. Full-scale reconnaissance surveys began last fall

  16. A survey of the French creep-fatigue design rules for LMFBR

    International Nuclear Information System (INIS)

    Tribout, J.; Cordier, G.; Moulin, D.

    1987-01-01

    The paper provides a survey of the creep-fatigue design rules for the LMFBR in France. These rules are the ones currently implemented in French component manufacturing. The background of each item is discussed and the trends for improvements currently investigated are described. The creep-fatigue rules apply to elastic analysis only. (orig.)

  17. THE CHANDRA COSMOS-LEGACY SURVEY: THE z > 3 SAMPLE

    Energy Technology Data Exchange (ETDEWEB)

    Marchesi, S.; Civano, F.; Urry, C. M. [Yale Center for Astronomy and Astrophysics, 260 Whitney Avenue, New Haven, CT 06520 (United States); Salvato, M. [Max-Planck-Institut für extraterrestrische Physik, Giessenbachstrasse 1, D-85748 Garching bei München (Germany); Shankar, F. [Department of Physics and Astronomy, University of Southampton, Highfield, SO17 1BJ (United Kingdom); Comastri, A.; Lanzuisi, G.; Vignali, C.; Zamorani, G.; Brusa, M.; Gilli, R. [INAF–Osservatorio Astronomico di Bologna, via Ranzani 1, 40127 Bologna (Italy); Elvis, M. [Harvard Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Trakhtenbrot, B.; Schawinski, K. [Institute for Astronomy, Department of Physics, ETH Zurich, Wolfgang-Pauli-Strasse 27, CH-8093 Zurich (Switzerland); Allevato, V. [Department of Physics, University of Helsinki, Gustaf Hällströmin katu 2a, FI-00014 Helsinki (Finland); Fiore, F. [INAF–Osservatorio Astronomico di Roma, via di Frascati 33, I-00040 Monte Porzio Catone (Italy); Griffiths, R. [Physics and Astronomy Department, Natural Sciences Division, University of Hawaii at Hilo, 200 W. Kawili Street, Hilo, HI 96720 (United States); Hasinger, G. [Institute for Astronomy, 2680 Woodlawn Drive, University of Hawaii, Honolulu, HI 96822 (United States); Miyaji, T. [Instituto de Astronomía sede Ensenada, Universidad Nacional Autónoma de México, Km. 103, Carret. Tijunana-Ensenada, Ensenada, BC (Mexico); Treister, E. [Universidad de Concepción, Departamento de Astronomía, Casilla 160-C, Concepción (Chile)

    2016-08-20

    We present the largest high-redshift (3 < z < 6.85) sample of X-ray-selected active galactic nuclei (AGNs) on a contiguous field, using sources detected in the Chandra COSMOS-Legacy survey. The sample contains 174 sources, 87 with spectroscopic redshift and the other 87 with photometric redshift (z {sub phot}). In this work, we treat z {sub phot} as a probability-weighted sum of contributions, adding to our sample the contribution of sources with z {sub phot} < 3 but z {sub phot} probability distribution >0 at z > 3. We compute the number counts in the observed 0.5–2 keV band, finding a decline in the number of sources at z > 3 and constraining phenomenological models of the X-ray background. We compute the AGN space density at z > 3 in two different luminosity bins. At higher luminosities (log L (2–10 keV) > 44.1 erg s{sup −1}), the space density declines exponentially, dropping by a factor of ∼20 from z ∼ 3 to z ∼ 6. The observed decline is ∼80% steeper at lower luminosities (43.55 erg s{sup −1} < logL(2–10 keV) < 44.1 erg s{sup −1}) from z ∼ 3 to z ∼ 4.5. We study the space density evolution dividing our sample into optically classified Type 1 and Type 2 AGNs. At log L (2–10 keV) > 44.1 erg s{sup −1}, unobscured and obscured objects may have different evolution with redshift, with the obscured component being three times higher at z ∼ 5. Finally, we compare our space density with predictions of quasar activation merger models, whose calibration is based on optically luminous AGNs. These models significantly overpredict the number of expected AGNs at log L (2–10 keV) > 44.1 erg s{sup −1} with respect to our data.

  18. 105-F and DR Phase 1 Sampling and Analysis Plan

    International Nuclear Information System (INIS)

    Curry, L.R.

    1998-06-01

    This SAP presents the rationale and strategy for characterization of specific rooms within the 105-F and 105-DR reactor buildings. Figures 1-1 and 1-2 identify the rooms that are the subject of this SAP. These rooms are to be decontaminated and demolished as an initial step (Phase 1 ) in the Interim Safe Storage process for these reactors. Section 1.0 presents the background and sites history for the reactor buildings and summarizes the data quality objective process, which provides the logical basis for this SAP. Preliminary surveys indicate that little radiochemical contamination is present. Section 2.0 presents the quality assurance project plan, which includes a project management structure, sampling methods and quality control, and oversight of the sampling process. Section 2.2.1 summarizes the sampling methods, reflecting the radiological and chemical sampling designs presented in Tables 1-17 and 1-18. Section 3.0 presents the Field Sampling Plan for Phase 1. The sampling design is broken into two stages. Stage 1 will verify the list of radioactive constituents of concern and generate the isotopic distribution. The objectives of Stage 2 are to estimate the radionuclide inventories of room debris, quantify chemical contamination, and survey room contents for potential salvage or recycle. Table 3-1 presents the sampling activities to be performed in Stage 1. Tables 1-17 and 1-18 identify samples to be collected in Stage 2. Stage 2 will consist primarily of survey data collection, with fixed laboratory samples to be collected in areas showing visible stains. Quality control sampling requirements are presented in Table 3-2

  19. National Household Education Surveys Program of 2012: Data File User's Manual. Parent and Family Involvement in Education Survey. Early Childhood Program Participation Survey. NCES 2015-030

    Science.gov (United States)

    McPhee, C.; Bielick, S.; Masterton, M.; Flores, L.; Parmer, R.; Amchin, S.; Stern, S.; McGowan, H.

    2015-01-01

    The 2012 National Household Education Surveys Program (NHES:2012) Data File User's Manual provides documentation and guidance for users of the NHES:2012 data files. The manual provides information about the purpose of the study, the sample design, data collection procedures, data processing procedures, response rates, imputation, weighting and…

  20. Geochemical reanalysis of historical U.S. Geological Survey sediment samples from the northeastern Alaska Range, Healy, Mount Hayes, Nabesna, and Tanacross quadrangles, Alaska

    Science.gov (United States)

    Werdon, Melanie B.; Granitto, Matthew; Azain, Jaime S.

    2015-01-01

    The State of Alaska’s Strategic and Critical Minerals (SCM) Assessment project, a State-funded Capital Improvement Project (CIP), is designed to evaluate Alaska’s statewide potential for SCM resources. The SCM Assessment is being implemented by the Alaska Division of Geological & Geophysical Surveys (DGGS), and involves obtaining new airborne-geophysical, geological, and geochemical data. As part of the SCM Assessment, thousands of historical geochemical samples from DGGS, U.S. Geological Survey (USGS), and U.S. Bureau of Mines archives are being reanalyzed by DGGS using modern, quantitative, geochemical-analytical methods. The objective is to update the statewide geochemical database to more clearly identify areas in Alaska with SCM potential. The USGS is also undertaking SCM-related geologic studies in Alaska through the federally funded Alaska Critical Minerals cooperative project. DGGS and USGS share the goal of evaluating Alaska’s strategic and critical minerals potential and together created a Letter of Agreement (signed December 2012) and a supplementary Technical Assistance Agreement (#14CMTAA143458) to facilitate the two agencies’ cooperative work. Under these agreements, DGGS contracted the USGS in Denver to reanalyze historical USGS sediment samples from Alaska. For this report, DGGS funded reanalysis of 670 historical USGS sediment samples from the statewide Alaska Geochemical Database Version 2.0 (AGDB2; Granitto and others, 2013). Samples were chosen from the northeastern Alaska Range, in the Healy, Mount Hayes, Nabesna, and Tanacross quadrangles, Alaska (fig. 1). The USGS was responsible for sample retrieval from the National Geochemical Sample Archive (NGSA) in Denver, Colorado through the final quality assurance/quality control (QA/QC) of the geochemical analyses obtained through the USGS contract lab. The new geochemical data are published in this report as a coauthored DGGS report, and will be incorporated into the statewide geochemical

  1. Methodology of the National School-based Health Survey in Malaysia, 2012.

    Science.gov (United States)

    Yusoff, Fadhli; Saari, Riyanti; Naidu, Balkish M; Ahmad, Noor Ani; Omar, Azahadi; Aris, Tahir

    2014-09-01

    The National School-Based Health Survey 2012 was a nationwide school health survey of students in Standard 4 to Form 5 (10-17 years of age), who were schooling in government schools in Malaysia during the period of data collection. The survey comprised 3 subsurveys: the Global School Health Survey (GSHS), the Mental Health Survey, and the National School-Based Nutrition Survey. The aim of the survey was to provide data on the health status of adolescents in Malaysia toward strengthening the adolescent health program in the country. The design of the survey was created to fulfill the requirements of the 3 subsurveys. A 2-stage stratified sampling method was adopted in the sampling. The methods for data collection were via questionnaire and physical examination. The National School-Based Health Survey 2012 adopted an appropriate methodology for a school-based survey to ensure valid and reliable findings. © 2014 APJPH.

  2. EuropeaN Energy balance Research to prevent excessive weight Gain among Youth (ENERGY project: Design and methodology of the ENERGY cross-sectional survey

    Directory of Open Access Journals (Sweden)

    Moreno Luis

    2011-01-01

    Full Text Available Abstract Background Obesity treatment is by large ineffective long term, and more emphasis on the prevention of excessive weight gain in childhood and adolescence is warranted. To inform energy balance related behaviour (EBRB change interventions, insight in the potential personal, family and school environmental correlates of these behaviours is needed. Studies on such multilevel correlates of EBRB among schoolchildren in Europe are lacking. The ENERGY survey aims to (1 provide up-to-date prevalence rates of measured overweight, obesity, self-reported engagement in EBRBs, and objective accelerometer-based assessment of physical activity and sedentary behaviour and blood-sample biomarkers of metabolic function in countries in different regions of Europe, (2 to identify personal, family and school environmental correlates of these EBRBs. This paper describes the design, methodology and protocol of the survey. Method/Design A school-based cross-sectional survey was carried out in 2010 in seven different European countries; Belgium, Greece, Hungary, the Netherlands, Norway, Slovenia, and Spain. The survey included measurements of anthropometrics, child, parent and school-staff questionnaires, and school observations to measure and assess outcomes (i.e. height, weight, and waist circumference, EBRBs and potential personal, family and school environmental correlates of these behaviours including the social-cultural, physical, political, and economic environmental factors. In addition, a selection of countries conducted accelerometer measurements to objectively assess physical activity and sedentary behaviour, and collected blood samples to assess several biomarkers of metabolic function. Discussion The ENERGY survey is a comprehensive cross-sectional study measuring anthropometrics and biomarkers as well as assessing a range of EBRBs and their potential correlates at the personal, family and school level, among 10-12 year old children in seven

  3. Improving the collection of knowledge, attitude and practice data with community surveys: a comparison of two second-stage sampling methods.

    Science.gov (United States)

    Davis, Rosemary H; Valadez, Joseph J

    2014-12-01

    Second-stage sampling techniques, including spatial segmentation, are widely used in community health surveys when reliable household sampling frames are not available. In India, an unresearched technique for household selection is used in eight states, which samples the house with the last marriage or birth as the starting point. Users question whether this last-birth or last-marriage (LBLM) approach introduces bias affecting survey results. We conducted two simultaneous population-based surveys. One used segmentation sampling; the other used LBLM. LBLM sampling required modification before assessment was possible and a more systematic approach was tested using last birth only. We compared coverage proportions produced by the two independent samples for six malaria indicators and demographic variables (education, wealth and caste). We then measured the level of agreement between the caste of the selected participant and the caste of the health worker making the selection. No significant difference between methods was found for the point estimates of six malaria indicators, education, caste or wealth of the survey participants (range of P: 0.06 to >0.99). A poor level of agreement occurred between the caste of the health worker used in household selection and the caste of the final participant, (Κ = 0.185), revealing little association between the two, and thereby indicating that caste was not a source of bias. Although LBLM was not testable, a systematic last-birth approach was tested. If documented concerns of last-birth sampling are addressed, this new method could offer an acceptable alternative to segmentation in India. However, inter-state caste variation could affect this result. Therefore, additional assessment of last birth is required before wider implementation is recommended. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine © The Author 2013; all rights reserved.

  4. Evaluation of Nine Consensus Indices in Delphi Foresight Research and Their Dependency on Delphi Survey Characteristics: A Simulation Study and Debate on Delphi Design and Interpretation.

    Science.gov (United States)

    Birko, Stanislav; Dove, Edward S; Özdemir, Vural

    2015-01-01

    The extent of consensus (or the lack thereof) among experts in emerging fields of innovation can serve as antecedents of scientific, societal, investor and stakeholder synergy or conflict. Naturally, how we measure consensus is of great importance to science and technology strategic foresight. The Delphi methodology is a widely used anonymous survey technique to evaluate consensus among a panel of experts. Surprisingly, there is little guidance on how indices of consensus can be influenced by parameters of the Delphi survey itself. We simulated a classic three-round Delphi survey building on the concept of clustered consensus/dissensus. We evaluated three study characteristics that are pertinent for design of Delphi foresight research: (1) the number of survey questions, (2) the sample size, and (3) the extent to which experts conform to group opinion (the Group Conformity Index) in a Delphi study. Their impacts on the following nine Delphi consensus indices were then examined in 1000 simulations: Clustered Mode, Clustered Pairwise Agreement, Conger's Kappa, De Moivre index, Extremities Version of the Clustered Pairwise Agreement, Fleiss' Kappa, Mode, the Interquartile Range and Pairwise Agreement. The dependency of a consensus index on the Delphi survey characteristics was expressed from 0.000 (no dependency) to 1.000 (full dependency). The number of questions (range: 6 to 40) in a survey did not have a notable impact whereby the dependency values remained below 0.030. The variation in sample size (range: 6 to 50) displayed the top three impacts for the Interquartile Range, the Clustered Mode and the Mode (dependency = 0.396, 0.130, 0.116, respectively). The Group Conformity Index, a construct akin to measuring stubbornness/flexibility of experts' opinions, greatly impacted all nine Delphi consensus indices (dependency = 0.200 to 0.504), except the Extremity CPWA and the Interquartile Range that were impacted only beyond the first decimal point (dependency = 0

  5. The French national survey on food consumption of children under 3 years of age - Nutri-Bébé 2013: design, methodology, population sampling and feeding practices.

    Science.gov (United States)

    Chouraqui, Jean-Pierre; Tavoularis, Gabriel; Emery, Yves; Francou, Aurée; Hébel, Pascale; Bocquet, Magali; Hankard, Régis; Turck, Dominique

    2018-02-01

    To update the data on food consumption and practices in children under 3 years of age in metropolitan France. The Nutri-Bébé 2013 cross-sectional study selected a random sample, according to the quota sampling method. After giving their informed consent, parents had to record the food consumption during three non-consecutive days framed by two face-to-face interviews, using for quantitative information different portion size measurement aids. One thousand one hundred and eighty-four children were enrolled. Mothers' mean age was 30·8 (sd 5·4) years; 38 % were primiparous; 89 % lived with a partner; 60 % had an occupation. Of the infants younger than 4 months, 31 % were breast-fed. One thousand and thirty-five children consumed infant formula followed by growing-up milk in 63 % of them; solid foods were introduced at a mean age of 5·4 (sd 2·13) months. From 8 months onwards, 25 % of children consumed the same foods as their parents on a more or less regular basis; 29 % ate in front of a screen, with a daily average screen time of 43·0 (sd 40·4) min. This robust survey highlights the low prevalence and duration of breast-feeding in France and shows a modest improvement since the previous survey of 2005 in the observance of recommendations concerning other feeding practices. The frequent consumption of adult foods and the screen time are of concern.

  6. Rapid Active Sampling Surveys as a Tool to Evaluate Factors Associated with Acute Gastroenteritis and Norovirus Infection among Children in Rural Guatemala.

    Science.gov (United States)

    Olson, Daniel; Lamb, Molly M; Lopez, Maria R; Paniagua-Avila, Maria A; Zacarias, Alma; Samayoa-Reyes, Gabriela; Cordon-Rosales, Celia; Asturias, Edwin J

    2017-09-01

    We examined burden and factors associated with norovirus (NoV) acute gastroenteritis (AGE) among children in rural Guatemala. Children age 6 weeks to 17 years were enrolled into three AGE surveillance groups, using two-stage cluster sampling: a prospective participatory syndromic surveillance (PSS) cohort and two cross-sectional rapid active sampling (RAS) surveys, conducted from April 2015 to February 2016. Epidemiologic and NoV testing data were used to identify factors associated with NoV infection, AGE, and NoV+ AGE. The three cross-sectional surveys (PSS enrollment visit, RAS Survey 1, and RAS Survey 2) enrolled 1,239 children, who reported 134 (11%) AGE cases, with 20% of AGE and 11% of non-AGE samples positive for NoV. Adjusted analyses identified several modifiable factors associated with AGE and NoV infection. The cross-sectional RAS surveys were practical and cost-effective in identifying population-level risk factors for AGE and NoV, supporting their use as a tool to direct limited public health resources toward high-risk populations.

  7. Nationwide survey of policies and practices related to capillary blood sampling in medical laboratories in Croatia

    OpenAIRE

    Lenicek Krleza, Jasna

    2014-01-01

    Introduction: Capillary sampling is increasingly used to obtain blood for laboratory tests in volumes as small as necessary and as non-invasively as possible. Whether capillary blood sampling is also frequent in Croatia, and whether it is performed according to international laboratory standards is unclear. Materials and methods: All medical laboratories that participate in the Croatian National External Quality Assessment Program (N = 204) were surveyed on-line to collect information about t...

  8. Spread of Traditional Medicines in India: Results of National Sample Survey Organization's Perception Survey on Use of AYUSH.

    Science.gov (United States)

    Srinivasan, R; Sugumar, V Raji

    2015-10-04

    For the first time, we have a comprehensive database on usage of AYUSH (acronym for Ayurveda, naturopathy and Yoga, Unani, Siddha, and Homeopathy) in India at the household level. This article aims at exploring the spread of the traditional medical systems in India and the perceptions of people on the access and effectiveness of these medical systems using this database. The article uses the unit level data purchased from the National Sample Survey Organization, New Delhi. Household is the basic unit of survey and the data are the collective opinion of the household. This survey shows that less than 30% of Indian households use the traditional medical systems. There is also a regional pattern in the usage of particular type of traditional medicine, reflecting the regional aspects of the development of such medical systems. The strong faith in AYUSH is the main reason for its usage; lack of need for AYUSH and lack of awareness about AYUSH are the main reasons for not using it. With regard to source of medicines in the traditional medical systems, home is the main source in the Indian medical system and private sector is the main source in Homeopathy. This shows that there is need for creating awareness and improving access to traditional medical systems in India. By and large, the users of AYUSH are also convinced about the effectiveness of these traditional medicines. © The Author(s) 2015.

  9. International survey blauwe wijting. Elke seconde een sample

    NARCIS (Netherlands)

    Faessler, S.M.M.

    2011-01-01

    IJMuiden- in 2011 wordt voor het eerst de jaarlijkse blauwe wijting survey gecoördineerd door Schascha Fässler, geboren Zwitser en sinds 2009 als 'fisheries acoustics' scientist' werkzaam bij IMARES in IJmuiden. In onderstaand artikel gaat Fässler nader in op acoustics in het algemeen en de survey

  10. A study on the representative sampling survey for the inspection of the clearance level for the radioisotope waste

    International Nuclear Information System (INIS)

    Hong Joo Ahn; Se Chul Sohn; Kwang Yong Jee; Ju Youl Kim; In Koo Lee

    2007-01-01

    Utilization facilities for radioisotope (RI) are increasing annually in South Korea, and the total number was 2,723, as of December 31, 2005. The inspection of a clearance level is a very important problem in order to ensure a social reliance for releasing radioactive materials to the environment. Korean regulations for such a clearance are described in Notice No. 2001-30 of the Ministry of Science and Technology (MOST) and Notice No. 2002-67 of the Ministry of Commerce, Industry and Energy (MOCIE). Most unsealed sources in RI waste drums at a storage facility are low-level beta-emitters with short half-lives, so it is impossible to measure their inventories by a nondestructive analysis. Furthermore, RI wastes generated from hospital, educational and research institutes and industry have a heterogeneous, multiple, irregular, and a small quantity of a waste stream. This study addresses a representative (master) sampling survey and analysis plan for RI wastes because a complete enumeration of waste drums is impossible and not desirable in terms of a cost and efficiency. The existing approaches to a representative sampling include a judgmental, simple random, stratified random, systematic grid, systematic random, composite, and adaptive sampling. A representative sampling plan may combine two or more of the above sampling approaches depending on the type and distribution of a waste stream. Stratified random sampling (constrained randomization) is proven to be adequate for a sampling design of a RI waste regarding a half-life, surface dose, undertaking time to a storage facility, and type of waste. The developed sampling protocol includes estimating the number of drums within a waste stream, estimating the number of samples, and a confirmation of the required number of samples. The statistical process control for a quality assurance plan includes control charts and an upper control limit (UCL) of 95% to determine whether a clearance level is met or not. (authors)

  11. Total Survey Error for Longitudinal Surveys

    NARCIS (Netherlands)

    Lynn, Peter; Lugtig, P.J.

    2016-01-01

    This article describes the application of the total survey error paradigm to longitudinal surveys. Several aspects of survey error, and of the interactions between different types of error, are distinct in the longitudinal survey context. Furthermore, error trade-off decisions in survey design and

  12. The Apollo lunar samples collection analysis and results

    CERN Document Server

    Young, Anthony

    2017-01-01

    This book focuses on the specific mission planning for lunar sample collection, the equipment used, and the analysis and findings concerning the samples at the Lunar Receiving Laboratory in Texas. Anthony Young documents the collection of Apollo samples for the first time for readers of all backgrounds, and includes interviews with many of those involved in planning and analyzing the samples. NASA contracted with the U.S. Geologic Survey to perform classroom and field training of the Apollo astronauts. NASA’s Geology Group within the Manned Spacecraft Center in Houston, Texas, helped to establish the goals of sample collection, as well as the design of sample collection tools, bags, and storage containers. In this book, detailed descriptions are given on the design of the lunar sampling tools, the Modular Experiment Transporter used on Apollo 14, and the specific areas of the Lunar Rover vehicle used for the Apollo 15, 16, and 17 missions, which carried the sampling tools, bags, and other related equipment ...

  13. Tools for Real-Time Control Systems Co-Design - A Survey

    OpenAIRE

    Henriksson, Dan; El-Khoury, Jad; Årzén, Karl-Erik; Törngren, Martin; Redell, Ola

    2005-01-01

    This report presents a survey of current simulation tools in the area of integrated control and real-time systems design. Each tool is presented with a quick overview followed by a more detailed section describing comparative aspects of the tool. These aspects describe the context and purpose of the tool (scenarios, development stages, activities, and qualities/constraints being addressed) and the actual tool technology (tool architecture, inputs, outputs, modeling content, extensibility and ...

  14. A statistical evaluation of the design and precision of the shrimp trawl survey off West Greenland

    DEFF Research Database (Denmark)

    Folmer, Ole; Pennington, M.

    2000-01-01

    statistical techniques were used to estimate two indices of shrimp abundance and their precision, and to determine the effective sample sizes for estimates of length-frequency distributions. It is concluded that the surveys produce a fairly precise abundance index, and that given the relatively small...... effective sample size, reducing tow duration to 15 min would increase overall survey precision. An unexpected outcome of the analysis is that the density of shrimp appears to have been fairly stable over the last 11 years. (C) 2000 Elsevier Science B.V. All rights reserved....

  15. Design of a gravity corer for near shore sediment sampling

    Digital Repository Service at National Institute of Oceanography (India)

    Bhat, S.T.; Sonawane, A.V.; Nayak, B.U.

    For the purpose of geotechnical investigation a gravity corer has been designed and fabricated to obtain undisturbed sediment core samples from near shore waters. The corer was successfully operated at 75 stations up to water depth 30 m. Simplicity...

  16. A binary logistic regression model with complex sampling design of ...

    African Journals Online (AJOL)

    2017-09-03

    Sep 3, 2017 ... Bi-variable and multi-variable binary logistic regression model with complex sampling design was fitted. .... Data was entered into STATA-12 and analyzed using. SPSS-21. .... lack of access/too far or costs too much. 35. 1.2.

  17. Optimum sample size to estimate mean parasite abundance in fish parasite surveys

    Directory of Open Access Journals (Sweden)

    Shvydka S.

    2018-03-01

    Full Text Available To reach ethically and scientifically valid mean abundance values in parasitological and epidemiological studies this paper considers analytic and simulation approaches for sample size determination. The sample size estimation was carried out by applying mathematical formula with predetermined precision level and parameter of the negative binomial distribution estimated from the empirical data. A simulation approach to optimum sample size determination aimed at the estimation of true value of the mean abundance and its confidence interval (CI was based on the Bag of Little Bootstraps (BLB. The abundance of two species of monogenean parasites Ligophorus cephali and L. mediterraneus from Mugil cephalus across the Azov-Black Seas localities were subjected to the analysis. The dispersion pattern of both helminth species could be characterized as a highly aggregated distribution with the variance being substantially larger than the mean abundance. The holistic approach applied here offers a wide range of appropriate methods in searching for the optimum sample size and the understanding about the expected precision level of the mean. Given the superior performance of the BLB relative to formulae with its few assumptions, the bootstrap procedure is the preferred method. Two important assessments were performed in the present study: i based on CIs width a reasonable precision level for the mean abundance in parasitological surveys of Ligophorus spp. could be chosen between 0.8 and 0.5 with 1.6 and 1x mean of the CIs width, and ii the sample size equal 80 or more host individuals allows accurate and precise estimation of mean abundance. Meanwhile for the host sample size in range between 25 and 40 individuals, the median estimates showed minimal bias but the sampling distribution skewed to the low values; a sample size of 10 host individuals yielded to unreliable estimates.

  18. The Laboratory Course Assessment Survey: A Tool to Measure Three Dimensions of Research-Course Design

    Science.gov (United States)

    Corwin, Lisa A.; Runyon, Christopher; Robinson, Aspen; Dolan, Erin L.

    2015-01-01

    Course-based undergraduate research experiences (CUREs) are increasingly being offered as scalable ways to involve undergraduates in research. Yet few if any design features that make CUREs effective have been identified. We developed a 17-item survey instrument, the Laboratory Course Assessment Survey (LCAS), that measures students’ perceptions of three design features of biology lab courses: 1) collaboration, 2) discovery and relevance, and 3) iteration. We assessed the psychometric properties of the LCAS using established methods for instrument design and validation. We also assessed the ability of the LCAS to differentiate between CUREs and traditional laboratory courses, and found that the discovery and relevance and iteration scales differentiated between these groups. Our results indicate that the LCAS is suited for characterizing and comparing undergraduate biology lab courses and should be useful for determining the relative importance of the three design features for achieving student outcomes. PMID:26466990

  19. Multiple sensitive estimation and optimal sample size allocation in the item sum technique.

    Science.gov (United States)

    Perri, Pier Francesco; Rueda García, María Del Mar; Cobo Rodríguez, Beatriz

    2018-01-01

    For surveys of sensitive issues in life sciences, statistical procedures can be used to reduce nonresponse and social desirability response bias. Both of these phenomena provoke nonsampling errors that are difficult to deal with and can seriously flaw the validity of the analyses. The item sum technique (IST) is a very recent indirect questioning method derived from the item count technique that seeks to procure more reliable responses on quantitative items than direct questioning while preserving respondents' anonymity. This article addresses two important questions concerning the IST: (i) its implementation when two or more sensitive variables are investigated and efficient estimates of their unknown population means are required; (ii) the determination of the optimal sample size to achieve minimum variance estimates. These aspects are of great relevance for survey practitioners engaged in sensitive research and, to the best of our knowledge, were not studied so far. In this article, theoretical results for multiple estimation and optimal allocation are obtained under a generic sampling design and then particularized to simple random sampling and stratified sampling designs. Theoretical considerations are integrated with a number of simulation studies based on data from two real surveys and conducted to ascertain the efficiency gain derived from optimal allocation in different situations. One of the surveys concerns cannabis consumption among university students. Our findings highlight some methodological advances that can be obtained in life sciences IST surveys when optimal allocation is achieved. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Small population size of Pribilof Rock Sandpipers confirmed through distance-sampling surveys in Alaska

    Science.gov (United States)

    Ruthrauff, Daniel R.; Tibbitts, T. Lee; Gill, Robert E.; Dementyev, Maksim N.; Handel, Colleen M.

    2012-01-01

    The Rock Sandpiper (Calidris ptilocnemis) is endemic to the Bering Sea region and unique among shorebirds in the North Pacific for wintering at high latitudes. The nominate subspecies, the Pribilof Rock Sandpiper (C. p. ptilocnemis), breeds on four isolated islands in the Bering Sea and appears to spend the winter primarily in Cook Inlet, Alaska. We used a stratified systematic sampling design and line-transect method to survey the entire breeding range of this population during springs 2001-2003. Densities were up to four times higher on the uninhabited and more northerly St. Matthew and Hall islands than on St. Paul and St. George islands, which both have small human settlements and introduced reindeer herds. Differences in density, however, appeared to be more related to differences in vegetation than to anthropogenic factors, raising some concern for prospective effects of climate change. We estimated the total population at 19 832 birds (95% CI 17 853–21 930), ranking it among the smallest of North American shorebird populations. To determine the vulnerability of C. p. ptilocnemis to anthropogenic and stochastic environmental threats, future studies should focus on determining the amount of gene flow among island subpopulations, the full extent of the subspecies' winter range, and the current trajectory of this small population.

  1. school-based survey of adolescents' opinion on premarital sex in ...

    African Journals Online (AJOL)

    PROF. BARTH EKWEME

    Method: A cross sectional descriptive survey design was used. The sample size was 313 senior secondary school students from four public secondary schools in Yakurr Local Government Area of Cross River State. Simple random sampling technique was used to select 313 students from 4 schools in Yakurr Local ...

  2. [Design and implementation of data checking system for Chinese materia medica resources survey].

    Science.gov (United States)

    Wang, Hui; Zhang, Xiao-Bo; Ge, Xiao-Guang; Jin, Yan; Jing, Zhi-Xian; Qi, Yuan-Hua; Wang, Ling; Zhao, Yu-Ping; Wang, Wei; Guo, Lan-Ping; Huang, Lu-Qi

    2017-11-01

    The Chinese material medica resources (CMMR) national survey information management system has collected a large amount of data. To help dealing with data recheck, reduce the work of inside, improve the recheck of survey data from provincial and county level, National Resource Center for Chinese Materia Medical has designed a data checking system for Chinese material medica resources survey based on J2EE technology, Java language, Oracle data base in accordance with the SOA framework. It includes single data check, check score, content manage, check the survey data census data with manual checking and automatic checking about census implementation plan, key research information, general survey information, cultivation of medicinal materials information, germplasm resources information the medicine information, market research information, traditional knowledge information, specimen information of this 9 aspects 20 class 175 indicators in two aspects of the quantity and quality. The established system assists in the completion of the data consistency and accuracy, pushes the county survey team timely to complete the data entry arrangement work, so as to improve the integrity, consistency and accuracy of the survey data, and ensure effective and available data, which lay a foundation for providing accurate data support for national survey of the Chinese material medica resources (CMMR) results summary, and displaying results and sharing. Copyright© by the Chinese Pharmaceutical Association.

  3. Participant dropout as a function of survey length in internet-mediated university studies: implications for study design and voluntary participation in psychological research.

    Science.gov (United States)

    Hoerger, Michael

    2010-12-01

    Internet-mediated research has offered substantial advantages over traditional laboratory-based research in terms of efficiently and affordably allowing for the recruitment of large samples of participants for psychology studies. Core technical, ethical, and methodological issues have been addressed in recent years, but the important issue of participant dropout has received surprisingly little attention. Specifically, web-based psychology studies often involve undergraduates completing lengthy and time-consuming batteries of online personality questionnaires, but no known published studies to date have closely examined the natural course of participant dropout during attempted completion of these studies. The present investigation examined participant dropout among 1,963 undergraduates completing one of six web-based survey studies relatively representative of those conducted in university settings. Results indicated that 10% of participants could be expected to drop out of these studies nearly instantaneously, with an additional 2% dropping out per 100 survey items included in the study. For individual project investigators, these findings hold ramifications for study design considerations, such as conducting a priori power analyses. The present results also have broader ethical implications for understanding and improving voluntary participation in research involving human subjects. Nonetheless, the generalizability of these conclusions may be limited to studies involving similar design or survey content.

  4. The Sloan Digital Sky Survey Quasar Lens Search. IV. Statistical Lens Sample from the Fifth Data Release

    Energy Technology Data Exchange (ETDEWEB)

    Inada, Naohisa; /Wako, RIKEN /Tokyo U., ICEPP; Oguri, Masamune; /Natl. Astron. Observ. of Japan /Stanford U., Phys. Dept.; Shin, Min-Su; /Michigan U. /Princeton U. Observ.; Kayo, Issha; /Tokyo U., ICRR; Strauss, Michael A.; /Princeton U. Observ.; Hennawi, Joseph F.; /UC, Berkeley /Heidelberg, Max Planck Inst. Astron.; Morokuma, Tomoki; /Natl. Astron. Observ. of Japan; Becker, Robert H.; /LLNL, Livermore /UC, Davis; White, Richard L.; /Baltimore, Space Telescope Sci.; Kochanek, Christopher S.; /Ohio State U.; Gregg, Michael D.; /LLNL, Livermore /UC, Davis /Exeter U.

    2010-05-01

    We present the second report of our systematic search for strongly lensed quasars from the data of the Sloan Digital Sky Survey (SDSS). From extensive follow-up observations of 136 candidate objects, we find 36 lenses in the full sample of 77,429 spectroscopically confirmed quasars in the SDSS Data Release 5. We then define a complete sample of 19 lenses, including 11 from our previous search in the SDSS Data Release 3, from the sample of 36,287 quasars with i < 19.1 in the redshift range 0.6 < z < 2.2, where we require the lenses to have image separations of 1 < {theta} < 20 and i-band magnitude differences between the two images smaller than 1.25 mag. Among the 19 lensed quasars, 3 have quadruple-image configurations, while the remaining 16 show double images. This lens sample constrains the cosmological constant to be {Omega}{sub {Lambda}} = 0.84{sub -0.08}{sup +0.06}(stat.){sub -0.07}{sup + 0.09}(syst.) assuming a flat universe, which is in good agreement with other cosmological observations. We also report the discoveries of 7 binary quasars with separations ranging from 1.1 to 16.6, which are identified in the course of our lens survey. This study concludes the construction of our statistical lens sample in the full SDSS-I data set.

  5. The TRacking Adolescents' Individual Lives Survey (TRAILS) : Design, Current Status, and Selected Findings

    NARCIS (Netherlands)

    Ormel, Johan; Oldehinkel, Albertine J; Sijtsema, Jelle; van Oort, Floor; Raven, Dennis; Veenstra, Rene; Vollebergh, Wilma A M; Verhulst, Frank C

    2012-01-01

    Objectives: The objectives of this study were as follows: to present a concise overview of the sample, outcomes, determinants, non-response and attrition of the ongoing TRacking Adolescents' Individual Lives Survey (TRAILS), which started in 2001; to summarize a selection of recent findings on

  6. Discussion on the source survey method in a natural evaporation pond

    International Nuclear Information System (INIS)

    Dai Xiaoshu; Fan Chengrong; Fu Yunshan

    2014-01-01

    A natural evaporation pond intended to be decommissioned. The survey of the pond focused on investigating radioactive contamination distribution and estimating the total amount of deposits in the pond, in order to provide support for subsequent decommissioning activities. Based on the source survey in the pond, this paper introduced how to implement radiation measurements and sampling (such as water and sediment) in the water. The movable work platform was built on the pond to facilitate sampling and measurement. In addition, a sludge sampler had been designed so as to accurately determine the amount of sampling and its depth. This paper also described the distribution of sampling points. (authors)

  7. Survey of injury sources for a trampoline with equipment hazards designed out.

    Science.gov (United States)

    Eager, David; Scarrott, Carl; Nixon, Jim; Alexander, Keith

    2012-07-01

    In Australia, trampolines contribute approximately one-quarter of all childhood play-equipment injuries. The purpose of this study was to gather and evaluate injury data from a nontraditional, 'soft-edged', consumer trampoline in which the equipment injury sources have been designed out. A survey was undertaken in Queensland and New South Wales. The manufacturer of the nontraditional trampoline provided the University of Technology, Sydney, with their Australian customer database. Injury data were gathered in a pilot study by phone interview, then in a full study through an email survey. Results from 3817 respondents were compared with earlier Australian and US data from traditional trampolines gathered from emergency departments.   A significantly lower proportion of the injuries caused by falling off or striking the equipment was found for this new design when compared with traditional trampolines both in Australia and in the USA. The age of children being injured on trampolines in Australia was found to be markedly lower than in North America. This research indicates that with appropriate design the more severe injuries on traditional trampolines can be significantly reduced. © 2012 The Authors. Journal of Paediatrics and Child Health © 2012 Paediatrics and Child Health Division (Royal Australasian College of Physicians).

  8. Sample size reassessment for a two-stage design controlling the false discovery rate.

    Science.gov (United States)

    Zehetmayer, Sonja; Graf, Alexandra C; Posch, Martin

    2015-11-01

    Sample size calculations for gene expression microarray and NGS-RNA-Seq experiments are challenging because the overall power depends on unknown quantities as the proportion of true null hypotheses and the distribution of the effect sizes under the alternative. We propose a two-stage design with an adaptive interim analysis where these quantities are estimated from the interim data. The second stage sample size is chosen based on these estimates to achieve a specific overall power. The proposed procedure controls the power in all considered scenarios except for very low first stage sample sizes. The false discovery rate (FDR) is controlled despite of the data dependent choice of sample size. The two-stage design can be a useful tool to determine the sample size of high-dimensional studies if in the planning phase there is high uncertainty regarding the expected effect sizes and variability.

  9. Report of the Survey on the Design Review of New Reactor Applications. Volume 1 - Instrumentation and Control

    International Nuclear Information System (INIS)

    Downey, Steven

    2014-06-01

    At the tenth meeting of the CNRA Working Group on the Regulation of New Reactors (WGRNR) in March 2013, the members agreed to present the responses to the Second Phase, or Design Phase, of the Licensing Process Survey as a multi-volume text. As such, each report will focus on one of the eleven general technical categories covered in the survey. The general technical categories were selected to conform to the topics covered in the International Atomic Energy Agency (IAEA) Safety Guide GS-G-4.1. This report, which is the first volume, provides a discussion of the survey responses related to Instrumentation and Control (I and C). The Instrumentation and Control category includes the twelve following technical topics: Reactor trip system, actuation systems for Engineered Safety Features (ESF), safe shutdown system, safety-related display instrumentation, information and interlock systems important to safety, controls systems, main control room, supplementary control room, diverse I and C systems, data communication systems, software reliability and cyber-security. For each technical topic, the member countries described the information provided by the applicant, the scope and level of detail of the technical review, the technical basis for granting regulatory authorisation, the skill sets required and the Level of effort needed to perform the review. Based on a comparison of the information provided in response to the survey, the following observations were made: - Among the regulatory organisations that responded to the survey, there are similarities in the design information provided by an applicant. In most countries, the design information provided by an applicant includes, but is not limited to, a description of the I and C system design and functions, a description of the verification and validation programmes, and provisions for analysis, testing, and inspection of various I and C systems. - In addition to the regulations, it is a common practice for countries

  10. The Influence of the Design of Web Survey Questionnaires on the Quality of Responses

    Directory of Open Access Journals (Sweden)

    Stéphane Ganassali

    2008-03-01

    Full Text Available he first objective of this article is to propose a conceptual framework of the effects of on-line questionnaire design on the quality of collected responses. Secondly, we present the results of an experiment where different protocols have been tested and compared in a randomised design using the basis of several quality indexes. Starting from some previous categorizations, and from the main factors identified in the literature, we first propose an initial global framework of the questionnaire and question characteristics in a web survey, divided into five groups of factors. Our framework was built to follow the response process successive stages of the contact between the respondent and the questionnaire itself. Then, because it has been studied in the survey methodology literature in a very restricted way, the concept of `response quality' is discussed and extended with some more `qualitative' criteria that could be helpful for researchers and practitioners, in order to obtain a deeper assessment of the survey output. As an experiment, on the basis of the factors chosen as major characteristics of the questionnaire design, eight versions of a questionnaire related to young people's consumption patterns were created. The links to these on-line questionnaires were sent in November 2005 to a target of 10,000 young people. The article finally presents the results of our study and discusses the conclusions. Very interesting results come to light; especially regarding the influence of length, interaction and question wording dimensions on response quality. We discuss the effects of Web-questionnaire design characteristics on the quality of data.

  11. A Survey of Structural Design of Diagnostic X-ray Imaging Facilities and Compliance to Shielding Design Goals in a Limited Resource Setting

    Directory of Open Access Journals (Sweden)

    Flavious B. Nkubli

    2017-11-01

    Full Text Available Purpose: To survey structural designs of x-ray rooms and compliance to shielding design goals of three x-ray imaging facilities. Methods and Materials: The survey was conducted in three radiodiagnostic centers in South East Nigeria, labeled X, Y and Z for anonymity. A stretchable non-elastic meter rule was used to measure x-ray room dimensions. A Vernier caliper was used to measure lead thickness while a calibrated digital survey meter Radalert 100x was used for radiation survey of controlled and uncontrolled areas. Simple statistical tools such as mean and standard deviation were used for analysis with the aid of Microsoft Excel version 2007. Results: Center X had a room dimension of 2.4 m × 2.1 m, Center Y had an x-ray room dimension of 3.6 m × 3.3 m, and Center Z had two x-ray rooms with identical dimensions of 6.3 m × 3.6 m. Measured exit radiation doses for controlled areas in all the centers were: 0.00152 mSv/wk; 0.00496 mSv/wk; 0.00168 mSv/wk; 0.00224 mSv/wk respectively. Lead was the common shielding material used. Conclusion: Based on the parameters studied, Center Z had the ideal room size and layout. Relative distances from the x-ray tubes to the nearest walls were not optimized in all the centers except in Center Z. Measured exit doses were within recommended limits except in Center Y. The location of the control consoles and measured doses were appropriate and within recommended design goals.

  12. Quantification of physical activity using the QAPACE Questionnaire: a two stage cluster sample design survey of children and adolescents attending urban school.

    Science.gov (United States)

    Barbosa, Nicolas; Sanchez, Carlos E; Patino, Efrain; Lozano, Benigno; Thalabard, Jean C; LE Bozec, Serge; Rieu, Michel

    2016-05-01

    Quantification of physical activity as energy expenditure is important since youth for the prevention of chronic non communicable diseases in adulthood. It is necessary to quantify physical activity expressed in daily energy expenditure (DEE) in school children and adolescents between 8-16 years, by age, gender and socioeconomic level (SEL) in Bogotá. This is a Two Stage Cluster Survey Sample. From a universe of 4700 schools and 760000 students from three existing socioeconomic levels in Bogotá (low, medium and high). The random sample was 20 schools and 1840 students (904 boys and 936 girls). Foreshadowing desertion of participants and inconsistency in the questionnaire responses, the sample size was increased. Thus, 6 individuals of each gender for each of the nine age groups were selected, resulting in a total sample of 2160 individuals. Selected students filled the QAPACE questionnaire under supervision. The data was analyzed comparing means with multivariate general linear model. Fixed factors used were: gender (boys and girls), age (8 to 16 years old) and tri-strata SEL (low, medium and high); as independent variables were assessed: height, weight, leisure time, expressed in hours/day and dependent variable: daily energy expenditure DEE (kJ.kg-1.day-1): during leisure time (DEE-LT), during school time (DEE-ST), during vacation time (DEE-VT), and total mean DEE per year (DEEm-TY) RESULTS: Differences in DEE by gender, in boys, LT and all DEE, with the SEL all variables were significant; but age-SEL was only significant in DEE-VT. In girls, with the SEL all variables were significant. The post hoc multiple comparisons tests were significant with age using Fisher's Least Significant Difference (LSD) test in all variables. For both genders and for all SELs the values in girls had the higher value except SEL high (5-6) The boys have higher values in DEE-LT, DEE-ST, DEE-VT; except in DEEm-TY in SEL (5-6) In SEL (5-6) all DEEs for both genders are highest. For SEL

  13. Dark Energy Survey Year 1 Results: Galaxy Sample for BAO Measurement

    Energy Technology Data Exchange (ETDEWEB)

    Crocce, M.; et al.

    2017-12-17

    We define and characterise a sample of 1.3 million galaxies extracted from the first year of Dark Energy Survey data, optimised to measure Baryon Acoustic Oscillations in the presence of significant redshift uncertainties. The sample is dominated by luminous red galaxies located at redshifts $z \\gtrsim 0.6$. We define the exact selection using color and magnitude cuts that balance the need of high number densities and small photometric redshift uncertainties, using the corresponding forecasted BAO distance error as a figure-of-merit in the process. The typical photo-$z$ uncertainty varies from $2.3\\%$ to $3.6\\%$ (in units of 1+$z$) from $z=0.6$ to $1$, with number densities from $200$ to $130$ galaxies per deg$^2$ in tomographic bins of width $\\Delta z = 0.1$. Next we summarise the validation of the photometric redshift estimation. We characterise and mitigate observational systematics including stellar contamination, and show that the clustering on large scales is robust in front of those contaminants. We show that the clustering signal in the auto-correlations and cross-correlations is generally consistent with theoretical models, which serves as an additional test of the redshift distributions.

  14. Small-scale and reconnaissance surveys

    Science.gov (United States)

    Bart, Jonathan; Andres, Brad A.; Elliott, Kyle; Francis, Charles M.; Johnston, Victoria; Morrison, R.I.G.; Pierce, Elin P.; Rausch, Jennie; Bart, Jonathan; Johnston, Victoria

    2012-01-01

    This brief chapter addresses two related issues: how effort should be allocated to different parts of the sampling plan and, given optimal allocation, how large a sample will be required to achieve the PRISM accuracy target. Simulations based on data collected to date showed that 2 plots per cluster on rapid surveys, 2 intensive camps per field crew-year, 2-4 intensive plots per intensive camp, and 2-3 rapid surveys per intensive plot is the most efficient allocation of resources. Using this design, we investigated how crew-years should be allocated to each region in order to meet the PRISM accuracy target most efficiently. The analysis indicated that 40-50 crew-years would achieve the accuracy target for 18-24 of the 26 species breeding widely in the Arctic. This analysis was based on assuming that two rounds of surveys were conducted and that a 50% decline occurred between them. We discuss the complexity of making these estimates and why they should be viewed as first approximations.

  15. Current State of Agile User-Centered Design: A Survey

    Science.gov (United States)

    Hussain, Zahid; Slany, Wolfgang; Holzinger, Andreas

    Agile software development methods are quite popular nowadays and are being adopted at an increasing rate in the industry every year. However, these methods are still lacking usability awareness in their development lifecycle, and the integration of usability/User-Centered Design (UCD) into agile methods is not adequately addressed. This paper presents the preliminary results of a recently conducted online survey regarding the current state of the integration of agile methods and usability/UCD. A world wide response of 92 practitioners was received. The results show that the majority of practitioners perceive that the integration of agile methods with usability/UCD has added value to their adopted processes and to their teams; has resulted in the improvement of usability and quality of the product developed; and has increased the satisfaction of the end-users of the product developed. The top most used HCI techniques are low-fidelity prototyping, conceptual designs, observational studies of users, usability expert evaluations, field studies, personas, rapid iterative testing, and laboratory usability testing.

  16. A statistically rigorous sampling design to integrate avian monitoring and management within Bird Conservation Regions.

    Science.gov (United States)

    Pavlacky, David C; Lukacs, Paul M; Blakesley, Jennifer A; Skorkowsky, Robert C; Klute, David S; Hahn, Beth A; Dreitz, Victoria J; George, T Luke; Hanni, David J

    2017-01-01

    Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1) coordination across organizations and regions, 2) meaningful management and conservation objectives, and 3) rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR) program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17). We provide two examples for the Brewer's sparrow (Spizella breweri) in BCR 17 demonstrating the ability of the design to 1) determine hierarchical population responses to landscape change and 2) estimate hierarchical habitat relationships to predict the response of the Brewer's sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous statistical

  17. A statistically rigorous sampling design to integrate avian monitoring and management within Bird Conservation Regions.

    Directory of Open Access Journals (Sweden)

    David C Pavlacky

    Full Text Available Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1 coordination across organizations and regions, 2 meaningful management and conservation objectives, and 3 rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17. We provide two examples for the Brewer's sparrow (Spizella breweri in BCR 17 demonstrating the ability of the design to 1 determine hierarchical population responses to landscape change and 2 estimate hierarchical habitat relationships to predict the response of the Brewer's sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous

  18. The TRacking Adolescents' Individual Lives Survey (TRAILS): Design, Current Status, and Selected Findings

    Science.gov (United States)

    Ormel, Johan; Oldehinkel, Albertine J.; Sijtsema, Jelle; van Oort, Floor; Raven, Dennis; Veenstra, Rene; Vollebergh, Wilma A. M.; Verhulst, Frank C.

    2012-01-01

    Objectives: The objectives of this study were as follows: to present a concise overview of the sample, outcomes, determinants, non-response and attrition of the ongoing TRacking Adolescents' Individual Lives Survey (TRAILS), which started in 2001; to summarize a selection of recent findings on continuity, discontinuity, risk, and protective…

  19. Design for mosquito abundance, diversity, and phenology sampling within the National Ecological Observatory Network

    Science.gov (United States)

    Hoekman, D.; Springer, Yuri P.; Barker, C.M.; Barrera, R.; Blackmore, M.S.; Bradshaw, W.E.; Foley, D. H.; Ginsberg, Howard; Hayden, M. H.; Holzapfel, C. M.; Juliano, S. A.; Kramer, L. D.; LaDeau, S. L.; Livdahl, T. P.; Moore, C. G.; Nasci, R.S.; Reisen, W.K.; Savage, H. M.

    2016-01-01

    The National Ecological Observatory Network (NEON) intends to monitor mosquito populations across its broad geographical range of sites because of their prevalence in food webs, sensitivity to abiotic factors and relevance for human health. We describe the design of mosquito population sampling in the context of NEON’s long term continental scale monitoring program, emphasizing the sampling design schedule, priorities and collection methods. Freely available NEON data and associated field and laboratory samples, will increase our understanding of how mosquito abundance, demography, diversity and phenology are responding to land use and climate change.

  20. Using the Superpopulation Model for Imputations and Variance Computation in Survey Sampling

    Directory of Open Access Journals (Sweden)

    Petr Novák

    2012-03-01

    Full Text Available This study is aimed at variance computation techniques for estimates of population characteristics based on survey sampling and imputation. We use the superpopulation regression model, which means that the target variable values for each statistical unit are treated as random realizations of a linear regression model with weighted variance. We focus on regression models with one auxiliary variable and no intercept, which have many applications and straightforward interpretation in business statistics. Furthermore, we deal with caseswhere the estimates are not independent and thus the covariance must be computed. We also consider chained regression models with auxiliary variables as random variables instead of constants.

  1. Using lot quality-assurance sampling and area sampling to identify priority areas for trachoma control: Viet Nam.

    Science.gov (United States)

    Myatt, Mark; Mai, Nguyen Phuong; Quynh, Nguyen Quang; Nga, Nguyen Huy; Tai, Ha Huy; Long, Nguyen Hung; Minh, Tran Hung; Limburg, Hans

    2005-10-01

    To report on the use of lot quality-assurance sampling (LQAS) surveys undertaken within an area-sampling framework to identify priority areas for intervention with trachoma control activities in Viet Nam. The LQAS survey method for the rapid assessment of the prevalence of active trachoma was adapted for use in Viet Nam with the aim of classifying individual communes by the prevalence of active trachoma among children in primary school. School-based sampling was used; school sites to be sampled were selected using an area-sampling approach. A total of 719 communes in 41 districts in 18 provinces were surveyed. Survey staff found the LQAS survey method both simple and rapid to use after initial problems with area-sampling methods were identified and remedied. The method yielded a finer spatial resolution of prevalence than had been previously achieved in Viet Nam using semiquantitative rapid assessment surveys and multistage cluster-sampled surveys. When used with area-sampling techniques, the LQAS survey method has the potential to form the basis of survey instruments that can be used to efficiently target resources for interventions against active trachoma. With additional work, such methods could provide a generally applicable tool for effective programme planning and for the certification of the elimination of trachoma as a blinding disease.

  2. Statistical searches for microlensing events in large, non-uniformly sampled time-domain surveys: A test using palomar transient factory data

    Energy Technology Data Exchange (ETDEWEB)

    Price-Whelan, Adrian M.; Agüeros, Marcel A. [Department of Astronomy, Columbia University, 550 W 120th Street, New York, NY 10027 (United States); Fournier, Amanda P. [Department of Physics, Broida Hall, University of California, Santa Barbara, CA 93106 (United States); Street, Rachel [Las Cumbres Observatory Global Telescope Network, Inc., 6740 Cortona Drive, Suite 102, Santa Barbara, CA 93117 (United States); Ofek, Eran O. [Benoziyo Center for Astrophysics, Weizmann Institute of Science, 76100 Rehovot (Israel); Covey, Kevin R. [Lowell Observatory, 1400 West Mars Hill Road, Flagstaff, AZ 86001 (United States); Levitan, David; Sesar, Branimir [Division of Physics, Mathematics, and Astronomy, California Institute of Technology, Pasadena, CA 91125 (United States); Laher, Russ R.; Surace, Jason, E-mail: adrn@astro.columbia.edu [Spitzer Science Center, California Institute of Technology, Mail Stop 314-6, Pasadena, CA 91125 (United States)

    2014-01-20

    Many photometric time-domain surveys are driven by specific goals, such as searches for supernovae or transiting exoplanets, which set the cadence with which fields are re-imaged. In the case of the Palomar Transient Factory (PTF), several sub-surveys are conducted in parallel, leading to non-uniform sampling over its ∼20,000 deg{sup 2} footprint. While the median 7.26 deg{sup 2} PTF field has been imaged ∼40 times in the R band, ∼2300 deg{sup 2} have been observed >100 times. We use PTF data to study the trade off between searching for microlensing events in a survey whose footprint is much larger than that of typical microlensing searches, but with far-from-optimal time sampling. To examine the probability that microlensing events can be recovered in these data, we test statistics used on uniformly sampled data to identify variables and transients. We find that the von Neumann ratio performs best for identifying simulated microlensing events in our data. We develop a selection method using this statistic and apply it to data from fields with >10 R-band observations, 1.1 × 10{sup 9} light curves, uncovering three candidate microlensing events. We lack simultaneous, multi-color photometry to confirm these as microlensing events. However, their number is consistent with predictions for the event rate in the PTF footprint over the survey's three years of operations, as estimated from near-field microlensing models. This work can help constrain all-sky event rate predictions and tests microlensing signal recovery in large data sets, which will be useful to future time-domain surveys, such as that planned with the Large Synoptic Survey Telescope.

  3. Empirically simulated study to compare and validate sampling methods used in aerial surveys of wildlife populations

    NARCIS (Netherlands)

    Khaemba, W.M.; Stein, A.; Rasch, D.; Leeuw, de J.; Georgiadis, N.

    2001-01-01

    This paper compares the distribution, sampling and estimation of abundance for two animal species in an African ecosystem by means of an intensive simulation of the sampling process under a geographical information system (GIS) environment. It focuses on systematic and random sampling designs,

  4. The WEAVE-LOFAR Survey

    Science.gov (United States)

    Smith, D. J. B.; Best, P. N.; Duncan, K. J.; Hatch, N. A.; Jarvis, M. J.; Röttgering, H. J. A.; Simpson, C. J.; Stott, J. P.; Cochrane, R. K.; Coppin, K. E.; Dannerbauer, H.; Davis, T. A.; Geach, J. E.; Hale, C. L.; Hardcastle, M. J.; Hatfield, P. W.; Houghton, R. C. W.; Maddox, N.; McGee, S. L.; Morabito, L.; Nisbet, D.; Pandey-Pommier, M.; Prandoni, I.; Saxena, A.; Shimwell, T. W.; Tarr, M.; van Bemmel, I.; Verma, A.; White, G. J.; Williams, W. L.

    2016-12-01

    In these proceedings we highlight the primary scientific goals and design of the WEAVE-LOFAR survey, which will use the new WEAVE spectrograph on the 4.2m William Herschel Telescope to provide the primary source of spectroscopic information for the LOFAR Surveys Key Science Project. Beginning in 2018, WEAVE-LOFAR will generate more than 10^6 R=5000 365-960nm spectra of low-frequency selected radio sources, across three tiers designed to efficiently sample the redshift-luminosity plane, and produce a data set of enormous legacy value. The radio frequency selection, combined with the high multiplex and throughput of the WEAVE spectrograph, make obtaining redshifts in this way very efficient, and we expect that the redshift success rate will approach 100 per cent at z < 1. This unprecedented spectroscopic sample - which will be complemented by an integral field component - will be transformational in key areas, including studying the star formation history of the Universe, the role of accretion and AGN-driven feedback, properties of the epoch of reionisation, cosmology, cluster haloes and relics, as well as the nature of radio galaxies and protoclusters. Each topic will be addressed in unprecedented detail, and with the most reliable source classifications and redshift information in existence.

  5. Multi-saline sample distillation apparatus for hydrogen isotope analyses : design and accuracy

    Science.gov (United States)

    Hassan, Afifa Afifi

    1981-01-01

    A distillation apparatus for saline water samples was designed and tested. Six samples may be distilled simultaneously. The temperature was maintained at 400 C to ensure complete dehydration of the precipitating salts. Consequently, the error in the measured ratio of stable hydrogen isotopes resulting from incomplete dehydration of hydrated salts during distillation was eliminated. (USGS)

  6. Can we do better than the grid survey: Optimal synoptic surveys in presence of variable uncertainty and decorrelation scales

    Science.gov (United States)

    Frolov, Sergey; Garau, Bartolame; Bellingham, James

    2014-08-01

    Regular grid ("lawnmower") survey is a classical strategy for synoptic sampling of the ocean. Is it possible to achieve a more effective use of available resources if one takes into account a priori knowledge about variability in magnitudes of uncertainty and decorrelation scales? In this article, we develop and compare the performance of several path-planning algorithms: optimized "lawnmower," a graph-search algorithm (A*), and a fully nonlinear genetic algorithm. We use the machinery of the best linear unbiased estimator (BLUE) to quantify the ability of a vehicle fleet to synoptically map distribution of phytoplankton off the central California coast. We used satellite and in situ data to specify covariance information required by the BLUE estimator. Computational experiments showed that two types of sampling strategies are possible: a suboptimal space-filling design (produced by the "lawnmower" and the A* algorithms) and an optimal uncertainty-aware design (produced by the genetic algorithm). Unlike the space-filling designs that attempted to cover the entire survey area, the optimal design focused on revisiting areas of high uncertainty. Results of the multivehicle experiments showed that fleet performance predictors, such as cumulative speed or the weight of the fleet, predicted the performance of a homogeneous fleet well; however, these were poor predictors for comparing the performance of different platforms.

  7. Note: Design and development of wireless controlled aerosol sampling network for large scale aerosol dispersion experiments

    International Nuclear Information System (INIS)

    Gopalakrishnan, V.; Subramanian, V.; Baskaran, R.; Venkatraman, B.

    2015-01-01

    Wireless based custom built aerosol sampling network is designed, developed, and implemented for environmental aerosol sampling. These aerosol sampling systems are used in field measurement campaign, in which sodium aerosol dispersion experiments have been conducted as a part of environmental impact studies related to sodium cooled fast reactor. The sampling network contains 40 aerosol sampling units and each contains custom built sampling head and the wireless control networking designed with Programmable System on Chip (PSoC™) and Xbee Pro RF modules. The base station control is designed using graphical programming language LabView. The sampling network is programmed to operate in a preset time and the running status of the samplers in the network is visualized from the base station. The system is developed in such a way that it can be used for any other environment sampling system deployed in wide area and uneven terrain where manual operation is difficult due to the requirement of simultaneous operation and status logging

  8. Note: Design and development of wireless controlled aerosol sampling network for large scale aerosol dispersion experiments

    Energy Technology Data Exchange (ETDEWEB)

    Gopalakrishnan, V.; Subramanian, V.; Baskaran, R.; Venkatraman, B. [Radiation Impact Assessment Section, Radiological Safety Division, Indira Gandhi Centre for Atomic Research, Kalpakkam 603 102 (India)

    2015-07-15

    Wireless based custom built aerosol sampling network is designed, developed, and implemented for environmental aerosol sampling. These aerosol sampling systems are used in field measurement campaign, in which sodium aerosol dispersion experiments have been conducted as a part of environmental impact studies related to sodium cooled fast reactor. The sampling network contains 40 aerosol sampling units and each contains custom built sampling head and the wireless control networking designed with Programmable System on Chip (PSoC™) and Xbee Pro RF modules. The base station control is designed using graphical programming language LabView. The sampling network is programmed to operate in a preset time and the running status of the samplers in the network is visualized from the base station. The system is developed in such a way that it can be used for any other environment sampling system deployed in wide area and uneven terrain where manual operation is difficult due to the requirement of simultaneous operation and status logging.

  9. How Big of a Problem is Analytic Error in Secondary Analyses of Survey Data?

    Directory of Open Access Journals (Sweden)

    Brady T West

    Full Text Available Secondary analyses of survey data collected from large probability samples of persons or establishments further scientific progress in many fields. The complex design features of these samples improve data collection efficiency, but also require analysts to account for these features when conducting analysis. Unfortunately, many secondary analysts from fields outside of statistics, biostatistics, and survey methodology do not have adequate training in this area, and as a result may apply incorrect statistical methods when analyzing these survey data sets. This in turn could lead to the publication of incorrect inferences based on the survey data that effectively negate the resources dedicated to these surveys. In this article, we build on the results of a preliminary meta-analysis of 100 peer-reviewed journal articles presenting analyses of data from a variety of national health surveys, which suggested that analytic errors may be extremely prevalent in these types of investigations. We first perform a meta-analysis of a stratified random sample of 145 additional research products analyzing survey data from the Scientists and Engineers Statistical Data System (SESTAT, which describes features of the U.S. Science and Engineering workforce, and examine trends in the prevalence of analytic error across the decades used to stratify the sample. We once again find that analytic errors appear to be quite prevalent in these studies. Next, we present several example analyses of real SESTAT data, and demonstrate that a failure to perform these analyses correctly can result in substantially biased estimates with standard errors that do not adequately reflect complex sample design features. Collectively, the results of this investigation suggest that reviewers of this type of research need to pay much closer attention to the analytic methods employed by researchers attempting to publish or present secondary analyses of survey data.

  10. How Big of a Problem is Analytic Error in Secondary Analyses of Survey Data?

    Science.gov (United States)

    West, Brady T.; Sakshaug, Joseph W.; Aurelien, Guy Alain S.

    2016-01-01

    Secondary analyses of survey data collected from large probability samples of persons or establishments further scientific progress in many fields. The complex design features of these samples improve data collection efficiency, but also require analysts to account for these features when conducting analysis. Unfortunately, many secondary analysts from fields outside of statistics, biostatistics, and survey methodology do not have adequate training in this area, and as a result may apply incorrect statistical methods when analyzing these survey data sets. This in turn could lead to the publication of incorrect inferences based on the survey data that effectively negate the resources dedicated to these surveys. In this article, we build on the results of a preliminary meta-analysis of 100 peer-reviewed journal articles presenting analyses of data from a variety of national health surveys, which suggested that analytic errors may be extremely prevalent in these types of investigations. We first perform a meta-analysis of a stratified random sample of 145 additional research products analyzing survey data from the Scientists and Engineers Statistical Data System (SESTAT), which describes features of the U.S. Science and Engineering workforce, and examine trends in the prevalence of analytic error across the decades used to stratify the sample. We once again find that analytic errors appear to be quite prevalent in these studies. Next, we present several example analyses of real SESTAT data, and demonstrate that a failure to perform these analyses correctly can result in substantially biased estimates with standard errors that do not adequately reflect complex sample design features. Collectively, the results of this investigation suggest that reviewers of this type of research need to pay much closer attention to the analytic methods employed by researchers attempting to publish or present secondary analyses of survey data. PMID:27355817

  11. The State of Environmentally Sustainable Interior Design Practice

    OpenAIRE

    Mihyun Kang; Denise A. Guerin

    2009-01-01

    Problem statement: Research that investigates how interior designers use environmentally sustainable interior design criteria in their design solutions has not been done. To provide a base to develop education strategies for sustainable interior design, this study examined the state of environmentally sustainable interior design practice. Approach: A national, Internet-based survey of interior design practitioners was conducted. To collect data, the random sample of US interior design practit...

  12. Male Circumcision and STI Acquisition in Britain: Evidence from a National Probability Sample Survey.

    Directory of Open Access Journals (Sweden)

    Virginia Homfray

    Full Text Available It is well-established that male circumcision reduces acquisition of HIV, herpes simplex virus 2, chancroid, and syphilis. However, the effect on the acquisition of non-ulcerative sexually transmitted infections (STIs remains unclear. We examined the relationship between circumcision and biological measures of three STIs: human papillomavirus (HPV, Chlamydia trachomatis and Mycoplasma genitalium.A probability sample survey of 15,162 men and women aged 16-74 years (including 4,060 men aged 16-44 years was carried out in Britain between 2010 and 2012. Participants completed a computer-assisted personal interview, including a computer-assisted self-interview, which asked about experience of STI diagnoses, and circumcision. Additionally, 1,850 urine samples from sexually-experienced men aged 16-44 years were collected and tested for STIs. Multivariable logistic regression was used to calculate adjusted odds ratios (AOR to quantify associations between circumcision and i self-reporting any STI diagnosis and ii presence of STIs in urine, in men aged 16-44 years, adjusting for key socio-demographic and sexual behavioural factors.The prevalence of circumcision in sexually-experienced men aged 16-44 years was 17.4% (95%CI 16.0-19.0. There was no association between circumcision and reporting any previous STI diagnoses, and specifically previous chlamydia or genital warts. However, circumcised men were less likely to have any HPV type (AOR 0.26, 95% confidence interval (CI 0.13-0.50 including high-risk HPV types (HPV-16, 18, 31, 33, 35, 39, 45, 51, 52, 56, 58, 59 and/or 68 (AOR 0.14, 95% CI 0.05-0.40 detected in urine.Circumcised men had reduced odds of HPV detection in urine. These findings have implications for improving the precision of models of STI transmission in populations with different circumcision prevalence and in designing interventions to reduce STI acquisition.

  13. Shielding design of highly activated sample storage at reactor TRIGA PUSPATI

    International Nuclear Information System (INIS)

    Naim Syauqi Hamzah; Julia Abdul Karim; Mohamad Hairie Rabir; Muhd Husamuddin Abdul Khalil; Mohd Amin Sharifuldin Salleh

    2010-01-01

    Radiation protection has always been one of the most important things considered in Reaktor Triga PUSPATI (RTP) management. Currently, demands on sample activation were increased from variety of applicant in different research field area. Radiological hazard may occur if the samples evaluation done were misjudge or miscalculated. At present, there is no appropriate storage for highly activated samples. For that purpose, special irradiated samples storage box should be provided in order to segregate highly activated samples that produce high dose level and typical activated samples that produce lower dose level (1 - 2 mR/ hr). In this study, thickness required by common shielding material such as lead and concrete to reduce highly activated radiotracer sample (potassium bromide) with initial exposure dose of 5 R/ hr to background level (0.05 mR/ hr) were determined. Analyses were done using several methods including conventional shielding equation, half value layer calculation and Micro shield computer code. Design of new irradiated samples storage box for RTP that capable to contain high level gamma radioactivity were then proposed. (author)

  14. Windscale pile core surveys

    International Nuclear Information System (INIS)

    Curtis, R.F.; Mathews, R.F.

    1996-01-01

    The two Windscale Piles were closed down, defueled as far as possible and mothballed for thirty years following a fire in the core of Pile 1 in 1957 resulting from the spontaneous release of stored Wigner energy in the graphite moderator. Decommissioning of the reactors commenced in 1987 and has reached the stage where the condition of both cores needs to be determined. To this end, non-intrusive and intrusive surveys and sampling of the cores have been planned and partly implemented. The objectives for each Pile differ slightly. The location and quantity of fuel remaining in the damaged core of Pile 1 needed to be established, whereas the removal of all fuel from Pile 2 needed to be confirmed. In Pile 1, the possible existence of a void in the core is to be explored and in Pile 2, the level of Wigner energy remaining required to be quantified. Levels of radioactivity in both cores needed to be measured. The planning of the surveys is described including strategy, design, safety case preparation and the remote handling and viewing equipment required to carry out the inspection, sampling and monitoring work. The results from the completed non-intrusive survey of Pile 2 are summarised. They confirm that the core is empty and the graphite is in good condition. The survey of Pile 1 has just started. (UK)

  15. Formerly utilized MED/AEC sites remedial action program. Radiological survey of the Middlesex Sampling Plant, Middlesex, New Jersey. Final report

    International Nuclear Information System (INIS)

    1977-11-01

    The results of a radiological survey of the former Middlesex Sampling Plant, Middlesex, New Jersey, are presented in this report. The surveyed property served as a uranium ore sampling plant during the 1940's and early 1950's. It was released for unrestricted use in 1968 following a radiological survey by the Atomic Energy Commission and is now a reserve training center for the U.S. Marine Sixth Motor Transport Battalion. The present survey was undertaken to determine whether the existing radiological status of the property is consistent with current health guidelines and radiation protection practices. The radiological survey included measurement of residual alpha and beta-gamma contamination levels, radon and radon daughter concentrations in buildings, external gamma radiation levels on the site and on adjacent property, and radium concentrations in soil on the site and on adjacent property. Surface contamination levels exceeded U.S. Nuclear Regulatory Commission (NRC) guidelines and 222 Rn concentration levels exceeded the non-occupational maximum permissible concentration MPC/sub a/ of 3 pCi/liter in some structures. These results indicate the possible need for extensive radon and radon daughter measurements in structures both onsite and offsite over periods as suggested by the U.S. Surgeon General

  16. Two specialized delayed-neutron detector designs for assays of fissionable elements in water and sediment samples

    International Nuclear Information System (INIS)

    Balestrini, S.J.; Balagna, J.P.; Menlove, H.O.

    1976-01-01

    Two specialized neutron-sensitive detectors are described which are employed for rapid assays of fissionable elements by sensing for delayed neutrons emitted by samples after they have been irradiated in a nuclear reactor. The more sensitive of the two detectors, designed to assay for uranium in water samples, is 40% efficient; the other, designed for sediment sample assays, is 27% efficient. These detectors are also designed to operate under water as an inexpensive shielding against neutron leakage from the reactor and neutrons from cosmic rays. (Auth.)

  17. A phoswich detector design for improved spatial sampling in PET

    Science.gov (United States)

    Thiessen, Jonathan D.; Koschan, Merry A.; Melcher, Charles L.; Meng, Fang; Schellenberg, Graham; Goertzen, Andrew L.

    2018-02-01

    Block detector designs, utilizing a pixelated scintillator array coupled to a photosensor array in a light-sharing design, are commonly used for positron emission tomography (PET) imaging applications. In practice, the spatial sampling of these designs is limited by the crystal pitch, which must be large enough for individual crystals to be resolved in the detector flood image. Replacing the conventional 2D scintillator array with an array of phoswich elements, each consisting of an optically coupled side-by-side scintillator pair, may improve spatial sampling in one direction of the array without requiring resolving smaller crystal elements. To test the feasibility of this design, a 4 × 4 phoswich array was constructed, with each phoswich element consisting of two optically coupled, 3 . 17 × 1 . 58 × 10mm3 LSO crystals co-doped with cerium and calcium. The amount of calcium doping was varied to create a 'fast' LSO crystal with decay time of 32.9 ns and a 'slow' LSO crystal with decay time of 41.2 ns. Using a Hamamatsu R8900U-00-C12 position-sensitive photomultiplier tube (PS-PMT) and a CAEN V1720 250 MS/s waveform digitizer, we were able to show effective discrimination of the fast and slow LSO crystals in the phoswich array. Although a side-by-side phoswich array is feasible, reflections at the crystal boundary due to a mismatch between the refractive index of the optical adhesive (n = 1 . 5) and LSO (n = 1 . 82) caused it to behave optically as an 8 × 4 array rather than a 4 × 4 array. Direct coupling of each phoswich element to individual photodetector elements may be necessary with the current phoswich array design. Alternatively, in order to implement this phoswich design with a conventional light sharing PET block detector, a high refractive index optical adhesive is necessary to closely match the refractive index of LSO.

  18. A sero-survey of rinderpest in nomadic pastoral systems in central and southern Somalia from 2002 to 2003, using a spatially integrated random sampling approach.

    Science.gov (United States)

    Tempia, S; Salman, M D; Keefe, T; Morley, P; Freier, J E; DeMartini, J C; Wamwayi, H M; Njeumi, F; Soumaré, B; Abdi, A M

    2010-12-01

    A cross-sectional sero-survey, using a two-stage cluster sampling design, was conducted between 2002 and 2003 in ten administrative regions of central and southern Somalia, to estimate the seroprevalence and geographic distribution of rinderpest (RP) in the study area, as well as to identify potential risk factors for the observed seroprevalence distribution. The study was also used to test the feasibility of the spatially integrated investigation technique in nomadic and semi-nomadic pastoral systems. In the absence of a systematic list of livestock holdings, the primary sampling units were selected by generating random map coordinates. A total of 9,216 serum samples were collected from cattle aged 12 to 36 months at 562 sampling sites. Two apparent clusters of RP seroprevalence were detected. Four potential risk factors associated with the observed seroprevalence were identified: the mobility of cattle herds, the cattle population density, the proximity of cattle herds to cattle trade routes and cattle herd size. Risk maps were then generated to assist in designing more targeted surveillance strategies. The observed seroprevalence in these areas declined over time. In subsequent years, similar seroprevalence studies in neighbouring areas of Kenya and Ethiopia also showed a very low seroprevalence of RP or the absence of antibodies against RP. The progressive decline in RP antibody prevalence is consistent with virus extinction. Verification of freedom from RP infection in the Somali ecosystem is currently in progress.

  19. Quality-control design for surface-water sampling in the National Water-Quality Network

    Science.gov (United States)

    Riskin, Melissa L.; Reutter, David C.; Martin, Jeffrey D.; Mueller, David K.

    2018-04-10

    The data-quality objectives for samples collected at surface-water sites in the National Water-Quality Network include estimating the extent to which contamination, matrix effects, and measurement variability affect interpretation of environmental conditions. Quality-control samples provide insight into how well the samples collected at surface-water sites represent the true environmental conditions. Quality-control samples used in this program include field blanks, replicates, and field matrix spikes. This report describes the design for collection of these quality-control samples and the data management needed to properly identify these samples in the U.S. Geological Survey’s national database.

  20. Planning Considerations for a Mars Sample Receiving Facility: Summary and Interpretation of Three Design Studies

    Science.gov (United States)

    Beaty, David W.; Allen, Carlton C.; Bass, Deborah S.; Buxbaum, Karen L.; Campbell, James K.; Lindstrom, David J.; Miller, Sylvia L.; Papanastassiou, Dimitri A.

    2009-10-01

    It has been widely understood for many years that an essential component of a Mars Sample Return mission is a Sample Receiving Facility (SRF). The purpose of such a facility would be to take delivery of the flight hardware that lands on Earth, open the spacecraft and extract the sample container and samples, and conduct an agreed-upon test protocol, while ensuring strict containment and contamination control of the samples while in the SRF. Any samples that are found to be non-hazardous (or are rendered non-hazardous by sterilization) would then be transferred to long-term curation. Although the general concept of an SRF is relatively straightforward, there has been considerable discussion about implementation planning. The Mars Exploration Program carried out an analysis of the attributes of an SRF to establish its scope, including minimum size and functionality, budgetary requirements (capital cost, operating costs, cost profile), and development schedule. The approach was to arrange for three independent design studies, each led by an architectural design firm, and compare the results. While there were many design elements in common identified by each study team, there were significant differences in the way human operators were to interact with the systems. In aggregate, the design studies provided insight into the attributes of a future SRF and the complex factors to consider for future programmatic planning.

  1. Low-sensitivity H ∞ filter design for linear delta operator systems with sampling time jitter

    Science.gov (United States)

    Guo, Xiang-Gui; Yang, Guang-Hong

    2012-04-01

    This article is concerned with the problem of designing H ∞ filters for a class of linear discrete-time systems with low-sensitivity to sampling time jitter via delta operator approach. Delta-domain model is used to avoid the inherent numerical ill-condition resulting from the use of the standard shift-domain model at high sampling rates. Based on projection lemma in combination with the descriptor system approach often used to solve problems related to delay, a novel bounded real lemma with three slack variables for delta operator systems is presented. A sensitivity approach based on this novel lemma is proposed to mitigate the effects of sampling time jitter on system performance. Then, the problem of designing a low-sensitivity filter can be reduced to a convex optimisation problem. An important consideration in the design of correlation filters is the optimal trade-off between the standard H ∞ criterion and the sensitivity of the transfer function with respect to sampling time jitter. Finally, a numerical example demonstrating the validity of the proposed design method is given.

  2. [The methodology and sample description of the National Survey on Addiction Problems in Hungary 2015 (NSAPH 2015)].

    Science.gov (United States)

    Paksi, Borbala; Demetrovics, Zsolt; Magi, Anna; Felvinczi, Katalin

    2017-06-01

    This paper introduces the methods and methodological findings of the National Survey on Addiction Problems in Hungary (NSAPH 2015). Use patterns of smoking, alcohol use and other psychoactive substances were measured as well as that of certain behavioural addictions (problematic gambling - PGSI, DSM-V, eating disorders - SCOFF, problematic internet use - PIUQ, problematic on-line gaming - POGO, problematic social media use - FAS, exercise addictions - EAI-HU, work addiction - BWAS, compulsive buying - CBS). The paper describes the applied measurement techniques, sample selection, recruitment of respondents and the data collection strategy as well. Methodological results of the survey including reliability and validity of the measures are reported. The NSAPH 2015 research was carried out on a nationally representative sample of the Hungarian adult population aged 16-64 yrs (gross sample 2477, net sample 2274 persons) with the age group of 18-34 being overrepresented. Statistical analysis of the weight-distribution suggests that weighting did not create any artificial distortion in the database leaving the representativeness of the sample unaffected. The size of the weighted sample of the 18-64 years old adult population is 1490 persons. The extent of the theoretical margin of error in the weighted sample is ±2,5%, at a reliability level of 95% which is in line with the original data collection plans. Based on the analysis of reliability and the extent of errors beyond sampling within the context of the database we conclude that inconsistencies create relatively minor distortions in cumulative prevalence rates; consequently the database makes possible the reliable estimation of risk factors related to different substance use behaviours. The reliability indexes of measurements used for prevalence estimates of behavioural addictions proved to be appropriate, though the psychometric features in some cases suggest the presence of redundant items. The comparison of

  3. Reward-allocation judgments in Romania : A factorial survey approach

    NARCIS (Netherlands)

    Buzea, C.; Meseşan-Schmitz, L.; van de Vijver, F.J.R.

    2013-01-01

    We investigated reward-allocation judgments when positive outcomes (monetary rewards) were distributed and the allocator was not a co-recipient, in a sample of 200 Romanian students. Within a full factorial survey design, seven factors, selected to affect the allocation decision, were orthogonally

  4. Architectural Design Space Exploration of an FPGA-based Compressed Sampling Engine

    DEFF Research Database (Denmark)

    El-Sayed, Mohammad; Koch, Peter; Le Moullec, Yannick

    2015-01-01

    We present the architectural design space exploration of a compressed sampling engine for use in a wireless heart-rate monitoring system. We show how parallelism affects execution time at the register transfer level. Furthermore, two example solutions (modified semi-parallel and full...

  5. The Hubble Space Telescope Medium Deep Survey Cluster Sample: Methodology and Data

    Science.gov (United States)

    Ostrander, E. J.; Nichol, R. C.; Ratnatunga, K. U.; Griffiths, R. E.

    1998-12-01

    We present a new, objectively selected, sample of galaxy overdensities detected in the Hubble Space Telescope Medium Deep Survey (MDS). These clusters/groups were found using an automated procedure that involved searching for statistically significant galaxy overdensities. The contrast of the clusters against the field galaxy population is increased when morphological data are used to search around bulge-dominated galaxies. In total, we present 92 overdensities above a probability threshold of 99.5%. We show, via extensive Monte Carlo simulations, that at least 60% of these overdensities are likely to be real clusters and groups and not random line-of-sight superpositions of galaxies. For each overdensity in the MDS cluster sample, we provide a richness and the average of the bulge-to-total ratio of galaxies within each system. This MDS cluster sample potentially contains some of the most distant clusters/groups ever detected, with about 25% of the overdensities having estimated redshifts z > ~0.9. We have made this sample publicly available to facilitate spectroscopic confirmation of these clusters and help more detailed studies of cluster and galaxy evolution. We also report the serendipitous discovery of a new cluster close on the sky to the rich optical cluster Cl l0016+16 at z = 0.546. This new overdensity, HST 001831+16208, may be coincident with both an X-ray source and a radio source. HST 001831+16208 is the third cluster/group discovered near to Cl 0016+16 and appears to strengthen the claims of Connolly et al. of superclustering at high redshift.

  6. Measuring the health of the Indian elderly: evidence from National Sample Survey data

    Directory of Open Access Journals (Sweden)

    Mahal Ajay

    2010-11-01

    Full Text Available Abstract Background Comparable health measures across different sets of populations are essential for describing the distribution of health outcomes and assessing the impact of interventions on these outcomes. Self-reported health (SRH is a commonly used indicator of health in household surveys and has been shown to be predictive of future mortality. However, the susceptibility of SRH to influence by individuals' expectations complicates its interpretation and undermines its usefulness. Methods This paper applies the empirical methodology of Lindeboom and van Doorslaer (2004 to investigate elderly health in India using data from the 52nd round of the National Sample Survey conducted in 1995-96 that includes both an SRH variable as well as a range of objective indicators of disability and ill health. The empirical testing was conducted on stratified homogeneous groups, based on four factors: gender, education, rural-urban residence, and region. Results We find that region generally has a significant impact on how women perceive their health. Reporting heterogeneity can arise not only from cut-point shifts, but also from differences in health effects by objective health measures. In contrast, we find little evidence of reporting heterogeneity due to differences in gender or educational status within regions. Rural-urban residence does matter in some cases. The findings are robust with different specifications of objective health indicators. Conclusions Our exercise supports the thesis that the region of residence is associated with different cut-points and reporting behavior on health surveys. We believe this is the first paper that applies the Lindeboom-van Doorslaer methodology to data on the elderly in a developing country, showing the feasibility of applying this methodology to data from many existing cross-sectional health surveys.

  7. [Design and implementation of data reporting system for Chinese materia medica resources survey].

    Science.gov (United States)

    Wang, Hui; Zhang, Xiao-Bo; Ge, Xiao-Guang; Jing, Zhi-Xian; Wang, Ling; Zhao, Yu-Ping; Guo, Lan-Ping; Huang, Lu-Qi

    2017-11-01

    The collection, summary and sharing of all kinds of survey data are one of the main tasks and achievements in the national census of Chinese materia medica resources organized and implemented by the State Administration of Traditional Chinese Medicine. It is a key link in the implementation of the national census of Chinese materia medica resources. Based on the client / server architecture, the data reporting system for Chinese materia medica resources survey has been established for reporting system application model of geospatial data service based on Web implementation, through the SOA framework, to achieve the data collection summary of the seven aspects of the local data configuration, data reporting, data verification, data reporting, PDA data import and export, APP data import, track instrument data import. The system services include the general investigation, the focus of investigation, specimen information, herbs sample information, market research, germplasm survey, traditional knowledge survey of these seven aspects of the 312 indicators of the report, serving the Chinese materia medica resource survey of field survey data collection and internal data collation. The system provides the technical support for the national census of Chinese materia medica resources, improves the efficiency of the census of Chinese materia medica resources, and is conducive to the long-term preservation of the data of Chinese materia medica resources census, the transformation and sharing of the results. Copyright© by the Chinese Pharmaceutical Association.

  8. Survey of control-room design practices with respect to human factors engineering

    International Nuclear Information System (INIS)

    Seminara, J.L.; Parsons, S.O.

    1980-01-01

    Human factors engineering is an interdisciplinary speciality concerned with influencing the design of equipment systems, facilities, and operational environments to promote safe, efficient, and reliable operator performance. This emphasis has been applied to most military and space systems in the past 30 y. A review of five nuclear power-plant control rooms, reported in the November-December 1977 issue of Nuclear Safety, revealed that human factors principles of design have generally not been incorporated in present-generation control rooms. This article summarizes the findings of a survey of 20 control-board designers from a mix of nuclear steam-supply system and architect-engineering firms. The interviews with these designers probed design methods currently used in developing control rooms. From these data it was concluded that there is currently no consistent, formal, uniform concern for the human factors aspects of control-room design on the part of the design organizations, the utilities, or the Nuclear Regulatory Commission. Although all the parties involved are concerned with human factors issues, this responsibility is not focused, and human factors yardsticks, or design standards, specific to power plants have not been evolved and applied in the development and verification of control-room designs from the standpoint of the man-machine interface

  9. Sampling and analysis plan for the preoperational environmental survey for the immobilized low activity waste (ILAW) project W-465

    International Nuclear Information System (INIS)

    Mitchell, R.M.

    1998-01-01

    This document provides a detailed description of the Sampling and Analysis Plan for the Preoperational Survey to be conducted at the Immobilized Low Activity Waste (ILAW) Project Site in the 200 East Area

  10. A random cluster survey and a convenience sample give comparable estimates of immunity to vaccine preventable diseases in children of school age in Victoria, Australia.

    Science.gov (United States)

    Kelly, Heath; Riddell, Michaela A; Gidding, Heather F; Nolan, Terry; Gilbert, Gwendolyn L

    2002-08-19

    We compared estimates of the age-specific population immunity to measles, mumps, rubella, hepatitis B and varicella zoster viruses in Victorian school children obtained by a national sero-survey, using a convenience sample of residual sera from diagnostic laboratories throughout Australia, with those from a three-stage random cluster survey. When grouped according to school age (primary or secondary school) there was no significant difference in the estimates of immunity to measles, mumps, hepatitis B or varicella. Compared with the convenience sample, the random cluster survey estimated higher immunity to rubella in samples from both primary (98.7% versus 93.6%, P = 0.002) and secondary school students (98.4% versus 93.2%, P = 0.03). Despite some limitations, this study suggests that the collection of a convenience sample of sera from diagnostic laboratories is an appropriate sampling strategy to provide population immunity data that will inform Australia's current and future immunisation policies. Copyright 2002 Elsevier Science Ltd.

  11. Spatiotemporally Representative and Cost-Efficient Sampling Design for Validation Activities in Wanglang Experimental Site

    Directory of Open Access Journals (Sweden)

    Gaofei Yin

    2017-11-01

    Full Text Available Spatiotemporally representative Elementary Sampling Units (ESUs are required for capturing the temporal variations in surface spatial heterogeneity through field measurements. Since inaccessibility often coexists with heterogeneity, a cost-efficient sampling design is mandatory. We proposed a sampling strategy to generate spatiotemporally representative and cost-efficient ESUs based on the conditioned Latin hypercube sampling scheme. The proposed strategy was constrained by multi-temporal Normalized Difference Vegetation Index (NDVI imagery, and the ESUs were limited within a sampling feasible region established based on accessibility criteria. A novel criterion based on the Overlapping Area (OA between the NDVI frequency distribution histogram from the sampled ESUs and that from the entire study area was used to assess the sampling efficiency. A case study in Wanglang National Nature Reserve in China showed that the proposed strategy improves the spatiotemporally representativeness of sampling (mean annual OA = 74.7% compared to the single-temporally constrained (OA = 68.7% and the random sampling (OA = 63.1% strategies. The introduction of the feasible region constraint significantly reduces in-situ labour-intensive characterization necessities at expenses of about 9% loss in the spatiotemporal representativeness of the sampling. Our study will support the validation activities in Wanglang experimental site providing a benchmark for locating the nodes of automatic observation systems (e.g., LAINet which need a spatially distributed and temporally fixed sampling design.

  12. Pain and Joy of a Panel Survey on Transport Studies

    Energy Technology Data Exchange (ETDEWEB)

    Comendador Arquero, María Eugenia López-Lambas

    2016-07-01

    Over ten years ago, it was established that the most frequent reason that motivates a panel survey on transport studies is the evaluation of a change in the transportation system, or a specific transportation-planning project, especially when the project involves novel elements. From a statistical viewpoint, a panel survey has the definite advantage to offer more accurate estimatesof changes than cross-sectional surveys for the same sample size. Observing travel patterns of individuals and households overseveral consecutive days, has offered insights into activity scheduling and travel planning. Variability in travel patterns has important policy implications as well, but how much effort is worth to design a panel survey? To evaluate the effects of the transport policies introduced in Madrid during the last five years, a ‘short-long’ panel survey wasbuilt, based on a sample of a Madrid-worker subpopulation most affected by those recent changes in transport policy. The paper describes both the design and construction of the panel based on GPS technology, and presents some results based on an analysis of its two waves; for example, it registered an increment of public transport use and walking trips in 10%. The panel overcomes the known attrition problem thanks to providing incentives, maintaining contact, using the same interviewer for the same respondents, and conducting face-to-face interviews. (Author)

  13. A nationwide population-based cross-sectional survey of health-related quality of life in patients with myeloproliferative neoplasms in Denmark (MPNhealthSurvey: survey design and characteristics of respondents and nonrespondents

    Directory of Open Access Journals (Sweden)

    Brochmann N

    2017-03-01

    Full Text Available Nana Brochmann,1 Esben Meulengracht Flachs,2 Anne Illemann Christensen,3 Christen Lykkegaard Andersen,1 Knud Juel,3 Hans Carl Hasselbalch,1 Ann-Dorthe Zwisler4 1Department of Hematology, Zealand University Hospital, University of Copenhagen, Roskilde, 2Department of Occupational and Environmental Medicine, Bispebjerg University Hospital, Copenhagen, 3National Institute of Public Health, University of Southern Denmark, Copenhagen, 4Danish Knowledge Centre for Rehabilitation and Palliative Care, University of Southern Denmark and Odense University Hospital, Odense, Denmark Objective: The Department of Hematology, Zealand University Hospital, Denmark, and the National Institute of Public Health, University of Southern Denmark, created the first nationwide, population-based, and the most comprehensive cross-sectional health-related quality of life (HRQoL survey of patients with myeloproliferative neoplasms (MPNs. In Denmark, all MPN patients are treated in public hospitals and treatments received are free of charge for these patients. Therefore, MPN patients receive the best available treatment to the extent of its suitability for them and if they wish to receive the treatment. The aims of this article are to describe the survey design and the characteristics of respondents and nonrespondents. Material and methods: Individuals with MPN diagnoses registered in the Danish National Patient Register (NPR were invited to participate. The registers of the Danish Civil Registration System and Statistics Denmark provided information regarding demographics. The survey contained 120 questions: validated patient-reported outcome (PRO questionnaires and additional questions addressing lifestyle. Results: A total of 4,704 individuals were registered with MPN diagnoses in the NPR of whom 4,236 were eligible for participation and 2,613 (62% responded. Overall, the respondents covered the broad spectrum of MPN patients, but patients 70–79 years old, living with

  14. BMR in a Brazilian adult probability sample: the Nutrition, Physical Activity and Health Survey.

    Science.gov (United States)

    Anjos, Luiz A; Wahrlich, Vivian; Vasconcellos, Mauricio Tl

    2014-04-01

    To measure BMR in a probability sample of adults from an urban city of Brazil and to compare indirectly measured BMR (BMRi) with BMR predicted from different equations. BMR data were obtained by indirect calorimetry and estimated by different predictive equations (Schofield; Harris and Benedict; Henry and Rees). Anthropometric and body composition measures were also obtained. The Nutrition, Physical Activity and Health Survey (PNAFS), a household survey conducted in Niterói, Rio de Janeiro state, Brazil. Representative sample of 529 adults (aged ≥20 years; 339 females) living in Niterói, Rio de Janeiro state, Brazil. Mean BMRi values were 5839.7 (se 73.9) kJ/d and 4758.1 (se 39.5) kJ/d for men and women, respectively. Predicted BMR by all equations was significantly higher (difference between means and 95% CI did not include zero) than BMRi in both men and women of all ages. Overall bias in BMR (predicted BMR minus BMRi) using the Schofield equations (overestimation of about 20%) was higher than when using the Henry and Rees equations (13% and 16% overestimation for males and females, respectively). The percentage of individuals whose BMR predicted by the Schofield equations fell within 10% of BMRi was very low (7.8% and 14.1% of males nd females, respectively). Current available predictive equations of BMR are not adequate to estimate BMR in Brazilians living in Niterói, Rio de Janeiro, Brazil.

  15. Southeast Region Headboat Survey-PPS Survey Design Project

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset is a record of trips selected during pilot procedures for the PPS design project designed to track the port agents ability to follow the PPS design and...

  16. THE BOSS EMISSION-LINE LENS SURVEY (BELLS). I. A LARGE SPECTROSCOPICALLY SELECTED SAMPLE OF LENS GALAXIES AT REDSHIFT {approx}0.5

    Energy Technology Data Exchange (ETDEWEB)

    Brownstein, Joel R.; Bolton, Adam S.; Pandey, Parul [Department of Physics and Astronomy, University of Utah, Salt Lake City, UT 84112 (United States); Schlegel, David J. [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Eisenstein, Daniel J. [Harvard College Observatory, 60 Garden Street, MS 20, Cambridge, MA 02138 (United States); Kochanek, Christopher S. [Department of Astronomy and Center for Cosmology and Astroparticle Physics, Ohio State University, Columbus, OH 43210 (United States); Connolly, Natalia [Department of Physics, Hamilton College, Clinton, NY 13323 (United States); Maraston, Claudia [Institute of Cosmology and Gravitation, University of Portsmouth, Portsmouth PO1 3FX (United Kingdom); Seitz, Stella [University Observatory Munich, Scheinstrasse 1, 81679 Muenchen (Germany); Wake, David A. [Department of Astronomy, Yale University, New Haven, CT 06520 (United States); Wood-Vasey, W. Michael [Pittsburgh Center for Particle Physics, Astrophysics, and Cosmology (PITT-PACC), Department of Physics and Astronomy, University of Pittsburgh, Pittsburgh, PA 15260 (United States); Brinkmann, Jon [Apache Point Observatory, P.O. Box 59, Sunspot, NM 88349 (United States); Schneider, Donald P. [Department of Astronomy and Astrophysics and Institute for Gravitation and the Cosmos, Pennsylvania State University, University Park, PA 16802 (United States); Weaver, Benjamin A. [Center for Cosmology and Particle Physics, New York University, New York, NY 10003 (United States)

    2012-01-01

    We present a catalog of 25 definite and 11 probable strong galaxy-galaxy gravitational lens systems with lens redshifts 0.4 {approx}< z {approx}< 0.7, discovered spectroscopically by the presence of higher-redshift emission lines within the Baryon Oscillation Spectroscopic Survey (BOSS) of luminous galaxies, and confirmed with high-resolution Hubble Space Telescope (HST) images of 44 candidates. Our survey extends the methodology of the Sloan Lens Advanced Camera for Surveys survey (SLACS) to higher redshift. We describe the details of the BOSS spectroscopic candidate detections, our HST ACS image processing and analysis methods, and our strong gravitational lens modeling procedure. We report BOSS spectroscopic parameters and ACS photometric parameters for all candidates, and mass-distribution parameters for the best-fit singular isothermal ellipsoid models of definite lenses. Our sample to date was selected using only the first six months of BOSS survey-quality spectroscopic data. The full five-year BOSS database should produce a sample of several hundred strong galaxy-galaxy lenses and in combination with SLACS lenses at lower redshift, strongly constrain the redshift evolution of the structure of elliptical, bulge-dominated galaxies as a function of luminosity, stellar mass, and rest-frame color, thereby providing a powerful test for competing theories of galaxy formation and evolution.

  17. Stratification in Business and Agriculture Surveys with R

    Directory of Open Access Journals (Sweden)

    Marco Ballin

    2016-06-01

    Full Text Available Usually sample surveys on enterprises and farms adopt a one stage stratified sampling design. In practice the sampling frame is divided in non-overlapping strata and simple random sampling is carried out independently in each stratum. Stratification allows for reduction of the sampling error and permits to derive accurate estimates. Stratified sampling requires a number of decisions strictly related: (i how to stratify the population and how many strata to consider; (ii the size of the whole sample and corresponding partitioning among the strata (so called allocation. This paper will deal mainly with the problem (i and will show how to tackle it in the R environment using packages already available on the CRAN.

  18. A Questionnaire Survey On Use Of The Internet By Students Of The ...

    African Journals Online (AJOL)

    The study was carried out to examine use of Internet by students of the University of Ibadan, Nigeria\\'s premier higher educational institution. Adopting a sample survey research design, systematic sampling was used to select 560 students who are resident in the main campus hostels, and data was collected from the ...

  19. International survey of self-reported medicine use among adolescents

    DEFF Research Database (Denmark)

    Hansen, Ebba H; Holstein, Bjørn E; Due, Pernille

    2003-01-01

    OBJECTIVE: To examine gender, age, and country variations in adolescents' self-reported medicine use. DESIGN: Cross-sectional school surveys of representative samples of 11- to 15-year-old girls and boys were used. The 1997/1998 Health Behaviour in School-aged Children study was referenced. A sta...

  20. Contributions to sampling statistics

    CERN Document Server

    Conti, Pier; Ranalli, Maria

    2014-01-01

    This book contains a selection of the papers presented at the ITACOSM 2013 Conference, held in Milan in June 2013. ITACOSM is the bi-annual meeting of the Survey Sampling Group S2G of the Italian Statistical Society, intended as an international  forum of scientific discussion on the developments of theory and application of survey sampling methodologies and applications in human and natural sciences. The book gathers research papers carefully selected from both invited and contributed sessions of the conference. The whole book appears to be a relevant contribution to various key aspects of sampling methodology and techniques; it deals with some hot topics in sampling theory, such as calibration, quantile-regression and multiple frame surveys, and with innovative methodologies in important topics of both sampling theory and applications. Contributions cut across current sampling methodologies such as interval estimation for complex samples, randomized responses, bootstrap, weighting, modeling, imputati...

  1. Sampling Design of Soil Physical Properties in a Conilon Coffee Field

    Directory of Open Access Journals (Sweden)

    Eduardo Oliveira de Jesus Santos

    Full Text Available ABSTRACT Establishing the number of samples required to determine values of soil physical properties ultimately results in optimization of labor and allows better representation of such attributes. The objective of this study was to analyze the spatial variability of soil physical properties in a Conilon coffee field and propose a soil sampling method better attuned to conditions of the management system. The experiment was performed in a Conilon coffee field in Espírito Santo state, Brazil, under a 3.0 × 2.0 × 1.0 m (4,000 plants ha-1 double spacing design. An irregular grid, with dimensions of 107 × 95.7 m and 65 sampling points, was set up. Soil samples were collected from the 0.00-0.20 m depth from each sampling point. Data were analyzed under descriptive statistical and geostatistical methods. Using statistical parameters, the adequate number of samples for analyzing the attributes under study was established, which ranged from 1 to 11 sampling points. With the exception of particle density, all soil physical properties showed a spatial dependence structure best fitted to the spherical model. Establishment of the number of samples and spatial variability for the physical properties of soils may be useful in developing sampling strategies that minimize costs for farmers within a tolerable and predictable level of error.

  2. African Primary Care Research: Performing surveys using questionnaires

    OpenAIRE

    Govender, Indiran; Mabuza, Langalibalele H.; Ogunbanjo, Gboyega A.; Mash, Bob

    2014-01-01

    The aim of this article is to provide practical guidance on conducting surveys and the use of questionnaires for postgraduate students at a Masters level who are undertaking primary care research. The article is intended to assist with writing the methods section of the research proposal and thinking through the relevant issues that apply to sample size calculation, sampling strategy, design of a questionnaire and administration of a questionnaire. The articleis part of a larger series on pri...

  3. Survey research.

    Science.gov (United States)

    Alderman, Amy K; Salem, Barbara

    2010-10-01

    Survey research is a unique methodology that can provide insight into individuals' perspectives and experiences and can be collected on a large population-based sample. Specifically, in plastic surgery, survey research can provide patients and providers with accurate and reproducible information to assist with medical decision-making. When using survey methods in research, researchers should develop a conceptual model that explains the relationships of the independent and dependent variables. The items of the survey are of primary importance. Collected data are only useful if they accurately measure the concepts of interest. In addition, administration of the survey must follow basic principles to ensure an adequate response rate and representation of the intended target sample. In this article, the authors review some general concepts important for successful survey research and discuss the many advantages this methodology has for obtaining limitless amounts of valuable information.

  4. Sample Loss and Survey Bias in Estimates of Social Security Beneficiaries: A Tale of Two Surveys.

    OpenAIRE

    John L. Czajka; James Mabli; Scott Cody

    2008-01-01

    Data from the Census Bureau’s Survey of Income and Program Participation (SIPP) and the Current Population Survey (CPS) provide information on current and potential beneficiaries served by Social Security Administration (SSA) programs. SSA also links administrative records to the records of survey respondents who provide Social Security numbers. These matched data expand the content of the SIPP and CPS files to fields available only through SSA and Internal Revenue Service records—such as l...

  5. Developing a weighting strategy to include mobile phone numbers into an ongoing population health survey using an overlapping dual-frame design with limited benchmark information.

    Science.gov (United States)

    Barr, Margo L; Ferguson, Raymond A; Hughes, Phil J; Steel, David G

    2014-09-04

    In 2012 mobile phone numbers were included into the ongoing New South Wales Population Health Survey (NSWPHS) using an overlapping dual-frame design. Previously in the NSWPHS the sample was selected using random digit dialing (RDD) of landline phone numbers. The survey was undertaken using computer assisted telephone interviewing (CATI). The weighting strategy needed to be significantly expanded to manage the differing probabilities of selection by frame, including that of children of mobile-only phone users, and to adjust for the increased chance of selection of dual-phone users. This paper describes the development of the final weighting strategy to properly combine the data from two overlapping sample frames accounting for the fact that population benchmarks for the different sampling frames were not available at the state or regional level. Estimates of the number of phone numbers for the landline and mobile phone frames used to calculate the differing probabilities of selection by frame, for New South Wales (NSW) and by stratum, were obtained by apportioning Australian estimates as none were available for NSW. The weighting strategy was then developed by calculating person selection probabilities, selection weights, applying a constant composite factor to the dual-phone users sample weights, and benchmarking to the latest NSW population by age group, sex and stratum. Data from the NSWPHS for the first quarter of 2012 was used to test the weighting strategy. This consisted of data on 3395 respondents with 2171 (64%) from the landline frame and 1224 (36%) from the mobile frame. However, in order to calculate the weights, data needed to be available for all core weighting variables and so 3378 respondents, 2933 adults and 445 children, had sufficient data to be included. Average person weights were 3.3 times higher for the mobile-only respondents, 1.3 times higher for the landline-only respondents and 1.7 times higher for dual-phone users in the mobile frame

  6. A Novel Simulation Technician Laboratory Design: Results of a Survey-Based Study.

    Science.gov (United States)

    Ahmed, Rami; Hughes, Patrick G; Friedl, Ed; Ortiz Figueroa, Fabiana; Cepeda Brito, Jose R; Frey, Jennifer; Birmingham, Lauren E; Atkinson, Steven Scott

    2016-03-16

    OBJECTIVE : The purpose of this study was to elicit feedback from simulation technicians prior to developing the first simulation technician-specific simulation laboratory in Akron, OH. Simulation technicians serve a vital role in simulation centers within hospitals/health centers around the world. The first simulation technician degree program in the US has been approved in Akron, OH. To satisfy the requirements of this program and to meet the needs of this special audience of learners, a customized simulation lab is essential. A web-based survey was circulated to simulation technicians prior to completion of the lab for the new program. The survey consisted of questions aimed at identifying structural and functional design elements of a novel simulation center for the training of simulation technicians. Quantitative methods were utilized to analyze data. Over 90% of technicians (n=65) think that a lab designed explicitly for the training of technicians is novel and beneficial. Approximately 75% of respondents think that the space provided appropriate audiovisual (AV) infrastructure and space to evaluate the ability of technicians to be independent. The respondents think that the lab needed more storage space, visualization space for a large number of students, and more space in the technical/repair area. CONCLUSIONS : A space designed for the training of simulation technicians was considered to be beneficial. This laboratory requires distinct space for technical repair, adequate bench space for the maintenance and repair of simulators, an appropriate AV infrastructure, and space to evaluate the ability of technicians to be independent.

  7. A novel sampling design to explore gene-longevity associations

    DEFF Research Database (Denmark)

    De Rango, Francesco; Dato, Serena; Bellizzi, Dina

    2008-01-01

    To investigate the genetic contribution to familial similarity in longevity, we set up a novel experimental design where cousin-pairs born from siblings who were concordant or discordant for the longevity trait were analyzed. To check this design, two chromosomal regions already known to encompass...... from concordant and discordant siblings. In addition, we analyzed haplotype transmission from centenarians to offspring, and a statistically significant Transmission Ratio Distortion (TRD) was observed for both chromosomal regions in the discordant families (P=0.007 for 6p21.3 and P=0.015 for 11p15.......5). In concordant families, a marginally significant TRD was observed at 6p21.3 only (P=0.06). Although no significant difference emerged between the two groups of cousin-pairs, our study gave new insights on the hindrances to recruiting a suitable sample to obtain significant IBD data on longevity...

  8. Trajectory Design to Mitigate Risk on the Transiting Exoplanet Survey Satellite (TESS) Mission

    Science.gov (United States)

    Dichmann, Donald

    2016-01-01

    The Transiting Exoplanet Survey Satellite (TESS) will employ a highly eccentric Earth orbit, in 2:1 lunar resonance, reached with a lunar flyby preceded by 3.5 phasing loops. The TESS mission has limited propellant and several orbit constraints. Based on analysis and simulation, we have designed the phasing loops to reduce delta-V and to mitigate risk due to maneuver execution errors. We have automated the trajectory design process and use distributed processing to generate and to optimize nominal trajectories, check constraint satisfaction, and finally model the effects of maneuver errors to identify trajectories that best meet the mission requirements.

  9. "Is This Ethical?" A Survey of Opinion on Principles and Practices of Document Design.

    Science.gov (United States)

    Dragga, Sam

    1996-01-01

    Reprints a corrected version of an article originally published in the volume 43, number 1 issue of this journal. Presents results of a national survey of technical communicators and technical communication teachers assessing the ethics of seven document design cases involving manipulation of typography, illustrations, and photographs. Offers…

  10. Representative mass reduction in sampling

    DEFF Research Database (Denmark)

    Petersen, Lars; Esbensen, Harry Kim; Dahl, Casper Kierulf

    2004-01-01

    We here present a comprehensive survey of current mass reduction principles and hardware available in the current market. We conduct a rigorous comparison study of the performance of 17 field and/or laboratory instruments or methods which are quantitatively characterized (and ranked) for accuracy...... dividers, the Boerner Divider, the ??spoon method??, alternate/fractional shoveling and grab sampling. Only devices based on riffle splitting principles (static or rotational) passes the ultimate representativity test (with minor, but significant relative differences). Grab sampling, the overwhelmingly...... most often used mass reduction method, performs appallingly?its use must be discontinued (with the singular exception for completely homogenized fine powders). Only proper mass reduction (i.e. carried out in complete compliance with all appropriate design principles, maintenance and cleaning rules) can...

  11. Utility FGD Survey, January--December 1989. Volume 2, Design performance data for operating FGD systems, Part 1

    Energy Technology Data Exchange (ETDEWEB)

    Hance, S.L.; McKibben, R.S.; Jones, F.M. [IT Corp., Cincinnati, OH (United States)

    1992-03-01

    The Utility flue gas desulfurization (FGD) Survey report, which is generated by a computerized data base management system, represents a survey of operational and planned domestic utility flue gas desulfurization (FGD) systems. It summarizes information contributed by the utility industry, system and equipment suppliers, system designers, research organizations, and regulatory agencies. The data cover system design, fuel characteristics, operating history, and actual system performance. Also included is a unit-by-unit discussion of problems and solutions associated with the boilers, scrubbers, and FGD systems. The development status (operational, under construction, or in the planning stages), system supplier, process, waste disposal practice, and regulatory class are tabulated alphabetically by utility company.

  12. Engagement with HIV prevention treatment and care among female sex workers in Zimbabwe: a respondent driven sampling survey.

    Science.gov (United States)

    Cowan, Frances M; Mtetwa, Sibongile; Davey, Calum; Fearon, Elizabeth; Dirawo, Jeffrey; Wong-Gruenwald, Ramona; Ndikudze, Theresa; Chidiya, Samson; Benedikt, Clemens; Busza, Joanna; Hargreaves, James R

    2013-01-01

    To determine the HIV prevalence and extent of engagement with HIV prevention and care among a representative sample of Zimbabwean sex workers working in Victoria Falls, Hwange and Mutare. Respondent driven sampling (RDS) surveys conducted at each site. Sex workers were recruited using respondent driven sampling with each respondent limited to recruiting 2 peers. Participants completed an interviewer-administered questionnaire and provided a finger prick blood sample for HIV antibody testing. Statistical analysis took account of sampling method. 870 women were recruited from the three sites. HIV prevalence was between 50 and 70%. Around half of those confirmed HIV positive were aware of their HIV status and of those 50-70% reported being enrolled in HIV care programmes. Overall only 25-35% of those with laboratory-confirmed HIV were accessing antiretroviral therapy. Among those reporting they were HIV negative, 21-28% reported having an HIV test in the last 6 months. Of those tested HIV negative, most (65-82%) were unaware of their status. Around two-thirds of sex workers reported consistent condom use with their clients. As in other settings, sex workers reported high rates of gender based violence and police harassment. This survey suggests that prevalence of HIV is high among sex workers in Zimbabwe and that their engagement with prevention, treatment and care is sub-optimal. Intensifying prevention and care interventions for sex workers has the potential to markedly reduce HIV and social risks for sex workers, their clients and the general population in Zimbabwe and elsewhere in the region.

  13. Assessment of long-term gas sampling design at two commercial manure-belt layer barns.

    Science.gov (United States)

    Chai, Li-Long; Ni, Ji-Qin; Chen, Yan; Diehl, Claude A; Heber, Albert J; Lim, Teng T

    2010-06-01

    Understanding temporal and spatial variations of aerial pollutant concentrations is important for designing air quality monitoring systems. In long-term and continuous air quality monitoring in large livestock and poultry barns, these systems usually use location-shared analyzers and sensors and can only sample air at limited number of locations. To assess the validity of the gas sampling design at a commercial layer farm, a new methodology was developed to map pollutant gas concentrations using portable sensors under steady-state or quasi-steady-state barn conditions. Three assessment tests were conducted from December 2008 to February 2009 in two manure-belt layer barns. Each barn was 140.2 m long and 19.5 m wide and had 250,000 birds. Each test included four measurements of ammonia and carbon dioxide concentrations at 20 locations that covered all operating fans, including six of the fans used in the long-term sampling that represented three zones along the lengths of the barns, to generate data for complete-barn monitoring. To simulate the long-term monitoring, gas concentrations from the six long-term sampling locations were extracted from the 20 assessment locations. Statistical analyses were performed to test the variances (F-test) and sample means (t test) between the 6- and 20-sample data. The study clearly demonstrated ammonia and carbon dioxide concentration gradients that were characterized by increasing concentrations from the west to east ends of the barns following the under-cage manure-belt travel direction. Mean concentrations increased from 7.1 to 47.7 parts per million (ppm) for ammonia and from 2303 to 3454 ppm for carbon dioxide from the west to east of the barns. Variations of mean gas concentrations were much less apparent between the south and north sides of the barns, because they were 21.2 and 20.9 ppm for ammonia and 2979 and 2951 ppm for carbon dioxide, respectively. The null hypotheses that the variances and means between the 6- and 20

  14. Sampling design considerations for demographic studies: a case of colonial seabirds

    Science.gov (United States)

    Kendall, William L.; Converse, Sarah J.; Doherty, Paul F.; Naughton, Maura B.; Anders, Angela; Hines, James E.; Flint, Elizabeth

    2009-01-01

    For the purposes of making many informed conservation decisions, the main goal for data collection is to assess population status and allow prediction of the consequences of candidate management actions. Reducing the bias and variance of estimates of population parameters reduces uncertainty in population status and projections, thereby reducing the overall uncertainty under which a population manager must make a decision. In capture-recapture studies, imperfect detection of individuals, unobservable life-history states, local movement outside study areas, and tag loss can cause bias or precision problems with estimates of population parameters. Furthermore, excessive disturbance to individuals during capture?recapture sampling may be of concern because disturbance may have demographic consequences. We address these problems using as an example a monitoring program for Black-footed Albatross (Phoebastria nigripes) and Laysan Albatross (Phoebastria immutabilis) nesting populations in the northwestern Hawaiian Islands. To mitigate these estimation problems, we describe a synergistic combination of sampling design and modeling approaches. Solutions include multiple capture periods per season and multistate, robust design statistical models, dead recoveries and incidental observations, telemetry and data loggers, buffer areas around study plots to neutralize the effect of local movements outside study plots, and double banding and statistical models that account for band loss. We also present a variation on the robust capture?recapture design and a corresponding statistical model that minimizes disturbance to individuals. For the albatross case study, this less invasive robust design was more time efficient and, when used in combination with a traditional robust design, reduced the standard error of detection probability by 14% with only two hours of additional effort in the field. These field techniques and associated modeling approaches are applicable to studies of

  15. SamplingStrata: An R Package for the Optimization of Strati?ed Sampling

    Directory of Open Access Journals (Sweden)

    Giulio Barcaroli

    2014-11-01

    Full Text Available When designing a sampling survey, usually constraints are set on the desired precision levels regarding one or more target estimates (the Ys. If a sampling frame is available, containing auxiliary information related to each unit (the Xs, it is possible to adopt a stratified sample design. For any given strati?cation of the frame, in the multivariate case it is possible to solve the problem of the best allocation of units in strata, by minimizing a cost function sub ject to precision constraints (or, conversely, by maximizing the precision of the estimates under a given budget. The problem is to determine the best stratification in the frame, i.e., the one that ensures the overall minimal cost of the sample necessary to satisfy precision constraints. The Xs can be categorical or continuous; continuous ones can be transformed into categorical ones. The most detailed strati?cation is given by the Cartesian product of the Xs (the atomic strata. A way to determine the best stratification is to explore exhaustively the set of all possible partitions derivable by the set of atomic strata, evaluating each one by calculating the corresponding cost in terms of the sample required to satisfy precision constraints. This is una?ordable in practical situations, where the dimension of the space of the partitions can be very high. Another possible way is to explore the space of partitions with an algorithm that is particularly suitable in such situations: the genetic algorithm. The R package SamplingStrata, based on the use of a genetic algorithm, allows to determine the best strati?cation for a population frame, i.e., the one that ensures the minimum sample cost necessary to satisfy precision constraints, in a multivariate and multi-domain case.

  16. When is a species declining? Optimizing survey effort to detect population changes in reptiles.

    Directory of Open Access Journals (Sweden)

    David Sewell

    Full Text Available Biodiversity monitoring programs need to be designed so that population changes can be detected reliably. This can be problematical for species that are cryptic and have imperfect detection. We used occupancy modeling and power analysis to optimize the survey design for reptile monitoring programs in the UK. Surveys were carried out six times a year in 2009-2010 at multiple sites. Four out of the six species--grass snake, adder, common lizard, slow-worm -were encountered during every survey from March-September. The exceptions were the two rarest species--sand lizard and smooth snake--which were not encountered in July 2009 and March 2010 respectively. The most frequently encountered and most easily detected species was the slow-worm. For the four widespread reptile species in the UK, three to four survey visits that used a combination of directed transect walks and artificial cover objects resulted in 95% certainty that a species would be detected if present. Using artificial cover objects was an effective detection method for most species, considerably increased the detection rate of some, and reduced misidentifications. To achieve an 85% power to detect a decline in any of the four widespread species when the true decline is 15%, three surveys at a total of 886 sampling sites, or four surveys at a total of 688 sites would be required. The sampling effort needed reduces to 212 sites surveyed three times, or 167 sites surveyed four times, if the target is to detect a true decline of 30% with the same power. The results obtained can be used to refine reptile survey protocols in the UK and elsewhere. On a wider scale, the occupancy study design approach can be used to optimize survey effort and help set targets for conservation outcomes for regional or national biodiversity assessments.

  17. Designing Surveys for Language Programs.

    Science.gov (United States)

    Brown, James Dean

    A discussion of survey methodology for investigating second language programs and instruction examines two methods: oral interviews and written questionnaires. Each method is defined, and variations are explored. For interviews, this includes individual, group, and telephone interviews. For questionnaires, this includes self-administered and…

  18. Methodology of the fasting sub-sample from the Mexican Health Survey, 2000 Metodología de la submuestra de suero de la Encuesta Nacional de Salud 2000

    OpenAIRE

    Simón Barquera; Citlalli Carrión; Ismael Campos; Juan Espinosa; Juan Rivera; Gustavo Olaiz-Fernández

    2007-01-01

    OBJECTIVE: To report the comparative results of the sub-sample of fasting adults selected for the biochemical measurement of cardiovascular risk factors and the rest of the Mexican Health Survey (MHS) (2000) participants. MATERIAL AND METHODS: The nationally representative, cross-sectional Mexican Health Survey (2000) was analyzed. Survey participants reporting a fasting state period of 9- to 12-h were included in a sub-sample (n= 2 535) and compared with all other participants (n= 41 126). P...

  19. SDSS-II SUPERNOVA SURVEY: AN ANALYSIS OF THE LARGEST SAMPLE OF TYPE IA SUPERNOVAE AND CORRELATIONS WITH HOST-GALAXY SPECTRAL PROPERTIES

    International Nuclear Information System (INIS)

    Wolf, Rachel C.; Gupta, Ravi R.; Sako, Masao; Fischer, John A.; March, Marisa C.; Fischer, Johanna-Laina; D’Andrea, Chris B.; Smith, Mathew; Kessler, Rick; Scolnic, Daniel M.; Jha, Saurabh W.; Campbell, Heather; Nichol, Robert C.; Olmstead, Matthew D.; Richmond, Michael; Schneider, Donald P.

    2016-01-01

    Using the largest single-survey sample of Type Ia supernovae (SNe Ia) to date, we study the relationship between properties of SNe Ia and those of their host galaxies, focusing primarily on correlations with Hubble residuals (HRs). Our sample consists of 345 photometrically classified or spectroscopically confirmed SNe Ia discovered as part of the SDSS-II Supernova Survey (SDSS-SNS). This analysis utilizes host-galaxy spectroscopy obtained during the SDSS-I/II spectroscopic survey and from an ancillary program on the SDSS-III Baryon Oscillation Spectroscopic Survey that obtained spectra for nearly all host galaxies of SDSS-II SN candidates. In addition, we use photometric host-galaxy properties from the SDSS-SNS data release such as host stellar mass and star formation rate. We confirm the well-known relation between HR and host-galaxy mass and find a 3.6 σ significance of a nonzero linear slope. We also recover correlations between HR and host-galaxy gas-phase metallicity and specific star formation rate as they are reported in the literature. With our large data set, we examine correlations between HR and multiple host-galaxy properties simultaneously and find no evidence of a significant correlation. We also independently analyze our spectroscopically confirmed and photometrically classified SNe Ia and comment on the significance of similar combined data sets for future surveys.

  20. Classifier-guided sampling for discrete variable, discontinuous design space exploration: Convergence and computational performance

    Energy Technology Data Exchange (ETDEWEB)

    Backlund, Peter B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shahan, David W. [HRL Labs., LLC, Malibu, CA (United States); Seepersad, Carolyn Conner [Univ. of Texas, Austin, TX (United States)

    2014-04-22

    A classifier-guided sampling (CGS) method is introduced for solving engineering design optimization problems with discrete and/or continuous variables and continuous and/or discontinuous responses. The method merges concepts from metamodel-guided sampling and population-based optimization algorithms. The CGS method uses a Bayesian network classifier for predicting the performance of new designs based on a set of known observations or training points. Unlike most metamodeling techniques, however, the classifier assigns a categorical class label to a new design, rather than predicting the resulting response in continuous space, and thereby accommodates nondifferentiable and discontinuous functions of discrete or categorical variables. The CGS method uses these classifiers to guide a population-based sampling process towards combinations of discrete and/or continuous variable values with a high probability of yielding preferred performance. Accordingly, the CGS method is appropriate for discrete/discontinuous design problems that are ill-suited for conventional metamodeling techniques and too computationally expensive to be solved by population-based algorithms alone. In addition, the rates of convergence and computational properties of the CGS method are investigated when applied to a set of discrete variable optimization problems. Results show that the CGS method significantly improves the rate of convergence towards known global optima, on average, when compared to genetic algorithms.

  1. A survey of attitudes and factors associated with successful cardiopulmonary resuscitation (CPR knowledge transfer in an older population most likely to witness cardiac arrest: design and methodology

    Directory of Open Access Journals (Sweden)

    Brehaut Jamie C

    2008-11-01

    Full Text Available Abstract Background Overall survival rates for out-of-hospital cardiac arrest rarely exceed 5%. While bystander cardiopulmonary resuscitation (CPR can increase survival for cardiac arrest victims by up to four times, bystander CPR rates remain low in Canada (15%. Most cardiac arrest victims are men in their sixties, they usually collapse in their own home (85% and the event is witnessed 50% of the time. These statistics would appear to support a strategy of targeted CPR training for an older population that is most likely to witness a cardiac arrest event. However, interest in CPR training appears to decrease with advancing age. Behaviour surrounding CPR training and performance has never been studied using well validated behavioural theories. Methods/Design The overall goal of this study is to conduct a survey to better understand the behavioural factors influencing CPR training and performance in men and women 55 years of age and older. The study will proceed in three phases. In phase one, semi-structured qualitative interviews will be conducted and recorded to identify common categories and themes regarding seeking CPR training and providing CPR to a cardiac arrest victim. The themes identified in the first phase will be used in phase two to develop, pilot-test, and refine a survey instrument based upon the Theory of Planned Behaviour. In the third phase of the project, the final survey will be administered to a sample of the study population over the telephone. Analyses will include measures of sampling bias, reliability of the measures, construct validity, as well as multiple regression analyses to identify constructs and beliefs most salient to seniors' decisions about whether to attend CPR classes or perform CPR on a cardiac arrest victim. Discussion The results of this survey will provide valuable insight into factors influencing the interest in CPR training and performance among a targeted group of individuals most susceptible to

  2. Requirements and concept design for large earth survey telescope for SEOS

    Science.gov (United States)

    Mailhot, P.; Bisbee, J.

    1975-01-01

    The efforts of a one year program of Requirements Analysis and Conceptual Design for the Large Earth Survey Telescope for the Synchronous Earth Observatory Satellite is summarized. A 1.4 meter aperture Cassegrain telescope with 0.6 deg field of view is shown to do an excellent job in satisfying the observational requirements for a wide range of earth resources and meteorological applications. The telescope provides imagery or thermal mapping in ten spectral bands at one time in a field sharing grouping of linear detector arrays. Pushbroom scanning is accomplished by spacecraft slew.

  3. Designing a two-rank acceptance sampling plan for quality inspection of geospatial data products

    Science.gov (United States)

    Tong, Xiaohua; Wang, Zhenhua; Xie, Huan; Liang, Dan; Jiang, Zuoqin; Li, Jinchao; Li, Jun

    2011-10-01

    To address the disadvantages of classical sampling plans designed for traditional industrial products, we originally propose a two-rank acceptance sampling plan (TRASP) for the inspection of geospatial data outputs based on the acceptance quality level (AQL). The first rank sampling plan is to inspect the lot consisting of map sheets, and the second is to inspect the lot consisting of features in an individual map sheet. The TRASP design is formulated as an optimization problem with respect to sample size and acceptance number, which covers two lot size cases. The first case is for a small lot size with nonconformities being modeled by a hypergeometric distribution function, and the second is for a larger lot size with nonconformities being modeled by a Poisson distribution function. The proposed TRASP is illustrated through two empirical case studies. Our analysis demonstrates that: (1) the proposed TRASP provides a general approach for quality inspection of geospatial data outputs consisting of non-uniform items and (2) the proposed acceptance sampling plan based on TRASP performs better than other classical sampling plans. It overcomes the drawbacks of percent sampling, i.e., "strictness for large lot size, toleration for small lot size," and those of a national standard used specifically for industrial outputs, i.e., "lots with different sizes corresponding to the same sampling plan."

  4. Injury survey of a non-traditional 'soft-edged' trampoline designed to lower equipment hazards.

    Science.gov (United States)

    Eager, David B; Scarrott, Carl; Nixon, Jim; Alexander, Keith

    2013-01-01

    In Australia trampolines contribute one quarter of all childhood play equipment injuries. The objective of this study was to gather and evaluate injury data from a non-traditional, 'soft-edged', consumer trampoline, where the design aimed to minimise injuries from the equipment and from falling off. The manufacturer of the non-traditional trampoline provided the University of Technology Sydney with their Australian customer database. The study involved surveys in Queensland and New South Wales, between May 2007 and March 2010. Initially injury data was gathered by a phone interview pilot study, then in the full study, through an email survey. The 3817 respondents were the carers of child users of the 'soft-edge' trampolines. Responses were compared with Australian and US emergency department data. In both countries the proportion of injuries caused by the equipment and falling off was compared with the proportion caused by the jumpers to themselves or each other. The comparisons showed a significantly lower proportion resulted from falling-off or hitting the equipment for this design when compared to traditional trampolines, both in Australia and the US. This research concludes that equipment-induced and falling-off injuries, the more severe injuries on traditional trampolines, can be significantly reduced with appropriate trampoline design.

  5. Japanese structure survey of radiation oncology in 2007 with special reference to designated cancer care hospitals

    International Nuclear Information System (INIS)

    Numasaki, Hodaka; Shibuya, Hitoshi; Nishio, Masamichi

    2011-01-01

    Background and Purpose: The structure of radiation oncology in designated cancer care hospitals in Japan was investigated in terms of equipment, personnel, patient load, and geographic distribution. The effect of changes in the health care policy in Japan on radiotherapy structure was also examined. Material and Methods: The Japanese Society of Therapeutic Radiology and Oncology surveyed the national structure of radiation oncology in 2007. The structures of 349 designated cancer care hospitals and 372 other radiotherapy facilities were compared. Results: Respective findings for equipment and personnel at designated cancer care hospitals and other facilities included the following: linear accelerators/facility: 1.3 and 1.0; annual patients/linear accelerator: 296.5 and 175.0; and annual patient load/full-time equivalent radiation oncologist was 237.0 and 273.3, respectively. Geographically, the number of designated cancer care hospitals was associated with population size. Conclusion: The structure of radiation oncology in Japan in terms of equipment, especially for designated cancer care hospitals, was as mature as that in European countries and the United States, even though the medical costs in relation to GDP in Japan are lower. There is still a shortage of manpower. The survey data proved to be important to fully understand the radiation oncology medical care system in Japan. (orig.)

  6. Fast Bayesian experimental design: Laplace-based importance sampling for the expected information gain

    KAUST Repository

    Beck, Joakim

    2018-02-19

    In calculating expected information gain in optimal Bayesian experimental design, the computation of the inner loop in the classical double-loop Monte Carlo requires a large number of samples and suffers from underflow if the number of samples is small. These drawbacks can be avoided by using an importance sampling approach. We present a computationally efficient method for optimal Bayesian experimental design that introduces importance sampling based on the Laplace method to the inner loop. We derive the optimal values for the method parameters in which the average computational cost is minimized for a specified error tolerance. We use three numerical examples to demonstrate the computational efficiency of our method compared with the classical double-loop Monte Carlo, and a single-loop Monte Carlo method that uses the Laplace approximation of the return value of the inner loop. The first demonstration example is a scalar problem that is linear in the uncertain parameter. The second example is a nonlinear scalar problem. The third example deals with the optimal sensor placement for an electrical impedance tomography experiment to recover the fiber orientation in laminate composites.

  7. Fast Bayesian experimental design: Laplace-based importance sampling for the expected information gain

    Science.gov (United States)

    Beck, Joakim; Dia, Ben Mansour; Espath, Luis F. R.; Long, Quan; Tempone, Raúl

    2018-06-01

    In calculating expected information gain in optimal Bayesian experimental design, the computation of the inner loop in the classical double-loop Monte Carlo requires a large number of samples and suffers from underflow if the number of samples is small. These drawbacks can be avoided by using an importance sampling approach. We present a computationally efficient method for optimal Bayesian experimental design that introduces importance sampling based on the Laplace method to the inner loop. We derive the optimal values for the method parameters in which the average computational cost is minimized according to the desired error tolerance. We use three numerical examples to demonstrate the computational efficiency of our method compared with the classical double-loop Monte Carlo, and a more recent single-loop Monte Carlo method that uses the Laplace method as an approximation of the return value of the inner loop. The first example is a scalar problem that is linear in the uncertain parameter. The second example is a nonlinear scalar problem. The third example deals with the optimal sensor placement for an electrical impedance tomography experiment to recover the fiber orientation in laminate composites.

  8. Infall and outflow motions towards a sample of massive star-forming regions from the RMS survey

    Science.gov (United States)

    Cunningham, N.; Lumsden, S. L.; Moore, T. J. T.; Maud, L. T.; Mendigutía, I.

    2018-06-01

    We present the results of an outflow and infall survey towards a distance-limited sample of 31 massive star-forming regions drawn from the Red MSX source (RMS) survey. The presence of young, active outflows is identified from SiO (8-7) emission and the infall dynamics are explored using HCO+/H13CO+ (4-3) emission. We investigate if the infall and outflow parameters vary with source properties, exploring whether regions hosting potentially young active outflows show similarities or differences with regions harbouring more evolved, possibly momentum-driven, `fossil' outflows. SiO emission is detected towards approximately 46 per cent of the sources. When considering sources with and without an SiO detection (i.e. potentially active and fossil outflows, respectively), only the 12CO outflow velocity shows a significant difference between samples, indicating SiO is more prevalent towards sources with higher outflow velocities. Furthermore, we find the SiO luminosity increases as a function of the Herschel 70 μm to WISE 22 μm flux ratio, suggesting the production of SiO is prevalent in younger, more embedded regions. Similarly, we find tentative evidence that sources with an SiO detection have a smaller bolometric luminosity-to-mass ratio, indicating SiO (8-7) emission is associated with potentially younger regions. We do not find a prevalence towards sources displaying signatures of infall in our sample. However, the higher energy HCO+ transitions may not be the best suited tracer of infall at this spatial resolution in these regions.

  9. Single-subject withdrawal designs in delayed matching-to-sample procedures

    OpenAIRE

    Eilifsen, Christoffer; Arntzen, Erik

    2011-01-01

    In most studies of delayed matching-to-sample (DMTS) and stimulus equivalence, the delay has remained fixed throughout a single experimental condition. We wanted to expand on the DMTS and stimulus equivalence literature by examining the effects of using titrating delays with different starting points during the establishment of conditional discriminations prerequisite for stimulus equivalence. In Experiment 1, a variation of a single-subject withdrawal design was used. Ten adults were exposed...

  10. The Unique Optical Design of the CTI-II Survey Telescope

    Science.gov (United States)

    Ackermann, Mark R.; McGraw, J. T.; MacFarlane, M.

    2006-12-01

    The CCD/Transit Instrument with Innovative Instrumentation (CTI-II) is being developed for precision ground-based astrometric and photometric astronomical observations. The 1.8m telescope will be stationary, near-zenith pointing and will feature a CCD-mosaic array operated in time-delay and integrate (TDI) mode to image a continuous strip of the sky in five bands. The heart of the telescope is a Nasmyth-like bent-Cassegrain optical system optimized to produce near diffraction-limited images with near zero distortion over a circular1.42 deg field. The optical design includes an f/2.2 parabolic ULE primary with no central hole salvaged from the original CTI telescope and adds the requisite hyperbolic secondary, a folding flat and a highly innovative all-spherical, five lens corrector which includes three plano surfaces. The reflective and refractive portions of the design have been optimized as individual but interdependent systems so that the same reflective system can be used with slightly different refractive correctors. At present, two nearly identical corrector designs are being evaluated, one fabricated from BK-7 glass and the other of fused silica. The five lens corrector consists of an air-spaced triplet separated from follow-on air-spaced doublet. Either design produces 0.25 arcsecond images at 83% encircled energy with a maximum of 0.0005% distortion. The innovative five lens corrector design has been applied to other current and planned Cassegrain, RC and super RC optical systems requiring correctors. The basic five lens approach always results in improved performance compared to the original designs. In some cases, the improvement in image quality is small but includes substantial reductions in distortion. In other cases, the improvement in image quality is substantial. Because the CTI-II corrector is designed for a parabolic primary, it might be especially useful for liquid mirror telescopes. We describe and discuss the CTI-II optical design with respect

  11. Designing efficient nitrous oxide sampling strategies in agroecosystems using simulation models

    Science.gov (United States)

    Debasish Saha; Armen R. Kemanian; Benjamin M. Rau; Paul R. Adler; Felipe Montes

    2017-01-01

    Annual cumulative soil nitrous oxide (N2O) emissions calculated from discrete chamber-based flux measurements have unknown uncertainty. We used outputs from simulations obtained with an agroecosystem model to design sampling strategies that yield accurate cumulative N2O flux estimates with a known uncertainty level. Daily soil N2O fluxes were simulated for Ames, IA (...

  12. An Alternative View of Some FIA Sample Design and Analysis Issues

    Science.gov (United States)

    Paul C. Van Deusen

    2005-01-01

    Sample design and analysis decisions are the result of compromises and inputs from many sources. The end result would likely change if different individuals or groups were involved in the planning process. Discussed here are some alternatives to the procedures that are currently being used for the annual inventory. The purpose is to indicate that alternatives exist and...

  13. Dimensions of design space: a decision-theoretic approach to optimal research design.

    Science.gov (United States)

    Conti, Stefano; Claxton, Karl

    2009-01-01

    Bayesian decision theory can be used not only to establish the optimal sample size and its allocation in a single clinical study but also to identify an optimal portfolio of research combining different types of study design. Within a single study, the highest societal payoff to proposed research is achieved when its sample sizes and allocation between available treatment options are chosen to maximize the expected net benefit of sampling (ENBS). Where a number of different types of study informing different parameters in the decision problem could be conducted, the simultaneous estimation of ENBS across all dimensions of the design space is required to identify the optimal sample sizes and allocations within such a research portfolio. This is illustrated through a simple example of a decision model of zanamivir for the treatment of influenza. The possible study designs include: 1) a single trial of all the parameters, 2) a clinical trial providing evidence only on clinical endpoints, 3) an epidemiological study of natural history of disease, and 4) a survey of quality of life. The possible combinations, samples sizes, and allocation between trial arms are evaluated over a range of cost-effectiveness thresholds. The computational challenges are addressed by implementing optimization algorithms to search the ENBS surface more efficiently over such large dimensions.

  14. Trends in CAD education in interior design programs

    OpenAIRE

    Ko, Hye Mi

    1990-01-01

    This research investigated Computer Aided Design (CAD) education in the interior design. program focusing on educators' opinions about creativity aspects including computer application, teaching materials and teaching methods as well as other trends in CAD education. A questionnaire was sent to one hundred eighty-two members of the Interior Design Educators Council (IDEC). A frequency distribution was used on 69 usable returned surveys to describe the sample characteristi...

  15. A UAV-Based Fog Collector Design for Fine-Scale Aerobiological Sampling

    Science.gov (United States)

    Gentry, Diana; Guarro, Marcello; Demachkie, Isabella Siham; Stumfall, Isabel; Dahlgren, Robert P.

    2017-01-01

    Airborne microbes are found throughout the troposphere and into the stratosphere. Knowing how the activity of airborne microorganisms can alter water, carbon, and other geochemical cycles is vital to a full understanding of local and global ecosystems. Just as on the land or in the ocean, atmospheric regions vary in habitability; the underlying geochemical, climatic, and ecological dynamics must be characterized at different scales to be effectively modeled. Most aerobiological studies have focused on a high level: 'How high are airborne microbes found?' and 'How far can they travel?' Most fog and cloud water studies collect from stationary ground stations (point) or along flight transects (1D). To complement and provide context for this data, we have designed a UAV-based modified fog and cloud water collector to retrieve 4D-resolved samples for biological and chemical analysis.Our design uses a passive impacting collector hanging from a rigid rod suspended between two multi-rotor UAVs. The suspension design reduces the effect of turbulence and potential for contamination from the UAV downwash. The UAVs are currently modeled in a leader-follower configuration, taking advantage of recent advances in modular UAVs, UAV swarming, and flight planning.The collector itself is a hydrophobic mesh. Materials including Tyvek, PTFE, nylon, and polypropylene monofilament fabricated via laser cutting, CNC knife, or 3D printing were characterized for droplet collection efficiency using a benchtop atomizer and particle counter. Because the meshes can be easily and inexpensively fabricated, a set can be pre-sterilized and brought to the field for 'hot swapping' to decrease cross-contamination between flight sessions or use as negative controls.An onboard sensor and logging system records the time and location of each sample; when combined with flight tracking data, the samples can be resolved into a 4D volumetric map of the fog bank. Collected samples can be returned to the lab for

  16. Workshop Synthesis: Stated Preference Surveys and Experimental Design, an Audit of the Journey so far and Future Research Perspectives

    DEFF Research Database (Denmark)

    Cherchi, Elisabetta; Hensher, David A.

    2015-01-01

    This paper is a synthesis of the discussions and ideas that were generated during the workshop on “Stated preference surveys and experimental design” at the 2014 Travel Survey Methods Conference in Leura (Australia). The workshop addressed the challenges related to the design and implementation o...

  17. Effort versus Reward: Preparing Samples for Fungal Community Characterization in High-Throughput Sequencing Surveys of Soils.

    Directory of Open Access Journals (Sweden)

    Zewei Song

    Full Text Available Next generation fungal amplicon sequencing is being used with increasing frequency to study fungal diversity in various ecosystems; however, the influence of sample preparation on the characterization of fungal community is poorly understood. We investigated the effects of four procedural modifications to library preparation for high-throughput sequencing (HTS. The following treatments were considered: 1 the amount of soil used in DNA extraction, 2 the inclusion of additional steps (freeze/thaw cycles, sonication, or hot water bath incubation in the extraction procedure, 3 the amount of DNA template used in PCR, and 4 the effect of sample pooling, either physically or computationally. Soils from two different ecosystems in Minnesota, USA, one prairie and one forest site, were used to assess the generality of our results. The first three treatments did not significantly influence observed fungal OTU richness or community structure at either site. Physical pooling captured more OTU richness compared to individual samples, but total OTU richness at each site was highest when individual samples were computationally combined. We conclude that standard extraction kit protocols are well optimized for fungal HTS surveys, but because sample pooling can significantly influence OTU richness estimates, it is important to carefully consider the study aims when planning sampling procedures.

  18. Monitoring oil persistence on beaches : SCAT versus stratified random sampling designs

    International Nuclear Information System (INIS)

    Short, J.W.; Lindeberg, M.R.; Harris, P.M.; Maselko, J.M.; Pella, J.J.; Rice, S.D.

    2003-01-01

    In the event of a coastal oil spill, shoreline clean-up assessment teams (SCAT) commonly rely on visual inspection of the entire affected area to monitor the persistence of the oil on beaches. Occasionally, pits are excavated to evaluate the persistence of subsurface oil. This approach is practical for directing clean-up efforts directly following a spill. However, sampling of the 1989 Exxon Valdez oil spill in Prince William Sound 12 years later has shown that visual inspection combined with pit excavation does not offer estimates of contaminated beach area of stranded oil volumes. This information is needed to statistically evaluate the significance of change with time. Assumptions regarding the correlation of visually-evident surface oil and cryptic subsurface oil are usually not evaluated as part of the SCAT mandate. Stratified random sampling can avoid such problems and could produce precise estimates of oiled area and volume that allow for statistical assessment of major temporal trends and the extent of the impact. The 2001 sampling of the shoreline of Prince William Sound showed that 15 per cent of surface oil occurrences were associated with subsurface oil. This study demonstrates the usefulness of the stratified random sampling method and shows how sampling design parameters impact statistical outcome. Power analysis based on the study results, indicate that optimum power is derived when unnecessary stratification is avoided. It was emphasized that sampling effort should be balanced between choosing sufficient beaches for sampling and the intensity of sampling

  19. Accuracy assessment of the National Forest Inventory map of Mexico: sampling designs and the fuzzy characterization of landscapes

    Directory of Open Access Journals (Sweden)

    Stéphane Couturier

    2009-10-01

    Full Text Available There is no record so far in the literature of a comprehensive method to assess the accuracy of regional scale Land Cover/ Land Use (LCLU maps in the sub-tropical belt. The elevated biodiversity and the presence of highly fragmented classes hamper the use of sampling designs commonly employed in previous assessments of mainly temperate zones. A sampling design for assessing the accuracy of the Mexican National Forest Inventory (NFI map at community level is presented. A pilot study was conducted on the Cuitzeo Lake watershed region covering 400 000 ha of the 2000 Landsat-derived map. Various sampling designs were tested in order to find a trade-off between operational costs, a good spatial distribution of the sample and the inclusion of all scarcely distributed classes (‘rare classes’. A two-stage sampling design where the selection of Primary Sampling Units (PSU was done under separate schemes for commonly and scarcely distributed classes, showed best characteristics. A total of 2 023 punctual secondary sampling units were verified against their NFI map label. Issues regarding the assessment strategy and trends of class confusions are devised.

  20. A two-stage Bayesian design with sample size reestimation and subgroup analysis for phase II binary response trials.

    Science.gov (United States)

    Zhong, Wei; Koopmeiners, Joseph S; Carlin, Bradley P

    2013-11-01

    Frequentist sample size determination for binary outcome data in a two-arm clinical trial requires initial guesses of the event probabilities for the two treatments. Misspecification of these event rates may lead to a poor estimate of the necessary sample size. In contrast, the Bayesian approach that considers the treatment effect to be random variable having some distribution may offer a better, more flexible approach. The Bayesian sample size proposed by (Whitehead et al., 2008) for exploratory studies on efficacy justifies the acceptable minimum sample size by a "conclusiveness" condition. In this work, we introduce a new two-stage Bayesian design with sample size reestimation at the interim stage. Our design inherits the properties of good interpretation and easy implementation from Whitehead et al. (2008), generalizes their method to a two-sample setting, and uses a fully Bayesian predictive approach to reduce an overly large initial sample size when necessary. Moreover, our design can be extended to allow patient level covariates via logistic regression, now adjusting sample size within each subgroup based on interim analyses. We illustrate the benefits of our approach with a design in non-Hodgkin lymphoma with a simple binary covariate (patient gender), offering an initial step toward within-trial personalized medicine. Copyright © 2013 Elsevier Inc. All rights reserved.

  1. Economic Design of Acceptance Sampling Plans in a Two-Stage Supply Chain

    Directory of Open Access Journals (Sweden)

    Lie-Fern Hsu

    2012-01-01

    Full Text Available Supply Chain Management, which is concerned with material and information flows between facilities and the final customers, has been considered the most popular operations strategy for improving organizational competitiveness nowadays. With the advanced development of computer technology, it is getting easier to derive an acceptance sampling plan satisfying both the producer's and consumer's quality and risk requirements. However, all the available QC tables and computer software determine the sampling plan on a noneconomic basis. In this paper, we design an economic model to determine the optimal sampling plan in a two-stage supply chain that minimizes the producer's and the consumer's total quality cost while satisfying both the producer's and consumer's quality and risk requirements. Numerical examples show that the optimal sampling plan is quite sensitive to the producer's product quality. The product's inspection, internal failure, and postsale failure costs also have an effect on the optimal sampling plan.

  2. Optimizing trial design in pharmacogenetics research: comparing a fixed parallel group, group sequential, and adaptive selection design on sample size requirements.

    Science.gov (United States)

    Boessen, Ruud; van der Baan, Frederieke; Groenwold, Rolf; Egberts, Antoine; Klungel, Olaf; Grobbee, Diederick; Knol, Mirjam; Roes, Kit

    2013-01-01

    Two-stage clinical trial designs may be efficient in pharmacogenetics research when there is some but inconclusive evidence of effect modification by a genomic marker. Two-stage designs allow to stop early for efficacy or futility and can offer the additional opportunity to enrich the study population to a specific patient subgroup after an interim analysis. This study compared sample size requirements for fixed parallel group, group sequential, and adaptive selection designs with equal overall power and control of the family-wise type I error rate. The designs were evaluated across scenarios that defined the effect sizes in the marker positive and marker negative subgroups and the prevalence of marker positive patients in the overall study population. Effect sizes were chosen to reflect realistic planning scenarios, where at least some effect is present in the marker negative subgroup. In addition, scenarios were considered in which the assumed 'true' subgroup effects (i.e., the postulated effects) differed from those hypothesized at the planning stage. As expected, both two-stage designs generally required fewer patients than a fixed parallel group design, and the advantage increased as the difference between subgroups increased. The adaptive selection design added little further reduction in sample size, as compared with the group sequential design, when the postulated effect sizes were equal to those hypothesized at the planning stage. However, when the postulated effects deviated strongly in favor of enrichment, the comparative advantage of the adaptive selection design increased, which precisely reflects the adaptive nature of the design. Copyright © 2013 John Wiley & Sons, Ltd.

  3. Survey of perceived influence of the conceptual design model of interactive television advertising towards impulse purchase tendency

    Science.gov (United States)

    Sarif, Siti Mahfuzah; Omar, Azizah Che; Shiratuddin, Norshuhada

    2016-08-01

    With the proliferation of technology assisted shopping, there is growing evidence that impulse buying is an emerging phenomenon, which has been the focus of this study. Literatures indicate that studies related to impulse purchase for interactive television (iTV) advertising are highly scarce. It was found that most of the existing impulse purchase elements are mainly focusing on traditional retail store, website advertising, and traditional TV advertising, but not on iTV advertising. Due to that, through a systematic process, a design model for developing iTV advertising with influence towards impulse purchase tendency was developed and tested in this study. The design model is named as iTVAdIP and comprises of three main components; technology, impulse purchase components, and development process. This paper describes the survey, which measures the influence of iTVAdIP design model towards impulse purchase tendency. 37 potential advertising designers were involved in the survey. The results indicate that the iTVAdIP is practical and workable in developing iTV advertisement that could influence consumer to buy the advertised product.

  4. A free-knot spline modeling framework for piecewise linear logistic regression in complex samples with body mass index and mortality as an example

    Directory of Open Access Journals (Sweden)

    Scott W. Keith

    2014-09-01

    Full Text Available This paper details the design, evaluation, and implementation of a framework for detecting and modeling nonlinearity between a binary outcome and a continuous predictor variable adjusted for covariates in complex samples. The framework provides familiar-looking parameterizations of output in terms of linear slope coefficients and odds ratios. Estimation methods focus on maximum likelihood optimization of piecewise linear free-knot splines formulated as B-splines. Correctly specifying the optimal number and positions of the knots improves the model, but is marked by computational intensity and numerical instability. Our inference methods utilize both parametric and nonparametric bootstrapping. Unlike other nonlinear modeling packages, this framework is designed to incorporate multistage survey sample designs common to nationally representative datasets. We illustrate the approach and evaluate its performance in specifying the correct number of knots under various conditions with an example using body mass index (BMI; kg/m2 and the complex multi-stage sampling design from the Third National Health and Nutrition Examination Survey to simulate binary mortality outcomes data having realistic nonlinear sample-weighted risk associations with BMI. BMI and mortality data provide a particularly apt example and area of application since BMI is commonly recorded in large health surveys with complex designs, often categorized for modeling, and nonlinearly related to mortality. When complex sample design considerations were ignored, our method was generally similar to or more accurate than two common model selection procedures, Schwarz’s Bayesian Information Criterion (BIC and Akaike’s Information Criterion (AIC, in terms of correctly selecting the correct number of knots. Our approach provided accurate knot selections when complex sampling weights were incorporated, while AIC and BIC were not effective under these conditions.

  5. Ochratoxin A in raisins and currants: basic extraction procedure used in two small marketing surveys of the occurrence and control of the heterogeneity of the toxins in samples.

    Science.gov (United States)

    Möller, T E; Nyberg, M

    2003-11-01

    A basic extraction procedure for analysis of ochratoxin A (OTA) in currants and raisins is described, as well as the occurrence of OTA and a control of heterogeneity of the toxin in samples bought for two small marketing surveys 1999/2000 and 2001/02. Most samples in the surveys were divided into two subsamples that were individually prepared as slurries and analysed separately. The limit of quantification for the method was estimated as 0.1 microg kg(-1) and recoveries of 85, 90 and 115% were achieved in recovery experiments at 10, 5 and 0.1 microg kg(-1), respectively. Of all 118 subsamples analysed in the surveys, 96 (84%) contained ochratoxin A at levels above the quantification level and five samples (4%) contained more than the European Community legislation of 10 microg kg(-1). The OTA concentrations found in the first survey were in the range Big differences were often achieved between individual subsamples of the original sample, which indicate a wide heterogeneous distribution of the toxin. Data from the repeatability test as well as recovery experiments from the same slurries showed that preparation of slurries as described here seemed to give a homogeneous and representative sample. The extraction with the basic sodium bicarbonate-methanol mixture used in the surveys gave similar or somewhat higher OTA values on some samples tested in a comparison with a weak phosphoric acid water-methanol extraction mixture.

  6. Design-based estimators for snowball sampling

    OpenAIRE

    Shafie, Termeh

    2010-01-01

    Snowball sampling, where existing study subjects recruit further subjects from amongtheir acquaintances, is a popular approach when sampling from hidden populations.Since people with many in-links are more likely to be selected, there will be a selectionbias in the samples obtained. In order to eliminate this bias, the sample data must beweighted. However, the exact selection probabilities are unknown for snowball samplesand need to be approximated in an appropriate way. This paper proposes d...

  7. Evaluation of carbon monoxide in blood samples from the second health and nutrition survey. Progress report No. 1

    Energy Technology Data Exchange (ETDEWEB)

    Radford, E.P.

    1976-01-01

    This is a study of carbon monoxide (CO) in the blood of human subjects participating in the Second National Health and Nutrition Survey (HANES II), a detailed study of health indicators in sample populations of many communities throughout the U.S. The purpose of this aspect of the survey is to evaluate the levels of blood carboxyhemoglobin in normal individuals of all ages in typical U.S. communities, from whom accurate histories and clinical studies are available. This report gives results of the first of three years of analyses. A careful calibration of the analytical method has been completed, and more than 3000 blood samples have been analyzed. Although smoking histories are not yet available to permit evaluation of carboxyhemoglobin in non-smokers, in children under 12 years of age, blood COHb has been found to be consistently low, with less than 3% greater than 1.5% COHb. These preliminary results suggest that urban exposure to carbon monoxide among the general population is not now significant in the U.S., at least during the period of these early examinations.

  8. Sampling and energy evaluation challenges in ligand binding protein design.

    Science.gov (United States)

    Dou, Jiayi; Doyle, Lindsey; Jr Greisen, Per; Schena, Alberto; Park, Hahnbeom; Johnsson, Kai; Stoddard, Barry L; Baker, David

    2017-12-01

    The steroid hormone 17α-hydroxylprogesterone (17-OHP) is a biomarker for congenital adrenal hyperplasia and hence there is considerable interest in development of sensors for this compound. We used computational protein design to generate protein models with binding sites for 17-OHP containing an extended, nonpolar, shape-complementary binding pocket for the four-ring core of the compound, and hydrogen bonding residues at the base of the pocket to interact with carbonyl and hydroxyl groups at the more polar end of the ligand. Eight of 16 designed proteins experimentally tested bind 17-OHP with micromolar affinity. A co-crystal structure of one of the designs revealed that 17-OHP is rotated 180° around a pseudo-two-fold axis in the compound and displays multiple binding modes within the pocket, while still interacting with all of the designed residues in the engineered site. Subsequent rounds of mutagenesis and binding selection improved the ligand affinity to nanomolar range, while appearing to constrain the ligand to a single bound conformation that maintains the same "flipped" orientation relative to the original design. We trace the discrepancy in the design calculations to two sources: first, a failure to model subtle backbone changes which alter the distribution of sidechain rotameric states and second, an underestimation of the energetic cost of desolvating the carbonyl and hydroxyl groups of the ligand. The difference between design model and crystal structure thus arises from both sampling limitations and energy function inaccuracies that are exacerbated by the near two-fold symmetry of the molecule. © 2017 The Authors Protein Science published by Wiley Periodicals, Inc. on behalf of The Protein Society.

  9. Regional soil erosion assessment based on a sample survey and geostatistics

    Science.gov (United States)

    Yin, Shuiqing; Zhu, Zhengyuan; Wang, Li; Liu, Baoyuan; Xie, Yun; Wang, Guannan; Li, Yishan

    2018-03-01

    Soil erosion is one of the most significant environmental problems in China. From 2010 to 2012, the fourth national census for soil erosion sampled 32 364 PSUs (Primary Sampling Units, small watersheds) with the areas of 0.2-3 km2. Land use and soil erosion controlling factors including rainfall erosivity, soil erodibility, slope length, slope steepness, biological practice, engineering practice, and tillage practice for the PSUs were surveyed, and the soil loss rate for each land use in the PSUs was estimated using an empirical model, the Chinese Soil Loss Equation (CSLE). Though the information collected from the sample units can be aggregated to estimate soil erosion conditions on a large scale; the problem of estimating soil erosion condition on a regional scale has not been addressed well. The aim of this study is to introduce a new model-based regional soil erosion assessment method combining a sample survey and geostatistics. We compared seven spatial interpolation models based on the bivariate penalized spline over triangulation (BPST) method to generate a regional soil erosion assessment from the PSUs. Shaanxi Province (3116 PSUs) in China was selected for the comparison and assessment as it is one of the areas with the most serious erosion problem. Ten-fold cross-validation based on the PSU data showed the model assisted by the land use, rainfall erosivity factor (R), soil erodibility factor (K), slope steepness factor (S), and slope length factor (L) derived from a 1 : 10 000 topography map is the best one, with the model efficiency coefficient (ME) being 0.75 and the MSE being 55.8 % of that for the model assisted by the land use alone. Among four erosion factors as the covariates, the S factor contributed the most information, followed by K and L factors, and R factor made almost no contribution to the spatial estimation of soil loss. The LS factor derived from 30 or 90 m Shuttle Radar Topography Mission (SRTM) digital elevation model (DEM) data

  10. "Best Practices in Using Large, Complex Samples: The Importance of Using Appropriate Weights and Design Effect Compensation"

    Directory of Open Access Journals (Sweden)

    Jason W. Osborne

    2011-09-01

    Full Text Available Large surveys often use probability sampling in order to obtain representative samples, and these data sets are valuable tools for researchers in all areas of science. Yet many researchers are not formally prepared to appropriately utilize these resources. Indeed, users of one popular dataset were generally found not to have modeled the analyses to take account of the complex sample (Johnson & Elliott, 1998 even when publishing in highly-regarded journals. It is well known that failure to appropriately model the complex sample can substantially bias the results of the analysis. Examples presented in this paper highlight the risk of error of inference and mis-estimation of parameters from failure to analyze these data sets appropriately.

  11. The Modular Optical Underwater Survey System

    Directory of Open Access Journals (Sweden)

    Ruhul Amin

    2017-10-01

    Full Text Available The Pacific Islands Fisheries Science Center deploys the Modular Optical Underwater Survey System (MOUSS to estimate the species-specific, size-structured abundance of commercially-important fish species in Hawaii and the Pacific Islands. The MOUSS is an autonomous stereo-video camera system designed for the in situ visual sampling of fish assemblages. This system is rated to 500 m and its low-light, stereo-video cameras enable identification, counting, and sizing of individuals at a range of 0.5–10 m. The modular nature of MOUSS allows for the efficient and cost-effective use of various imaging sensors, power systems, and deployment platforms. The MOUSS is in use for surveys in Hawaii, the Gulf of Mexico, and Southern California. In Hawaiian waters, the system can effectively identify individuals to a depth of 250 m using only ambient light. In this paper, we describe the MOUSS’s application in fisheries research, including the design, calibration, analysis techniques, and deployment mechanism.

  12. Sampling plan design and analysis for a low level radioactive waste disposal program

    International Nuclear Information System (INIS)

    Hassig, N.L.; Wanless, J.W.

    1989-01-01

    Low-level wastes that are candidates for BRC (below regulatory concern) disposal must be subjected to an extensive monitoring program to insure the wastes meet (potential) bulk property and contamination concentration BRC criteria for disposal. This paper addresses the statistical implications of using various methods to verify BRC criteria. While surface and volumetric monitoring each have their advantages and disadvantages, a dual, sequential monitoring process is the preferred choice from a statistical reliability perspective. With dual monitoring, measurements on the contamination are verifiable, and sufficient to allow for a complete characterization of the wastes. As these characterizations become more reliable and stable, something less than 100% sampling may be possible for release of wastes for BRC disposal. This paper provides a survey of the issues involved in the selection of a monitoring and sampling program for the disposal of BRC wastes

  13. Utilizing the Total Design Method in medicine: maximizing response rates in long, non-incentivized, personal questionnaire postal surveys.

    Science.gov (United States)

    Kazzazi, Fawz; Haggie, Rebecca; Forouhi, Parto; Kazzazi, Nazar; Malata, Charles M

    2018-01-01

    Maximizing response rates in questionnaires can improve their validity and quality by reducing non-response bias. A comprehensive analysis is essential for producing reasonable conclusions in patient-reported outcome research particularly for topics of a sensitive nature. This often makes long (≥7 pages) questionnaires necessary but these have been shown to reduce response rates in mail surveys. Our work adapted the "Total Design Method," initially produced for commercial markets, to raise response rates in a long (total: 11 pages, 116 questions), non-incentivized, very personal postal survey sent to almost 350 women. A total of 346 women who had undergone mastectomy and immediate breast reconstruction from 2008-2014 (inclusive) at Addenbrooke's University Hospital were sent our study pack (Breast-Q satisfaction questionnaire and support documents) using our modified "Total Design Method." Participants were sent packs and reminders according to our designed schedule. Of the 346 participants, we received 258 responses, an overall response rate of 74.5% with a useable response rate of 72.3%. One hundred and six responses were received before the week 1 reminder (30.6%), 120 before week 3 (34.6%), 225 before the week 7 reminder (64.6%) and the remainder within 3 weeks of the final pack being sent. The median age of patients that the survey was sent to, and the median age of the respondents, was 54 years. In this study, we have demonstrated the successful implementation of a novel approach to postal surveys. Despite the length of the questionnaire (nine pages, 116 questions) and limitations of expenses to mail a survey to ~350 women, we were able to attain a response rate of 74.6%.

  14. Adaptive clinical trial designs with pre-specified rules for modifying the sample size: understanding efficient types of adaptation.

    Science.gov (United States)

    Levin, Gregory P; Emerson, Sarah C; Emerson, Scott S

    2013-04-15

    Adaptive clinical trial design has been proposed as a promising new approach that may improve the drug discovery process. Proponents of adaptive sample size re-estimation promote its ability to avoid 'up-front' commitment of resources, better address the complicated decisions faced by data monitoring committees, and minimize accrual to studies having delayed ascertainment of outcomes. We investigate aspects of adaptation rules, such as timing of the adaptation analysis and magnitude of sample size adjustment, that lead to greater or lesser statistical efficiency. Owing in part to the recent Food and Drug Administration guidance that promotes the use of pre-specified sampling plans, we evaluate alternative approaches in the context of well-defined, pre-specified adaptation. We quantify the relative costs and benefits of fixed sample, group sequential, and pre-specified adaptive designs with respect to standard operating characteristics such as type I error, maximal sample size, power, and expected sample size under a range of alternatives. Our results build on others' prior research by demonstrating in realistic settings that simple and easily implemented pre-specified adaptive designs provide only very small efficiency gains over group sequential designs with the same number of analyses. In addition, we describe optimal rules for modifying the sample size, providing efficient adaptation boundaries on a variety of scales for the interim test statistic for adaptation analyses occurring at several different stages of the trial. We thus provide insight into what are good and bad choices of adaptive sampling plans when the added flexibility of adaptive designs is desired. Copyright © 2012 John Wiley & Sons, Ltd.

  15. Visual Sample Plan Version 7.0 User's Guide

    Energy Technology Data Exchange (ETDEWEB)

    Matzke, Brett D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Newburn, Lisa LN [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hathaway, John E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bramer, Lisa M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wilson, John E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dowson, Scott T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sego, Landon H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Pulsipher, Brent A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-03-01

    User's guide for VSP 7.0 This user's guide describes Visual Sample Plan (VSP) Version 7.0 and provides instructions for using the software. VSP selects the appropriate number and location of environmental samples to ensure that the results of statistical tests performed to provide input to risk decisions have the required confidence and performance. VSP Version 7.0 provides sample-size equations or algorithms needed by specific statistical tests appropriate for specific environmental sampling objectives. It also provides data quality assessment and statistical analysis functions to support evaluation of the data and determine whether the data support decisions regarding sites suspected of contamination. The easy-to-use program is highly visual and graphic. VSP runs on personal computers with Microsoft Windows operating systems (XP, Vista, Windows 7, and Windows 8). Designed primarily for project managers and users without expertise in statistics, VSP is applicable to two- and three-dimensional populations to be sampled (e.g., rooms and buildings, surface soil, a defined layer of subsurface soil, water bodies, and other similar applications) for studies of environmental quality. VSP is also applicable for designing sampling plans for assessing chem/rad/bio threat and hazard identification within rooms and buildings, and for designing geophysical surveys for unexploded ordnance (UXO) identification.

  16. High School and Beyond Transcripts Survey (1982). Data File User's Manual. Contractor Report.

    Science.gov (United States)

    Jones, Calvin; And Others

    This data file user's manual documents the procedures used to collect and process high school transcripts for a large sample of the younger cohort (1980 sophomores) in the High School and Beyond survey. The manual provides the user with the technical assistance needed to use the computer file and also discusses the following: (1) sample design for…

  17. Optimal design of a lagrangian observing system for hydrodynamic surveys in coastal areas

    Science.gov (United States)

    Cucco, Andrea; Quattrocchi, Giovanni; Antognarelli, Fabio; Satta, Andrea; Maicu, Francesco; Ferrarin, Christian; Umgiesser, Georg

    2014-05-01

    The optimization of ocean observing systems is a pressing need for scientific research. In particular, the improvement of ocean short-term observing networks is achievable by reducing the cost-benefit ratio of the field campaigns and by increasing the quality of measurements. Numerical modeling is a powerful tool for determining the appropriateness of a specific observing system and for optimizing the sampling design. This is particularly true when observations are carried out in coastal areas and lagoons where, the use satellites is prohibitive due to the water shallowness. For such areas, numerical models are the most efficient tool both to provide a preliminary assess of the local physical environment and to make short -term predictions above its change. In this context, a test case experiment was carried out within an enclosed shallow water areas, the Cabras Lagoon (Sardinia, Italy). The aim of the experiment was to explore the optimal design for a field survey based on the use of coastal lagrangian buoys. A three-dimensional hydrodynamic model based on the finite element method (SHYFEM3D, Umgiesser et al., 2004) was implemented to simulate the lagoon water circulation. The model domain extent to the whole Cabras lagoon and to the whole Oristano Gulf, including the surrounding coastal area. Lateral open boundary conditions were provided by the operational ocean model system WMED and only wind forcing, provided by SKIRON atmospheric model (Kallos et al., 1997), was considered as surface boundary conditions. The model was applied to provide a number of ad hoc scenarios and to explore the efficiency of the short-term hydrodynamic survey. A first field campaign was carried out to investigate the lagrangian circulation inside the lagoon under the main wind forcing condition (Mistral wind from North-West). The trajectories followed by the lagrangian buoys and the estimated lagrangian velocities were used to calibrate the model parameters and to validate the

  18. Digital readout alpha survey instrument

    International Nuclear Information System (INIS)

    Jacobs, M.E.

    1976-01-01

    A prototype solid-state digital readout alpha particle survey instrument has been designed and constructed. The meter incorporates a Ludlum alpha scintillator as a detector, digital logic circuits for control and timing, and a Digilin counting module with reflective liquid crystal display. The device is used to monitor alpha radiation from a surface. Sample counts are totalized over 10-second intervals and displayed digitally in counts per minute up to 19,999. Tests over source samples with counts to 15,600 cpm have shown the device to be rapid, versatile and accurate. The instrument can be fabricated in one man-week and requires about $835 in material costs. A complete set of drawings is included

  19. Geochemical drainage surveys for uranium: sampling and analytical methods based on trial surveys in Pennsylvania

    International Nuclear Information System (INIS)

    Rose, A.W.; Keith, M.L.; Suhr, N.H.

    1976-01-01

    Geochemical surveys near sandstone-type uranium prospects in northeastern and north-central Pennsylvania show that the deposits can be detected by carefully planned stream sediment surveys, but not by stream water surveys. Stream waters at single sites changed in U content by x10 to 50 during the 18 months of our studies, and even near known prospects, contain less than 0.2 ppB U most of the time. Uranium extractable from stream sediment by acetic acid--H 2 O 2 provides useful contrast between mineralized and nonmineralized drainages of a square mile or less; total U in sediment does not. High organic material results in increased U content of sediments and must be corrected. Changes in U content of sediment with time reach a maximum of x3 and appear to be of short duration. A sediment of about 200 mi 2 near Jim Thorpe detects anomalies extending over several square miles near known occurrences and a second anomaly about two miles northeast of Penn Haven Jct. A similar survey in Lycoming-Sullivan Counties shows anomalous zones near known prospects of the Beaver Lake area and northwest of Muncy Creek. As, Mn, Pb, and V are enriched in the mineralized zones, and perhaps in surrounding halo zones, but do not appear to be pathfinder elements useful for reconnaissance exploration

  20. Annual Omnibus Survey: A survey of life in Qatar 2014

    OpenAIRE

    Diop, Abdoulaye; Gengler, Justin John; Khan, Mohammad N.; Traugott, Michael; Elawad, Elmogiera Fadlallh; Al Ansari, Majed; Le, Kien T.; El-Maghraby, Engi; Elkassem, Rima Charbaji; Qutteina, Yara; Al Khulaifi, Buthaina; Nasrallah, Catherine; Al Subaey, Mohammed; Mustafa, Semsia Al-Ali; Alqassass, Haneen

    2015-01-01

    This Executive Summary presents the highlights of the 2014 Omnibus survey, the fourth in a series of Omnibus surveys since 2010. The surveys were carried out by the Social and Economic Survey Research Institute (SESRI) of Qatar University. Each Omnibus survey interviews a large and representative sample of Qatari citizens, resident expatriates and laborers. In these surveys, we asked a number of questions covering several topics of importance to Qatari society, including their ...

  1. Hepatitis C bio-behavioural surveys in people who inject drugs-a systematic review of sensitivity to the theoretical assumptions of respondent driven sampling.

    Science.gov (United States)

    Buchanan, Ryan; Khakoo, Salim I; Coad, Jonathan; Grellier, Leonie; Parkes, Julie

    2017-07-11

    New, more effective and better-tolerated therapies for hepatitis C (HCV) have made the elimination of HCV a feasible objective. However, for this to be achieved, it is necessary to have a detailed understanding of HCV epidemiology in people who inject drugs (PWID). Respondent-driven sampling (RDS) can provide prevalence estimates in hidden populations such as PWID. The aims of this systematic review are to identify published studies that use RDS in PWID to measure the prevalence of HCV, and compare each study against the STROBE-RDS checklist to assess their sensitivity to the theoretical assumptions underlying RDS. Searches were undertaken in accordance with PRISMA systematic review guidelines. Included studies were English language publications in peer-reviewed journals, which reported the use of RDS to recruit PWID to an HCV bio-behavioural survey. Data was extracted under three headings: (1) survey overview, (2) survey outcomes, and (3) reporting against selected STROBE-RDS criteria. Thirty-one studies met the inclusion criteria. They varied in scale (range 1-15 survey sites) and the sample sizes achieved (range 81-1000 per survey site) but were consistent in describing the use of standard RDS methods including: seeds, coupons and recruitment incentives. Twenty-seven studies (87%) either calculated or reported the intention to calculate population prevalence estimates for HCV and two used RDS data to calculate the total population size of PWID. Detailed operational and analytical procedures and reporting against selected criteria from the STROBE-RDS checklist varied between studies. There were widespread indications that sampling did not meet the assumptions underlying RDS, which led to two studies being unable to report an estimated HCV population prevalence in at least one survey location. RDS can be used to estimate a population prevalence of HCV in PWID and estimate the PWID population size. Accordingly, as a single instrument, it is a useful tool for

  2. Within-otolith variability in chemical fingerprints: implications for sampling designs and possible environmental interpretation.

    Directory of Open Access Journals (Sweden)

    Antonio Di Franco

    Full Text Available Largely used as a natural biological tag in studies of dispersal/connectivity of fish, otolith elemental fingerprinting is usually analyzed by laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS. LA-ICP-MS produces an elemental fingerprint at a discrete time-point in the life of a fish and can generate data on within-otolith variability of that fingerprint. The presence of within-otolith variability has been previously acknowledged but not incorporated into experimental designs on the presumed, but untested, grounds of both its negligibility compared to among-otolith variability and of spatial autocorrelation among multiple ablations within an otolith. Here, using a hierarchical sampling design of spatial variation at multiple scales in otolith chemical fingerprints for two Mediterranean coastal fishes, we explore: 1 whether multiple ablations within an otolith can be used as independent replicates for significance tests among otoliths, and 2 the implications of incorporating within-otolith variability when assessing spatial variability in otolith chemistry at a hierarchy of spatial scales (different fish, from different sites, at different locations on the Apulian Adriatic coast. We find that multiple ablations along the same daily rings do not necessarily exhibit spatial dependency within the otolith and can be used to estimate residual variability in a hierarchical sampling design. Inclusion of within-otolith measurements reveals that individuals at the same site can show significant variability in elemental uptake. Within-otolith variability examined across the spatial hierarchy identifies differences between the two fish species investigated, and this finding leads to discussion of the potential for within-otolith variability to be used as a marker for fish exposure to stressful conditions. We also demonstrate that a 'cost'-optimal allocation of sampling effort should typically include some level of within

  3. Within-otolith variability in chemical fingerprints: implications for sampling designs and possible environmental interpretation.

    Science.gov (United States)

    Di Franco, Antonio; Bulleri, Fabio; Pennetta, Antonio; De Benedetto, Giuseppe; Clarke, K Robert; Guidetti, Paolo

    2014-01-01

    Largely used as a natural biological tag in studies of dispersal/connectivity of fish, otolith elemental fingerprinting is usually analyzed by laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS). LA-ICP-MS produces an elemental fingerprint at a discrete time-point in the life of a fish and can generate data on within-otolith variability of that fingerprint. The presence of within-otolith variability has been previously acknowledged but not incorporated into experimental designs on the presumed, but untested, grounds of both its negligibility compared to among-otolith variability and of spatial autocorrelation among multiple ablations within an otolith. Here, using a hierarchical sampling design of spatial variation at multiple scales in otolith chemical fingerprints for two Mediterranean coastal fishes, we explore: 1) whether multiple ablations within an otolith can be used as independent replicates for significance tests among otoliths, and 2) the implications of incorporating within-otolith variability when assessing spatial variability in otolith chemistry at a hierarchy of spatial scales (different fish, from different sites, at different locations on the Apulian Adriatic coast). We find that multiple ablations along the same daily rings do not necessarily exhibit spatial dependency within the otolith and can be used to estimate residual variability in a hierarchical sampling design. Inclusion of within-otolith measurements reveals that individuals at the same site can show significant variability in elemental uptake. Within-otolith variability examined across the spatial hierarchy identifies differences between the two fish species investigated, and this finding leads to discussion of the potential for within-otolith variability to be used as a marker for fish exposure to stressful conditions. We also demonstrate that a 'cost'-optimal allocation of sampling effort should typically include some level of within-otolith replication in the

  4. The SAGES Legacy Unifying Globulars and Galaxies survey (SLUGGS): sample definition, methods, and initial results

    Energy Technology Data Exchange (ETDEWEB)

    Brodie, Jean P.; Romanowsky, Aaron J.; Jennings, Zachary G.; Pota, Vincenzo; Kader, Justin; Roediger, Joel C.; Villaume, Alexa; Arnold, Jacob A.; Woodley, Kristin A. [University of California Observatories, 1156 High Street, Santa Cruz, CA 95064 (United States); Strader, Jay [Department of Physics and Astronomy, Michigan State University, East Lansing, MI 48824 (United States); Forbes, Duncan A.; Pastorello, Nicola; Usher, Christopher; Blom, Christina; Kartha, Sreeja S. [Centre for Astrophysics and Supercomputing, Swinburne University, Hawthorn, VIC 3122 (Australia); Foster, Caroline; Spitler, Lee R., E-mail: jbrodie@ucsc.edu [Australian Astronomical Observatory, P.O. Box 915, North Ryde, NSW 1670 (Australia)

    2014-11-20

    We introduce and provide the scientific motivation for a wide-field photometric and spectroscopic chemodynamical survey of nearby early-type galaxies (ETGs) and their globular cluster (GC) systems. The SAGES Legacy Unifying Globulars and GalaxieS (SLUGGS) survey is being carried out primarily with Subaru/Suprime-Cam and Keck/DEIMOS. The former provides deep gri imaging over a 900 arcmin{sup 2} field-of-view to characterize GC and host galaxy colors and spatial distributions, and to identify spectroscopic targets. The NIR Ca II triplet provides GC line-of-sight velocities and metallicities out to typically ∼8 R {sub e}, and to ∼15 R {sub e} in some cases. New techniques to extract integrated stellar kinematics and metallicities to large radii (∼2-3 R {sub e}) are used in concert with GC data to create two-dimensional (2D) velocity and metallicity maps for comparison with simulations of galaxy formation. The advantages of SLUGGS compared with other, complementary, 2D-chemodynamical surveys are its superior velocity resolution, radial extent, and multiple halo tracers. We describe the sample of 25 nearby ETGs, the selection criteria for galaxies and GCs, the observing strategies, the data reduction techniques, and modeling methods. The survey observations are nearly complete and more than 30 papers have so far been published using SLUGGS data. Here we summarize some initial results, including signatures of two-phase galaxy assembly, evidence for GC metallicity bimodality, and a novel framework for the formation of extended star clusters and ultracompact dwarfs. An integrated overview of current chemodynamical constraints on GC systems points to separate, in situ formation modes at high redshifts for metal-poor and metal-rich GCs.

  5. Multidrug resistance among new tuberculosis cases: detecting local variation through lot quality-assurance sampling.

    Science.gov (United States)

    Hedt, Bethany Lynn; van Leth, Frank; Zignol, Matteo; Cobelens, Frank; van Gemert, Wayne; Nhung, Nguyen Viet; Lyepshina, Svitlana; Egwaga, Saidi; Cohen, Ted

    2012-03-01

    Current methodology for multidrug-resistant tuberculosis (MDR TB) surveys endorsed by the World Health Organization provides estimates of MDR TB prevalence among new cases at the national level. On the aggregate, local variation in the burden of MDR TB may be masked. This paper investigates the utility of applying lot quality-assurance sampling to identify geographic heterogeneity in the proportion of new cases with multidrug resistance. We simulated the performance of lot quality-assurance sampling by applying these classification-based approaches to data collected in the most recent TB drug-resistance surveys in Ukraine, Vietnam, and Tanzania. We explored 3 classification systems- two-way static, three-way static, and three-way truncated sequential sampling-at 2 sets of thresholds: low MDR TB = 2%, high MDR TB = 10%, and low MDR TB = 5%, high MDR TB = 20%. The lot quality-assurance sampling systems identified local variability in the prevalence of multidrug resistance in both high-resistance (Ukraine) and low-resistance settings (Vietnam). In Tanzania, prevalence was uniformly low, and the lot quality-assurance sampling approach did not reveal variability. The three-way classification systems provide additional information, but sample sizes may not be obtainable in some settings. New rapid drug-sensitivity testing methods may allow truncated sequential sampling designs and early stopping within static designs, producing even greater efficiency gains. Lot quality-assurance sampling study designs may offer an efficient approach for collecting critical information on local variability in the burden of multidrug-resistant TB. Before this methodology is adopted, programs must determine appropriate classification thresholds, the most useful classification system, and appropriate weighting if unbiased national estimates are also desired.

  6. Importance of sampling design and analysis in animal population studies: a comment on Sergio et al

    Science.gov (United States)

    Kery, M.; Royle, J. Andrew; Schmid, Hans

    2008-01-01

    1. The use of predators as indicators and umbrellas in conservation has been criticized. In the Trentino region, Sergio et al. (2006; hereafter SEA) counted almost twice as many bird species in quadrats located in raptor territories than in controls. However, SEA detected astonishingly few species. We used contemporary Swiss Breeding Bird Survey data from an adjacent region and a novel statistical model that corrects for overlooked species to estimate the expected number of bird species per quadrat in that region. 2. There are two anomalies in SEA which render their results ambiguous. First, SEA detected on average only 6.8 species, whereas a value of 32 might be expected. Hence, they probably overlooked almost 80% of all species. Secondly, the precision of their mean species counts was greater in two-thirds of cases than in the unlikely case that all quadrats harboured exactly the same number of equally detectable species. This suggests that they detected consistently only a biased, unrepresentative subset of species. 3. Conceptually, expected species counts are the product of true species number and species detectability p. Plenty of factors may affect p, including date, hour, observer, previous knowledge of a site and mobbing behaviour of passerines in the presence of predators. Such differences in p between raptor and control quadrats could have easily created the observed effects. Without a method that corrects for such biases, or without quantitative evidence that species detectability was indeed similar between raptor and control quadrats, the meaning of SEA's counts is hard to evaluate. Therefore, the evidence presented by SEA in favour of raptors as indicator species for enhanced levels of biodiversity remains inconclusive. 4. Synthesis and application. Ecologists should pay greater attention to sampling design and analysis in animal population estimation. Species richness estimation means sampling a community. Samples should be representative for the

  7. Trajectory Design Enhancements to Mitigate Risk for the Transiting Exoplanet Survey Satellite (TESS)

    Science.gov (United States)

    Dichmann, Donald; Parker, Joel; Nickel, Craig; Lutz, Stephen

    2016-01-01

    The Transiting Exoplanet Survey Satellite (TESS) will employ a highly eccentric Earth orbit, in 2:1 lunar resonance, which will be reached with a lunar flyby preceded by 3.5 phasing loops. The TESS mission has limited propellant and several constraints on the science orbit and on the phasing loops. Based on analysis and simulation, we have designed the phasing loops to reduce delta-V (DV) and to mitigate risk due to maneuver execution errors. We have automated the trajectory design process and use distributed processing to generate and optimal nominal trajectories; to check constraint satisfaction; and finally to model the effects of maneuver errors to identify trajectories that best meet the mission requirements.

  8. OSIRIS-REx Touch-and-Go (TAG) Mission Design for Asteroid Sample Collection

    Science.gov (United States)

    May, Alexander; Sutter, Brian; Linn, Timothy; Bierhaus, Beau; Berry, Kevin; Mink, Ron

    2014-01-01

    The Origins Spectral Interpretation Resource Identification Security Regolith Explorer (OSIRIS-REx) mission is a NASA New Frontiers mission launching in September 2016 to rendezvous with the near-Earth asteroid Bennu in October 2018. After several months of proximity operations to characterize the asteroid, OSIRIS-REx flies a Touch-And-Go (TAG) trajectory to the asteroid's surface to collect at least 60 g of pristine regolith sample for Earth return. This paper provides mission and flight system overviews, with more details on the TAG mission design and key events that occur to safely and successfully collect the sample. An overview of the navigation performed relative to a chosen sample site, along with the maneuvers to reach the desired site is described. Safety monitoring during descent is performed with onboard sensors providing an option to abort, troubleshoot, and try again if necessary. Sample collection occurs using a collection device at the end of an articulating robotic arm during a brief five second contact period, while a constant force spring mechanism in the arm assists to rebound the spacecraft away from the surface. Finally, the sample is measured quantitatively utilizing the law of conservation of angular momentum, along with qualitative data from imagery of the sampling device. Upon sample mass verification, the arm places the sample into the Stardust-heritage Sample Return Capsule (SRC) for return to Earth in September 2023.

  9. Sampling large landscapes with small-scale stratification-User's Manual

    Science.gov (United States)

    Bart, Jonathan

    2011-01-01

    This manual explains procedures for partitioning a large landscape into plots, assigning the plots to strata, and selecting plots in each stratum to be surveyed. These steps are referred to as the "sampling large landscapes (SLL) process." We assume that users of the manual have a moderate knowledge of ArcGIS and Microsoft ® Excel. The manual is written for a single user but in many cases, some steps will be carried out by a biologist designing the survey and some steps will be carried out by a quantitative assistant. Thus, the manual essentially may be passed back and forth between these users. The SLL process primarily has been used to survey birds, and we refer to birds as subjects of the counts. The process, however, could be used to count any objects. ®

  10. Robotic Irradiated Sample Handling Concept Design in Reactor TRIGA PUSPATI using Simulation Software

    International Nuclear Information System (INIS)

    Mohd Khairulezwan Abdul Manan; Mohd Sabri Minhat; Ridzuan Abdul Mutalib; Zareen Khan Abdul Jalil Khan; Nurfarhana Ayuni Joha

    2015-01-01

    This paper introduces the concept design of an Robotic Irradiated Sample Handling Machine using graphical software application, designed as a general, flexible and open platform to work on robotics. Webots has proven to be a useful tool in many fields of robotics, such as manipulator programming, mobile robots control (wheeled, sub-aquatic and walking robots), distance computation, sensor simulation, collision detection, motion planning and so on. Webots is used as the common interface for all the applications. Some practical cases and application for this concept design are illustrated on the paper to present the possibilities of this simulation software. (author)

  11. Mechanical Design of NESSI: New Mexico Tech Extrasolar Spectroscopic Survey Instrument

    Science.gov (United States)

    Santoro, Fernando G.; Olivares, Andres M.; Salcido, Christopher D.; Jimenez, Stephen R.; Jurgenson, Colby A.; Hrynevych, Michael A.; Creech-Eakman, Michelle J.; Boston, Penny J.; Schmidt, Luke M.; Bloemhard, Heather; hide

    2011-01-01

    NESSI: the New Mexico Tech Extrasolar Spectroscopic Survey Instrument is a ground-based multi-object spectrograph that operates in the near-infrared. It will be installed on one of the Nasmyth ports of the Magdalena Ridge Observatory (MRO) 2.4-meter Telescope sited in the Magdalena Mountains, about 48 km west of Socorro-NM. NESSI operates stationary to the telescope fork so as not to produce differential flexure between internal opto-mechanical components during or between observations. An appropriate mechanical design allows the instrument alignment to be highly repeatable and stable for both short and long observation timescales, within a wide-range of temperature variation. NESSI is optically composed of a field lens, a field de-rotator, re-imaging optics, an auto-guider and a Dewar spectrograph that operates at LN2 temperature. In this paper we report on NESSI's detailed mechanical and opto-mechanical design, and the planning for mechanical construction, assembly, integration and verification.

  12. The Jamaica asthma and allergies national prevalence survey: rationale and methods

    Directory of Open Access Journals (Sweden)

    Edwards Nancy C

    2010-04-01

    Full Text Available Abstract Background Asthma is a significant public health problem in the Caribbean. Prevalence surveys using standardized measures of asthma provide valid prevalence estimates to facilitate regional and international comparisons and monitoring of trends. This paper describes methods used in the Jamaica Asthma and Allergies National Prevalence Survey, challenges associated with this survey and strategies used to overcome these challenges. Methods/Design An island wide, cross-sectional, community-based survey of asthma, asthma symptoms and allergies was done among adults and children using the European Community Respiratory Health Survey Questionnaire for adults and the International Study of Asthma and Allergies in Children. Stratified multi-stage cluster sampling was used to select 2, 163 adults aged 18 years and older and 2, 017 children aged 2-17 years for the survey. The Kish selection table was used to select one adult and one child per household. Data analysis accounted for sampling design and prevalence estimates were weighted to produce national estimates. Discussion The Jamaica Asthma and Allergies National Prevalence Survey is the first population- based survey in the Caribbean to determine the prevalence of asthma and allergies both in adults and children using standardized methods. With response rates exceeding 80% in both groups, this approach facilitated cost-effective gathering of high quality asthma prevalence data that will facilitate international and regional comparison and monitoring of asthma prevalence trends. Another unique feature of this study was the partnership with the Ministry of Health in Jamaica, which ensured the collection of data relevant for decision-making to facilitate the uptake of research evidence. The findings of this study will provide important data on the burden of asthma and allergies in Jamaica and contribute to evidence-informed planning of comprehensive asthma management and education programs.

  13. The outlier sample effects on multivariate statistical data processing geochemical stream sediment survey (Moghangegh region, North West of Iran)

    International Nuclear Information System (INIS)

    Ghanbari, Y.; Habibnia, A.; Memar, A.

    2009-01-01

    In geochemical stream sediment surveys in Moghangegh Region in north west of Iran, sheet 1:50,000, 152 samples were collected and after the analyze and processing of data, it revealed that Yb, Sc, Ni, Li, Eu, Cd, Co, as contents in one sample is far higher than other samples. After detecting this sample as an outlier sample, the effect of this sample on multivariate statistical data processing for destructive effects of outlier sample in geochemical exploration was investigated. Pearson and Spear man correlation coefficient methods and cluster analysis were used for multivariate studies and the scatter plot of some elements together the regression profiles are given in case of 152 and 151 samples and the results are compared. After investigation of multivariate statistical data processing results, it was realized that results of existence of outlier samples may appear as the following relations between elements: - true relation between two elements, which have no outlier frequency in the outlier sample. - false relation between two elements which one of them has outlier frequency in the outlier sample. - complete false relation between two elements which both have outlier frequency in the outlier sample

  14. Evaluation of single and two-stage adaptive sampling designs for estimation of density and abundance of freshwater mussels in a large river

    Science.gov (United States)

    Smith, D.R.; Rogala, J.T.; Gray, B.R.; Zigler, S.J.; Newton, T.J.

    2011-01-01

    Reliable estimates of abundance are needed to assess consequences of proposed habitat restoration and enhancement projects on freshwater mussels in the Upper Mississippi River (UMR). Although there is general guidance on sampling techniques for population assessment of freshwater mussels, the actual performance of sampling designs can depend critically on the population density and spatial distribution at the project site. To evaluate various sampling designs, we simulated sampling of populations, which varied in density and degree of spatial clustering. Because of logistics and costs of large river sampling and spatial clustering of freshwater mussels, we focused on adaptive and non-adaptive versions of single and two-stage sampling. The candidate designs performed similarly in terms of precision (CV) and probability of species detection for fixed sample size. Both CV and species detection were determined largely by density, spatial distribution and sample size. However, designs did differ in the rate that occupied quadrats were encountered. Occupied units had a higher probability of selection using adaptive designs than conventional designs. We used two measures of cost: sample size (i.e. number of quadrats) and distance travelled between the quadrats. Adaptive and two-stage designs tended to reduce distance between sampling units, and thus performed better when distance travelled was considered. Based on the comparisons, we provide general recommendations on the sampling designs for the freshwater mussels in the UMR, and presumably other large rivers.

  15. Cluster-sample surveys and lot quality assurance sampling to evaluate yellow fever immunisation coverage following a national campaign, Bolivia, 2007.

    Science.gov (United States)

    Pezzoli, Lorenzo; Pineda, Silvia; Halkyer, Percy; Crespo, Gladys; Andrews, Nick; Ronveaux, Olivier

    2009-03-01

    To estimate the yellow fever (YF) vaccine coverage for the endemic and non-endemic areas of Bolivia and to determine whether selected districts had acceptable levels of coverage (>70%). We conducted two surveys of 600 individuals (25 x 12 clusters) to estimate coverage in the endemic and non-endemic areas. We assessed 11 districts using lot quality assurance sampling (LQAS). The lot (district) sample was 35 individuals with six as decision value (alpha error 6% if true coverage 70%; beta error 6% if true coverage 90%). To increase feasibility, we divided the lots into five clusters of seven individuals; to investigate the effect of clustering, we calculated alpha and beta by conducting simulations where each cluster's true coverage was sampled from a normal distribution with a mean of 70% or 90% and standard deviations of 5% or 10%. Estimated coverage was 84.3% (95% CI: 78.9-89.7) in endemic areas, 86.8% (82.5-91.0) in non-endemic and 86.0% (82.8-89.1) nationally. LQAS showed that four lots had unacceptable coverage levels. In six lots, results were inconsistent with the estimated administrative coverage. The simulations suggested that the effect of clustering the lots is unlikely to have significantly increased the risk of making incorrect accept/reject decisions. Estimated YF coverage was high. Discrepancies between administrative coverage and LQAS results may be due to incorrect population data. Even allowing for clustering in LQAS, the statistical errors would remain low. Catch-up campaigns are recommended in districts with unacceptable coverage.

  16. Design and characterization of poly(dimethylsiloxane)-based valves for interfacing continuous-flow sampling to microchip electrophoresis.

    Science.gov (United States)

    Li, Michelle W; Huynh, Bryan H; Hulvey, Matthew K; Lunte, Susan M; Martin, R Scott

    2006-02-15

    This work describes the fabrication and evaluation of a poly(dimethyl)siloxane (PDMS)-based device that enables the discrete injection of a sample plug from a continuous-flow stream into a microchannel for subsequent analysis by electrophoresis. Devices were fabricated by aligning valving and flow channel layers followed by plasma sealing the combined layers onto a glass plate that contained fittings for the introduction of liquid sample and nitrogen gas. The design incorporates a reduced-volume pneumatic valve that actuates (on the order of hundreds of milliseconds) to allow analyte from a continuously flowing sampling channel to be injected into a separation channel for electrophoresis. The injector design was optimized to include a pushback channel to flush away stagnant sample associated with the injector dead volume. The effect of the valve actuation time, the pushback voltage, and the sampling stream flow rate on the performance of the device was characterized. Using the optimized design and an injection frequency of 0.64 Hz showed that the injection process is reproducible (RSD of 1.77%, n = 15). Concentration change experiments using fluorescein as the analyte showed that the device could achieve a lag time as small as 14 s. Finally, to demonstrate the potential uses of this device, the microchip was coupled to a microdialysis probe to monitor a concentration change and sample a fluorescein dye mixture.

  17. Sampling design for the Study of Cardiovascular Risks in Adolescents (ERICA

    Directory of Open Access Journals (Sweden)

    Mauricio Teixeira Leite de Vasconcellos

    2015-05-01

    Full Text Available The Study of Cardiovascular Risk in Adolescents (ERICA aims to estimate the prevalence of cardiovascular risk factors and metabolic syndrome in adolescents (12-17 years enrolled in public and private schools of the 273 municipalities with over 100,000 inhabitants in Brazil. The study population was stratified into 32 geographical strata (27 capitals and five sets with other municipalities in each macro-region of the country and a sample of 1,251 schools was selected with probability proportional to size. In each school three combinations of shift (morning and afternoon and grade were selected, and within each of these combinations, one class was selected. All eligible students in the selected classes were included in the study. The design sampling weights were calculated by the product of the reciprocals of the inclusion probabilities in each sampling stage, and were later calibrated considering the projections of the numbers of adolescents enrolled in schools located in the geographical strata by sex and age.

  18. Report on the Survey of the Design Review of New Reactor Applications. Volume 4: Reactor Coolant and Associated Systems

    International Nuclear Information System (INIS)

    Downey, Steven; Monninger, John; Nevalainen, Janne; Joyer, Philippe; Koley, Jaharlal; Kawamura, Tomonori; Chung, Yeon-Ki; Haluska, Ladislav; Persic, Andreja; Reierson, Craig; Monninger, John; Choi, Young-Joon; )

    2017-01-01

    At the tenth meeting of the Committee on Nuclear Regulatory Activities (CNRA) Working Group on the Regulation of New Reactors (WGRNR) in March 2013, the Working Group agreed to present the responses to the Second Phase, or Design Phase, of the licensing process survey as a multi-volume text. As such, each report will focus on one of the eleven general technical categories covered in the survey. The general technical categories were selected to conform to the topics covered in the International Atomic Energy Agency (IAEA) Safety Guide GS-G-4.1. This report provides a discussion of the survey responses related to the Reactor Coolant and Associated Systems category. The Reactor Coolant and Associated Systems category includes the following technical topics: overpressure protection, reactor coolant pressure boundary, reactor vessel, and design of the reactor coolant system. For each technical topic, the member countries described the information provided by the applicant, the scope and level of detail of the technical review, the technical basis for granting regulatory authorisation, the skill sets required and the level of effort needed to perform the review. Based on a comparison of the information provided by the member countries in response to the survey, the following observations were made: - Although the description of the information provided by the applicant differs in scope and level of detail among the member countries that provided responses, there are similarities in the information that is required. - All of the technical topics covered in the survey are reviewed in some manner by all of the regulatory authorities that provided responses. - It is common to consider operating experience and lessons learnt from the current fleet during the review process. - The most commonly and consistently identified technical expertise needed to perform design reviews related to this category are mechanical engineering and materials engineering. The complete survey

  19. Estimation of Social Exclusion Indicators from Complex Surveys: The R Package laeken

    Directory of Open Access Journals (Sweden)

    Andreas Alfons

    2013-09-01

    Full Text Available Units sampled from finite populations typically come with different inclusion proba- bilities. Together with additional preprocessing steps of the raw data, this yields unequal sampling weights of the observations. Whenever indicators are estimated from such com- plex samples, the corresponding sampling weights have to be taken into account. In addition, many indicators suffer from a strong influence of outliers, which are a common problem in real-world data. The R package laeken is an object-oriented toolkit for the estimation of indicators from complex survey samples via standard or robust methods. In particular the most widely used social exclusion and poverty indicators are imple- mented in the package. A general calibrated bootstrap method to estimate the variance of indicators for common survey designs is included as well. Furthermore, the package contains synthetically generated close-to-reality data for the European Union Statistics on Income and Living Conditions and the Structure of Earnings Survey, which are used in the code examples throughout the paper. Even though the paper is focused on showing the functionality of package laeken, it also provides a brief mathematical description of the implemented indicator methodology.

  20. A design-based approximation to the Bayes Information Criterion in finite population sampling

    Directory of Open Access Journals (Sweden)

    Enrico Fabrizi

    2014-05-01

    Full Text Available In this article, various issues related to the implementation of the usual Bayesian Information Criterion (BIC are critically examined in the context of modelling a finite population. A suitable design-based approximation to the BIC is proposed in order to avoid the derivation of the exact likelihood of the sample which is often very complex in a finite population sampling. The approximation is justified using a theoretical argument and a Monte Carlo simulation study.