WorldWideScience

Sample records for outcomes sampling sample

  1. Graphical models for inference under outcome-dependent sampling

    DEFF Research Database (Denmark)

    Didelez, V; Kreiner, S; Keiding, N

    2010-01-01

    a node for the sampling indicator, assumptions about sampling processes can be made explicit. We demonstrate how to read off such graphs whether consistent estimation of the association between exposure and outcome is possible. Moreover, we give sufficient graphical conditions for testing and estimating......We consider situations where data have been collected such that the sampling depends on the outcome of interest and possibly further covariates, as for instance in case-control studies. Graphical models represent assumptions about the conditional independencies among the variables. By including...

  2. Sample size estimation and sampling techniques for selecting a representative sample

    Directory of Open Access Journals (Sweden)

    Aamir Omair

    2014-01-01

    Full Text Available Introduction: The purpose of this article is to provide a general understanding of the concepts of sampling as applied to health-related research. Sample Size Estimation: It is important to select a representative sample in quantitative research in order to be able to generalize the results to the target population. The sample should be of the required sample size and must be selected using an appropriate probability sampling technique. There are many hidden biases which can adversely affect the outcome of the study. Important factors to consider for estimating the sample size include the size of the study population, confidence level, expected proportion of the outcome variable (for categorical variables/standard deviation of the outcome variable (for numerical variables, and the required precision (margin of accuracy from the study. The more the precision required, the greater is the required sample size. Sampling Techniques: The probability sampling techniques applied for health related research include simple random sampling, systematic random sampling, stratified random sampling, cluster sampling, and multistage sampling. These are more recommended than the nonprobability sampling techniques, because the results of the study can be generalized to the target population.

  3. Evaluation of the International Outcome Inventory for Hearing Aids in a veteran sample.

    Science.gov (United States)

    Smith, Sherri L; Noe, Colleen M; Alexander, Genevieve C

    2009-06-01

    The International Outcome Inventory for Hearing Aids (IOI-HA) was developed as a global hearing aid outcome measure targeting seven outcome domains. The published norms were based on a private-pay sample who were fitted with analog hearing aids. The purpose of this study was to evaluate the psychometric properties of the IOI-HA and to establish normative data in a veteran sample. Survey. The participants were 131 male veterans (mean age of 74.3 years, SD = 7.4) who were issued hearing aids with digital signal processing (DSP). Hearing aids with DSP that were fitted bilaterally between 2005 and 2007. Veterans were mailed two copies of the IOI-HA. The participants were instructed to complete the first copy of the questionnaire immediately and the second copy in two weeks. The completed questionnaires were mailed to the laboratory. The psychometric properties of the questionnaire were evaluated. As suggested by Cox and colleagues, the participants were divided into two categories based on their unaided subjective hearing difficulty. The two categories were (1) those with less hearing difficulty (none-to-moderate category) and (2) those who report more hearing difficulty (moderately severe+ category). The norms from the current veteran sample then were compared to the original, published sample. For each hearing difficulty category, the critical difference values were calculated for each item and for the total score. A factor analysis showed that the IOI-HA in the veteran sample had the identical subscale structure as reported in the original sample. For the total scale, the internal consistency was good (Chronbach's alpha = 0.83), and the test-retest reliability was high (lambda = 0.94). Group and individual norms were developed for both hearing difficulty categories in the veteran sample. For each IOI-HA item, the critical difference scores were one response unit between two test sessions reflects a true change in outcome for a given domain. The results of this study

  4. Outcome-Dependent Sampling Design and Inference for Cox's Proportional Hazards Model.

    Science.gov (United States)

    Yu, Jichang; Liu, Yanyan; Cai, Jianwen; Sandler, Dale P; Zhou, Haibo

    2016-11-01

    We propose a cost-effective outcome-dependent sampling design for the failure time data and develop an efficient inference procedure for data collected with this design. To account for the biased sampling scheme, we derive estimators from a weighted partial likelihood estimating equation. The proposed estimators for regression parameters are shown to be consistent and asymptotically normally distributed. A criteria that can be used to optimally implement the ODS design in practice is proposed and studied. The small sample performance of the proposed method is evaluated by simulation studies. The proposed design and inference procedure is shown to be statistically more powerful than existing alternative designs with the same sample sizes. We illustrate the proposed method with an existing real data from the Cancer Incidence and Mortality of Uranium Miners Study.

  5. Outcome-Dependent Sampling Design and Inference for Cox’s Proportional Hazards Model

    Science.gov (United States)

    Yu, Jichang; Liu, Yanyan; Cai, Jianwen; Sandler, Dale P.; Zhou, Haibo

    2016-01-01

    We propose a cost-effective outcome-dependent sampling design for the failure time data and develop an efficient inference procedure for data collected with this design. To account for the biased sampling scheme, we derive estimators from a weighted partial likelihood estimating equation. The proposed estimators for regression parameters are shown to be consistent and asymptotically normally distributed. A criteria that can be used to optimally implement the ODS design in practice is proposed and studied. The small sample performance of the proposed method is evaluated by simulation studies. The proposed design and inference procedure is shown to be statistically more powerful than existing alternative designs with the same sample sizes. We illustrate the proposed method with an existing real data from the Cancer Incidence and Mortality of Uranium Miners Study. PMID:28090134

  6. Investigating Treatment Outcomes Across OCD Symptom Dimensions in a Clinical Sample of OCD Patients.

    Science.gov (United States)

    Chase, Tannah; Wetterneck, Chad T; Bartsch, Robert A; Leonard, Rachel C; Riemann, Bradley C

    2015-01-01

    Despite the heterogeneous nature of obsessive-compulsive disorder (OCD), many self-report assessments do not adequately capture the clinical picture presenting within each symptom dimension, particularly unacceptable thoughts (UTs). In addition, obsessions and ordering/arranging compulsions are often underrepresented in samples of treatment outcome studies for OCD. Such methodological discrepancies may obscure research findings comparing treatment outcomes across OCD symptom dimensions. This study aimed to improve upon previous research by investigating treatment outcomes across OCD symptom dimensions using the Dimensional Obsessive-Compulsive Scale, which offers a more comprehensive assessment of UTs. The study included a primarily residential sample of 134 OCD patients. Results indicated that there were no significant differences in treatment outcomes across symptom dimensions. However, the severity of UTs remained significantly greater than other symptom dimensions at both admission and discharge. Thus, it is possible that UTs may exhibit uniquely impairing features, compared with other symptom dimensions. It is also possible that these findings may reflect the characteristics of the residential OCD samples. These speculations as well as implications for OCD treatment and future research are discussed.

  7. Sampling depth confounds soil acidification outcomes

    Science.gov (United States)

    In the northern Great Plains (NGP) of North America, surface sampling depths of 0-15 or 0-20 cm are suggested for testing soil characteristics such as pH. However, acidification is often most pronounced near the soil surface. Thus, sampling deeper can potentially dilute (increase) pH measurements an...

  8. Sample size calculations based on a difference in medians for positively skewed outcomes in health care studies

    Directory of Open Access Journals (Sweden)

    Aidan G. O’Keeffe

    2017-12-01

    Full Text Available Abstract Background In healthcare research, outcomes with skewed probability distributions are common. Sample size calculations for such outcomes are typically based on estimates on a transformed scale (e.g. log which may sometimes be difficult to obtain. In contrast, estimates of median and variance on the untransformed scale are generally easier to pre-specify. The aim of this paper is to describe how to calculate a sample size for a two group comparison of interest based on median and untransformed variance estimates for log-normal outcome data. Methods A log-normal distribution for outcome data is assumed and a sample size calculation approach for a two-sample t-test that compares log-transformed outcome data is demonstrated where the change of interest is specified as difference in median values on the untransformed scale. A simulation study is used to compare the method with a non-parametric alternative (Mann-Whitney U test in a variety of scenarios and the method is applied to a real example in neurosurgery. Results The method attained a nominal power value in simulation studies and was favourable in comparison to a Mann-Whitney U test and a two-sample t-test of untransformed outcomes. In addition, the method can be adjusted and used in some situations where the outcome distribution is not strictly log-normal. Conclusions We recommend the use of this sample size calculation approach for outcome data that are expected to be positively skewed and where a two group comparison on a log-transformed scale is planned. An advantage of this method over usual calculations based on estimates on the log-transformed scale is that it allows clinical efficacy to be specified as a difference in medians and requires a variance estimate on the untransformed scale. Such estimates are often easier to obtain and more interpretable than those for log-transformed outcomes.

  9. [Transabdominal chorionic villus sampling using biopsy forceps or needle: pregnancy outcomes by technique used].

    Science.gov (United States)

    Spallina, J; Anselem, O; Haddad, B; Touboul, C; Tsatsaris, V; Le Ray, C

    2014-11-01

    To compare pregnancy outcomes after transabdominal chorionic villus sampling using biopsy forceps or needle. Retrospective bicentric study including all women who had a transabdominal chorionic villus sampling between 2005 and 2009 (172 using biopsy forceps and 160 using needle). The primary endpoint was the rate of fetal loss, after excluding medical abortion due to the result of the biopsy. The secondary endpoint was the rate of premature rupture of the membrane. All cases were reviewed to try to determine the responsibility of the biopsy. The pregnancy outcomes were not different between the two groups: 4 (4.4%) fetal losses in the biopsy forceps group and 6 (7.4%) in the needle group (P=0.52). Only one case (1.2%) of fetal loss can be attributed to the biopsy, using a needle, and none (0%) following a forceps biospy (P=0.29). The rate of premature rupture of the membrane was comparable in the two groups. The pregnancy outcomes following chorionic villus sampling using a biopsy forceps or a needle seem comparable. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  10. Statistical inference for the additive hazards model under outcome-dependent sampling.

    Science.gov (United States)

    Yu, Jichang; Liu, Yanyan; Sandler, Dale P; Zhou, Haibo

    2015-09-01

    Cost-effective study design and proper inference procedures for data from such designs are always of particular interests to study investigators. In this article, we propose a biased sampling scheme, an outcome-dependent sampling (ODS) design for survival data with right censoring under the additive hazards model. We develop a weighted pseudo-score estimator for the regression parameters for the proposed design and derive the asymptotic properties of the proposed estimator. We also provide some suggestions for using the proposed method by evaluating the relative efficiency of the proposed method against simple random sampling design and derive the optimal allocation of the subsamples for the proposed design. Simulation studies show that the proposed ODS design is more powerful than other existing designs and the proposed estimator is more efficient than other estimators. We apply our method to analyze a cancer study conducted at NIEHS, the Cancer Incidence and Mortality of Uranium Miners Study, to study the risk of radon exposure to cancer.

  11. Evaluation of three sampling methods to monitor outcomes of antiretroviral treatment programmes in low- and middle-income countries.

    Science.gov (United States)

    Tassie, Jean-Michel; Malateste, Karen; Pujades-Rodríguez, Mar; Poulet, Elisabeth; Bennett, Diane; Harries, Anthony; Mahy, Mary; Schechter, Mauro; Souteyrand, Yves; Dabis, François

    2010-11-10

    Retention of patients on antiretroviral therapy (ART) over time is a proxy for quality of care and an outcome indicator to monitor ART programs. Using existing databases (Antiretroviral in Lower Income Countries of the International Databases to Evaluate AIDS and Médecins Sans Frontières), we evaluated three sampling approaches to simplify the generation of outcome indicators. We used individual patient data from 27 ART sites and included 27,201 ART-naive adults (≥15 years) who initiated ART in 2005. For each site, we generated two outcome indicators at 12 months, retention on ART and proportion of patients lost to follow-up (LFU), first using all patient data and then within a smaller group of patients selected using three sampling methods (random, systematic and consecutive sampling). For each method and each site, 500 samples were generated, and the average result was compared with the unsampled value. The 95% sampling distribution (SD) was expressed as the 2.5(th) and 97.5(th) percentile values from the 500 samples. Overall, retention on ART was 76.5% (range 58.9-88.6) and the proportion of patients LFU, 13.5% (range 0.8-31.9). Estimates of retention from sampling (n = 5696) were 76.5% (SD 75.4-77.7) for random, 76.5% (75.3-77.5) for systematic and 76.0% (74.1-78.2) for the consecutive method. Estimates for the proportion of patients LFU were 13.5% (12.6-14.5), 13.5% (12.6-14.3) and 14.0% (12.5-15.5), respectively. With consecutive sampling, 50% of sites had SD within ±5% of the unsampled site value. Our results suggest that random, systematic or consecutive sampling methods are feasible for monitoring ART indicators at national level. However, sampling may not produce precise estimates in some sites.

  12. Joint analysis of binary and quantitative traits with data sharing and outcome-dependent sampling.

    Science.gov (United States)

    Zheng, Gang; Wu, Colin O; Kwak, Minjung; Jiang, Wenhua; Joo, Jungnam; Lima, Joao A C

    2012-04-01

    We study the analysis of a joint association between a genetic marker with both binary (case-control) and quantitative (continuous) traits, where the quantitative trait values are only available for the cases due to data sharing and outcome-dependent sampling. Data sharing becomes common in genetic association studies, and the outcome-dependent sampling is the consequence of data sharing, under which a phenotype of interest is not measured for some subgroup. The trend test (or Pearson's test) and F-test are often, respectively, used to analyze the binary and quantitative traits. Because of the outcome-dependent sampling, the usual F-test can be applied using the subgroup with the observed quantitative traits. We propose a modified F-test by also incorporating the genotype frequencies of the subgroup whose traits are not observed. Further, a combination of this modified F-test and Pearson's test is proposed by Fisher's combination of their P-values as a joint analysis. Because of the correlation of the two analyses, we propose to use a Gamma (scaled chi-squared) distribution to fit the asymptotic null distribution for the joint analysis. The proposed modified F-test and the joint analysis can also be applied to test single trait association (either binary or quantitative trait). Through simulations, we identify the situations under which the proposed tests are more powerful than the existing ones. Application to a real dataset of rheumatoid arthritis is presented. © 2012 Wiley Periodicals, Inc.

  13. Outcomes of couples with infidelity in a community-based sample of couple therapy.

    Science.gov (United States)

    Atkins, David C; Marín, Rebeca A; Lo, Tracy T Y; Klann, Notker; Hahlweg, Kurt

    2010-04-01

    Infidelity is an often cited problem for couples seeking therapy, but the research literature to date is very limited on couple therapy outcomes when infidelity is a problem. The current study is a secondary analysis of a community-based sample of couple therapy in Germany and Austria. Outcomes for 145 couples who reported infidelity as a problem in their relationship were compared with 385 couples who sought therapy for other reasons. Analyses based on hierarchical linear modeling revealed that infidelity couples were significantly more distressed and reported more depressive symptoms at the start of therapy but continued improving through the end of therapy and to 6 months posttherapy. At the follow-up assessment, infidelity couples were not statistically distinguishable from non-infidelity couples, replicating previous research. Sexual dissatisfaction did not depend on infidelity status. Although there was substantial missing data, sensitivity analyses suggested that the primary findings were not due to missing data. The current findings based on a large community sample replicated previous work from an efficacy trial and show generally optimistic results for couples in which there has been an affair. 2010 APA, all rights reserved

  14. Comparative analysis of whole mount processing and systematic sampling of radical prostatectomy specimens: pathological outcomes and risk of biochemical recurrence.

    Science.gov (United States)

    Salem, Shady; Chang, Sam S; Clark, Peter E; Davis, Rodney; Herrell, S Duke; Kordan, Yakup; Wills, Marcia L; Shappell, Scott B; Baumgartner, Roxelyn; Phillips, Sharon; Smith, Joseph A; Cookson, Michael S; Barocas, Daniel A

    2010-10-01

    Whole mount processing is more resource intensive than routine systematic sampling of radical retropubic prostatectomy specimens. We compared whole mount and systematic sampling for detecting pathological outcomes, and compared the prognostic value of pathological findings across pathological methods. We included men (608 whole mount and 525 systematic sampling samples) with no prior treatment who underwent radical retropubic prostatectomy at Vanderbilt University Medical Center between January 2000 and June 2008. We used univariate and multivariate analysis to compare the pathological outcome detection rate between pathological methods. Kaplan-Meier curves and the log rank test were used to compare the prognostic value of pathological findings across pathological methods. There were no significant differences between the whole mount and the systematic sampling groups in detecting extraprostatic extension (25% vs 30%), positive surgical margins (31% vs 31%), pathological Gleason score less than 7 (49% vs 43%), 7 (39% vs 43%) or greater than 7 (12% vs 13%), seminal vesicle invasion (8% vs 10%) or lymph node involvement (3% vs 5%). Tumor volume was higher in the systematic sampling group and whole mount detected more multiple surgical margins (each p systematic sampling yield similar pathological information. Each method stratifies patients into comparable risk groups for biochemical recurrence. Thus, while whole mount is more resource intensive, it does not appear to result in improved detection of clinically important pathological outcomes or prognostication. Copyright © 2010 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  15. Catch me if you can: Comparing ballast water sampling skids to traditional net sampling

    Science.gov (United States)

    Bradie, Johanna; Gianoli, Claudio; Linley, Robert Dallas; Schillak, Lothar; Schneider, Gerd; Stehouwer, Peter; Bailey, Sarah

    2018-03-01

    With the recent ratification of the International Convention for the Control and Management of Ships' Ballast Water and Sediments, 2004, it will soon be necessary to assess ships for compliance with ballast water discharge standards. Sampling skids that allow the efficient collection of ballast water samples in a compact space have been developed for this purpose. We ran 22 trials on board the RV Meteor from June 4-15, 2015 to evaluate the performance of three ballast water sampling devices (traditional plankton net, Triton sampling skid, SGS sampling skid) for three organism size classes: ≥ 50 μm, ≥ 10 μm to Natural sea water was run through the ballast water system and untreated samples were collected using paired sampling devices. Collected samples were analyzed in parallel by multiple analysts using several different analytic methods to quantify organism concentrations. To determine whether there were differences in the number of viable organisms collected across sampling devices, results were standardized and statistically treated to filter out other sources of variability, resulting in an outcome variable representing the mean difference in measurements that can be attributed to sampling devices. These results were tested for significance using pairwise Tukey contrasts. Differences in organism concentrations were found in 50% of comparisons between sampling skids and the plankton net for ≥ 50 μm, and ≥ 10 μm to < 50 μm size classes, with net samples containing either higher or lower densities. There were no differences for < 10 μm organisms. Future work will be required to explicitly examine the potential effects of flow velocity, sampling duration, sampled volume, and organism concentrations on sampling device performance.

  16. Quantification of errors in ordinal outcome scales using shannon entropy: effect on sample size calculations.

    Science.gov (United States)

    Mandava, Pitchaiah; Krumpelman, Chase S; Shah, Jharna N; White, Donna L; Kent, Thomas A

    2013-01-01

    Clinical trial outcomes often involve an ordinal scale of subjective functional assessments but the optimal way to quantify results is not clear. In stroke, the most commonly used scale, the modified Rankin Score (mRS), a range of scores ("Shift") is proposed as superior to dichotomization because of greater information transfer. The influence of known uncertainties in mRS assessment has not been quantified. We hypothesized that errors caused by uncertainties could be quantified by applying information theory. Using Shannon's model, we quantified errors of the "Shift" compared to dichotomized outcomes using published distributions of mRS uncertainties and applied this model to clinical trials. We identified 35 randomized stroke trials that met inclusion criteria. Each trial's mRS distribution was multiplied with the noise distribution from published mRS inter-rater variability to generate an error percentage for "shift" and dichotomized cut-points. For the SAINT I neuroprotectant trial, considered positive by "shift" mRS while the larger follow-up SAINT II trial was negative, we recalculated sample size required if classification uncertainty was taken into account. Considering the full mRS range, error rate was 26.1%±5.31 (Mean±SD). Error rates were lower for all dichotomizations tested using cut-points (e.g. mRS 1; 6.8%±2.89; overall pdecrease in reliability. The resultant errors need to be considered since sample size may otherwise be underestimated. In principle, we have outlined an approach to error estimation for any condition in which there are uncertainties in outcome assessment. We provide the user with programs to calculate and incorporate errors into sample size estimation.

  17. A review of empirical research related to the use of small quantitative samples in clinical outcome scale development.

    Science.gov (United States)

    Houts, Carrie R; Edwards, Michael C; Wirth, R J; Deal, Linda S

    2016-11-01

    There has been a notable increase in the advocacy of using small-sample designs as an initial quantitative assessment of item and scale performance during the scale development process. This is particularly true in the development of clinical outcome assessments (COAs), where Rasch analysis has been advanced as an appropriate statistical tool for evaluating the developing COAs using a small sample. We review the benefits such methods are purported to offer from both a practical and statistical standpoint and detail several problematic areas, including both practical and statistical theory concerns, with respect to the use of quantitative methods, including Rasch-consistent methods, with small samples. The feasibility of obtaining accurate information and the potential negative impacts of misusing large-sample statistical methods with small samples during COA development are discussed.

  18. The significance of Sampling Design on Inference: An Analysis of Binary Outcome Model of Children’s Schooling Using Indonesian Large Multi-stage Sampling Data

    OpenAIRE

    Ekki Syamsulhakim

    2008-01-01

    This paper aims to exercise a rather recent trend in applied microeconometrics, namely the effect of sampling design on statistical inference, especially on binary outcome model. Many theoretical research in econometrics have shown the inappropriateness of applying i.i.dassumed statistical analysis on non-i.i.d data. These research have provided proofs showing that applying the iid-assumed analysis on a non-iid observations would result in an inflated standard errors which could make the esti...

  19. Defining And Characterizing Sample Representativeness For DWPF Melter Feed Samples

    Energy Technology Data Exchange (ETDEWEB)

    Shine, E. P.; Poirier, M. R.

    2013-10-29

    Representative sampling is important throughout the Defense Waste Processing Facility (DWPF) process, and the demonstrated success of the DWPF process to achieve glass product quality over the past two decades is a direct result of the quality of information obtained from the process. The objective of this report was to present sampling methods that the Savannah River Site (SRS) used to qualify waste being dispositioned at the DWPF. The goal was to emphasize the methodology, not a list of outcomes from those studies. This methodology includes proven methods for taking representative samples, the use of controlled analytical methods, and data interpretation and reporting that considers the uncertainty of all error sources. Numerous sampling studies were conducted during the development of the DWPF process and still continue to be performed in order to evaluate options for process improvement. Study designs were based on use of statistical tools applicable to the determination of uncertainties associated with the data needs. Successful designs are apt to be repeated, so this report chose only to include prototypic case studies that typify the characteristics of frequently used designs. Case studies have been presented for studying in-tank homogeneity, evaluating the suitability of sampler systems, determining factors that affect mixing and sampling, comparing the final waste glass product chemical composition and durability to that of the glass pour stream sample and other samples from process vessels, and assessing the uniformity of the chemical composition in the waste glass product. Many of these studies efficiently addressed more than one of these areas of concern associated with demonstrating sample representativeness and provide examples of statistical tools in use for DWPF. The time when many of these designs were implemented was in an age when the sampling ideas of Pierre Gy were not as widespread as they are today. Nonetheless, the engineers and

  20. Defining And Characterizing Sample Representativeness For DWPF Melter Feed Samples

    International Nuclear Information System (INIS)

    Shine, E. P.; Poirier, M. R.

    2013-01-01

    Representative sampling is important throughout the Defense Waste Processing Facility (DWPF) process, and the demonstrated success of the DWPF process to achieve glass product quality over the past two decades is a direct result of the quality of information obtained from the process. The objective of this report was to present sampling methods that the Savannah River Site (SRS) used to qualify waste being dispositioned at the DWPF. The goal was to emphasize the methodology, not a list of outcomes from those studies. This methodology includes proven methods for taking representative samples, the use of controlled analytical methods, and data interpretation and reporting that considers the uncertainty of all error sources. Numerous sampling studies were conducted during the development of the DWPF process and still continue to be performed in order to evaluate options for process improvement. Study designs were based on use of statistical tools applicable to the determination of uncertainties associated with the data needs. Successful designs are apt to be repeated, so this report chose only to include prototypic case studies that typify the characteristics of frequently used designs. Case studies have been presented for studying in-tank homogeneity, evaluating the suitability of sampler systems, determining factors that affect mixing and sampling, comparing the final waste glass product chemical composition and durability to that of the glass pour stream sample and other samples from process vessels, and assessing the uniformity of the chemical composition in the waste glass product. Many of these studies efficiently addressed more than one of these areas of concern associated with demonstrating sample representativeness and provide examples of statistical tools in use for DWPF. The time when many of these designs were implemented was in an age when the sampling ideas of Pierre Gy were not as widespread as they are today. Nonetheless, the engineers and

  1. Systematic sampling with errors in sample locations

    DEFF Research Database (Denmark)

    Ziegel, Johanna; Baddeley, Adrian; Dorph-Petersen, Karl-Anton

    2010-01-01

    analysis using point process methods. We then analyze three different models for the error process, calculate exact expressions for the variances, and derive asymptotic variances. Errors in the placement of sample points can lead to substantial inflation of the variance, dampening of zitterbewegung......Systematic sampling of points in continuous space is widely used in microscopy and spatial surveys. Classical theory provides asymptotic expressions for the variance of estimators based on systematic sampling as the grid spacing decreases. However, the classical theory assumes that the sample grid...... is exactly periodic; real physical sampling procedures may introduce errors in the placement of the sample points. This paper studies the effect of errors in sample positioning on the variance of estimators in the case of one-dimensional systematic sampling. First we sketch a general approach to variance...

  2. [A comparison of convenience sampling and purposive sampling].

    Science.gov (United States)

    Suen, Lee-Jen Wu; Huang, Hui-Man; Lee, Hao-Hsien

    2014-06-01

    Convenience sampling and purposive sampling are two different sampling methods. This article first explains sampling terms such as target population, accessible population, simple random sampling, intended sample, actual sample, and statistical power analysis. These terms are then used to explain the difference between "convenience sampling" and purposive sampling." Convenience sampling is a non-probabilistic sampling technique applicable to qualitative or quantitative studies, although it is most frequently used in quantitative studies. In convenience samples, subjects more readily accessible to the researcher are more likely to be included. Thus, in quantitative studies, opportunity to participate is not equal for all qualified individuals in the target population and study results are not necessarily generalizable to this population. As in all quantitative studies, increasing the sample size increases the statistical power of the convenience sample. In contrast, purposive sampling is typically used in qualitative studies. Researchers who use this technique carefully select subjects based on study purpose with the expectation that each participant will provide unique and rich information of value to the study. As a result, members of the accessible population are not interchangeable and sample size is determined by data saturation not by statistical power analysis.

  3. Rational learning and information sampling: on the "naivety" assumption in sampling explanations of judgment biases.

    Science.gov (United States)

    Le Mens, Gaël; Denrell, Jerker

    2011-04-01

    Recent research has argued that several well-known judgment biases may be due to biases in the available information sample rather than to biased information processing. Most of these sample-based explanations assume that decision makers are "naive": They are not aware of the biases in the available information sample and do not correct for them. Here, we show that this "naivety" assumption is not necessary. Systematically biased judgments can emerge even when decision makers process available information perfectly and are also aware of how the information sample has been generated. Specifically, we develop a rational analysis of Denrell's (2005) experience sampling model, and we prove that when information search is interested rather than disinterested, even rational information sampling and processing can give rise to systematic patterns of errors in judgments. Our results illustrate that a tendency to favor alternatives for which outcome information is more accessible can be consistent with rational behavior. The model offers a rational explanation for behaviors that had previously been attributed to cognitive and motivational biases, such as the in-group bias or the tendency to prefer popular alternatives. 2011 APA, all rights reserved

  4. Sample size and number of outcome measures of veterinary randomised controlled trials of pharmaceutical interventions funded by different sources, a cross-sectional study.

    Science.gov (United States)

    Wareham, K J; Hyde, R M; Grindlay, D; Brennan, M L; Dean, R S

    2017-10-04

    Randomised controlled trials (RCTs) are a key component of the veterinary evidence base. Sample sizes and defined outcome measures are crucial components of RCTs. To describe the sample size and number of outcome measures of veterinary RCTs either funded by the pharmaceutical industry or not, published in 2011. A structured search of PubMed identified RCTs examining the efficacy of pharmaceutical interventions. Number of outcome measures, number of animals enrolled per trial, whether a primary outcome was identified, and the presence of a sample size calculation were extracted from the RCTs. The source of funding was identified for each trial and groups compared on the above parameters. Literature searches returned 972 papers; 86 papers comprising 126 individual trials were analysed. The median number of outcomes per trial was 5.0; there were no significant differences across funding groups (p = 0.133). The median number of animals enrolled per trial was 30.0; this was similar across funding groups (p = 0.302). A primary outcome was identified in 40.5% of trials and was significantly more likely to be stated in trials funded by a pharmaceutical company. A very low percentage of trials reported a sample size calculation (14.3%). Failure to report primary outcomes, justify sample sizes and the reporting of multiple outcome measures was a common feature in all of the clinical trials examined in this study. It is possible some of these factors may be affected by the source of funding of the studies, but the influence of funding needs to be explored with a larger number of trials. Some veterinary RCTs provide a weak evidence base and targeted strategies are required to improve the quality of veterinary RCTs to ensure there is reliable evidence on which to base clinical decisions.

  5. Information sampling behavior with explicit sampling costs

    Science.gov (United States)

    Juni, Mordechai Z.; Gureckis, Todd M.; Maloney, Laurence T.

    2015-01-01

    The decision to gather information should take into account both the value of information and its accrual costs in time, energy and money. Here we explore how people balance the monetary costs and benefits of gathering additional information in a perceptual-motor estimation task. Participants were rewarded for touching a hidden circular target on a touch-screen display. The target’s center coincided with the mean of a circular Gaussian distribution from which participants could sample repeatedly. Each “cue” — sampled one at a time — was plotted as a dot on the display. Participants had to repeatedly decide, after sampling each cue, whether to stop sampling and attempt to touch the hidden target or continue sampling. Each additional cue increased the participants’ probability of successfully touching the hidden target but reduced their potential reward. Two experimental conditions differed in the initial reward associated with touching the hidden target and the fixed cost per cue. For each condition we computed the optimal number of cues that participants should sample, before taking action, to maximize expected gain. Contrary to recent claims that people gather less information than they objectively should before taking action, we found that participants over-sampled in one experimental condition, and did not significantly under- or over-sample in the other. Additionally, while the ideal observer model ignores the current sample dispersion, we found that participants used it to decide whether to stop sampling and take action or continue sampling, a possible consequence of imperfect learning of the underlying population dispersion across trials. PMID:27429991

  6. Samples and Sampling Protocols for Scientific Investigations | Joel ...

    African Journals Online (AJOL)

    ... from sampling, through sample preparation, calibration to final measurement and reporting. This paper, therefore offers useful information on practical guidance on sampling protocols in line with best practice and international standards. Keywords: Sampling, sampling protocols, chain of custody, analysis, documentation ...

  7. Quantification of errors in ordinal outcome scales using shannon entropy: effect on sample size calculations.

    Directory of Open Access Journals (Sweden)

    Pitchaiah Mandava

    Full Text Available OBJECTIVE: Clinical trial outcomes often involve an ordinal scale of subjective functional assessments but the optimal way to quantify results is not clear. In stroke, the most commonly used scale, the modified Rankin Score (mRS, a range of scores ("Shift" is proposed as superior to dichotomization because of greater information transfer. The influence of known uncertainties in mRS assessment has not been quantified. We hypothesized that errors caused by uncertainties could be quantified by applying information theory. Using Shannon's model, we quantified errors of the "Shift" compared to dichotomized outcomes using published distributions of mRS uncertainties and applied this model to clinical trials. METHODS: We identified 35 randomized stroke trials that met inclusion criteria. Each trial's mRS distribution was multiplied with the noise distribution from published mRS inter-rater variability to generate an error percentage for "shift" and dichotomized cut-points. For the SAINT I neuroprotectant trial, considered positive by "shift" mRS while the larger follow-up SAINT II trial was negative, we recalculated sample size required if classification uncertainty was taken into account. RESULTS: Considering the full mRS range, error rate was 26.1%±5.31 (Mean±SD. Error rates were lower for all dichotomizations tested using cut-points (e.g. mRS 1; 6.8%±2.89; overall p<0.001. Taking errors into account, SAINT I would have required 24% more subjects than were randomized. CONCLUSION: We show when uncertainty in assessments is considered, the lowest error rates are with dichotomization. While using the full range of mRS is conceptually appealing, a gain of information is counter-balanced by a decrease in reliability. The resultant errors need to be considered since sample size may otherwise be underestimated. In principle, we have outlined an approach to error estimation for any condition in which there are uncertainties in outcome assessment. We

  8. Mental illness and housing outcomes among a sample of homeless men in an Australian urban centre.

    Science.gov (United States)

    Spicer, Bridget; Smith, David I; Conroy, Elizabeth; Flatau, Paul R; Burns, Lucy

    2015-05-01

    The over-representation of mental illness among homeless people across the globe is well documented. However, there is a dearth of Australian literature on the mental health needs of homeless individuals. Furthermore, longitudinal research examining the factors that contribute to better housing outcomes among this population is sparse. The aim of this research is to describe the mental illness profile of a sample of homeless men in an Australian urban centre (in Sydney) and examine the factors associated with better housing outcomes at 12-month follow-up. A longitudinal survey was administered to 253 homeless men who were involved in the Michael Project: a 3-year initiative which combined existing accommodation support services with assertive case management and access to coordinated additional specialist allied health and support services. A total of 107 participants were followed up 12 months later. The survey examined the demographics of the sample and lifetime mental disorder diagnoses, and also included psychological screeners for current substance use and dependence, psychological distress, psychosis, and post-traumatic stress. Consistent with existing literature, the prevalence of mental illness was significantly greater amongst this sample than the general Australian population. However, mental illness presentation was not associated with housing situation at 12-month follow-up. Instead, type of support service at baseline was the best predictor of housing outcome, wherein participants who received short to medium-term accommodation and support were significantly more likely to be housed in stable, long-term housing at the 12-month follow-up than participants who received outreach or emergency accommodation support. This study provides evidence to support an innovative support model for homeless people in Australia and contributes to the limited Australian research on mental illness in this population. © The Royal Australian and New Zealand College of

  9. Sampling and chemical analysis in environmental samples around Nuclear Power Plants and some environmental samples

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Yong Woo; Han, Man Jung; Cho, Seong Won; Cho, Hong Jun; Oh, Hyeon Kyun; Lee, Jeong Min; Chang, Jae Sook [KORTIC, Taejon (Korea, Republic of)

    2002-12-15

    Twelve kinds of environmental samples such as soil, seawater, underground water, etc. around Nuclear Power Plants(NPPs) were collected. Tritium chemical analysis was tried for the samples of rain water, pine-needle, air, seawater, underground water, chinese cabbage, a grain of rice and milk sampled around NPPs, and surface seawater and rain water sampled over the country. Strontium in the soil that sere sampled at 60 point of district in Korea were analyzed. Tritium were sampled at 60 point of district in Korea were analyzed. Tritium were analyzed in 21 samples of surface seawater around the Korea peninsular that were supplied from KFRDI(National Fisheries Research and Development Institute). Sampling and chemical analysis environmental samples around Kori, Woolsung, Youngkwang, Wooljin Npps and Taeduk science town for tritium and strontium analysis was managed according to plans. Succeed to KINS after all samples were tried.

  10. Fetal scalp blood sampling during labor

    DEFF Research Database (Denmark)

    Chandraharan, Edwin; Wiberg, Nana

    2014-01-01

    Fetal cardiotocography is characterized by low specificity; therefore, in an attempt to ensure fetal well-being, fetal scalp blood sampling has been recommended by most obstetric societies in the case of a non-reassuring cardiotocography. The scientific agreement on the evidence for using fetal...... scalp blood sampling to decrease the rate of operative delivery for fetal distress is ambiguous. Based on the same studies, a Cochrane review states that fetal scalp blood sampling increases the rate of instrumental delivery while decreasing neonatal acidosis, whereas the National Institute of Health...... and Clinical Excellence guideline considers that fetal scalp blood sampling decreases instrumental delivery without differences in other outcome variables. The fetal scalp is supplied by vessels outside the skull below the level of the cranial vault, which is likely to be compressed during contractions...

  11. Pierre Gy's sampling theory and sampling practice heterogeneity, sampling correctness, and statistical process control

    CERN Document Server

    Pitard, Francis F

    1993-01-01

    Pierre Gy's Sampling Theory and Sampling Practice, Second Edition is a concise, step-by-step guide for process variability management and methods. Updated and expanded, this new edition provides a comprehensive study of heterogeneity, covering the basic principles of sampling theory and its various applications. It presents many practical examples to allow readers to select appropriate sampling protocols and assess the validity of sampling protocols from others. The variability of dynamic process streams using variography is discussed to help bridge sampling theory with statistical process control. Many descriptions of good sampling devices, as well as descriptions of poor ones, are featured to educate readers on what to look for when purchasing sampling systems. The book uses its accessible, tutorial style to focus on professional selection and use of methods. The book will be a valuable guide for mineral processing engineers; metallurgists; geologists; miners; chemists; environmental scientists; and practit...

  12. Systematic Sampling and Cluster Sampling of Packet Delays

    OpenAIRE

    Lindh, Thomas

    2006-01-01

    Based on experiences of a traffic flow performance meter this papersuggests and evaluates cluster sampling and systematic sampling as methods toestimate average packet delays. Systematic sampling facilitates for exampletime analysis, frequency analysis and jitter measurements. Cluster samplingwith repeated trains of periodically spaced sampling units separated by randomstarting periods, and systematic sampling are evaluated with respect to accuracyand precision. Packet delay traces have been ...

  13. Sampling

    CERN Document Server

    Thompson, Steven K

    2012-01-01

    Praise for the Second Edition "This book has never had a competitor. It is the only book that takes a broad approach to sampling . . . any good personal statistics library should include a copy of this book." —Technometrics "Well-written . . . an excellent book on an important subject. Highly recommended." —Choice "An ideal reference for scientific researchers and other professionals who use sampling." —Zentralblatt Math Features new developments in the field combined with all aspects of obtaining, interpreting, and using sample data Sampling provides an up-to-date treat

  14. Water born pollutants sampling using porous suction samples

    International Nuclear Information System (INIS)

    Baig, M.A.

    1997-01-01

    The common standard method of sampling water born pollutants in the vadoze zone is core sampling and it is followed by extraction of pore fluid. This method does not allow sampling at the same location next time and again later on. There is an alternative approach for sampling fluids (water born pollutants) from both saturated and unsaturated regions of vadose zone using porous suction samplers. There are three types of porous suction samplers, vacuum-operated, pressure-vacuum lysimeters, high pressure vacuum samples. The suction samples are operated in the range of 0-70 centi bars and usually consist of ceramic and polytetrafluorethylene (PTFE). The operation range of PTFE is higher than ceramic cups. These samplers are well suited for in situ and repeated sampling form the same location. This paper discusses the physical properties and operating condition of such samplers to the utilized under our environmental sampling. (author)

  15. Choosing a suitable sample size in descriptive sampling

    International Nuclear Information System (INIS)

    Lee, Yong Kyun; Choi, Dong Hoon; Cha, Kyung Joon

    2010-01-01

    Descriptive sampling (DS) is an alternative to crude Monte Carlo sampling (CMCS) in finding solutions to structural reliability problems. It is known to be an effective sampling method in approximating the distribution of a random variable because it uses the deterministic selection of sample values and their random permutation,. However, because this method is difficult to apply to complex simulations, the sample size is occasionally determined without thorough consideration. Input sample variability may cause the sample size to change between runs, leading to poor simulation results. This paper proposes a numerical method for choosing a suitable sample size for use in DS. Using this method, one can estimate a more accurate probability of failure in a reliability problem while running a minimal number of simulations. The method is then applied to several examples and compared with CMCS and conventional DS to validate its usefulness and efficiency

  16. Marijuana use and inpatient outcomes among hospitalized patients: analysis of the nationwide inpatient sample database

    OpenAIRE

    Vin?Raviv, Neomi; Akinyemiju, Tomi; Meng, Qingrui; Sakhuja, Swati; Hayward, Reid

    2016-01-01

    Abstract The purpose of this paper is to examine the relationship between marijuana use and health outcomes among hospitalized patients, including those hospitalized with a diagnosis of cancer. A total of 387,608 current marijuana users were identified based on ICD?9 codes for marijuana use among hospitalized patients in the Nationwide Inpatient Sample database between 2007 and 2011. Logistic regression analysis was performed to determine the association between marijuana use and heart failur...

  17. Sampling and examination methods used for TMI-2 samples

    International Nuclear Information System (INIS)

    Marley, A.W.; Akers, D.W.; McIsaac, C.V.

    1988-01-01

    The purpose of this paper is to summarize the sampling and examination techniques that were used in the collection and analysis of TMI-2 samples. Samples ranging from auxiliary building air to core debris were collected and analyzed. Handling of the larger samples and many of the smaller samples had to be done remotely and many standard laboratory analytical techniques were modified to accommodate the extremely high radiation fields associated with these samples. The TMI-2 samples presented unique problems with sampling and the laboratory analysis of prior molten fuel debris. 14 refs., 8 figs

  18. Sampling and sample processing in pesticide residue analysis.

    Science.gov (United States)

    Lehotay, Steven J; Cook, Jo Marie

    2015-05-13

    Proper sampling and sample processing in pesticide residue analysis of food and soil have always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agrochemical industry and contract laboratories. As global food trade and the importance of monitoring increase, the food industry and regulatory laboratories are also considering miniaturized high-throughput methods. In conjunction with a summary of the symposium "Residues in Food and Feed - Going from Macro to Micro: The Future of Sample Processing in Residue Analytical Methods" held at the 13th IUPAC International Congress of Pesticide Chemistry, this is an opportune time to review sampling theory and sample processing for pesticide residue analysis. If collected samples and test portions do not adequately represent the actual lot from which they came and provide meaningful results, then all costs, time, and efforts involved in implementing programs using sophisticated analytical instruments and techniques are wasted and can actually yield misleading results. This paper is designed to briefly review the often-neglected but crucial topic of sample collection and processing and put the issue into perspective for the future of pesticide residue analysis. It also emphasizes that analysts should demonstrate the validity of their sample processing approaches for the analytes/matrices of interest and encourages further studies on sampling and sample mass reduction to produce a test portion.

  19. Boat sampling

    International Nuclear Information System (INIS)

    Citanovic, M.; Bezlaj, H.

    1994-01-01

    This presentation describes essential boat sampling activities: on site boat sampling process optimization and qualification; boat sampling of base material (beltline region); boat sampling of weld material (weld No. 4); problems accompanied with weld crown varieties, RPV shell inner radius tolerance, local corrosion pitting and water clarity. The equipment used for boat sampling is described too. 7 pictures

  20. Burnout and Engagement: Relative Importance of Predictors and Outcomes in Two Health Care Worker Samples.

    Science.gov (United States)

    Fragoso, Zachary L; Holcombe, Kyla J; McCluney, Courtney L; Fisher, Gwenith G; McGonagle, Alyssa K; Friebe, Susan J

    2016-06-09

    This study's purpose was twofold: first, to examine the relative importance of job demands and resources as predictors of burnout and engagement, and second, the relative importance of engagement and burnout related to health, depressive symptoms, work ability, organizational commitment, and turnover intentions in two samples of health care workers. Nurse leaders (n = 162) and licensed emergency medical technicians (EMTs; n = 102) completed surveys. In both samples, job demands predicted burnout more strongly than job resources, and job resources predicted engagement more strongly than job demands. Engagement held more weight than burnout for predicting commitment, and burnout held more weight for predicting health outcomes, depressive symptoms, and work ability. Results have implications for the design, evaluation, and effectiveness of workplace interventions to reduce burnout and improve engagement among health care workers. Actionable recommendations for increasing engagement and decreasing burnout in health care organizations are provided. © 2016 The Author(s).

  1. A Bayesian Justification for Random Sampling in Sample Survey

    Directory of Open Access Journals (Sweden)

    Glen Meeden

    2012-07-01

    Full Text Available In the usual Bayesian approach to survey sampling the sampling design, plays a minimal role, at best. Although a close relationship between exchangeable prior distributions and simple random sampling has been noted; how to formally integrate simple random sampling into the Bayesian paradigm is not clear. Recently it has been argued that the sampling design can be thought of as part of a Bayesian's prior distribution. We will show here that under this scenario simple random sample can be given a Bayesian justification in survey sampling.

  2. An assessment of Lot Quality Assurance Sampling to evaluate malaria outcome indicators: extending malaria indicator surveys.

    Science.gov (United States)

    Biedron, Caitlin; Pagano, Marcello; Hedt, Bethany L; Kilian, Albert; Ratcliffe, Amy; Mabunda, Samuel; Valadez, Joseph J

    2010-02-01

    Large investments and increased global prioritization of malaria prevention and treatment have resulted in greater emphasis on programme monitoring and evaluation (M&E) in many countries. Many countries currently use large multistage cluster sample surveys to monitor malaria outcome indicators on a regional and national level. However, these surveys often mask local-level variability important to programme management. Lot Quality Assurance Sampling (LQAS) has played a valuable role for local-level programme M&E. If incorporated into these larger surveys, it would provide a comprehensive M&E plan at little, if any, extra cost. The Mozambique Ministry of Health conducted a Malaria Indicator Survey (MIS) in June and July 2007. We applied LQAS classification rules to the 345 sampled enumeration areas to demonstrate identifying high- and low-performing areas with respect to two malaria program indicators-'household possession of any bednet' and 'household possession of any insecticide-treated bednet (ITN)'. As shown by the MIS, no province in Mozambique achieved the 70% coverage target for household possession of bednets or ITNs. By applying LQAS classification rules to the data, we identify 266 of the 345 enumeration areas as having bednet coverage severely below the 70% target. An additional 73 were identified with low ITN coverage. This article demonstrates the feasibility of integrating LQAS into multistage cluster sampling surveys and using these results to support a comprehensive national, regional and local programme M&E system. Furthermore, in the recommendations we outlined how to integrate the Large Country-LQAS design into macro-surveys while still obtaining results available through current sampling practices.

  3. Graph sampling

    OpenAIRE

    Zhang, L.-C.; Patone, M.

    2017-01-01

    We synthesise the existing theory of graph sampling. We propose a formal definition of sampling in finite graphs, and provide a classification of potential graph parameters. We develop a general approach of Horvitz–Thompson estimation to T-stage snowball sampling, and present various reformulations of some common network sampling methods in the literature in terms of the outlined graph sampling theory.

  4. Acceptance sampling using judgmental and randomly selected samples

    Energy Technology Data Exchange (ETDEWEB)

    Sego, Landon H.; Shulman, Stanley A.; Anderson, Kevin K.; Wilson, John E.; Pulsipher, Brent A.; Sieber, W. Karl

    2010-09-01

    We present a Bayesian model for acceptance sampling where the population consists of two groups, each with different levels of risk of containing unacceptable items. Expert opinion, or judgment, may be required to distinguish between the high and low-risk groups. Hence, high-risk items are likely to be identifed (and sampled) using expert judgment, while the remaining low-risk items are sampled randomly. We focus on the situation where all observed samples must be acceptable. Consequently, the objective of the statistical inference is to quantify the probability that a large percentage of the unsampled items in the population are also acceptable. We demonstrate that traditional (frequentist) acceptance sampling and simpler Bayesian formulations of the problem are essentially special cases of the proposed model. We explore the properties of the model in detail, and discuss the conditions necessary to ensure that required samples sizes are non-decreasing function of the population size. The method is applicable to a variety of acceptance sampling problems, and, in particular, to environmental sampling where the objective is to demonstrate the safety of reoccupying a remediated facility that has been contaminated with a lethal agent.

  5. Perceived social environment and adolescents' well-being and adjustment: Comparing a foster care sample with a matched sample

    OpenAIRE

    Farruggia, SP; Greenberger, E; Chen, C; Heckhausen, J

    2006-01-01

    Previous research has demonstrated that former foster care youth are at risk for poor outcomes (e.g., more problem behaviors, more depression, lower self-esteem, and poor social relationships). It is not clear, however, whether these findings reflect preemancipation developmental deficits. This study used 163 preemancipation foster care youth and a matched sample of 163 comparison youth. Results showed that foster-care youth did not differ from the comparison sample on measures of well-being,...

  6. Irritability Trajectories, Cortical Thickness, and Clinical Outcomes in a Sample Enriched for Preschool Depression.

    Science.gov (United States)

    Pagliaccio, David; Pine, Daniel S; Barch, Deanna M; Luby, Joan L; Leibenluft, Ellen

    2018-05-01

    Cross-sectional, longitudinal, and genetic associations exist between irritability and depression. Prior studies have examined developmental trajectories of irritability, clinical outcomes, and associations with child and familial depression. However, studies have not integrated neurobiological measures. The present study examined developmental trajectories of irritability, clinical outcomes, and cortical structure among preschoolers oversampled for depressive symptoms. Beginning at 3 to 5 years old, a sample of 271 children enriched for early depressive symptoms were assessed longitudinally by clinical interview. Latent class mixture models identified trajectories of irritability severity. Risk factors, clinical outcomes, and cortical thickness were compared across trajectory classes. Cortical thickness measures were extracted from 3 waves of magnetic resonance imaging at 7 to 12 years of age. Three trajectory classes were identified among these youth: 53.50% of children exhibited elevated irritability during preschool that decreased longitudinally, 30.26% exhibited consistently low irritability, and 16.24% exhibited consistently elevated irritability. Compared with other classes, the elevated irritability class exhibited higher rates of maternal depression, early life adversity, later psychiatric diagnoses, and functional impairment. Further, elevated baseline irritability predicted later depression beyond adversity and personal and maternal depression history. The elevated irritability class exhibited a thicker cortex in the left superior frontal and temporal gyri and the right inferior parietal lobule. Irritability manifested with specific developmental trajectories in this sample enriched for early depression. Persistently elevated irritability predicted poor psychiatric outcomes, higher risk for later depression, and decreased overall function later in development. Greater frontal, temporal, and parietal cortical thickness also was found, providing neural

  7. 40 CFR 1065.245 - Sample flow meter for batch sampling.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Sample flow meter for batch sampling... Sample flow meter for batch sampling. (a) Application. Use a sample flow meter to determine sample flow... difference between a diluted exhaust sample flow meter and a dilution air meter to calculate raw exhaust flow...

  8. Balanced sampling

    NARCIS (Netherlands)

    Brus, D.J.

    2015-01-01

    In balanced sampling a linear relation between the soil property of interest and one or more covariates with known means is exploited in selecting the sampling locations. Recent developments make this sampling design attractive for statistical soil surveys. This paper introduces balanced sampling

  9. Low ficolin-3 levels in early follow-up serum samples are associated with the severity and unfavorable outcome of acute ischemic stroke

    DEFF Research Database (Denmark)

    Füst, George; Munthe-Fog, Lea; Illes, Zsolt

    2011-01-01

    demonstrated the significance of MBL in ischemic stroke, the role of ficolins has not been examined. METHODS: Sera were obtained within 12 hours after the onset of ischemic stroke (admission samples) and 3-4 days later (follow-up samples) from 65 patients. The control group comprised 100 healthy individuals......-up samples an inverse correlation was observed between ficolin-3 levels and concentration of S100β, an indicator of the size of cerebral infarct. Patients with low ficolin-3 levels and high CRP levels in the follow up samples had a significantly worse outcome (adjusted ORs 5.6 and 3.9, respectively...

  10. Ensemble Sampling

    OpenAIRE

    Lu, Xiuyuan; Van Roy, Benjamin

    2017-01-01

    Thompson sampling has emerged as an effective heuristic for a broad range of online decision problems. In its basic form, the algorithm requires computing and sampling from a posterior distribution over models, which is tractable only for simple special cases. This paper develops ensemble sampling, which aims to approximate Thompson sampling while maintaining tractability even in the face of complex models such as neural networks. Ensemble sampling dramatically expands on the range of applica...

  11. Atypical antipsychotics: trends in analysis and sample preparation of various biological samples.

    Science.gov (United States)

    Fragou, Domniki; Dotsika, Spyridoula; Sarafidou, Parthena; Samanidou, Victoria; Njau, Samuel; Kovatsi, Leda

    2012-05-01

    Atypical antipsychotics are increasingly popular and increasingly prescribed. In some countries, they can even be obtained over-the-counter, without a prescription, making their abuse quite easy. Although atypical antipsychotics are thought to be safer than typical antipsychotics, they still have severe side effects. Intoxications are not rare and some of them have a fatal outcome. Drug interactions involving atypical antipsychotics complicate patient management in clinical settings and the determination of the cause of death in fatalities. In view of the above, analytical strategies that can efficiently isolate atypical antipsychotics from a variety of biological samples and quantify them accurately, sensitively and reliably, are of utmost importance both for the clinical, as well as for the forensic toxicologist. In this review, we will present and discuss novel analytical strategies that have been developed from 2004 to the present day for the determination of atypical antipsychotics in various biological samples.

  12. Modern survey sampling

    CERN Document Server

    Chaudhuri, Arijit

    2014-01-01

    Exposure to SamplingAbstract Introduction Concepts of Population, Sample, and SamplingInitial RamificationsAbstract Introduction Sampling Design, Sampling SchemeRandom Numbers and Their Uses in Simple RandomSampling (SRS)Drawing Simple Random Samples with and withoutReplacementEstimation of Mean, Total, Ratio of Totals/Means:Variance and Variance EstimationDetermination of Sample SizesA.2 Appendix to Chapter 2 A.More on Equal Probability Sampling A.Horvitz-Thompson EstimatorA.SufficiencyA.LikelihoodA.Non-Existence Theorem More Intricacies Abstract Introduction Unequal Probability Sampling StrategiesPPS Sampling Exploring Improved WaysAbstract Introduction Stratified Sampling Cluster SamplingMulti-Stage SamplingMulti-Phase Sampling: Ratio and RegressionEstimationviiviii ContentsControlled SamplingModeling Introduction Super-Population ModelingPrediction Approach Model-Assisted Approach Bayesian Methods Spatial SmoothingSampling on Successive Occasions: Panel Rotation Non-Response and Not-at-Homes Weighting Adj...

  13. Respondent-Driven Sampling – Testing Assumptions: Sampling with Replacement

    Directory of Open Access Journals (Sweden)

    Barash Vladimir D.

    2016-03-01

    Full Text Available Classical Respondent-Driven Sampling (RDS estimators are based on a Markov Process model in which sampling occurs with replacement. Given that respondents generally cannot be interviewed more than once, this assumption is counterfactual. We join recent work by Gile and Handcock in exploring the implications of the sampling-with-replacement assumption for bias of RDS estimators. We differ from previous studies in examining a wider range of sampling fractions and in using not only simulations but also formal proofs. One key finding is that RDS estimates are surprisingly stable even in the presence of substantial sampling fractions. Our analyses show that the sampling-with-replacement assumption is a minor contributor to bias for sampling fractions under 40%, and bias is negligible for the 20% or smaller sampling fractions typical of field applications of RDS.

  14. Turnover, staffing, skill mix, and resident outcomes in a national sample of US nursing homes.

    Science.gov (United States)

    Trinkoff, Alison M; Han, Kihye; Storr, Carla L; Lerner, Nancy; Johantgen, Meg; Gartrell, Kyungsook

    2013-12-01

    The authors examined the relationship of staff turnover to selected nursing home quality outcomes, in the context of staffing and skill mix. Staff turnover is a serious concern in nursing homes as it has been found to adversely affect care. When employee turnover is minimized, better care quality is more likely in nursing homes. Data from the National Nursing Home Survey, a nationally representative sample of US nursing homes, were linked to Nursing Home Compare quality outcomes and analyzed using logistic regression. Nursing homes with high certified nursing assistant turnover had significantly higher odds of pressure ulcers, pain, and urinary tract infections even after controlling for staffing, skill mix, bed size, and ownership. Nurse turnover was associated with twice the odds of pressure ulcers, although this was attenuated when staffing was controlled. This study suggests turnover may be more important in explaining nursing home (NH) outcomes than staffing and skill mix and should therefore be given greater emphasis.

  15. Sample preparation

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    Sample preparation prior to HPLC analysis is certainly one of the most important steps to consider in trace or ultratrace analysis. For many years scientists have tried to simplify the sample preparation process. It is rarely possible to inject a neat liquid sample or a sample where preparation may not be any more complex than dissolution of the sample in a given solvent. The last process alone can remove insoluble materials, which is especially helpful with the samples in complex matrices if other interactions do not affect extraction. Here, it is very likely a large number of components will not dissolve and are, therefore, eliminated by a simple filtration process. In most cases, the process of sample preparation is not as simple as dissolution of the component interest. At times, enrichment is necessary, that is, the component of interest is present in very large volume or mass of material. It needs to be concentrated in some manner so a small volume of the concentrated or enriched sample can be injected into HPLC. 88 refs

  16. Relational Intimacy Mediates Sexual Outcomes Associated With Impaired Sexual Function: Examination in a Clinical Sample.

    Science.gov (United States)

    Witherow, Marta Parkanyi; Chandraiah, Shambhavi; Seals, Samantha R; Sarver, Dustin E; Parisi, Kathryn E; Bugan, Antal

    2017-06-01

    Relational intimacy is hypothesized to underlie the association between female sexual functioning and various sexual outcomes, and married women and women with sexual dysfunction have been generally absent from prior studies investigating these associations, thus restricting generalizability. To investigate whether relational intimacy mediates sexual outcomes (sexual satisfaction, coital frequency, and sexual distress) in a sample of married women with and without impaired sexual functioning presenting in clinical settings. Using a cross-sectional design, 64 heterosexual married women with (n = 44) and without (n = 20) impaired sexual functioning completed a battery of validated measurements assessing relational intimacy, sexual dysfunction, sexual frequency, satisfaction, and distress. Intimacy measurements were combined using latent factor scores before analysis. Bias-corrected mediation models of the indirect effect were used to test mediation effects. Moderated mediation models examined whether indirect effects were influenced by age and marital duration. Patients completed the Female Sexual Function Index, the Couple's Satisfaction Index, the Sexual Satisfaction Scale for Women, the Inclusion of the Other in the Self Scale, and the Miller Social Intimacy Test. Mediation models showed that impaired sexual functioning is associated with all sexual outcomes directly and indirectly through relational intimacy. Results were predominantly independent of age and marital duration. Findings have important treatment implications for modifying interventions to focus on enhancing relational intimacy to improve the sexual functioning of women with impaired sexual functioning. The importance of the role relational intimacy plays in broad sexual outcomes of women with impaired sexual functioning is supported in clinically referred and married women. Latent factor scores to improve estimation of study constructs and the use of contemporary mediation analysis also are

  17. Sample summary report for ARG 1 pressure tube sample

    International Nuclear Information System (INIS)

    Belinco, C.

    2006-01-01

    The ARG 1 sample is made from an un-irradiated Zr-2.5% Nb pressure tube. The sample has 103.4 mm ID, 112 mm OD and approximately 500 mm length. A punch mark was made very close to one end of the sample. The punch mark indicates the 12 O'clock position and also identifies the face of the tube for making all the measurements. ARG 1 sample contains flaws on ID and OD surface. There was no intentional flaw within the wall of the pressure tube sample. Once the flaws are machined the pressure tube sample was covered from outside to hide the OD flaws. Approximately 50 mm length of pressure tube was left open at both the ends to facilitate the holding of sample in the fixtures for inspection. No flaw was machined in this zone of 50 mm on either end of the pressure tube sample. A total of 20 flaws were machined in ARG 1 sample. Out of these, 16 flaws were on the OD surface and the remaining 4 on the ID surface of the pressure tube. The flaws were characterized in to various groups like axial flaws, circumferential flaws, etc

  18. Spot sputum samples are at least as good as early morning samples for identifying Mycobacterium tuberculosis.

    Science.gov (United States)

    Murphy, Michael E; Phillips, Patrick P J; Mendel, Carl M; Bongard, Emily; Bateson, Anna L C; Hunt, Robert; Murthy, Saraswathi; Singh, Kasha P; Brown, Michael; Crook, Angela M; Nunn, Andrew J; Meredith, Sarah K; Lipman, Marc; McHugh, Timothy D; Gillespie, Stephen H

    2017-10-27

    The use of early morning sputum samples (EMS) to diagnose tuberculosis (TB) can result in treatment delay given the need for the patient to return to the clinic with the EMS, increasing the chance of patients being lost during their diagnostic workup. However, there is little evidence to support the superiority of EMS over spot sputum samples. In this new analysis of the REMoxTB study, we compare the diagnostic accuracy of EMS with spot samples for identifying Mycobacterium tuberculosis pre- and post-treatment. Patients who were smear positive at screening were enrolled into the study. Paired sputum samples (one EMS and one spot) were collected at each trial visit pre- and post-treatment. Microscopy and culture on solid LJ and liquid MGIT media were performed on all samples; those missing corresponding paired results were excluded from the analyses. Data from 1115 pre- and 2995 post-treatment paired samples from 1931 patients enrolled in the REMoxTB study were analysed. Patients were recruited from South Africa (47%), East Africa (21%), India (20%), Asia (11%), and North America (1%); 70% were male, median age 31 years (IQR 24-41), 139 (7%) co-infected with HIV with a median CD4 cell count of 399 cells/μL (IQR 318-535). Pre-treatment spot samples had a higher yield of positive Ziehl-Neelsen smears (98% vs. 97%, P = 0.02) and LJ cultures (87% vs. 82%, P = 0.006) than EMS, but there was no difference for positivity by MGIT (93% vs. 95%, P = 0.18). Contaminated and false-positive MGIT were found more often with EMS rather than spot samples. Surprisingly, pre-treatment EMS had a higher smear grading and shorter time-to-positivity, by 1 day, than spot samples in MGIT culture (4.5 vs. 5.5 days, P spot samples in those with unfavourable outcomes, there were no differences in smear or culture results, and positive results were not detected earlier in Kaplan-Meier analyses in either EMS or spot samples. Our data do not support the hypothesis that EMS

  19. Enhanced conformational sampling using enveloping distribution sampling.

    Science.gov (United States)

    Lin, Zhixiong; van Gunsteren, Wilfred F

    2013-10-14

    To lessen the problem of insufficient conformational sampling in biomolecular simulations is still a major challenge in computational biochemistry. In this article, an application of the method of enveloping distribution sampling (EDS) is proposed that addresses this challenge and its sampling efficiency is demonstrated in simulations of a hexa-β-peptide whose conformational equilibrium encompasses two different helical folds, i.e., a right-handed 2.7(10∕12)-helix and a left-handed 3(14)-helix, separated by a high energy barrier. Standard MD simulations of this peptide using the GROMOS 53A6 force field did not reach convergence of the free enthalpy difference between the two helices even after 500 ns of simulation time. The use of soft-core non-bonded interactions in the centre of the peptide did enhance the number of transitions between the helices, but at the same time led to neglect of relevant helical configurations. In the simulations of a two-state EDS reference Hamiltonian that envelops both the physical peptide and the soft-core peptide, sampling of the conformational space of the physical peptide ensures that physically relevant conformations can be visited, and sampling of the conformational space of the soft-core peptide helps to enhance the transitions between the two helices. The EDS simulations sampled many more transitions between the two helices and showed much faster convergence of the relative free enthalpy of the two helices compared with the standard MD simulations with only a slightly larger computational effort to determine optimized EDS parameters. Combined with various methods to smoothen the potential energy surface, the proposed EDS application will be a powerful technique to enhance the sampling efficiency in biomolecular simulations.

  20. Drinking game participation and outcomes in a sample of Australian university students.

    Science.gov (United States)

    George, Amanda M; Zamboanga, Byron L

    2018-05-15

    Most drinking game (DG) research among university students has been conducted among USA college samples. The extent to which demographics and game type (e.g. team and sculling games) are linked to DG behaviours/consequences among non-USA students is not well understood. As such, the current study investigated characteristics of DG participation (and associated outcomes) among a sample of Australian university students. University students (N = 252; aged 18-24 years; 67% female) who had consumed alcohol in the prior year completed an online survey. Measures included demographics, DG behaviours (lifetime, frequency and consumption) and gaming-specific consequences. Most of the students reported lifetime DG participation (85%). Among those who played a DG in the prior 6 months (69%), most had experienced a negative gaming-specific consequence. While team games were the most popular DG played, regression analysis demonstrated that participation in games which encouraged consumption (e.g. sculling) were associated with increased alcohol consumption during play. In addition to being older, playing DGs more frequently, and consuming more alcohol while playing, participation in both consumption and dice games (e.g. 7-11, doubles) predicted more negative gaming-specific consequences. DG participation is common among Australian university students, as it is in other parts of the world. The importance of game type is clear, particularly the risk of consumption games. Findings could help inform interventions to reduce participation in consumption games and identify students who might be especially at-risk for experiencing negative DG consequences. © 2018 Australasian Professional Society on Alcohol and other Drugs.

  1. Standard methods for sampling and sample preparation for gamma spectroscopy

    International Nuclear Information System (INIS)

    Taskaeva, M.; Taskaev, E.; Nikolov, P.

    1993-01-01

    The strategy for sampling and sample preparation is outlined: necessary number of samples; analysis and treatment of the results received; quantity of the analysed material according to the radionuclide concentrations and analytical methods; the minimal quantity and kind of the data needed for making final conclusions and decisions on the base of the results received. This strategy was tested in gamma spectroscopic analysis of radionuclide contamination of the region of Eleshnitsa Uranium Mines. The water samples was taken and stored according to the ASTM D 3370-82. The general sampling procedures were in conformity with the recommendations of ISO 5667. The radionuclides was concentrated by coprecipitation with iron hydroxide and ion exchange. The sampling of soil samples complied with the rules of ASTM C 998, and their sample preparation - with ASTM C 999. After preparation the samples were sealed hermetically and measured. (author)

  2. The Internet of Samples in the Earth Sciences (iSamples)

    Science.gov (United States)

    Carter, M. R.; Lehnert, K. A.

    2015-12-01

    Across most Earth Science disciplines, research depends on the availability of samples collected above, at, and beneath Earth's surface, on the moon and in space, or generated in experiments. Many domains in the Earth Sciences have recently expressed the need for better discovery, access, and sharing of scientific samples and collections (EarthCube End-User Domain workshops, 2012 and 2013, http://earthcube.org/info/about/end-user-workshops), as has the US government (OSTP Memo, March 2014). The Internet of Samples in the Earth Sciences (iSamples) is an initiative funded as a Research Coordination Network (RCN) within the EarthCube program to address this need. iSamples aims to advance the use of innovative cyberinfrastructure to connect physical samples and sample collections across the Earth Sciences with digital data infrastructures to revolutionize their utility for science. iSamples strives to build, grow, and foster a new community of practice, in which domain scientists, curators of sample repositories and collections, computer and information scientists, software developers and technology innovators engage in and collaborate on defining, articulating, and addressing the needs and challenges of physical samples as a critical component of digital data infrastructure. A primary goal of iSamples is to deliver a community-endorsed set of best practices and standards for the registration, description, identification, and citation of physical specimens and define an actionable plan for implementation. iSamples conducted a broad community survey about sample sharing and has created 5 different working groups to address the different challenges of developing the internet of samples - from metadata schemas and unique identifiers to an architecture of a shared cyberinfrastructure for collections, to digitization of existing collections, to education, and ultimately to establishing the physical infrastructure that will ensure preservation and access of the physical

  3. Sample vial inserts: A better approach for sampling heterogeneous slurry samples in the SRS Defense Waste Processing Facility

    International Nuclear Information System (INIS)

    Coleman, C.J.; Goode, S.R.

    1996-01-01

    A convenient and effective new approach for analyzing DWPF samples involves the use of inserts with volumes of 1.5--3 ml placed in the neck of 14 ml sample vials. The inserts have rims that conform to the rim of the vials so that they sit straight and stable in the vial. The DWPF tank sampling system fills the pre-weighed insert rather than the entire vial, so the vial functions only as the insert holder. The shielded cell operator then removes the vial cap and decants the insert containing the sample into a plastic bottle, crucible, etc., for analysis. Inert materials such as Teflon, plastic, and zirconium are used for the insert so it is unnecessary to separate the insert from the sample for most analyses. The key technique advantage of using inserts to take DWPF samples versus filling sample vials is that it provides a convenient and almost foolproof way of obtaining and handling small volumes of slurry samples in a shielded cell without corrupting the sample. Since the insert allows the entire sample to be analyzed, this approach eliminates the errors inherent with subsampling heterogeneous slurries that comprise DWPF samples. Slurry samples can then be analyzed with confidence. Analysis times are dramatically reduced by eliminating the drying and vitrification steps normally used to produce a homogeneous solid sample. Direct dissolution and elemental analysis of slurry samples are achieved in 8 hours or less compared with 40 hours for analysis of vitrified slurry samples. Comparison of samples taken in inserts versus full vials indicate that the insert does not significantly affect sample composition

  4. Concepts in sample size determination

    Directory of Open Access Journals (Sweden)

    Umadevi K Rao

    2012-01-01

    Full Text Available Investigators involved in clinical, epidemiological or translational research, have the drive to publish their results so that they can extrapolate their findings to the population. This begins with the preliminary step of deciding the topic to be studied, the subjects and the type of study design. In this context, the researcher must determine how many subjects would be required for the proposed study. Thus, the number of individuals to be included in the study, i.e., the sample size is an important consideration in the design of many clinical studies. The sample size determination should be based on the difference in the outcome between the two groups studied as in an analytical study, as well as on the accepted p value for statistical significance and the required statistical power to test a hypothesis. The accepted risk of type I error or alpha value, which by convention is set at the 0.05 level in biomedical research defines the cutoff point at which the p value obtained in the study is judged as significant or not. The power in clinical research is the likelihood of finding a statistically significant result when it exists and is typically set to >80%. This is necessary since the most rigorously executed studies may fail to answer the research question if the sample size is too small. Alternatively, a study with too large a sample size will be difficult and will result in waste of time and resources. Thus, the goal of sample size planning is to estimate an appropriate number of subjects for a given study design. This article describes the concepts in estimating the sample size.

  5. Adrenal Vein Sampling for Conn's Syndrome: Diagnosis and Clinical Outcomes.

    Science.gov (United States)

    Deipolyi, Amy R; Bailin, Alexander; Wicky, Stephan; Alansari, Shehab; Oklu, Rahmi

    2015-06-19

    Adrenal vein sampling (AVS) is the gold standard test to determine unilateral causes of primary aldosteronism (PA). We have retrospectively characterized our experience with AVS including concordance of AVS results and imaging, and describe the approach for the PA patient in whom bilateral AVS is unsuccessful. We reviewed the medical records of 85 patients with PA and compared patients who were treated medically and surgically on pre-procedure presentation and post-treatment outcomes, and evaluated how technically unsuccessful AVS results were used in further patient management. Out of the 92 AVS performed in 85 patients, AVS was technically successful bilaterally in 58 (63%) of cases. Either unsuccessful AVS prompted a repeat AVS, or results from the contralateral side and from CT imaging were used to guide further therapy. Patients who were managed surgically with adrenalectomy had higher initial blood pressure and lower potassium levels compared with patients who were managed medically. Adrenalectomy results in significantly decreased blood pressure and normalization of potassium levels. AVS can identify surgically curable causes of PA, but can be technically challenging. When one adrenal vein fails to be cannulated, results from the contralateral vein can be useful in conjunction with imaging and clinical findings to suggest further management.

  6. Signal sampling circuit

    NARCIS (Netherlands)

    Louwsma, S.M.; Vertregt, Maarten

    2011-01-01

    A sampling circuit for sampling a signal is disclosed. The sampling circuit comprises a plurality of sampling channels adapted to sample the signal in time-multiplexed fashion, each sampling channel comprising a respective track-and-hold circuit connected to a respective analogue to digital

  7. Signal sampling circuit

    NARCIS (Netherlands)

    Louwsma, S.M.; Vertregt, Maarten

    2010-01-01

    A sampling circuit for sampling a signal is disclosed. The sampling circuit comprises a plurality of sampling channels adapted to sample the signal in time-multiplexed fashion, each sampling channel comprising a respective track-and-hold circuit connected to a respective analogue to digital

  8. Understanding reasons for and outcomes of patients lost to follow-up in antiretroviral therapy programs in Africa through a sampling-based approach.

    Science.gov (United States)

    Geng, Elvin H; Bangsberg, David R; Musinguzi, Nicolas; Emenyonu, Nneka; Bwana, Mwebesa Bosco; Yiannoutsos, Constantin T; Glidden, David V; Deeks, Steven G; Martin, Jeffrey N

    2010-03-01

    Losses to follow-up after initiation of antiretroviral therapy (ART) are common in Africa and are a considerable obstacle to understanding the effectiveness of nascent treatment programs. We sought to characterize, through a sampling-based approach, reasons for and outcomes of patients who become lost to follow-up. Cohort study. We searched for and interviewed a representative sample of lost patients or close informants in the community to determine reasons for and outcomes among lost patients. Three thousand six hundred twenty-eight HIV-infected adults initiated ART between January 1, 2004 and September 30, 2007 in Mbarara, Uganda. Eight hundred twenty-nine became lost to follow-up (cumulative incidence at 1, 2, and 3 years of 16%, 30%, and 39%). We sought a representative sample of 128 lost patients in the community and ascertained vital status in 111 (87%). Top reasons for loss included lack of transportation or money and work/child care responsibilities. Among the 111 lost patients who had their vital status ascertained through tracking, 32 deaths occurred (cumulative 1-year incidence 36%); mortality was highest shortly after the last clinic visit. Lower pre-ART CD4 T-cell count, older age, low blood pressure, and a central nervous system syndrome at the last clinic visit predicted deaths. Of patients directly interviewed, 83% were in care at another clinic and 71% were still using ART. Sociostructural factors are the primary reasons for loss to follow-up. Outcomes among the lost are heterogeneous: both deaths and transfers to other clinics were common. Tracking a sample of lost patients is an efficient means for programs to understand site-specific reasons for and outcomes among patients lost to follow-up.

  9. Microfabricated Devices for Sample Extraction, Concentrations, and Related Sample Processing Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Gang; Lin, Yuehe

    2006-12-01

    This is an invited book chapter. As with other analytical techniques, sample pretreatments, sample extraction, sample introduction, and related techniques are of extreme importance for micro-electro-mechanical systems (MEMS). Bio-MEMS devices and systems start with a sampling step. The biological sample then usually undergoes some kinds of sample preparation steps before the actual analysis. These steps may involve extracting the target sample from its matrix, removing interferences from the sample, derivatizing the sample to detectable species, or performing a sample preconcentration step. The integration of the components for sample pretreatment into microfluidic devices represents one of the remaining the bottle-neck towards achieving true miniaturized total analysis systems (?TAS). This chapter provides a thorough state-of-art of the developments in this field to date.

  10. Limited-sampling strategies for anti-infective agents: systematic review.

    Science.gov (United States)

    Sprague, Denise A; Ensom, Mary H H

    2009-09-01

    Area under the concentration-time curve (AUC) is a pharmacokinetic parameter that represents overall exposure to a drug. For selected anti-infective agents, pharmacokinetic-pharmacodynamic parameters, such as AUC/MIC (where MIC is the minimal inhibitory concentration), have been correlated with outcome in a few studies. A limited-sampling strategy may be used to estimate pharmacokinetic parameters such as AUC, without the frequent, costly, and inconvenient blood sampling that would be required to directly calculate the AUC. To discuss, by means of a systematic review, the strengths, limitations, and clinical implications of published studies involving a limited-sampling strategy for anti-infective agents and to propose improvements in methodology for future studies. The PubMed and EMBASE databases were searched using the terms "anti-infective agents", "limited sampling", "optimal sampling", "sparse sampling", "AUC monitoring", "abbreviated AUC", "abbreviated sampling", and "Bayesian". The reference lists of retrieved articles were searched manually. Included studies were classified according to modified criteria from the US Preventive Services Task Force. Twenty studies met the inclusion criteria. Six of the studies (involving didanosine, zidovudine, nevirapine, ciprofloxacin, efavirenz, and nelfinavir) were classified as providing level I evidence, 4 studies (involving vancomycin, didanosine, lamivudine, and lopinavir-ritonavir) provided level II-1 evidence, 2 studies (involving saquinavir and ceftazidime) provided level II-2 evidence, and 8 studies (involving ciprofloxacin, nelfinavir, vancomycin, ceftazidime, ganciclovir, pyrazinamide, meropenem, and alpha interferon) provided level III evidence. All of the studies providing level I evidence used prospectively collected data and proper validation procedures with separate, randomly selected index and validation groups. However, most of the included studies did not provide an adequate description of the methods or

  11. Utility of the Mayo-Portland adaptability inventory-4 for self-reported outcomes in a military sample with traumatic brain injury.

    Science.gov (United States)

    Kean, Jacob; Malec, James F; Cooper, Douglas B; Bowles, Amy O

    2013-12-01

    To investigate the psychometric properties of the Mayo-Portland Adaptability Inventory-4 (MPAI-4) obtained by self-report in a large sample of active duty military personnel with traumatic brain injury (TBI). Consecutive cohort who completed the MPAI-4 as a part of a larger battery of clinical outcome measures at the time of intake to an outpatient brain injury clinic. Medical center. Consecutively referred sample of active duty military personnel (N=404) who suffered predominantly mild (n=355), but also moderate (n=37) and severe (n=12), TBI. Not applicable. MPAI-4 RESULTS: Initial factor analysis suggested 2 salient dimensions. In subsequent analysis, the ratio of the first and second eigenvalues (6.84:1) and parallel analysis indicated sufficient unidimensionality in 26 retained items. Iterative Rasch analysis resulted in the rescaling of the measure and the removal of 5 additional items for poor fit. The items of the final 21-item Mayo-Portland Adaptability Inventory-military were locally independent, demonstrated monotonically increasing responses, adequately fit the item response model, and permitted the identification of nearly 5 statistically distinct levels of disability in the study population. Slight mistargeting of the population resulted in the global outcome, as measured by the Mayo-Portland Adaptability Inventory-military, tending to be less reflective of very mild levels of disability. These data collected in a relatively large sample of active duty service members with TBI provide insight into the ability of patients to self-report functional impairment and the distinct effects of military deployment on outcome, providing important guidance for the meaningful measurement of outcome in this population. Copyright © 2013 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  12. Sampling bias in climate-conflict research

    Science.gov (United States)

    Adams, Courtland; Ide, Tobias; Barnett, Jon; Detges, Adrien

    2018-03-01

    Critics have argued that the evidence of an association between climate change and conflict is flawed because the research relies on a dependent variable sampling strategy1-4. Similarly, it has been hypothesized that convenience of access biases the sample of cases studied (the `streetlight effect'5). This also gives rise to claims that the climate-conflict literature stigmatizes some places as being more `naturally' violent6-8. Yet there has been no proof of such sampling patterns. Here we test whether climate-conflict research is based on such a biased sample through a systematic review of the literature. We demonstrate that research on climate change and violent conflict suffers from a streetlight effect. Further, studies which focus on a small number of cases in particular are strongly informed by cases where there has been conflict, do not sample on the independent variables (climate impact or risk), and hence tend to find some association between these two variables. These biases mean that research on climate change and conflict primarily focuses on a few accessible regions, overstates the links between both phenomena and cannot explain peaceful outcomes from climate change. This could result in maladaptive responses in those places that are stigmatized as being inherently more prone to climate-induced violence.

  13. Environmental sampling

    International Nuclear Information System (INIS)

    Puckett, J.M.

    1998-01-01

    Environmental Sampling (ES) is a technology option that can have application in transparency in nuclear nonproliferation. The basic process is to take a sample from the environment, e.g., soil, water, vegetation, or dust and debris from a surface, and through very careful sample preparation and analysis, determine the types, elemental concentration, and isotopic composition of actinides in the sample. The sample is prepared and the analysis performed in a clean chemistry laboratory (CCL). This ES capability is part of the IAEA Strengthened Safeguards System. Such a Laboratory is planned to be built by JAERI at Tokai and will give Japan an intrinsic ES capability. This paper presents options for the use of ES as a transparency measure for nuclear nonproliferation

  14. Soil Gas Sample Handling: Evaluation of Water Removal and Sample Ganging

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Brad G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Abrecht, David G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hayes, James C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Mendoza, Donaldo P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-10-31

    Soil gas sampling is currently conducted in support of Nuclear Test Ban treaty verification. Soil gas samples are collected and analyzed for isotopes of interest. Some issues that can impact sampling and analysis of these samples are excess moisture and sample processing time. Here we discuss three potential improvements to the current sampling protocol; a desiccant for water removal, use of molecular sieve to remove CO2 from the sample during collection, and a ganging manifold to allow composite analysis of multiple samples.

  15. Optimization of sampling parameters for standardized exhaled breath sampling.

    Science.gov (United States)

    Doran, Sophie; Romano, Andrea; Hanna, George B

    2017-09-05

    The lack of standardization of breath sampling is a major contributing factor to the poor repeatability of results and hence represents a barrier to the adoption of breath tests in clinical practice. On-line and bag breath sampling have advantages but do not suit multicentre clinical studies whereas storage and robust transport are essential for the conduct of wide-scale studies. Several devices have been developed to control sampling parameters and to concentrate volatile organic compounds (VOCs) onto thermal desorption (TD) tubes and subsequently transport those tubes for laboratory analysis. We conducted three experiments to investigate (i) the fraction of breath sampled (whole vs. lower expiratory exhaled breath); (ii) breath sample volume (125, 250, 500 and 1000ml) and (iii) breath sample flow rate (400, 200, 100 and 50 ml/min). The target VOCs were acetone and potential volatile biomarkers for oesophago-gastric cancer belonging to the aldehyde, fatty acids and phenol chemical classes. We also examined the collection execution time and the impact of environmental contamination. The experiments showed that the use of exhaled breath-sampling devices requires the selection of optimum sampling parameters. The increase in sample volume has improved the levels of VOCs detected. However, the influence of the fraction of exhaled breath and the flow rate depends on the target VOCs measured. The concentration of potential volatile biomarkers for oesophago-gastric cancer was not significantly different between the whole and lower airway exhaled breath. While the recovery of phenols and acetone from TD tubes was lower when breath sampling was performed at a higher flow rate, other VOCs were not affected. A dedicated 'clean air supply' overcomes the contamination from ambient air, but the breath collection device itself can be a source of contaminants. In clinical studies using VOCs to diagnose gastro-oesophageal cancer, the optimum parameters are 500mls sample volume

  16. Sample summary report for KOR1 pressure tube sample

    International Nuclear Information System (INIS)

    Lee, Hee Jong; Nam, Min Woo; Choi, Young Ha

    2006-01-01

    This summary report includes basically the following: - The FLAW CHARACTERIZATION TABLE of KOR1 sample and supporting documentation. - The CROSS REFERENCE TABLES for each investigator, which is the SAMPLE INSPECTION TABLE that cross reference to the FLAW CHARACTERIZATION TABLE. - Each Sample Inspection Report as Appendices

  17. Language sampling

    DEFF Research Database (Denmark)

    Rijkhoff, Jan; Bakker, Dik

    1998-01-01

    This article has two aims: [1] to present a revised version of the sampling method that was originally proposed in 1993 by Rijkhoff, Bakker, Hengeveld and Kahrel, and [2] to discuss a number of other approaches to language sampling in the light of our own method. We will also demonstrate how our...... sampling method is used with different genetic classifications (Voegelin & Voegelin 1977, Ruhlen 1987, Grimes ed. 1997) and argue that —on the whole— our sampling technique compares favourably with other methods, especially in the case of exploratory research....

  18. Clinical presentation and outcome of avoidant/restrictive food intake disorder in a Japanese sample.

    Science.gov (United States)

    Nakai, Yoshikatsu; Nin, Kazuko; Noma, Shun'ichi; Hamagaki, Seiji; Takagi, Ryuro; Teramukai, Satoshi; Wonderlich, Stephen A

    2017-01-01

    We conducted a study of the clinical presentation and outcome in patients with avoidant/restrictive food intake disorder (ARFID), aged 15-40years, and compared this group to an anorexia nervosa (AN) group in a Japanese sample. A retrospective chart review was completed on 245 patients with feeding and eating disorders (FEDs), analyzing prevalence, clinical presentation, psychopathological properties, and outcomes. Using the DSM-5 criteria, 27 (11.0%) out of the 245 patients with a FED met the criteria for ARFID at entry. All patients with ARFID were women. In terms of eating disorder symptoms, all patients with ARFID had restrictive eating related to emotional problems and/or gastrointestinal symptoms. However, none of the ARFID patients reported food avoidance related to sensory characteristics or functional dysphagia. Additionally, none of them exhibited binge eating or purging behaviors, and none of them reported excessive exercise. The ARFID group had a significantly shorter duration of illness, lower rates of admission history, and less severe psychopathology than the AN group. The ARFID group reported significantly better outcome results than the AN group. These results suggest that patients with ARFID in this study were clinically distinct from those with AN and somewhat different from pediatric patients with ARFID in previous studies. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. A Note on Information-Directed Sampling and Thompson Sampling

    OpenAIRE

    Zhou, Li

    2015-01-01

    This note introduce three Bayesian style Multi-armed bandit algorithms: Information-directed sampling, Thompson Sampling and Generalized Thompson Sampling. The goal is to give an intuitive explanation for these three algorithms and their regret bounds, and provide some derivations that are omitted in the original papers.

  20. Occurrence of multiple mental health or substance use outcomes among bisexuals: a respondent-driven sampling study

    Directory of Open Access Journals (Sweden)

    Greta R. Bauer

    2016-06-01

    Full Text Available Abstract Background Bisexual populations have higher prevalence of depression, anxiety, suicidality and substance use than heterosexuals, and often than gay men or lesbians. The co-occurrence of multiple outcomes has rarely been studied. Methods Data were collected from 405 bisexuals using respondent-driven sampling. Weighted analyses were conducted for 387 with outcome data. Multiple outcomes were defined as 3 or more of: depression, anxiety, suicide ideation, problematic alcohol use, or polysubstance use. Results Among bisexuals, 19.0 % had multiple outcomes. We did not find variation in raw frequency of multiple outcomes across sociodemographic variables (e.g. gender, age. After adjustment, gender and sexual orientation identity were associated, with transgender women and those identifying as bisexual only more likely to have multiple outcomes. Social equity factors had a strong impact in both crude and adjusted analysis: controlling for other factors, high mental health/substance use burden was associated with greater discrimination (prevalence risk ratio (PRR = 5.71; 95 % CI: 2.08, 15.63 and lower education (PRR = 2.41; 95 % CI: 1.06, 5.49, while higher income-to-needs ratio was protective (PRR = 0.44; 0.20, 1.00. Conclusions Mental health and substance use outcomes with high prevalence among bisexuals frequently co-occurred. We find some support for the theory that these multiple outcomes represent a syndemic, defined as co-occurring and mutually reinforcing adverse outcomes driven by social inequity.

  1. How Sample Size Affects a Sampling Distribution

    Science.gov (United States)

    Mulekar, Madhuri S.; Siegel, Murray H.

    2009-01-01

    If students are to understand inferential statistics successfully, they must have a profound understanding of the nature of the sampling distribution. Specifically, they must comprehend the determination of the expected value and standard error of a sampling distribution as well as the meaning of the central limit theorem. Many students in a high…

  2. Comparisons of methods for generating conditional Poisson samples and Sampford samples

    OpenAIRE

    Grafström, Anton

    2005-01-01

    Methods for conditional Poisson sampling (CP-sampling) and Sampford sampling are compared and the focus is on the efficiency of the methods. The efficiency is investigated by simulation in different sampling situations. It was of interest to compare methods since new methods for both CP-sampling and Sampford sampling were introduced by Bondesson, Traat & Lundqvist in 2004. The new methods are acceptance rejection methods that use the efficient Pareto sampling method. They are found to be ...

  3. Sampling the Mouse Hippocampal Dentate Gyrus

    Directory of Open Access Journals (Sweden)

    Lisa Basler

    2017-12-01

    Full Text Available Sampling is a critical step in procedures that generate quantitative morphological data in the neurosciences. Samples need to be representative to allow statistical evaluations, and samples need to deliver a precision that makes statistical evaluations not only possible but also meaningful. Sampling generated variability should, e.g., not be able to hide significant group differences from statistical detection if they are present. Estimators of the coefficient of error (CE have been developed to provide tentative answers to the question if sampling has been “good enough” to provide meaningful statistical outcomes. We tested the performance of the commonly used Gundersen-Jensen CE estimator, using the layers of the mouse hippocampal dentate gyrus as an example (molecular layer, granule cell layer and hilus. We found that this estimator provided useful estimates of the precision that can be expected from samples of different sizes. For all layers, we found that a smoothness factor (m of 0 generally provided better estimates than an m of 1. Only for the combined layers, i.e., the entire dentate gyrus, better CE estimates could be obtained using an m of 1. The orientation of the sections impacted on CE sizes. Frontal (coronal sections are typically most efficient by providing the smallest CEs for a given amount of work. Applying the estimator to 3D-reconstructed layers and using very intense sampling, we observed CE size plots with m = 0 to m = 1 transitions that should also be expected but are not often observed in real section series. The data we present also allows the reader to approximate the sampling intervals in frontal, horizontal or sagittal sections that provide CEs of specified sizes for the layers of the mouse dentate gyrus.

  4. Large Sample Neutron Activation Analysis of Heterogeneous Samples

    International Nuclear Information System (INIS)

    Stamatelatos, I.E.; Vasilopoulou, T.; Tzika, F.

    2018-01-01

    A Large Sample Neutron Activation Analysis (LSNAA) technique was developed for non-destructive analysis of heterogeneous bulk samples. The technique incorporated collimated scanning and combining experimental measurements and Monte Carlo simulations for the identification of inhomogeneities in large volume samples and the correction of their effect on the interpretation of gamma-spectrometry data. Corrections were applied for the effect of neutron self-shielding, gamma-ray attenuation, geometrical factor and heterogeneous activity distribution within the sample. A benchmark experiment was performed to investigate the effect of heterogeneity on the accuracy of LSNAA. Moreover, a ceramic vase was analyzed as a whole demonstrating the feasibility of the technique. The LSNAA results were compared against results obtained by INAA and a satisfactory agreement between the two methods was observed. This study showed that LSNAA is a technique capable to perform accurate non-destructive, multi-elemental compositional analysis of heterogeneous objects. It also revealed the great potential of the technique for the analysis of precious objects and artefacts that need to be preserved intact and cannot be damaged for sampling purposes. (author)

  5. Sampling in practice

    DEFF Research Database (Denmark)

    Esbensen, Kim Harry; Petersen, Lars

    2005-01-01

    A basic knowledge of the Theory of Sampling (TOS) and a set of only eight sampling unit operations is all the practical sampler needs to ensure representativeness of samples extracted from all kinds of lots: production batches, - truckloads, - barrels, sub-division in the laboratory, sampling...... in nature and in the field (environmental sampling, forestry, geology, biology), from raw materials or manufactory processes etc. We here can only give a brief introduction to the Fundamental Sampling Principle (FSP) and these eight Sampling Unit Operations (SUO’s). Always respecting FSP and invoking only...... the necessary SUO’s (dependent on the practical situation) is the only prerequisite needed for eliminating all sampling bias and simultaneously minimizing sampling variance, and this is in addition a sure guarantee for making the final analytical results trustworthy. No reliable conclusions can be made unless...

  6. Effective sample labeling

    International Nuclear Information System (INIS)

    Rieger, J.T.; Bryce, R.W.

    1990-01-01

    Ground-water samples collected for hazardous-waste and radiological monitoring have come under strict regulatory and quality assurance requirements as a result of laws such as the Resource Conservation and Recovery Act. To comply with these laws, the labeling system used to identify environmental samples had to be upgraded to ensure proper handling and to protect collection personnel from exposure to sample contaminants and sample preservatives. The sample label now used as the Pacific Northwest Laboratory is a complete sample document. In the event other paperwork on a labeled sample were lost, the necessary information could be found on the label

  7. On the Sampling

    OpenAIRE

    Güleda Doğan

    2017-01-01

    This editorial is on statistical sampling, which is one of the most two important reasons for editorial rejection from our journal Turkish Librarianship. The stages of quantitative research, the stage in which we are sampling, the importance of sampling for a research, deciding on sample size and sampling methods are summarised briefly.

  8. Sampling of ore

    International Nuclear Information System (INIS)

    Boehme, R.C.; Nicholas, B.L.

    1987-01-01

    This invention relates to a method of an apparatus for ore sampling. The method includes the steps of periodically removing a sample of the output material of a sorting machine, weighing each sample so that each is of the same weight, measuring a characteristic such as the radioactivity, magnetivity or the like of each sample, subjecting at least an equal portion of each sample to chemical analysis to determine the mineral content of the sample and comparing the characteristic measurement with desired mineral content of the chemically analysed portion of the sample to determine the characteristic/mineral ratio of the sample. The apparatus includes an ore sample collector, a deflector for deflecting a sample of ore particles from the output of an ore sorter into the collector and means for moving the deflector from a first position in which it is clear of the particle path from the sorter to a second position in which it is in the particle path at predetermined time intervals and for predetermined time periods to deflect the sample particles into the collector. The apparatus conveniently includes an ore crusher for comminuting the sample particle, a sample hopper means for weighing the hopper, a detector in the hopper for measuring a characteristic such as radioactivity, magnetivity or the like of particles in the hopper, a discharge outlet from the hopper and means for feeding the particles from the collector to the crusher and then to the hopper

  9. Are most samples of animals systematically biased? Consistent individual trait differences bias samples despite random sampling.

    Science.gov (United States)

    Biro, Peter A

    2013-02-01

    Sampling animals from the wild for study is something nearly every biologist has done, but despite our best efforts to obtain random samples of animals, 'hidden' trait biases may still exist. For example, consistent behavioral traits can affect trappability/catchability, independent of obvious factors such as size and gender, and these traits are often correlated with other repeatable physiological and/or life history traits. If so, systematic sampling bias may exist for any of these traits. The extent to which this is a problem, of course, depends on the magnitude of bias, which is presently unknown because the underlying trait distributions in populations are usually unknown, or unknowable. Indeed, our present knowledge about sampling bias comes from samples (not complete population censuses), which can possess bias to begin with. I had the unique opportunity to create naturalized populations of fish by seeding each of four small fishless lakes with equal densities of slow-, intermediate-, and fast-growing fish. Using sampling methods that are not size-selective, I observed that fast-growing fish were up to two-times more likely to be sampled than slower-growing fish. This indicates substantial and systematic bias with respect to an important life history trait (growth rate). If correlations between behavioral, physiological and life-history traits are as widespread as the literature suggests, then many animal samples may be systematically biased with respect to these traits (e.g., when collecting animals for laboratory use), and affect our inferences about population structure and abundance. I conclude with a discussion on ways to minimize sampling bias for particular physiological/behavioral/life-history types within animal populations.

  10. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality

    Science.gov (United States)

    Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Introduction Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. Materials and methods The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. Results The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. Conclusions The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in

  11. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality.

    Science.gov (United States)

    L'Engle, Kelly; Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit dialing of mobile

  12. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality.

    Directory of Open Access Journals (Sweden)

    Kelly L'Engle

    Full Text Available Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample.The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census.The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample.The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit

  13. A normative inference approach for optimal sample sizes in decisions from experience

    Science.gov (United States)

    Ostwald, Dirk; Starke, Ludger; Hertwig, Ralph

    2015-01-01

    “Decisions from experience” (DFE) refers to a body of work that emerged in research on behavioral decision making over the last decade. One of the major experimental paradigms employed to study experience-based choice is the “sampling paradigm,” which serves as a model of decision making under limited knowledge about the statistical structure of the world. In this paradigm respondents are presented with two payoff distributions, which, in contrast to standard approaches in behavioral economics, are specified not in terms of explicit outcome-probability information, but by the opportunity to sample outcomes from each distribution without economic consequences. Participants are encouraged to explore the distributions until they feel confident enough to decide from which they would prefer to draw from in a final trial involving real monetary payoffs. One commonly employed measure to characterize the behavior of participants in the sampling paradigm is the sample size, that is, the number of outcome draws which participants choose to obtain from each distribution prior to terminating sampling. A natural question that arises in this context concerns the “optimal” sample size, which could be used as a normative benchmark to evaluate human sampling behavior in DFE. In this theoretical study, we relate the DFE sampling paradigm to the classical statistical decision theoretic literature and, under a probabilistic inference assumption, evaluate optimal sample sizes for DFE. In our treatment we go beyond analytically established results by showing how the classical statistical decision theoretic framework can be used to derive optimal sample sizes under arbitrary, but numerically evaluable, constraints. Finally, we critically evaluate the value of deriving optimal sample sizes under this framework as testable predictions for the experimental study of sampling behavior in DFE. PMID:26441720

  14. Laser sampling

    International Nuclear Information System (INIS)

    Gorbatenko, A A; Revina, E I

    2015-01-01

    The review is devoted to the major advances in laser sampling. The advantages and drawbacks of the technique are considered. Specific features of combinations of laser sampling with various instrumental analytical methods, primarily inductively coupled plasma mass spectrometry, are discussed. Examples of practical implementation of hybrid methods involving laser sampling as well as corresponding analytical characteristics are presented. The bibliography includes 78 references

  15. Urine sampling techniques in symptomatic primary-care patients

    DEFF Research Database (Denmark)

    Holm, Anne; Aabenhus, Rune

    2016-01-01

    in infection rate between mid-stream-clean-catch, mid-stream-urine and random samples. Conclusions: At present, no evidence suggests that sampling technique affects the accuracy of the microbiological diagnosis in non-pregnant women with symptoms of urinary tract infection in primary care. However......Background: Choice of urine sampling technique in urinary tract infection may impact diagnostic accuracy and thus lead to possible over- or undertreatment. Currently no evidencebased consensus exists regarding correct sampling technique of urine from women with symptoms of urinary tract infection...... a randomized or paired design to compare the result of urine culture obtained with two or more collection techniques in adult, female, non-pregnant patients with symptoms of urinary tract infection. We evaluated quality of the studies and compared accuracy based on dichotomized outcomes. Results: We included...

  16. Effects of sample size on robustness and prediction accuracy of a prognostic gene signature

    Directory of Open Access Journals (Sweden)

    Kim Seon-Young

    2009-05-01

    Full Text Available Abstract Background Few overlap between independently developed gene signatures and poor inter-study applicability of gene signatures are two of major concerns raised in the development of microarray-based prognostic gene signatures. One recent study suggested that thousands of samples are needed to generate a robust prognostic gene signature. Results A data set of 1,372 samples was generated by combining eight breast cancer gene expression data sets produced using the same microarray platform and, using the data set, effects of varying samples sizes on a few performances of a prognostic gene signature were investigated. The overlap between independently developed gene signatures was increased linearly with more samples, attaining an average overlap of 16.56% with 600 samples. The concordance between predicted outcomes by different gene signatures also was increased with more samples up to 94.61% with 300 samples. The accuracy of outcome prediction also increased with more samples. Finally, analysis using only Estrogen Receptor-positive (ER+ patients attained higher prediction accuracy than using both patients, suggesting that sub-type specific analysis can lead to the development of better prognostic gene signatures Conclusion Increasing sample sizes generated a gene signature with better stability, better concordance in outcome prediction, and better prediction accuracy. However, the degree of performance improvement by the increased sample size was different between the degree of overlap and the degree of concordance in outcome prediction, suggesting that the sample size required for a study should be determined according to the specific aims of the study.

  17. Sample Transport for a European Sample Curation Facility

    Science.gov (United States)

    Berthoud, L.; Vrublevskis, J. B.; Bennett, A.; Pottage, T.; Bridges, J. C.; Holt, J. M. C.; Dirri, F.; Longobardo, A.; Palomba, E.; Russell, S.; Smith, C.

    2018-04-01

    This work has looked at the recovery of Mars Sample Return capsule once it arrives on Earth. It covers possible landing sites, planetary protection requirements, and transportation from the landing site to a European Sample Curation Facility.

  18. Estimating Sample Size for Usability Testing

    Directory of Open Access Journals (Sweden)

    Alex Cazañas

    2017-02-01

    Full Text Available One strategy used to assure that an interface meets user requirements is to conduct usability testing. When conducting such testing one of the unknowns is sample size. Since extensive testing is costly, minimizing the number of participants can contribute greatly to successful resource management of a project. Even though a significant number of models have been proposed to estimate sample size in usability testing, there is still not consensus on the optimal size. Several studies claim that 3 to 5 users suffice to uncover 80% of problems in a software interface. However, many other studies challenge this assertion. This study analyzed data collected from the user testing of a web application to verify the rule of thumb, commonly known as the “magic number 5”. The outcomes of the analysis showed that the 5-user rule significantly underestimates the required sample size to achieve reasonable levels of problem detection.

  19. Preeminence and prerequisites of sample size calculations in clinical trials

    Directory of Open Access Journals (Sweden)

    Richa Singhal

    2015-01-01

    Full Text Available The key components while planning a clinical study are the study design, study duration, and sample size. These features are an integral part of planning a clinical trial efficiently, ethically, and cost-effectively. This article describes some of the prerequisites for sample size calculation. It also explains that sample size calculation is different for different study designs. The article in detail describes the sample size calculation for a randomized controlled trial when the primary outcome is a continuous variable and when it is a proportion or a qualitative variable.

  20. A simple vibrating sample magnetometer for macroscopic samples

    Science.gov (United States)

    Lopez-Dominguez, V.; Quesada, A.; Guzmán-Mínguez, J. C.; Moreno, L.; Lere, M.; Spottorno, J.; Giacomone, F.; Fernández, J. F.; Hernando, A.; García, M. A.

    2018-03-01

    We here present a simple model of a vibrating sample magnetometer (VSM). The system allows recording magnetization curves at room temperature with a resolution of the order of 0.01 emu and is appropriated for macroscopic samples. The setup can be mounted with different configurations depending on the requirements of the sample to be measured (mass, saturation magnetization, saturation field, etc.). We also include here examples of curves obtained with our setup and comparison curves measured with a standard commercial VSM that confirms the reliability of our device.

  1. Optimal sampling designs for large-scale fishery sample surveys in Greece

    Directory of Open Access Journals (Sweden)

    G. BAZIGOS

    2007-12-01

    The paper deals with the optimization of the following three large scale sample surveys: biological sample survey of commercial landings (BSCL, experimental fishing sample survey (EFSS, and commercial landings and effort sample survey (CLES.

  2. Attention Deficit Hyperactivity Disorder Symptoms, Comorbidities, Substance Use, and Social Outcomes among Men and Women in a Canadian Sample

    Directory of Open Access Journals (Sweden)

    Evelyn Vingilis

    2015-01-01

    Full Text Available Background. Attention deficit hyperactivity disorder (ADHD is a neurodevelopmental disorder that can persist in adolescence and adulthood. Aim. To examine prevalence of ADHD symptoms and correlates in a representative sample of adults 18 years and older living in Ontario, Canada. Method. We used the Centre for Addiction and Mental Health Monitor, an ongoing cross-sectional telephone survey, to examine the relationships between ADHD positive symptoms and comorbidities, substance use, medication use, social outcomes, and sociodemographics. Results. Of 4014 residents sampled in 2011-2012, 3.30% (2.75%–3.85% screened positively for ADHD symptoms (women = 3.6%; men = 3.0%. For men, distress, antisocial symptoms, cocaine use, antianxiety medication use, antidepressant medication use, and criminal offence arrest were associated with positive ADHD screen. For women, distress, cocaine use, antianxiety medication use, antidepressant medication use, pain medication use, and motor vehicle collision in the past year were associated with positive ADHD screen. Conclusions. ADHD symptoms are associated with adverse medical and social outcomes that are in some cases gender specific.

  3. Fluidic sampling

    International Nuclear Information System (INIS)

    Houck, E.D.

    1992-01-01

    This paper covers the development of the fluidic sampler and its testing in a fluidic transfer system. The major findings of this paper are as follows. Fluidic jet samples can dependably produce unbiased samples of acceptable volume. The fluidic transfer system with a fluidic sampler in-line will transfer water to a net lift of 37.2--39.9 feet at an average ratio of 0.02--0.05 gpm (77--192 cc/min). The fluidic sample system circulation rate compares very favorably with the normal 0.016--0.026 gpm (60--100 cc/min) circulation rate that is commonly produced for this lift and solution with the jet-assisted airlift sample system that is normally used at ICPP. The volume of the sample taken with a fluidic sampler is dependant on the motive pressure to the fluidic sampler, the sample bottle size and on the fluidic sampler jet characteristics. The fluidic sampler should be supplied with fluid having the motive pressure of the 140--150 percent of the peak vacuum producing motive pressure for the jet in the sampler. Fluidic transfer systems should be operated by emptying a full pumping chamber to nearly empty or empty during the pumping cycle, this maximizes the solution transfer rate

  4. Patient activation and disparate health care outcomes in a racially diverse sample of chronically ill older adults.

    Science.gov (United States)

    Ryvicker, Miriam; Peng, Timothy R; Feldman, Penny Hollander

    2012-11-01

    The Patient Activation Measure (PAM) assesses people's ability to self-manage their health. Variations in PAM score have been linked with health behaviors, outcomes, and potential disparities. This study assessed the relative impacts of activation, socio-demographic and clinical factors on health care outcomes in a racially diverse sample of chronically ill, elderly homecare patients. Using survey and administrative data from 249 predominantly non-White patients, logistic regression was conducted to examine the effects of activation level and patient characteristics on the likelihood of subsequent hospitalization and emergency department (ED) use. Activation was not a significant predictor of hospitalization or ED use in adjusted models. Non-Whites were more likely than Whites to have a hospitalization or ED visit. Obesity was a strong predictor of both outcomes. Further research should examine potential sources of disadvantage among chronically ill homecare patients to design effective interventions to reduce health disparities in this population.

  5. AND/OR Importance Sampling

    OpenAIRE

    Gogate, Vibhav; Dechter, Rina

    2012-01-01

    The paper introduces AND/OR importance sampling for probabilistic graphical models. In contrast to importance sampling, AND/OR importance sampling caches samples in the AND/OR space and then extracts a new sample mean from the stored samples. We prove that AND/OR importance sampling may have lower variance than importance sampling; thereby providing a theoretical justification for preferring it over importance sampling. Our empirical evaluation demonstrates that AND/OR importance sampling is ...

  6. Conservative Sample Size Determination for Repeated Measures Analysis of Covariance.

    Science.gov (United States)

    Morgan, Timothy M; Case, L Douglas

    2013-07-05

    In the design of a randomized clinical trial with one pre and multiple post randomized assessments of the outcome variable, one needs to account for the repeated measures in determining the appropriate sample size. Unfortunately, one seldom has a good estimate of the variance of the outcome measure, let alone the correlations among the measurements over time. We show how sample sizes can be calculated by making conservative assumptions regarding the correlations for a variety of covariance structures. The most conservative choice for the correlation depends on the covariance structure and the number of repeated measures. In the absence of good estimates of the correlations, the sample size is often based on a two-sample t-test, making the 'ultra' conservative and unrealistic assumption that there are zero correlations between the baseline and follow-up measures while at the same time assuming there are perfect correlations between the follow-up measures. Compared to the case of taking a single measurement, substantial savings in sample size can be realized by accounting for the repeated measures, even with very conservative assumptions regarding the parameters of the assumed correlation matrix. Assuming compound symmetry, the sample size from the two-sample t-test calculation can be reduced at least 44%, 56%, and 61% for repeated measures analysis of covariance by taking 2, 3, and 4 follow-up measures, respectively. The results offer a rational basis for determining a fairly conservative, yet efficient, sample size for clinical trials with repeated measures and a baseline value.

  7. Compressive sampling of polynomial chaos expansions: Convergence analysis and sampling strategies

    International Nuclear Information System (INIS)

    Hampton, Jerrad; Doostan, Alireza

    2015-01-01

    Sampling orthogonal polynomial bases via Monte Carlo is of interest for uncertainty quantification of models with random inputs, using Polynomial Chaos (PC) expansions. It is known that bounding a probabilistic parameter, referred to as coherence, yields a bound on the number of samples necessary to identify coefficients in a sparse PC expansion via solution to an ℓ 1 -minimization problem. Utilizing results for orthogonal polynomials, we bound the coherence parameter for polynomials of Hermite and Legendre type under their respective natural sampling distribution. In both polynomial bases we identify an importance sampling distribution which yields a bound with weaker dependence on the order of the approximation. For more general orthonormal bases, we propose the coherence-optimal sampling: a Markov Chain Monte Carlo sampling, which directly uses the basis functions under consideration to achieve a statistical optimality among all sampling schemes with identical support. We demonstrate these different sampling strategies numerically in both high-order and high-dimensional, manufactured PC expansions. In addition, the quality of each sampling method is compared in the identification of solutions to two differential equations, one with a high-dimensional random input and the other with a high-order PC expansion. In both cases, the coherence-optimal sampling scheme leads to similar or considerably improved accuracy

  8. Compressive sampling of polynomial chaos expansions: Convergence analysis and sampling strategies

    Science.gov (United States)

    Hampton, Jerrad; Doostan, Alireza

    2015-01-01

    Sampling orthogonal polynomial bases via Monte Carlo is of interest for uncertainty quantification of models with random inputs, using Polynomial Chaos (PC) expansions. It is known that bounding a probabilistic parameter, referred to as coherence, yields a bound on the number of samples necessary to identify coefficients in a sparse PC expansion via solution to an ℓ1-minimization problem. Utilizing results for orthogonal polynomials, we bound the coherence parameter for polynomials of Hermite and Legendre type under their respective natural sampling distribution. In both polynomial bases we identify an importance sampling distribution which yields a bound with weaker dependence on the order of the approximation. For more general orthonormal bases, we propose the coherence-optimal sampling: a Markov Chain Monte Carlo sampling, which directly uses the basis functions under consideration to achieve a statistical optimality among all sampling schemes with identical support. We demonstrate these different sampling strategies numerically in both high-order and high-dimensional, manufactured PC expansions. In addition, the quality of each sampling method is compared in the identification of solutions to two differential equations, one with a high-dimensional random input and the other with a high-order PC expansion. In both cases, the coherence-optimal sampling scheme leads to similar or considerably improved accuracy.

  9. Sampling soils for 137Cs using various field-sampling volumes

    International Nuclear Information System (INIS)

    Nyhan, J.W.; Schofield, T.G.; White, G.C.; Trujillo, G.

    1981-10-01

    The sediments from a liquid effluent receiving area at the Los Alamos National Laboratory and soils from intensive study area in the fallout pathway of Trinity were sampled for 137 Cs using 25-, 500-, 2500-, and 12 500-cm 3 field sampling volumes. A highly replicated sampling program was used to determine mean concentrations and inventories of 137 Cs at each site, as well as estimates of spatial, aliquoting, and counting variance components of the radionuclide data. The sampling methods were also analyzed as a function of soil size fractions collected in each field sampling volume and of the total cost of the program for a given variation in the radionuclide survey results. Coefficients of variation (CV) of 137 Cs inventory estimates ranged from 0.063 to 0.14 for Mortandad Canyon sediments, where CV values for Trinity soils were observed from 0.38 to 0.57. Spatial variance components of 137 Cs concentration data were usually found to be larger than either the aliquoting or counting variance estimates and were inversely related to field sampling volume at the Trinity intensive site. Subsequent optimization studies of the sampling schemes demonstrated that each aliquot should be counted once, and that only 2 to 4 aliquots out of an many as 30 collected need be assayed for 137 Cs. The optimization studies showed that as sample costs increased to 45 man-hours of labor per sample, the variance of the mean 137 Cs concentration decreased dramatically, but decreased very little with additional labor

  10. Outcome Measurement Using Naturalistic Language Samples: A Feasibility Pilot Study Using Language Transcription Software and Speech and Language Therapy Assistants

    Science.gov (United States)

    Overton, Sarah; Wren, Yvonne

    2014-01-01

    The ultimate aim of intervention for children with language impairment is an improvement in their functional language skills. Baseline and outcome measurement of this is often problematic however and practitioners commonly resort to using formal assessments that may not adequately reflect the child's competence. Language sampling,…

  11. Choice of Sample Split in Out-of-Sample Forecast Evaluation

    DEFF Research Database (Denmark)

    Hansen, Peter Reinhard; Timmermann, Allan

    , while conversely the power of forecast evaluation tests is strongest with long out-of-sample periods. To deal with size distortions, we propose a test statistic that is robust to the effect of considering multiple sample split points. Empirical applications to predictabil- ity of stock returns......Out-of-sample tests of forecast performance depend on how a given data set is split into estimation and evaluation periods, yet no guidance exists on how to choose the split point. Empirical forecast evaluation results can therefore be difficult to interpret, particularly when several values...... and inflation demonstrate that out-of-sample forecast evaluation results can critically depend on how the sample split is determined....

  12. Sampling or gambling

    Energy Technology Data Exchange (ETDEWEB)

    Gy, P.M.

    1981-12-01

    Sampling can be compared to no other technique. A mechanical sampler must above all be selected according to its aptitude for supressing or reducing all components of the sampling error. Sampling is said to be correct when it gives all elements making up the batch of matter submitted to sampling an uniform probability of being selected. A sampler must be correctly designed, built, installed, operated and maintained. When the conditions of sampling correctness are not strictly respected, the sampling error can no longer be controlled and can, unknown to the user, be unacceptably large: the sample is no longer representative. The implementation of an incorrect sampler is a form of gambling and this paper intends to show that at this game the user is nearly always the loser in the long run. The users' and the manufacturers' interests may diverge and the standards which should safeguard the users' interests very often fail to do so by tolerating or even recommending incorrect techniques such as the implementation of too narrow cutters traveling too fast through the stream to be sampled.

  13. Soil sampling

    International Nuclear Information System (INIS)

    Fortunati, G.U.; Banfi, C.; Pasturenzi, M.

    1994-01-01

    This study attempts to survey the problems associated with techniques and strategies of soil sampling. Keeping in mind the well defined objectives of a sampling campaign, the aim was to highlight the most important aspect of representativeness of samples as a function of the available resources. Particular emphasis was given to the techniques and particularly to a description of the many types of samplers which are in use. The procedures and techniques employed during the investigations following the Seveso accident are described. (orig.)

  14. Sample representativeness verification of the FADN CZ farm business sample

    Directory of Open Access Journals (Sweden)

    Marie Prášilová

    2011-01-01

    Full Text Available Sample representativeness verification is one of the key stages of statistical work. After having joined the European Union the Czech Republic joined also the Farm Accountancy Data Network system of the Union. This is a sample of bodies and companies doing business in agriculture. Detailed production and economic data on the results of farming business are collected from that sample annually and results for the entire population of the country´s farms are then estimated and assessed. It is important hence, that the sample be representative. Representativeness is to be assessed as to the number of farms included in the survey and also as to the degree of accordance of the measures and indices as related to the population. The paper deals with the special statistical techniques and methods of the FADN CZ sample representativeness verification including the necessary sample size statement procedure. The Czech farm population data have been obtained from the Czech Statistical Office data bank.

  15. Present status of NMCC and sample preparation method for bio-samples

    International Nuclear Information System (INIS)

    Futatsugawa, S.; Hatakeyama, S.; Saitou, S.; Sera, K.

    1993-01-01

    In NMCC(Nishina Memorial Cyclotron Center) we are doing researches on PET of nuclear medicine (Positron Emission Computed Tomography) and PIXE analysis (Particle Induced X-ray Emission) using a small cyclotron of compactly designed. The NMCC facilities have been opened to researchers of other institutions since April 1993. The present status of NMCC is described. Bio-samples (medical samples, plants, animals and environmental samples) have mainly been analyzed by PIXE in NMCC. Small amounts of bio-samples for PIXE are decomposed quickly and easily in a sealed PTFE (polytetrafluoroethylene) vessel with a microwave oven. This sample preparation method of bio-samples also is described. (author)

  16. Adrenal Vein Sampling for Conn’s Syndrome: Diagnosis and Clinical Outcomes

    Directory of Open Access Journals (Sweden)

    Amy R. Deipolyi

    2015-06-01

    Full Text Available Adrenal vein sampling (AVS is the gold standard test to determine unilateral causes of primary aldosteronism (PA. We have retrospectively characterized our experience with AVS including concordance of AVS results and imaging, and describe the approach for the PA patient in whom bilateral AVS is unsuccessful. We reviewed the medical records of 85 patients with PA and compared patients who were treated medically and surgically on pre-procedure presentation and post-treatment outcomes, and evaluated how technically unsuccessful AVS results were used in further patient management. Out of the 92 AVS performed in 85 patients, AVS was technically successful bilaterally in 58 (63% of cases. Either unsuccessful AVS prompted a repeat AVS, or results from the contralateral side and from CT imaging were used to guide further therapy. Patients who were managed surgically with adrenalectomy had higher initial blood pressure and lower potassium levels compared with patients who were managed medically. Adrenalectomy results in significantly decreased blood pressure and normalization of potassium levels. AVS can identify surgically curable causes of PA, but can be technically challenging. When one adrenal vein fails to be cannulated, results from the contralateral vein can be useful in conjunction with imaging and clinical findings to suggest further management.

  17. Experimental and Sampling Design for the INL-2 Sample Collection Operational Test

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, Gregory F.; Amidan, Brett G.; Matzke, Brett D.

    2009-02-16

    This report describes the experimental and sampling design developed to assess sampling approaches and methods for detecting contamination in a building and clearing the building for use after decontamination. An Idaho National Laboratory (INL) building will be contaminated with BG (Bacillus globigii, renamed Bacillus atrophaeus), a simulant for Bacillus anthracis (BA). The contamination, sampling, decontamination, and re-sampling will occur per the experimental and sampling design. This INL-2 Sample Collection Operational Test is being planned by the Validated Sampling Plan Working Group (VSPWG). The primary objectives are: 1) Evaluate judgmental and probabilistic sampling for characterization as well as probabilistic and combined (judgment and probabilistic) sampling approaches for clearance, 2) Conduct these evaluations for gradient contamination (from low or moderate down to absent or undetectable) for different initial concentrations of the contaminant, 3) Explore judgment composite sampling approaches to reduce sample numbers, 4) Collect baseline data to serve as an indication of the actual levels of contamination in the tests. A combined judgmental and random (CJR) approach uses Bayesian methodology to combine judgmental and probabilistic samples to make clearance statements of the form "X% confidence that at least Y% of an area does not contain detectable contamination” (X%/Y% clearance statements). The INL-2 experimental design has five test events, which 1) vary the floor of the INL building on which the contaminant will be released, 2) provide for varying the amount of contaminant released to obtain desired concentration gradients, and 3) investigate overt as well as covert release of contaminants. Desirable contaminant gradients would have moderate to low concentrations of contaminant in rooms near the release point, with concentrations down to zero in other rooms. Such gradients would provide a range of contamination levels to challenge the sampling

  18. C-Arm Computed Tomography-Assisted Adrenal Venous Sampling Improved Right Adrenal Vein Cannulation and Sampling Quality in Primary Aldosteronism.

    Science.gov (United States)

    Park, Chung Hyun; Hong, Namki; Han, Kichang; Kang, Sang Wook; Lee, Cho Rok; Park, Sungha; Rhee, Yumie

    2018-05-04

    Adrenal venous sampling (AVS) is a gold standard for subtype classification of primary aldosteronism (PA). However, this procedure has a high failure rate because of the anatomical difficulties in accessing the right adrenal vein. We investigated whether C-arm computed tomography-assisted AVS (C-AVS) could improve the success rate of adrenal sampling. A total of 156 patients, diagnosed with PA who underwent AVS from May 2004 through April 2017, were included. Based on the medical records, we retrospectively compared the overall, left, and right catheterization success rates of adrenal veins during the periods without C-AVS (2004 to 2010, n=32) and with C-AVS (2011 to 2016, n=134). The primary outcome was adequate bilateral sampling defined as a selectivity index (SI) >5. With C-AVS, the rates of adequate bilateral AVS increased from 40.6% to 88.7% (PAVS was an independent predictor of adequate bilateral sampling in the multivariate model (odds ratio, 9.01; PAVS improved the overall success rate of AVS, possibly as a result of better catheterization of right adrenal vein. Copyright © 2018 Korean Endocrine Society.

  19. Sample collection and sample analysis plan in support of the 105-C/190-C concrete and soil sampling activities

    International Nuclear Information System (INIS)

    Marske, S.G.

    1996-07-01

    This sampling and analysis plan describes the sample collection and sample analysis in support of the 105-C water tunnels and 190-C main pumphouse concrete and soil sampling activities. These analytical data will be used to identify the radiological contamination and presence of hazardous materials to support the decontamination and disposal activities

  20. Sample size of the reference sample in a case-augmented study.

    Science.gov (United States)

    Ghosh, Palash; Dewanji, Anup

    2017-05-01

    The case-augmented study, in which a case sample is augmented with a reference (random) sample from the source population with only covariates information known, is becoming popular in different areas of applied science such as pharmacovigilance, ecology, and econometrics. In general, the case sample is available from some source (for example, hospital database, case registry, etc.); however, the reference sample is required to be drawn from the corresponding source population. The required minimum size of the reference sample is an important issue in this regard. In this work, we address the minimum sample size calculation and discuss related issues. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  1. Large sample neutron activation analysis of a reference inhomogeneous sample

    International Nuclear Information System (INIS)

    Vasilopoulou, T.; Athens National Technical University, Athens; Tzika, F.; Stamatelatos, I.E.; Koster-Ammerlaan, M.J.J.

    2011-01-01

    A benchmark experiment was performed for Neutron Activation Analysis (NAA) of a large inhomogeneous sample. The reference sample was developed in-house and consisted of SiO 2 matrix and an Al-Zn alloy 'inhomogeneity' body. Monte Carlo simulations were employed to derive appropriate correction factors for neutron self-shielding during irradiation as well as self-attenuation of gamma rays and sample geometry during counting. The large sample neutron activation analysis (LSNAA) results were compared against reference values and the trueness of the technique was evaluated. An agreement within ±10% was observed between LSNAA and reference elemental mass values, for all matrix and inhomogeneity elements except Samarium, provided that the inhomogeneity body was fully simulated. However, in cases that the inhomogeneity was treated as not known, the results showed a reasonable agreement for most matrix elements, while large discrepancies were observed for the inhomogeneity elements. This study provided a quantification of the uncertainties associated with inhomogeneity in large sample analysis and contributed to the identification of the needs for future development of LSNAA facilities for analysis of inhomogeneous samples. (author)

  2. Influence of sampling depth and post-sampling analysis time on the ...

    African Journals Online (AJOL)

    Bacteriological analysis was carried out for samples taken at water depth and at 1, 6, 12 and 24 hours post-sampling. It was observed that the total and faecal coliform bacteria were significantly higher in the 3 m water depth samples than in the surface water samples (ANOVA, F = 59.41, 26.751, 9.82 (T.C); 46.41, 26.81, ...

  3. DNA Sampling Hook

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The DNA Sampling Hook is a significant improvement on a method of obtaining a tissue sample from a live fish in situ from an aquatic environment. A tissue sample...

  4. Laboratory Sampling Guide

    Science.gov (United States)

    2012-05-11

    environment, and by ingestion of foodstuffs that have incorporated C-14 by photosynthesis . Like tritium, C-14 is a very low energy beta emitter and is... bacterial growth and to minimize development of solids in the sample. • Properly identify each sample container with name, SSN, and collection start and...sampling in the same cardboard carton. The sample may be kept cool or frozen during collection to control odor and bacterial growth. • Once

  5. Sampling Development

    Science.gov (United States)

    Adolph, Karen E.; Robinson, Scott R.

    2011-01-01

    Research in developmental psychology requires sampling at different time points. Accurate depictions of developmental change provide a foundation for further empirical studies and theories about developmental mechanisms. However, overreliance on widely spaced sampling intervals in cross-sectional and longitudinal designs threatens the validity of…

  6. ExSample. A library for sampling Sudakov-type distributions

    Energy Technology Data Exchange (ETDEWEB)

    Plaetzer, Simon

    2011-08-15

    Sudakov-type distributions are at the heart of generating radiation in parton showers as well as contemporary NLO matching algorithms along the lines of the POWHEG algorithm. In this paper, the C++ library ExSample is introduced, which implements adaptive sampling of Sudakov-type distributions for splitting kernels which are in general only known numerically. Besides the evolution variable, the splitting kernels can depend on an arbitrary number of other degrees of freedom to be sampled, and any number of further parameters which are fixed on an event-by-event basis. (orig.)

  7. ExSample. A library for sampling Sudakov-type distributions

    International Nuclear Information System (INIS)

    Plaetzer, Simon

    2011-08-01

    Sudakov-type distributions are at the heart of generating radiation in parton showers as well as contemporary NLO matching algorithms along the lines of the POWHEG algorithm. In this paper, the C++ library ExSample is introduced, which implements adaptive sampling of Sudakov-type distributions for splitting kernels which are in general only known numerically. Besides the evolution variable, the splitting kernels can depend on an arbitrary number of other degrees of freedom to be sampled, and any number of further parameters which are fixed on an event-by-event basis. (orig.)

  8. Spherical sampling

    CERN Document Server

    Freeden, Willi; Schreiner, Michael

    2018-01-01

    This book presents, in a consistent and unified overview, results and developments in the field of today´s spherical sampling, particularly arising in mathematical geosciences. Although the book often refers to original contributions, the authors made them accessible to (graduate) students and scientists not only from mathematics but also from geosciences and geoengineering. Building a library of topics in spherical sampling theory it shows how advances in this theory lead to new discoveries in mathematical, geodetic, geophysical as well as other scientific branches like neuro-medicine. A must-to-read for everybody working in the area of spherical sampling.

  9. Treatment outcome for a sample of patients with Class II division 1 malocclusion treated at a regional hospital orthodontic department.

    LENUS (Irish Health Repository)

    Burden, D J

    1998-01-01

    This retrospective study assessed the outcome of orthodontic treatment of 264 patients with Class II division 1 malocclusion (overjet greater than 6 mm). The sample comprised patients who had completed their fixed appliance orthodontic treatment at a regional hospital orthodontic unit in the Republic of Ireland. The PAR Index (Peer Assessment Rating) was used to evaluate treatment outcome using before and after treatment study casts. The results revealed that treatment for this particular type of malocclusion was highly effective with a very few patients failing to benefit from their orthodontic treatment.

  10. An integrated and accessible sample data library for Mars sample return science

    Science.gov (United States)

    Tuite, M. L., Jr.; Williford, K. H.

    2015-12-01

    Over the course of the next decade or more, many thousands of geological samples will be collected and analyzed in a variety of ways by researchers at the Jet Propulsion Laboratory (California Institute of Technology) in order to facilitate discovery and contextualize observations made of Mars rocks both in situ and here on Earth if samples are eventually returned. Integration of data from multiple analyses of samples including petrography, thin section and SEM imaging, isotope and organic geochemistry, XRF, XRD, and Raman spectrometry is a challenge and a potential obstacle to discoveries that require supporting lines of evidence. We report the development of a web-accessible repository, the Sample Data Library (SDL) for the sample-based data that are generated by the laboratories and instruments that comprise JPL's Center for Analysis of Returned Samples (CARS) in order to facilitate collaborative interpretation of potential biosignatures in Mars-analog geological samples. The SDL is constructed using low-cost, open-standards-based Amazon Web Services (AWS), including web-accessible storage, relational data base services, and a virtual web server. The data structure is sample-centered with a shared registry for assigning unique identifiers to all samples including International Geo-Sample Numbers. Both raw and derived data produced by instruments and post-processing workflows are automatically uploaded to online storage and linked via the unique identifiers. Through the web interface, users are able to find all the analyses associated with a single sample or search across features shared by multiple samples, sample localities, and analysis types. Planned features include more sophisticated search and analytical interfaces as well as data discoverability through NSF's EarthCube program.

  11. Towards Representative Metallurgical Sampling and Gold Recovery Testwork Programmes

    Directory of Open Access Journals (Sweden)

    Simon C. Dominy

    2018-05-01

    Full Text Available When developing a process flowsheet, the risks in achieving positive financial outcomes are minimised by ensuring representative metallurgical samples and high quality testwork. The quality and type of samples used are as important as the testwork itself. The key characteristic required of any set of samples is that they represent a given domain and quantify its variability. There are those who think that stating a sample(s is representative makes it representative without justification. There is a need to consider both (1 in-situ and (2 testwork sub-sample representativity. Early ore/waste characterisation and domain definition are required, so that sampling and testwork protocols can be designed to suit the style of mineralisation in question. The Theory of Sampling (TOS provides an insight into the causes and magnitude of errors that may occur during the sampling of particulate materials (e.g., broken rock and is wholly applicable to metallurgical sampling. Quality assurance/quality control (QAQC is critical throughout all programmes. Metallurgical sampling and testwork should be fully integrated into geometallurgical studies. Traditional metallurgical testwork is critical for plant design and is an inherent part of geometallurgy. In a geometallurgical study, multiple spatially distributed small-scale tests are used as proxies for process parameters. These will be validated against traditional testwork results. This paper focusses on sampling and testwork for gold recovery determination. It aims to provide the reader with the background to move towards the design, implementation and reporting of representative and fit-for-purpose sampling and testwork programmes. While the paper does not intend to provide a definitive commentary, it critically assesses the hard-rock sampling methods used and their optimal collection and preparation. The need for representative sampling and quality testwork to avoid financial and intangible losses is

  12. 21 CFR 203.38 - Sample lot or control numbers; labeling of sample units.

    Science.gov (United States)

    2010-04-01

    ... numbers; labeling of sample units. (a) Lot or control number required on drug sample labeling and sample... identifying lot or control number that will permit the tracking of the distribution of each drug sample unit... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Sample lot or control numbers; labeling of sample...

  13. Remote sampling and analysis of highly radioactive samples in shielded boxes

    International Nuclear Information System (INIS)

    Kirpikov, D.A.; Miroshnichenko, I.V.; Pykhteev, O.Yu.

    2010-01-01

    The sampling procedure used for highly radioactive coolant water is associated with high risk of personnel irradiation and uncontrolled radioactive contamination. Remote sample manipulation with provision for proper radiation shielding is intended for safety enhancement of the sampling procedure. The sampling lines are located in an isolated compartment, a shielded box. Various equipment which enables remote or automatic sample manipulation is used for this purpose. The main issues of development of the shielded box equipment intended for a wider ranger of remote chemical analyses and manipulation techniques for highly radioactive water samples are considered in the paper. There were three principal directions of work: Transfer of chemical analysis performed in the laboratory inside the shielded box; Prevalence of computer-aided and remote techniques of highly radioactive sample manipulation inside the shielded box; and, Increase in control over sampling and determination of thermal-hydraulic parameters of the coolant water in the sampling lines. The developed equipment and solutions enable remote chemical analysis in the restricted volume of the shielded box by using ion-chromatographic, amperometrical, fluorimetric, flow injection, phototurbidimetric, conductometric and potentiometric methods. Extent of control performed in the shielded box is determined taking into account the requirements of the regulatory documents as well as feasibility and cost of the technical adaptation of various methods to the shielded box conditions. The work resulted in highly precise determination of more than 15 indexes of the coolant water quality performed in on-line mode in the shielded box. It averages to 80% of the total extent of control performed at the prototype reactor plants. The novel solutions for highly radioactive sample handling are implemented in the shielded box (for example, packaging, sample transportation to the laboratory, volume measurement). The shielded box is

  14. Testing a groundwater sampling tool: Are the samples representative?

    International Nuclear Information System (INIS)

    Kaback, D.S.; Bergren, C.L.; Carlson, C.A.; Carlson, C.L.

    1989-01-01

    A ground water sampling tool, the HydroPunch trademark, was tested at the Department of Energy's Savannah River Site in South Carolina to determine if representative ground water samples could be obtained without installing monitoring wells. Chemical analyses of ground water samples collected with the HydroPunch trademark from various depths within a borehole were compared with chemical analyses of ground water from nearby monitoring wells. The site selected for the test was in the vicinity of a large coal storage pile and a coal pile runoff basin that was constructed to collect the runoff from the coal storage pile. Existing monitoring wells in the area indicate the presence of a ground water contaminant plume that: (1) contains elevated concentrations of trace metals; (2) has an extremely low pH; and (3) contains elevated concentrations of major cations and anions. Ground water samples collected with the HydroPunch trademark provide in excellent estimate of ground water quality at discrete depths. Groundwater chemical data collected from various depths using the HydroPunch trademark can be averaged to simulate what a screen zone in a monitoring well would sample. The averaged depth-discrete data compared favorably with the data obtained from the nearby monitoring wells

  15. Sample size methodology

    CERN Document Server

    Desu, M M

    2012-01-01

    One of the most important problems in designing an experiment or a survey is sample size determination and this book presents the currently available methodology. It includes both random sampling from standard probability distributions and from finite populations. Also discussed is sample size determination for estimating parameters in a Bayesian setting by considering the posterior distribution of the parameter and specifying the necessary requirements. The determination of the sample size is considered for ranking and selection problems as well as for the design of clinical trials. Appropria

  16. Novel Sample-handling Approach for XRD Analysis with Minimal Sample Preparation

    Science.gov (United States)

    Sarrazin, P.; Chipera, S.; Bish, D.; Blake, D.; Feldman, S.; Vaniman, D.; Bryson, C.

    2004-01-01

    Sample preparation and sample handling are among the most critical operations associated with X-ray diffraction (XRD) analysis. These operations require attention in a laboratory environment, but they become a major constraint in the deployment of XRD instruments for robotic planetary exploration. We are developing a novel sample handling system that dramatically relaxes the constraints on sample preparation by allowing characterization of coarse-grained material that would normally be impossible to analyze with conventional powder-XRD techniques.

  17. Types of non-probabilistic sampling used in marketing research. „Snowball” sampling

    OpenAIRE

    Manuela Rozalia Gabor

    2007-01-01

    A significant way of investigating a firm’s market is the statistical sampling. The sampling typology provides a non / probabilistic models of gathering information and this paper describes thorough information related to network sampling, named “snowball” sampling. This type of sampling enables the survey of occurrence forms concerning the decision power within an organisation and of the interpersonal relation network governing a certain collectivity, a certain consumer panel. The snowball s...

  18. Sampling procedure, receipt and conservation of water samples to determine environmental radioactivity

    International Nuclear Information System (INIS)

    Herranz, M.; Navarro, E.; Payeras, J.

    2009-01-01

    The present document informs about essential goals, processes and contents that the subgroups Sampling and Samples Preparation and Conservation believe they should be part of the procedure to obtain a correct sampling, receipt, conservation and preparation of samples of continental, marine and waste water before qualifying its radioactive content.

  19. Development and Demonstration of a Method to Evaluate Bio-Sampling Strategies Using Building Simulation and Sample Planning Software.

    Science.gov (United States)

    Dols, W Stuart; Persily, Andrew K; Morrow, Jayne B; Matzke, Brett D; Sego, Landon H; Nuffer, Lisa L; Pulsipher, Brent A

    2010-01-01

    In an effort to validate and demonstrate response and recovery sampling approaches and technologies, the U.S. Department of Homeland Security (DHS), along with several other agencies, have simulated a biothreat agent release within a facility at Idaho National Laboratory (INL) on two separate occasions in the fall of 2007 and the fall of 2008. Because these events constitute only two realizations of many possible scenarios, increased understanding of sampling strategies can be obtained by virtually examining a wide variety of release and dispersion scenarios using computer simulations. This research effort demonstrates the use of two software tools, CONTAM, developed by the National Institute of Standards and Technology (NIST), and Visual Sample Plan (VSP), developed by Pacific Northwest National Laboratory (PNNL). The CONTAM modeling software was used to virtually contaminate a model of the INL test building under various release and dissemination scenarios as well as a range of building design and operation parameters. The results of these CONTAM simulations were then used to investigate the relevance and performance of various sampling strategies using VSP. One of the fundamental outcomes of this project was the demonstration of how CONTAM and VSP can be used together to effectively develop sampling plans to support the various stages of response to an airborne chemical, biological, radiological, or nuclear event. Following such an event (or prior to an event), incident details and the conceptual site model could be used to create an ensemble of CONTAM simulations which model contaminant dispersion within a building. These predictions could then be used to identify priority area zones within the building and then sampling designs and strategies could be developed based on those zones.

  20. Recruiting a representative sample in adherence research-The MALT multisite prospective cohort study experience.

    Science.gov (United States)

    Shemesh, Eyal; Mitchell, Jeffrey; Neighbors, Katie; Feist, Susan; Hawkins, Andre; Brown, Amanda; Wanrong, Yin; Anand, Ravinder; Stuber, Margaret L; Annunziato, Rachel A

    2017-12-01

    Medication adherence is an important determinant of transplant outcomes. Attempts to investigate adherence are frequently undermined by selection bias: It is very hard to recruit and retain non-adherent patients in research efforts. This manuscript presents recruitment strategies and results from the MALT (Medication Adherence in children who had a Liver Transplant) multisite prospective cohort study. MALT sites recruited 400 pediatric liver transplant patients who agreed to be followed for 2 years. The primary purpose was to determine whether a marker of adherence, the Medication Level Variability Index (MLVI), predicts rejection outcomes. The present manuscript describes methods used in MALT to ensure that a representative sample was recruited, and presents detailed recruitment results. MALT sites were able to recruit a nationally representative sample, as determined by a comparison between the MALT cohort and a national sample of transplant recipients. Strategies that helped ensure that the sample was representative included monitoring of the outcome measure in comparison with a national sample, drastically limiting patient burden, and specific recruitment methods. We discuss the importance of a representative sample in adherence research and recommend that future efforts to study adherence pay special attention to sample characteristics. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  1. IAEA Sampling Plan

    Energy Technology Data Exchange (ETDEWEB)

    Geist, William H. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-15

    The objectives for this presentation are to describe the method that the IAEA uses to determine a sampling plan for nuclear material measurements; describe the terms detection probability and significant quantity; list the three nuclear materials measurement types; describe the sampling method applied to an item facility; and describe multiple method sampling.

  2. Sample Acquisition for Materials in Planetary Exploration (SAMPLE), Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — ORBITEC proposes to analyze, design, and develop a device for autonomous lunar surface/subsurface sampling and processing applications. The Sample Acquisition for...

  3. Network and adaptive sampling

    CERN Document Server

    Chaudhuri, Arijit

    2014-01-01

    Combining the two statistical techniques of network sampling and adaptive sampling, this book illustrates the advantages of using them in tandem to effectively capture sparsely located elements in unknown pockets. It shows how network sampling is a reliable guide in capturing inaccessible entities through linked auxiliaries. The text also explores how adaptive sampling is strengthened in information content through subsidiary sampling with devices to mitigate unmanageable expanding sample sizes. Empirical data illustrates the applicability of both methods.

  4. Sampling bee communities using pan traps: alternative methods increase sample size

    Science.gov (United States)

    Monitoring of the status of bee populations and inventories of bee faunas require systematic sampling. Efficiency and ease of implementation has encouraged the use of pan traps to sample bees. Efforts to find an optimal standardized sampling method for pan traps have focused on pan trap color. Th...

  5. SAMPLING IN EXTERNAL AUDIT - THE MONETARY UNIT SAMPLING METHOD

    Directory of Open Access Journals (Sweden)

    E. Dascalu

    2016-12-01

    Full Text Available This article approaches the general issue of diminishing the evidence investigation space in audit activities, by means of sampling techniques, given that in the instance of a significant data volume an exhaustive examination of the assessed popula¬tion is not possible and/or effective. The general perspective of the presentation involves dealing with sampling risk, in essence, the risk that a selected sample may not be representative for the overall population, in correlation with the audit risk model and with the component parts of this model (inherent risk, control risk and non detection risk and highlights the inter-conditionings between these two models.

  6. Statistical sampling strategies

    International Nuclear Information System (INIS)

    Andres, T.H.

    1987-01-01

    Systems assessment codes use mathematical models to simulate natural and engineered systems. Probabilistic systems assessment codes carry out multiple simulations to reveal the uncertainty in values of output variables due to uncertainty in the values of the model parameters. In this paper, methods are described for sampling sets of parameter values to be used in a probabilistic systems assessment code. Three Monte Carlo parameter selection methods are discussed: simple random sampling, Latin hypercube sampling, and sampling using two-level orthogonal arrays. Three post-selection transformations are also described: truncation, importance transformation, and discretization. Advantages and disadvantages of each method are summarized

  7. Measurement of radioactivity in the environment - Soil - Part 2: Guidance for the selection of the sampling strategy, sampling and pre-treatment of samples

    International Nuclear Information System (INIS)

    2007-01-01

    This part of ISO 18589 specifies the general requirements, based on ISO 11074 and ISO/IEC 17025, for all steps in the planning (desk study and area reconnaissance) of the sampling and the preparation of samples for testing. It includes the selection of the sampling strategy, the outline of the sampling plan, the presentation of general sampling methods and equipment, as well as the methodology of the pre-treatment of samples adapted to the measurements of the activity of radionuclides in soil. This part of ISO 18589 is addressed to the people responsible for determining the radioactivity present in soil for the purpose of radiation protection. It is applicable to soil from gardens, farmland, urban or industrial sites, as well as soil not affected by human activities. This part of ISO 18589 is applicable to all laboratories regardless of the number of personnel or the range of the testing performed. When a laboratory does not undertake one or more of the activities covered by this part of ISO 18589, such as planning, sampling or testing, the corresponding requirements do not apply. Information is provided on scope, normative references, terms and definitions and symbols, principle, sampling strategy, sampling plan, sampling process, pre-treatment of samples and recorded information. Five annexes inform about selection of the sampling strategy according to the objectives and the radiological characterization of the site and sampling areas, diagram of the evolution of the sample characteristics from the sampling site to the laboratory, example of sampling plan for a site divided in three sampling areas, example of a sampling record for a single/composite sample and example for a sample record for a soil profile with soil description. A bibliography is provided

  8. Biological sample collector

    Science.gov (United States)

    Murphy, Gloria A [French Camp, CA

    2010-09-07

    A biological sample collector is adapted to a collect several biological samples in a plurality of filter wells. A biological sample collector may comprise a manifold plate for mounting a filter plate thereon, the filter plate having a plurality of filter wells therein; a hollow slider for engaging and positioning a tube that slides therethrough; and a slide case within which the hollow slider travels to allow the tube to be aligned with a selected filter well of the plurality of filter wells, wherein when the tube is aligned with the selected filter well, the tube is pushed through the hollow slider and into the selected filter well to sealingly engage the selected filter well and to allow the tube to deposit a biological sample onto a filter in the bottom of the selected filter well. The biological sample collector may be portable.

  9. Quality Control Samples for the Radiological Determination of Tritium in Urine Samples

    International Nuclear Information System (INIS)

    Ost'pezuk, P.; Froning, M.; Laumen, S.; Richert, I.; Hill, P.

    2004-01-01

    The radioactive decay product of tritium is a low energy beta that cannot penetrate the outer dead layer of human skin. Therefore , the main hazard associated with tritium is internal exposure. In addition, due to the relatively long half life and short biological half life, tritium must be ingested in large amounts to pose a significant health risk. On the other hand, the internal exposure should be kept as low as practical. For incorporation monitoring of professional radiation workers the quality control is of utmost importance. In the Research Centre Juelich GmbH (FZJ) a considerable fraction of monitoring by excretion analysis relates to the isotope Tritium. Usually an aliquot of an urine sample is mixed with a liquid scintillator and measured in a liquid scintillation counter. Quality control samples in the form of three kind of internal reference samples (blank, reference samples with low activity and reference sample with elevated activity) were prepared from a mixed, Tritium (free) urine samples. 1 ml of these samples were pipetted into a liquid scintillation vial. In the part of theses vials a known amounts of Tritium were added. All these samples were stored at 20 degrees. Based on long term use of all these reference samples it was possible to construct appropriate control charts with the upper and lower alarm limits. Daily use of these reference samples decrease significantly the risk for false results in original urine with no significant increase of the determination time. (Author) 2 refs

  10. A Method for Choosing the Best Samples for Mars Sample Return.

    Science.gov (United States)

    Gordon, Peter R; Sephton, Mark A

    2018-05-01

    Success of a future Mars Sample Return mission will depend on the correct choice of samples. Pyrolysis-FTIR can be employed as a triage instrument for Mars Sample Return. The technique can thermally dissociate minerals and organic matter for detection. Identification of certain mineral types can determine the habitability of the depositional environment, past or present, while detection of organic matter may suggest past or present habitation. In Mars' history, the Theiikian era represents an attractive target for life search missions and the acquisition of samples. The acidic and increasingly dry Theiikian may have been habitable and followed a lengthy neutral and wet period in Mars' history during which life could have originated and proliferated to achieve relatively abundant levels of biomass with a wide distribution. Moreover, the sulfate minerals produced in the Theiikian are also known to be good preservers of organic matter. We have used pyrolysis-FTIR and samples from a Mars analog ferrous acid stream with a thriving ecosystem to test the triage concept. Pyrolysis-FTIR identified those samples with the greatest probability of habitability and habitation. A three-tier scoring system was developed based on the detection of (i) organic signals, (ii) carbon dioxide and water, and (iii) sulfur dioxide. The presence of each component was given a score of A, B, or C depending on whether the substance had been detected, tentatively detected, or not detected, respectively. Single-step (for greatest possible sensitivity) or multistep (for more diagnostic data) pyrolysis-FTIR methods informed the assignments. The system allowed the highest-priority samples to be categorized as AAA (or A*AA if the organic signal was complex), while the lowest-priority samples could be categorized as CCC. Our methods provide a mechanism with which to rank samples and identify those that should take the highest priority for return to Earth during a Mars Sample Return mission. Key Words

  11. Urine sample collection protocols for bioassay samples

    Energy Technology Data Exchange (ETDEWEB)

    MacLellan, J.A.; McFadden, K.M.

    1992-11-01

    In vitro radiobioassay analyses are used to measure the amount of radioactive material excreted by personnel exposed to the potential intake of radioactive material. The analytical results are then used with various metabolic models to estimate the amount of radioactive material in the subject`s body and the original intake of radioactive material. Proper application of these metabolic models requires knowledge of the excretion period. It is normal practice to design the bioassay program based on a 24-hour excretion sample. The Hanford bioassay program simulates a total 24-hour urine excretion sample with urine collection periods lasting from one-half hour before retiring to one-half hour after rising on two consecutive days. Urine passed during the specified periods is collected in three 1-L bottles. Because the daily excretion volume given in Publication 23 of the International Commission on Radiological Protection (ICRP 1975, p. 354) for Reference Man is 1.4 L, it was proposed to use only two 1-L bottles as a cost-saving measure. This raised the broader question of what should be the design capacity of a 24-hour urine sample kit.

  12. Urine sample collection protocols for bioassay samples

    Energy Technology Data Exchange (ETDEWEB)

    MacLellan, J.A.; McFadden, K.M.

    1992-11-01

    In vitro radiobioassay analyses are used to measure the amount of radioactive material excreted by personnel exposed to the potential intake of radioactive material. The analytical results are then used with various metabolic models to estimate the amount of radioactive material in the subject's body and the original intake of radioactive material. Proper application of these metabolic models requires knowledge of the excretion period. It is normal practice to design the bioassay program based on a 24-hour excretion sample. The Hanford bioassay program simulates a total 24-hour urine excretion sample with urine collection periods lasting from one-half hour before retiring to one-half hour after rising on two consecutive days. Urine passed during the specified periods is collected in three 1-L bottles. Because the daily excretion volume given in Publication 23 of the International Commission on Radiological Protection (ICRP 1975, p. 354) for Reference Man is 1.4 L, it was proposed to use only two 1-L bottles as a cost-saving measure. This raised the broader question of what should be the design capacity of a 24-hour urine sample kit.

  13. Improved sample size determination for attributes and variables sampling

    International Nuclear Information System (INIS)

    Stirpe, D.; Picard, R.R.

    1985-01-01

    Earlier INMM papers have addressed the attributes/variables problem and, under conservative/limiting approximations, have reported analytical solutions for the attributes and variables sample sizes. Through computer simulation of this problem, we have calculated attributes and variables sample sizes as a function of falsification, measurement uncertainties, and required detection probability without using approximations. Using realistic assumptions for uncertainty parameters of measurement, the simulation results support the conclusions: (1) previously used conservative approximations can be expensive because they lead to larger sample sizes than needed; and (2) the optimal verification strategy, as well as the falsification strategy, are highly dependent on the underlying uncertainty parameters of the measurement instruments. 1 ref., 3 figs

  14. Statistical issues in reporting quality data: small samples and casemix variation.

    Science.gov (United States)

    Zaslavsky, A M

    2001-12-01

    To present two key statistical issues that arise in analysis and reporting of quality data. Casemix variation is relevant to quality reporting when the units being measured have differing distributions of patient characteristics that also affect the quality outcome. When this is the case, adjustment using stratification or regression may be appropriate. Such adjustments may be controversial when the patient characteristic does not have an obvious relationship to the outcome. Stratified reporting poses problems for sample size and reporting format, but may be useful when casemix effects vary across units. Although there are no absolute standards of reliability, high reliabilities (interunit F > or = 10 or reliability > or = 0.9) are desirable for distinguishing above- and below-average units. When small or unequal sample sizes complicate reporting, precision may be improved using indirect estimation techniques that incorporate auxiliary information, and 'shrinkage' estimation can help to summarize the strength of evidence about units with small samples. With broader understanding of casemix adjustment and methods for analyzing small samples, quality data can be analysed and reported more accurately.

  15. Lunar Sample Compendium

    Science.gov (United States)

    Meyer, Charles

    2005-01-01

    The purpose of the Lunar Sample Compendium will be to inform scientists, astronauts and the public about the various lunar samples that have been returned from the Moon. This Compendium will be organized rock by rock in the manor of a catalog, but will not be as comprehensive, nor as complete, as the various lunar sample catalogs that are available. Likewise, this Compendium will not duplicate the various excellent books and reviews on the subject of lunar samples (Cadogen 1981, Heiken et al. 1991, Papike et al. 1998, Warren 2003, Eugster 2003). However, it is thought that an online Compendium, such as this, will prove useful to scientists proposing to study individual lunar samples and should help provide backup information for lunar sample displays. This Compendium will allow easy access to the scientific literature by briefly summarizing the significant findings of each rock along with the documentation of where the detailed scientific data are to be found. In general, discussion and interpretation of the results is left to the formal reviews found in the scientific literature. An advantage of this Compendium will be that it can be updated, expanded and corrected as need be.

  16. Sampling free energy surfaces as slices by combining umbrella sampling and metadynamics.

    Science.gov (United States)

    Awasthi, Shalini; Kapil, Venkat; Nair, Nisanth N

    2016-06-15

    Metadynamics (MTD) is a very powerful technique to sample high-dimensional free energy landscapes, and due to its self-guiding property, the method has been successful in studying complex reactions and conformational changes. MTD sampling is based on filling the free energy basins by biasing potentials and thus for cases with flat, broad, and unbound free energy wells, the computational time to sample them becomes very large. To alleviate this problem, we combine the standard Umbrella Sampling (US) technique with MTD to sample orthogonal collective variables (CVs) in a simultaneous way. Within this scheme, we construct the equilibrium distribution of CVs from biased distributions obtained from independent MTD simulations with umbrella potentials. Reweighting is carried out by a procedure that combines US reweighting and Tiwary-Parrinello MTD reweighting within the Weighted Histogram Analysis Method (WHAM). The approach is ideal for a controlled sampling of a CV in a MTD simulation, making it computationally efficient in sampling flat, broad, and unbound free energy surfaces. This technique also allows for a distributed sampling of a high-dimensional free energy surface, further increasing the computational efficiency in sampling. We demonstrate the application of this technique in sampling high-dimensional surface for various chemical reactions using ab initio and QM/MM hybrid molecular dynamics simulations. Further, to carry out MTD bias reweighting for computing forward reaction barriers in ab initio or QM/MM simulations, we propose a computationally affordable approach that does not require recrossing trajectories. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  17. Sample size adjustments for varying cluster sizes in cluster randomized trials with binary outcomes analyzed with second-order PQL mixed logistic regression.

    Science.gov (United States)

    Candel, Math J J M; Van Breukelen, Gerard J P

    2010-06-30

    Adjustments of sample size formulas are given for varying cluster sizes in cluster randomized trials with a binary outcome when testing the treatment effect with mixed effects logistic regression using second-order penalized quasi-likelihood estimation (PQL). Starting from first-order marginal quasi-likelihood (MQL) estimation of the treatment effect, the asymptotic relative efficiency of unequal versus equal cluster sizes is derived. A Monte Carlo simulation study shows this asymptotic relative efficiency to be rather accurate for realistic sample sizes, when employing second-order PQL. An approximate, simpler formula is presented to estimate the efficiency loss due to varying cluster sizes when planning a trial. In many cases sampling 14 per cent more clusters is sufficient to repair the efficiency loss due to varying cluster sizes. Since current closed-form formulas for sample size calculation are based on first-order MQL, planning a trial also requires a conversion factor to obtain the variance of the second-order PQL estimator. In a second Monte Carlo study, this conversion factor turned out to be 1.25 at most. (c) 2010 John Wiley & Sons, Ltd.

  18. Procedures for sampling and sample reduction within quality assurance systems for solid biofuels

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-07-01

    The objective of this experimental study on sampling was to determine the size and number of samples of biofuels required (taken at two sampling points in each case) and to compare two methods of sampling. The first objective of the sample-reduction exercise was to compare the reliability of various sampling methods, and the second objective was to measure the variations introduced as a result of reducing the sample size to form suitable test portions. The materials studied were sawdust, wood chips, wood pellets and bales of straw, and these were analysed for moisture, ash, particle size and chloride. The sampling procedures are described. The study was conducted in Scandinavia. The results of the study were presented in Leipzig in October 2004. The work was carried out as part of the UK's DTI Technology Programme: New and Renewable Energy.

  19. Procedures for sampling and sample-reduction within quality assurance systems for solid biofuels

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-04-15

    The bias introduced when sampling solid biofuels from stockpiles or containers instead of from moving streams is assessed as well as the number and size of samples required to represent accurately the bulk sample, variations introduced when reducing bulk samples into samples for testing, and the usefulness of sample reduction methods. Details are given of the experimental work carried out in Sweden and Denmark using sawdust, wood chips, wood pellets, forestry residues and straw. The production of a model European Standard for quality assurance of solid biofuels is examined.

  20. Superposition Enhanced Nested Sampling

    Directory of Open Access Journals (Sweden)

    Stefano Martiniani

    2014-08-01

    Full Text Available The theoretical analysis of many problems in physics, astronomy, and applied mathematics requires an efficient numerical exploration of multimodal parameter spaces that exhibit broken ergodicity. Monte Carlo methods are widely used to deal with these classes of problems, but such simulations suffer from a ubiquitous sampling problem: The probability of sampling a particular state is proportional to its entropic weight. Devising an algorithm capable of sampling efficiently the full phase space is a long-standing problem. Here, we report a new hybrid method for the exploration of multimodal parameter spaces exhibiting broken ergodicity. Superposition enhanced nested sampling combines the strengths of global optimization with the unbiased or athermal sampling of nested sampling, greatly enhancing its efficiency with no additional parameters. We report extensive tests of this new approach for atomic clusters that are known to have energy landscapes for which conventional sampling schemes suffer from broken ergodicity. We also introduce a novel parallelization algorithm for nested sampling.

  1. Network Sampling with Memory: A proposal for more efficient sampling from social networks

    Science.gov (United States)

    Mouw, Ted; Verdery, Ashton M.

    2013-01-01

    Techniques for sampling from networks have grown into an important area of research across several fields. For sociologists, the possibility of sampling from a network is appealing for two reasons: (1) A network sample can yield substantively interesting data about network structures and social interactions, and (2) it is useful in situations where study populations are difficult or impossible to survey with traditional sampling approaches because of the lack of a sampling frame. Despite its appeal, methodological concerns about the precision and accuracy of network-based sampling methods remain. In particular, recent research has shown that sampling from a network using a random walk based approach such as Respondent Driven Sampling (RDS) can result in high design effects (DE)—the ratio of the sampling variance to the sampling variance of simple random sampling (SRS). A high design effect means that more cases must be collected to achieve the same level of precision as SRS. In this paper we propose an alternative strategy, Network Sampling with Memory (NSM), which collects network data from respondents in order to reduce design effects and, correspondingly, the number of interviews needed to achieve a given level of statistical power. NSM combines a “List” mode, where all individuals on the revealed network list are sampled with the same cumulative probability, with a “Search” mode, which gives priority to bridge nodes connecting the current sample to unexplored parts of the network. We test the relative efficiency of NSM compared to RDS and SRS on 162 school and university networks from Add Health and Facebook that range in size from 110 to 16,278 nodes. The results show that the average design effect for NSM on these 162 networks is 1.16, which is very close to the efficiency of a simple random sample (DE=1), and 98.5% lower than the average DE we observed for RDS. PMID:24159246

  2. Image Sampling with Quasicrystals

    Directory of Open Access Journals (Sweden)

    Mark Grundland

    2009-07-01

    Full Text Available We investigate the use of quasicrystals in image sampling. Quasicrystals produce space-filling, non-periodic point sets that are uniformly discrete and relatively dense, thereby ensuring the sample sites are evenly spread out throughout the sampled image. Their self-similar structure can be attractive for creating sampling patterns endowed with a decorative symmetry. We present a brief general overview of the algebraic theory of cut-and-project quasicrystals based on the geometry of the golden ratio. To assess the practical utility of quasicrystal sampling, we evaluate the visual effects of a variety of non-adaptive image sampling strategies on photorealistic image reconstruction and non-photorealistic image rendering used in multiresolution image representations. For computer visualization of point sets used in image sampling, we introduce a mosaic rendering technique.

  3. Sample preparation of environmental samples using benzene synthesis followed by high-performance LSC

    International Nuclear Information System (INIS)

    Filippis, S. De; Noakes, J.E.

    1991-01-01

    Liquid scintillation counting (LSC) techniques have been widely employed as the detection method for determining environmental levels of tritium and 14 C. Since anthropogenic and nonanthropogenic inputs to the environment are a concern, sampling the environment surrounding a nuclear power facility or fuel reprocessing operation requires the collection of many different sample types, including agriculture products, water, biota, aquatic life, soil, and vegetation. These sample types are not suitable for the direct detection of tritium of 14 C for liquid scintillation techniques. Each sample type must be initially prepared in order to obtain the carbon or hydrogen component of interest and present this in a chemical form that is compatible with common chemicals used in scintillation counting applications. Converting the sample of interest to chemically pure benzene as a sample preparation technique has been widely accepted for processing samples for radiocarbon age-dating applications. The synthesized benzene is composed of the carbon or hydrogen atoms from the original sample and is ideal as a solvent for LSC with excellent photo-optical properties. Benzene synthesis followed by low-background scintillation counting can be applied to the preparation and measurement of environmental samples yielding good detection sensitivities, high radionuclide counting efficiency, and shorter preparation time. The method of benzene synthesis provides a unique approach to the preparation of a wide variety of environmental sample types using similar chemistry for all samples

  4. Sampling in Qualitative Research: Improving the Quality of ...

    African Journals Online (AJOL)

    Sampling consideration in qualitative research is very important, yet in practice this appears not to be given the prominence and the rigour it deserves among Higher Education researchers. Accordingly, the quality of research outcomes in Higher Education has suffered from low utilisation. This has motivated the production ...

  5. Quantitative portable gamma spectroscopy sample analysis for non-standard sample geometries

    International Nuclear Information System (INIS)

    Enghauser, M.W.; Ebara, S.B.

    1997-01-01

    Utilizing a portable spectroscopy system, a quantitative method for analysis of samples containing a mixture of fission and activation products in nonstandard geometries was developed. The method can be used with various sample and shielding configurations where analysis on a laboratory based gamma spectroscopy system is impractical. The portable gamma spectroscopy method involves calibration of the detector and modeling of the sample and shielding to identify and quantify the radionuclides present in the sample. The method utilizes the intrinsic efficiency of the detector and the unattenuated gamma fluence rate at the detector surface per unit activity from the sample to calculate the nuclide activity and Minimum Detectable Activity (MDA). For a complex geometry, a computer code written for shielding applications (MICROSHIELD) is utilized to determine the unattenuated gamma fluence rate per unit activity at the detector surface. Lastly, the method is only applicable to nuclides which emit gamma rays and cannot be used for pure beta emitters. In addition, if sample self absorption and shielding is significant, the attenuation will result in high MDA's for nuclides which solely emit low energy gamma rays. The following presents the analysis technique and presents verification results demonstrating the accuracy of the method

  6. Gravimetric dust sampling for control purposes and occupational dust sampling.

    CSIR Research Space (South Africa)

    Unsted, AD

    1997-02-01

    Full Text Available Prior to the introduction of gravimetric dust sampling, konimeters had been used for dust sampling, which was largely for control purposes. Whether or not absolute results were achievable was not an issue since relative results were used to evaluate...

  7. Adaptive designs for the one-sample log-rank test.

    Science.gov (United States)

    Schmidt, Rene; Faldum, Andreas; Kwiecien, Robert

    2017-09-22

    Traditional designs in phase IIa cancer trials are single-arm designs with a binary outcome, for example, tumor response. In some settings, however, a time-to-event endpoint might appear more appropriate, particularly in the presence of loss to follow-up. Then the one-sample log-rank test might be the method of choice. It allows to compare the survival curve of the patients under treatment to a prespecified reference survival curve. The reference curve usually represents the expected survival under standard of the care. In this work, convergence of the one-sample log-rank statistic to Brownian motion is proven using Rebolledo's martingale central limit theorem while accounting for staggered entry times of the patients. On this basis, a confirmatory adaptive one-sample log-rank test is proposed where provision is made for data dependent sample size reassessment. The focus is to apply the inverse normal method. This is done in two different directions. The first strategy exploits the independent increments property of the one-sample log-rank statistic. The second strategy is based on the patient-wise separation principle. It is shown by simulation that the proposed adaptive test might help to rescue an underpowered trial and at the same time lowers the average sample number (ASN) under the null hypothesis as compared to a single-stage fixed sample design. © 2017, The International Biometric Society.

  8. Sampling procedures and tables

    International Nuclear Information System (INIS)

    Franzkowski, R.

    1980-01-01

    Characteristics, defects, defectives - Sampling by attributes and by variables - Sample versus population - Frequency distributions for the number of defectives or the number of defects in the sample - Operating characteristic curve, producer's risk, consumer's risk - Acceptable quality level AQL - Average outgoing quality AOQ - Standard ISQ 2859 - Fundamentals of sampling by variables for fraction defective. (RW)

  9. Integrated sampling and analysis plan for samples measuring >10 mrem/hour

    International Nuclear Information System (INIS)

    Haller, C.S.

    1992-03-01

    This integrated sampling and analysis plan was prepared to assist in planning and scheduling of Hanford Site sampling and analytical activities for all waste characterization samples that measure greater than 10 mrem/hour. This report also satisfies the requirements of the renegotiated Interim Milestone M-10-05 of the Hanford Federal Facility Agreement and Consent Order (the Tri-Party Agreement). For purposes of comparing the various analytical needs with the Hanford Site laboratory capabilities, the analytical requirements of the various programs were normalized by converting required laboratory effort for each type of sample to a common unit of work, the standard analytical equivalency unit (AEU). The AEU approximates the amount of laboratory resources required to perform an extensive suite of analyses on five core segments individually plus one additional suite of analyses on a composite sample derived from a mixture of the five core segments and prepare a validated RCRA-type data package

  10. Evaluation of Respondent-Driven Sampling

    Science.gov (United States)

    McCreesh, Nicky; Frost, Simon; Seeley, Janet; Katongole, Joseph; Tarsh, Matilda Ndagire; Ndunguse, Richard; Jichi, Fatima; Lunel, Natasha L; Maher, Dermot; Johnston, Lisa G; Sonnenberg, Pam; Copas, Andrew J; Hayes, Richard J; White, Richard G

    2012-01-01

    Background Respondent-driven sampling is a novel variant of link-tracing sampling for estimating the characteristics of hard-to-reach groups, such as HIV prevalence in sex-workers. Despite its use by leading health organizations, the performance of this method in realistic situations is still largely unknown. We evaluated respondent-driven sampling by comparing estimates from a respondent-driven sampling survey with total-population data. Methods Total-population data on age, tribe, religion, socioeconomic status, sexual activity and HIV status were available on a population of 2402 male household-heads from an open cohort in rural Uganda. A respondent-driven sampling (RDS) survey was carried out in this population, employing current methods of sampling (RDS sample) and statistical inference (RDS estimates). Analyses were carried out for the full RDS sample and then repeated for the first 250 recruits (small sample). Results We recruited 927 household-heads. Full and small RDS samples were largely representative of the total population, but both samples under-represented men who were younger, of higher socioeconomic status, and with unknown sexual activity and HIV status. Respondent-driven-sampling statistical-inference methods failed to reduce these biases. Only 31%-37% (depending on method and sample size) of RDS estimates were closer to the true population proportions than the RDS sample proportions. Only 50%-74% of respondent-driven-sampling bootstrap 95% confidence intervals included the population proportion. Conclusions Respondent-driven sampling produced a generally representative sample of this well-connected non-hidden population. However, current respondent-driven-sampling inference methods failed to reduce bias when it occurred. Whether the data required to remove bias and measure precision can be collected in a respondent-driven sampling survey is unresolved. Respondent-driven sampling should be regarded as a (potentially superior) form of convenience-sampling

  11. Evaluation of respondent-driven sampling.

    Science.gov (United States)

    McCreesh, Nicky; Frost, Simon D W; Seeley, Janet; Katongole, Joseph; Tarsh, Matilda N; Ndunguse, Richard; Jichi, Fatima; Lunel, Natasha L; Maher, Dermot; Johnston, Lisa G; Sonnenberg, Pam; Copas, Andrew J; Hayes, Richard J; White, Richard G

    2012-01-01

    Respondent-driven sampling is a novel variant of link-tracing sampling for estimating the characteristics of hard-to-reach groups, such as HIV prevalence in sex workers. Despite its use by leading health organizations, the performance of this method in realistic situations is still largely unknown. We evaluated respondent-driven sampling by comparing estimates from a respondent-driven sampling survey with total population data. Total population data on age, tribe, religion, socioeconomic status, sexual activity, and HIV status were available on a population of 2402 male household heads from an open cohort in rural Uganda. A respondent-driven sampling (RDS) survey was carried out in this population, using current methods of sampling (RDS sample) and statistical inference (RDS estimates). Analyses were carried out for the full RDS sample and then repeated for the first 250 recruits (small sample). We recruited 927 household heads. Full and small RDS samples were largely representative of the total population, but both samples underrepresented men who were younger, of higher socioeconomic status, and with unknown sexual activity and HIV status. Respondent-driven sampling statistical inference methods failed to reduce these biases. Only 31%-37% (depending on method and sample size) of RDS estimates were closer to the true population proportions than the RDS sample proportions. Only 50%-74% of respondent-driven sampling bootstrap 95% confidence intervals included the population proportion. Respondent-driven sampling produced a generally representative sample of this well-connected nonhidden population. However, current respondent-driven sampling inference methods failed to reduce bias when it occurred. Whether the data required to remove bias and measure precision can be collected in a respondent-driven sampling survey is unresolved. Respondent-driven sampling should be regarded as a (potentially superior) form of convenience sampling method, and caution is required

  12. Sample Curation in Support of the OSIRIS-REx Asteroid Sample Return Mission

    Science.gov (United States)

    Righter, Kevin; Nakamura-Messenger, Keiko

    2017-01-01

    The OSIRIS-REx asteroid sample return mission launched to asteroid Bennu Sept. 8, 2016. The spacecraft will arrive at Bennu in late 2019, orbit and map the asteroid, and perform a touch and go (TAG) sampling maneuver in July 2020. After sample is stowed and confirmed the spacecraft will return to Earth, and the sample return capsule (SRC) will land in Utah in September 2023. Samples will be recovered from Utah [2] and then transported and stored in a new sample cleanroom at NASA Johnson Space Center in Houston [3]. The materials curated for the mission are described here. a) Materials Archive and Witness Plate Collection: The SRC and TAGSAM were built between March 2014 and Summer of 2015, and instruments (OTES,OVIRS, OLA, OCAMS, REXIS) were integrated from Summer 2015 until May 2016. A total of 395 items were received for the materials archive at NASA-JSC, with archiving finishing 30 days after launch (with the final archived items being related to launch operations)[4]. The materials fall into several general categories including metals (stainless steel, aluminum, titanium alloys, brass and BeCu alloy), epoxies, paints, polymers, lubricants, non-volatile-residue samples (NVR), sapphire, and various miscellaneous materials. All through the ATLO process (from March 2015 until late August 2016) contamination knowledge witness plates (Si wafer and Al foil) were deployed in the various cleanrooms in Denver and KSC to provide an additional record of particle counts and volatiles that is archived for current and future scientific studies. These plates were deployed in roughly monthly increments with each unit containing 4 Si wafers and 4 Al foils. We archived 128 individual witness plates (64 Si wafers and 64 Al foils); one of each witness plate (Si and Al) was analyzed immediately by the science team after archiving, while the remaining 3 of each are archived indefinitely. Information about each material archived is stored in an extensive database at NASA-JSC, and key

  13. Vibronic Boson Sampling: Generalized Gaussian Boson Sampling for Molecular Vibronic Spectra at Finite Temperature.

    Science.gov (United States)

    Huh, Joonsuk; Yung, Man-Hong

    2017-08-07

    Molecular vibroic spectroscopy, where the transitions involve non-trivial Bosonic correlation due to the Duschinsky Rotation, is strongly believed to be in a similar complexity class as Boson Sampling. At finite temperature, the problem is represented as a Boson Sampling experiment with correlated Gaussian input states. This molecular problem with temperature effect is intimately related to the various versions of Boson Sampling sharing the similar computational complexity. Here we provide a full description to this relation in the context of Gaussian Boson Sampling. We find a hierarchical structure, which illustrates the relationship among various Boson Sampling schemes. Specifically, we show that every instance of Gaussian Boson Sampling with an initial correlation can be simulated by an instance of Gaussian Boson Sampling without initial correlation, with only a polynomial overhead. Since every Gaussian state is associated with a thermal state, our result implies that every sampling problem in molecular vibronic transitions, at any temperature, can be simulated by Gaussian Boson Sampling associated with a product of vacuum modes. We refer such a generalized Gaussian Boson Sampling motivated by the molecular sampling problem as Vibronic Boson Sampling.

  14. Sampling on Quasicrystals

    OpenAIRE

    Grepstad, Sigrid

    2011-01-01

    We prove that quasicrystals are universal sets of stable sampling in any dimension. Necessary and sufficient density conditions for stable sampling and interpolation sets in one dimension are studied in detail.

  15. The Lunar Sample Compendium

    Science.gov (United States)

    Meyer, Charles

    2009-01-01

    The Lunar Sample Compendium is a succinct summary of the data obtained from 40 years of study of Apollo and Luna samples of the Moon. Basic petrographic, chemical and age information is compiled, sample-by-sample, in the form of an advanced catalog in order to provide a basic description of each sample. The LSC can be found online using Google. The initial allocation of lunar samples was done sparingly, because it was realized that scientific techniques would improve over the years and new questions would be formulated. The LSC is important because it enables scientists to select samples within the context of the work that has already been done and facilitates better review of proposed allocations. It also provides back up material for public displays, captures information found only in abstracts, grey literature and curatorial databases and serves as a ready access to the now-vast scientific literature.

  16. A simulative comparison of respondent driven sampling with incentivized snowball sampling--the "strudel effect".

    Science.gov (United States)

    Gyarmathy, V Anna; Johnston, Lisa G; Caplinskiene, Irma; Caplinskas, Saulius; Latkin, Carl A

    2014-02-01

    Respondent driven sampling (RDS) and incentivized snowball sampling (ISS) are two sampling methods that are commonly used to reach people who inject drugs (PWID). We generated a set of simulated RDS samples on an actual sociometric ISS sample of PWID in Vilnius, Lithuania ("original sample") to assess if the simulated RDS estimates were statistically significantly different from the original ISS sample prevalences for HIV (9.8%), Hepatitis A (43.6%), Hepatitis B (Anti-HBc 43.9% and HBsAg 3.4%), Hepatitis C (87.5%), syphilis (6.8%) and Chlamydia (8.8%) infections and for selected behavioral risk characteristics. The original sample consisted of a large component of 249 people (83% of the sample) and 13 smaller components with 1-12 individuals. Generally, as long as all seeds were recruited from the large component of the original sample, the simulation samples simply recreated the large component. There were no significant differences between the large component and the entire original sample for the characteristics of interest. Altogether 99.2% of 360 simulation sample point estimates were within the confidence interval of the original prevalence values for the characteristics of interest. When population characteristics are reflected in large network components that dominate the population, RDS and ISS may produce samples that have statistically non-different prevalence values, even though some isolated network components may be under-sampled and/or statistically significantly different from the main groups. This so-called "strudel effect" is discussed in the paper. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  17. Convenience samples and caregiving research: how generalizable are the findings?

    Science.gov (United States)

    Pruchno, Rachel A; Brill, Jonathan E; Shands, Yvonne; Gordon, Judith R; Genderson, Maureen Wilson; Rose, Miriam; Cartwright, Francine

    2008-12-01

    We contrast characteristics of respondents recruited using convenience strategies with those of respondents recruited by random digit dial (RDD) methods. We compare sample variances, means, and interrelationships among variables generated from the convenience and RDD samples. Women aged 50 to 64 who work full time and provide care to a community-dwelling older person were recruited using either RDD (N = 55) or convenience methods (N = 87). Telephone interviews were conducted using reliable, valid measures of demographics, characteristics of the care recipient, help provided to the care recipient, evaluations of caregiver-care recipient relationship, and outcomes common to caregiving research. Convenience and RDD samples had similar variances on 68.4% of the examined variables. We found significant mean differences for 63% of the variables examined. Bivariate correlations suggest that one would reach different conclusions using the convenience and RDD sample data sets. Researchers should use convenience samples cautiously, as they may have limited generalizability.

  18. Chorionic villus sampling

    Science.gov (United States)

    ... medlineplus.gov/ency/article/003406.htm Chorionic villus sampling To use the sharing features on this page, please enable JavaScript. Chorionic villus sampling (CVS) is a test some pregnant women have ...

  19. The new Chalk River AMS ion source, sample changer and external sample magazine

    International Nuclear Information System (INIS)

    Koslowsky, V.T.; Bray, N.; Imahori, Y.; Andrews, H.R.; Davies, W.G.

    1997-01-01

    A new sample magazine, sample changer and ion source have been developed and are in routine use at Chalk River. The system features a readily accessible 40-sample magazine at ground potential that is external to the ion source and high-voltage cage. The samples are held in an inert atmosphere and can be individually examined or removed; they can be exchanged en masse as a complete magazine concurrent with an AMS measurement. On-line sample changing is done with a pneumatic rabbit transfer system employing two stages of differential pumping. At Chalk River this is routinely performed across a 200 kV potential. Sample positioning is precise, and hundreds of 36 Cl and 129 I samples have been measured over a period of several days without interruption or alteration of ion source operating conditions. (author)

  20. Genetic Sample Inventory

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This database archives genetic tissue samples from marine mammals collected primarily from the U.S. east coast. The collection includes samples from field programs,...

  1. Fluid sample collection and distribution system. [qualitative analysis of aqueous samples from several points

    Science.gov (United States)

    Brooks, R. L. (Inventor)

    1979-01-01

    A multipoint fluid sample collection and distribution system is provided wherein the sample inputs are made through one or more of a number of sampling valves to a progressive cavity pump which is not susceptible to damage by large unfiltered particles. The pump output is through a filter unit that can provide a filtered multipoint sample. An unfiltered multipoint sample is also provided. An effluent sample can be taken and applied to a second progressive cavity pump for pumping to a filter unit that can provide one or more filtered effluent samples. The second pump can also provide an unfiltered effluent sample. Means are provided to periodically back flush each filter unit without shutting off the whole system.

  2. Norm Block Sample Sizes: A Review of 17 Individually Administered Intelligence Tests

    Science.gov (United States)

    Norfolk, Philip A.; Farmer, Ryan L.; Floyd, Randy G.; Woods, Isaac L.; Hawkins, Haley K.; Irby, Sarah M.

    2015-01-01

    The representativeness, recency, and size of norm samples strongly influence the accuracy of inferences drawn from their scores. Inadequate norm samples may lead to inflated or deflated scores for individuals and poorer prediction of developmental and academic outcomes. The purpose of this study was to apply Kranzler and Floyd's method for…

  3. Contributions to sampling statistics

    CERN Document Server

    Conti, Pier; Ranalli, Maria

    2014-01-01

    This book contains a selection of the papers presented at the ITACOSM 2013 Conference, held in Milan in June 2013. ITACOSM is the bi-annual meeting of the Survey Sampling Group S2G of the Italian Statistical Society, intended as an international  forum of scientific discussion on the developments of theory and application of survey sampling methodologies and applications in human and natural sciences. The book gathers research papers carefully selected from both invited and contributed sessions of the conference. The whole book appears to be a relevant contribution to various key aspects of sampling methodology and techniques; it deals with some hot topics in sampling theory, such as calibration, quantile-regression and multiple frame surveys, and with innovative methodologies in important topics of both sampling theory and applications. Contributions cut across current sampling methodologies such as interval estimation for complex samples, randomized responses, bootstrap, weighting, modeling, imputati...

  4. MaxEnt queries and sequential sampling

    International Nuclear Information System (INIS)

    Riegler, Peter; Caticha, Nestor

    2001-01-01

    In this paper we pose the question: After gathering N data points, at what value of the control parameter should the next measurement be done? We propose an on-line algorithm which samples optimally by maximizing the gain in information on the parameters to be measured. We show analytically that the information gain is maximum for those potential measurements whose outcome is most unpredictable, i.e. for which the predictive distribution has maximum entropy. The resulting algorithm is applied to exponential analysis

  5. Iowa Geologic Sampling Points

    Data.gov (United States)

    Iowa State University GIS Support and Research Facility — Point locations of geologic samples/files in the IGS repository. Types of samples include well cuttings, outcrop samples, cores, drillers logs, measured sections,...

  6. Sampling device for withdrawing a representative sample from single and multi-phase flows

    Science.gov (United States)

    Apley, Walter J.; Cliff, William C.; Creer, James M.

    1984-01-01

    A fluid stream sampling device has been developed for the purpose of obtaining a representative sample from a single or multi-phase fluid flow. This objective is carried out by means of a probe which may be inserted into the fluid stream. Individual samples are withdrawn from the fluid flow by sampling ports with particular spacings, and the sampling parts are coupled to various analytical systems for characterization of the physical, thermal, and chemical properties of the fluid flow as a whole and also individually.

  7. Coordination of Conditional Poisson Samples

    Directory of Open Access Journals (Sweden)

    Grafström Anton

    2015-12-01

    Full Text Available Sample coordination seeks to maximize or to minimize the overlap of two or more samples. The former is known as positive coordination, and the latter as negative coordination. Positive coordination is mainly used for estimation purposes and to reduce data collection costs. Negative coordination is mainly performed to diminish the response burden of the sampled units. Poisson sampling design with permanent random numbers provides an optimum coordination degree of two or more samples. The size of a Poisson sample is, however, random. Conditional Poisson (CP sampling is a modification of the classical Poisson sampling that produces a fixed-size πps sample. We introduce two methods to coordinate Conditional Poisson samples over time or simultaneously. The first one uses permanent random numbers and the list-sequential implementation of CP sampling. The second method uses a CP sample in the first selection and provides an approximate one in the second selection because the prescribed inclusion probabilities are not respected exactly. The methods are evaluated using the size of the expected sample overlap, and are compared with their competitors using Monte Carlo simulation. The new methods provide a good coordination degree of two samples, close to the performance of Poisson sampling with permanent random numbers.

  8. Sampling methods

    International Nuclear Information System (INIS)

    Loughran, R.J.; Wallbrink, P.J.; Walling, D.E.; Appleby, P.G.

    2002-01-01

    Methods for the collection of soil samples to determine levels of 137 Cs and other fallout radionuclides, such as excess 210 Pb and 7 Be, will depend on the purposes (aims) of the project, site and soil characteristics, analytical capacity, the total number of samples that can be analysed and the sample mass required. The latter two will depend partly on detector type and capabilities. A variety of field methods have been developed for different field conditions and circumstances over the past twenty years, many of them inherited or adapted from soil science and sedimentology. The use of them inherited or adapted from soil science and sedimentology. The use of 137 Cs in erosion studies has been widely developed, while the application of fallout 210 Pb and 7 Be is still developing. Although it is possible to measure these nuclides simultaneously, it is common for experiments to designed around the use of 137 Cs along. Caesium studies typically involve comparison of the inventories found at eroded or sedimentation sites with that of a 'reference' site. An accurate characterization of the depth distribution of these fallout nuclides is often required in order to apply and/or calibrate the conversion models. However, depending on the tracer involved, the depth distribution, and thus the sampling resolution required to define it, differs. For example, a depth resolution of 1 cm is often adequate when using 137 Cs. However, fallout 210 Pb and 7 Be commonly has very strong surface maxima that decrease exponentially with depth, and fine depth increments are required at or close to the soil surface. Consequently, different depth incremental sampling methods are required when using different fallout radionuclides. Geomorphic investigations also frequently require determination of the depth-distribution of fallout nuclides on slopes and depositional sites as well as their total inventories

  9. Applications of Liquid-Phase Microextraction in the Sample Preparation of Environmental Solid Samples

    Directory of Open Access Journals (Sweden)

    Helena Prosen

    2014-05-01

    Full Text Available Solvent extraction remains one of the fundamental sample preparation techniques in the analysis of environmental solid samples, but organic solvents are toxic and environmentally harmful, therefore one of the possible greening directions is its miniaturization. The present review covers the relevant research from the field of application of microextraction to the sample preparation of environmental solid samples (soil, sediments, sewage sludge, dust etc. published in the last decade. Several innovative liquid-phase microextraction (LPME techniques that have emerged recently have also been applied as an aid in sample preparation of these samples: single-drop microextraction (SDME, hollow fiber-liquid phase microextraction (HF-LPME, dispersive liquid-liquid microextraction (DLLME. Besides the common organic solvents, surfactants and ionic liquids are also used. However, these techniques have to be combined with another technique to release the analytes from the solid sample into an aqueous solution. In the present review, the published methods were categorized into three groups: LPME in combination with a conventional solvent extraction; LPME in combination with an environmentally friendly extraction; LPME without previous extraction. The applicability of these approaches to the sample preparation for the determination of pollutants in solid environmental samples is discussed, with emphasis on their strengths, weak points and environmental impact.

  10. Applications of liquid-phase microextraction in the sample preparation of environmental solid samples.

    Science.gov (United States)

    Prosen, Helena

    2014-05-23

    Solvent extraction remains one of the fundamental sample preparation techniques in the analysis of environmental solid samples, but organic solvents are toxic and environmentally harmful, therefore one of the possible greening directions is its miniaturization. The present review covers the relevant research from the field of application of microextraction to the sample preparation of environmental solid samples (soil, sediments, sewage sludge, dust etc.) published in the last decade. Several innovative liquid-phase microextraction (LPME) techniques that have emerged recently have also been applied as an aid in sample preparation of these samples: single-drop microextraction (SDME), hollow fiber-liquid phase microextraction (HF-LPME), dispersive liquid-liquid microextraction (DLLME). Besides the common organic solvents, surfactants and ionic liquids are also used. However, these techniques have to be combined with another technique to release the analytes from the solid sample into an aqueous solution. In the present review, the published methods were categorized into three groups: LPME in combination with a conventional solvent extraction; LPME in combination with an environmentally friendly extraction; LPME without previous extraction. The applicability of these approaches to the sample preparation for the determination of pollutants in solid environmental samples is discussed, with emphasis on their strengths, weak points and environmental impact.

  11. High speed network sampling

    OpenAIRE

    Rindalsholt, Ole Arild

    2005-01-01

    Master i nettverks- og systemadministrasjon Classical Sampling methods play an important role in the current practice of Internet measurement. With today’s high speed networks, routers cannot manage to generate complete Netflow data for every packet. They have to perform restricted sampling. This thesis summarizes some of the most important sampling schemes and their applications before diving into an analysis on the effect of sampling Netflow records.

  12. Generalized sampling in Julia

    DEFF Research Database (Denmark)

    Jacobsen, Christian Robert Dahl; Nielsen, Morten; Rasmussen, Morten Grud

    2017-01-01

    Generalized sampling is a numerically stable framework for obtaining reconstructions of signals in different bases and frames from their samples. For example, one can use wavelet bases for reconstruction given frequency measurements. In this paper, we will introduce a carefully documented toolbox...... for performing generalized sampling in Julia. Julia is a new language for technical computing with focus on performance, which is ideally suited to handle the large size problems often encountered in generalized sampling. The toolbox provides specialized solutions for the setup of Fourier bases and wavelets....... The performance of the toolbox is compared to existing implementations of generalized sampling in MATLAB....

  13. How iSamples (Internet of Samples in the Earth Sciences) Improves Sample and Data Stewardship in the Next Generation of Geoscientists

    Science.gov (United States)

    Hallett, B. W.; Dere, A. L. D.; Lehnert, K.; Carter, M.

    2016-12-01

    Vast numbers of physical samples are routinely collected by geoscientists to probe key scientific questions related to global climate change, biogeochemical cycles, magmatic processes, mantle dynamics, etc. Despite their value as irreplaceable records of nature the majority of these samples remain undiscoverable by the broader scientific community because they lack a digital presence or are not well-documented enough to facilitate their discovery and reuse for future scientific and educational use. The NSF EarthCube iSamples Research Coordination Network seeks to develop a unified approach across all Earth Science disciplines for the registration, description, identification, and citation of physical specimens in order to take advantage of the new opportunities that cyberinfrastructure offers. Even as consensus around best practices begins to emerge, such as the use of the International Geo Sample Number (IGSN), more work is needed to communicate these practices to investigators to encourage widespread adoption. Recognizing the importance of students and early career scientists in particular to transforming data and sample management practices, the iSamples Education and Training Working Group is developing training modules for sample collection, documentation, and management workflows. These training materials are made available to educators/research supervisors online at http://earthcube.org/group/isamples and can be modularized for supervisors to create a customized research workflow. This study details the design and development of several sample management tutorials, created by early career scientists and documented in collaboration with undergraduate research students in field and lab settings. Modules under development focus on rock outcrops, rock cores, soil cores, and coral samples, with an emphasis on sample management throughout the collection, analysis and archiving process. We invite others to share their sample management/registration workflows and to

  14. Low-sampling-rate ultra-wideband channel estimation using equivalent-time sampling

    KAUST Repository

    Ballal, Tarig

    2014-09-01

    In this paper, a low-sampling-rate scheme for ultra-wideband channel estimation is proposed. The scheme exploits multiple observations generated by transmitting multiple pulses. In the proposed scheme, P pulses are transmitted to produce channel impulse response estimates at a desired sampling rate, while the ADC samples at a rate that is P times slower. To avoid loss of fidelity, the number of sampling periods (based on the desired rate) in the inter-pulse interval is restricted to be co-prime with P. This condition is affected when clock drift is present and the transmitted pulse locations change. To handle this case, and to achieve an overall good channel estimation performance, without using prior information, we derive an improved estimator based on the bounded data uncertainty (BDU) model. It is shown that this estimator is related to the Bayesian linear minimum mean squared error (LMMSE) estimator. Channel estimation performance of the proposed sub-sampling scheme combined with the new estimator is assessed in simulation. The results show that high reduction in sampling rate can be achieved. The proposed estimator outperforms the least squares estimator in almost all cases, while in the high SNR regime it also outperforms the LMMSE estimator. In addition to channel estimation, a synchronization method is also proposed that utilizes the same pulse sequence used for channel estimation. © 2014 IEEE.

  15. Early Educational Intervention, Early Cumulative Risk, and the Early Home Environment as Predictors of Young Adult Outcomes within a High-Risk Sample

    Science.gov (United States)

    Pungello, Elizabeth P.; Kainz, Kirsten; Burchinal, Margaret; Wasik, Barbara H.; Sparling, Joseph J.; Ramey, Craig T.; Campbell, Frances A.

    2010-01-01

    The extent to which early educational intervention, early cumulative risk, and the early home environment were associated with young adult outcomes was investigated in a sample of 139 young adults (age 21) from high-risk families enrolled in randomized trials of early intervention. Positive effects of treatment were found for education attainment,…

  16. Test sample handling apparatus

    International Nuclear Information System (INIS)

    1981-01-01

    A test sample handling apparatus using automatic scintillation counting for gamma detection, for use in such fields as radioimmunoassay, is described. The apparatus automatically and continuously counts large numbers of samples rapidly and efficiently by the simultaneous counting of two samples. By means of sequential ordering of non-sequential counting data, it is possible to obtain precisely ordered data while utilizing sample carrier holders having a minimum length. (U.K.)

  17. The Effect of Asymmetrical Sample Training on Retention Functions for Hedonic Samples in Rats

    Science.gov (United States)

    Simmons, Sabrina; Santi, Angelo

    2012-01-01

    Rats were trained in a symbolic delayed matching-to-sample task to discriminate sample stimuli that consisted of the presence of food or the absence of food. Asymmetrical sample training was provided in which one group was initially trained with only the food sample and the other group was initially trained with only the no-food sample. In…

  18. Green approaches in sample preparation of bioanalytical samples prior to chromatographic analysis.

    Science.gov (United States)

    Filippou, Olga; Bitas, Dimitrios; Samanidou, Victoria

    2017-02-01

    Sample preparation is considered to be the most challenging step of the analytical procedure, since it has an effect on the whole analytical methodology, therefore it contributes significantly to the greenness or lack of it of the entire process. The elimination of the sample treatment steps, pursuing at the same time the reduction of the amount of the sample, strong reductions in consumption of hazardous reagents and energy also maximizing safety for operators and environment, the avoidance of the use of big amount of organic solvents, form the basis for greening sample preparation and analytical methods. In the last decade, the development and utilization of greener and sustainable microextraction techniques is an alternative to classical sample preparation procedures. In this review, the main green microextraction techniques (solid phase microextraction, stir bar sorptive extraction, hollow-fiber liquid phase microextraction, dispersive liquid - liquid microextraction, etc.) will be presented, with special attention to bioanalytical applications of these environment-friendly sample preparation techniques which comply with the green analytical chemistry principles. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Toward cost-efficient sampling methods

    Science.gov (United States)

    Luo, Peng; Li, Yongli; Wu, Chong; Zhang, Guijie

    2015-09-01

    The sampling method has been paid much attention in the field of complex network in general and statistical physics in particular. This paper proposes two new sampling methods based on the idea that a small part of vertices with high node degree could possess the most structure information of a complex network. The two proposed sampling methods are efficient in sampling high degree nodes so that they would be useful even if the sampling rate is low, which means cost-efficient. The first new sampling method is developed on the basis of the widely used stratified random sampling (SRS) method and the second one improves the famous snowball sampling (SBS) method. In order to demonstrate the validity and accuracy of two new sampling methods, we compare them with the existing sampling methods in three commonly used simulation networks that are scale-free network, random network, small-world network, and also in two real networks. The experimental results illustrate that the two proposed sampling methods perform much better than the existing sampling methods in terms of achieving the true network structure characteristics reflected by clustering coefficient, Bonacich centrality and average path length, especially when the sampling rate is low.

  20. Final Sampling and Analysis Plan for Background Sampling, Fort Sheridan, Illinois

    National Research Council Canada - National Science Library

    1995-01-01

    .... This Background Sampling and Analysis Plan (BSAP) is designed to address this issue through the collection of additional background samples at Fort Sheridan to support the statistical analysis and the Baseline Risk Assessment (BRA...

  1. Low-sampling-rate ultra-wideband digital receiver using equivalent-time sampling

    KAUST Repository

    Ballal, Tarig

    2014-09-01

    In this paper, we propose an all-digital scheme for ultra-wideband symbol detection. In the proposed scheme, the received symbols are sampled many times below the Nyquist rate. It is shown that when the number of symbol repetitions, P, is co-prime with the symbol duration given in Nyquist samples, the receiver can sample the received data P times below the Nyquist rate, without loss of fidelity. The proposed scheme is applied to perform channel estimation and binary pulse position modulation (BPPM) detection. Results are presented for two receivers operating at two different sampling rates that are 10 and 20 times below the Nyquist rate. The feasibility of the proposed scheme is demonstrated in different scenarios, with reasonable bit error rates obtained in most of the cases.

  2. Low-sampling-rate ultra-wideband digital receiver using equivalent-time sampling

    KAUST Repository

    Ballal, Tarig; Al-Naffouri, Tareq Y.

    2014-01-01

    In this paper, we propose an all-digital scheme for ultra-wideband symbol detection. In the proposed scheme, the received symbols are sampled many times below the Nyquist rate. It is shown that when the number of symbol repetitions, P, is co-prime with the symbol duration given in Nyquist samples, the receiver can sample the received data P times below the Nyquist rate, without loss of fidelity. The proposed scheme is applied to perform channel estimation and binary pulse position modulation (BPPM) detection. Results are presented for two receivers operating at two different sampling rates that are 10 and 20 times below the Nyquist rate. The feasibility of the proposed scheme is demonstrated in different scenarios, with reasonable bit error rates obtained in most of the cases.

  3. 'Intelligent' approach to radioimmunoassay sample counting employing a microprocessor controlled sample counter

    International Nuclear Information System (INIS)

    Ekins, R.P.; Sufi, S.; Malan, P.G.

    1977-01-01

    The enormous impact on medical science in the last two decades of microanalytical techniques employing radioisotopic labels has, in turn, generated a large demand for automatic radioisotopic sample counters. Such instruments frequently comprise the most important item of capital equipment required in the use of radioimmunoassay and related techniques and often form a principle bottleneck in the flow of samples through a busy laboratory. It is therefore particularly imperitive that such instruments should be used 'intelligently' and in an optimal fashion to avoid both the very large capital expenditure involved in the unnecessary proliferation of instruments and the time delays arising from their sub-optimal use. The majority of the current generation of radioactive sample counters nevertheless rely on primitive control mechanisms based on a simplistic statistical theory of radioactive sample counting which preclude their efficient and rational use. The fundamental principle upon which this approach is based is that it is useless to continue counting a radioactive sample for a time longer than that required to yield a significant increase in precision of the measurement. Thus, since substantial experimental errors occur during sample preparation, these errors should be assessed and must be releted to the counting errors for that sample. It is the objective of this presentation to demonstrate that the combination of a realistic statistical assessment of radioactive sample measurement, together with the more sophisticated control mechanisms that modern microprocessor technology make possible, may often enable savings in counter usage of the order of 5-10 fold to be made. (orig.) [de

  4. Visualizing the Sample Standard Deviation

    Science.gov (United States)

    Sarkar, Jyotirmoy; Rashid, Mamunur

    2017-01-01

    The standard deviation (SD) of a random sample is defined as the square-root of the sample variance, which is the "mean" squared deviation of the sample observations from the sample mean. Here, we interpret the sample SD as the square-root of twice the mean square of all pairwise half deviations between any two sample observations. This…

  5. Radioactivity in environmental samples

    International Nuclear Information System (INIS)

    Fornaro, Laura

    2001-01-01

    The objective of this practical work is to familiarize the student with radioactivity measures in environmental samples. For that were chosen samples a salt of natural potassium, a salt of uranium or torio and a sample of drinkable water

  6. Applications of Liquid-Phase Microextraction in the Sample Preparation of Environmental Solid Samples

    OpenAIRE

    Helena Prosen

    2014-01-01

    Solvent extraction remains one of the fundamental sample preparation techniques in the analysis of environmental solid samples, but organic solvents are toxic and environmentally harmful, therefore one of the possible greening directions is its miniaturization. The present review covers the relevant research from the field of application of microextraction to the sample preparation of environmental solid samples (soil, sediments, sewage sludge, dust etc.) published in the last decade. Several...

  7. Perpendicular distance sampling: an alternative method for sampling downed coarse woody debris

    Science.gov (United States)

    Michael S. Williams; Jeffrey H. Gove

    2003-01-01

    Coarse woody debris (CWD) plays an important role in many forest ecosystem processes. In recent years, a number of new methods have been proposed to sample CWD. These methods select individual logs into the sample using some form of unequal probability sampling. One concern with most of these methods is the difficulty in estimating the volume of each log. A new method...

  8. Jenis Sample: Keuntungan dan Kerugiannya

    OpenAIRE

    Suprapto, Agus

    1994-01-01

    Sample is a part of a population that are used in a study for purposes of making estimation about the nature of the total population that is obtained with sampling technic. Sampling technic is more adventagous than cencus because it can reduce cost, time, and it can gather deeper information and more accurate data. It is useful to distinguish two major types of sampling technics. First, Prob bility sampling i.e. simple random sampling. Second, Non Probability sampling i.e. systematic sam­plin...

  9. A method of language sampling

    DEFF Research Database (Denmark)

    Rijkhoff, Jan; Bakker, Dik; Hengeveld, Kees

    1993-01-01

    In recent years more attention is paid to the quality of language samples in typological work. Without an adequate sampling strategy, samples may suffer from various kinds of bias. In this article we propose a sampling method in which the genetic criterion is taken as the most important: samples...... to determine how many languages from each phylum should be selected, given any required sample size....

  10. Wet gas sampling

    Energy Technology Data Exchange (ETDEWEB)

    Welker, T.F.

    1997-07-01

    The quality of gas has changed drastically in the past few years. Most gas is wet with hydrocarbons, water, and heavier contaminants that tend to condense if not handled properly. If a gas stream is contaminated with condensables, the sampling of that stream must be done in a manner that will ensure all of the components in the stream are introduced into the sample container as the composite. The sampling and handling of wet gas is extremely difficult under ideal conditions. There are no ideal conditions in the real world. The problems related to offshore operations and other wet gas systems, as well as the transportation of the sample, are additional problems that must be overcome if the analysis is to mean anything to the producer and gatherer. The sampling of wet gas systems is decidedly more difficult than sampling conventional dry gas systems. Wet gas systems were generally going to result in the measurement of one heating value at the inlet of the pipe and a drastic reduction in the heating value of the gas at the outlet end of the system. This is caused by the fallout or accumulation of the heavier products that, at the inlet, may be in the vapor state in the pipeline; hence, the high gravity and high BTU. But, in fact, because of pressure and temperature variances, these liquids condense and form a liquid that is actually running down the pipe as a stream or is accumulated in drips to be blown from the system. (author)

  11. An algorithm to improve sampling efficiency for uncertainty propagation using sampling based method

    International Nuclear Information System (INIS)

    Campolina, Daniel; Lima, Paulo Rubens I.; Pereira, Claubia; Veloso, Maria Auxiliadora F.

    2015-01-01

    Sample size and computational uncertainty were varied in order to investigate sample efficiency and convergence of the sampling based method for uncertainty propagation. Transport code MCNPX was used to simulate a LWR model and allow the mapping, from uncertain inputs of the benchmark experiment, to uncertain outputs. Random sampling efficiency was improved through the use of an algorithm for selecting distributions. Mean range, standard deviation range and skewness were verified in order to obtain a better representation of uncertainty figures. Standard deviation of 5 pcm in the propagated uncertainties for 10 n-samples replicates was adopted as convergence criterion to the method. Estimation of 75 pcm uncertainty on reactor k eff was accomplished by using sample of size 93 and computational uncertainty of 28 pcm to propagate 1σ uncertainty of burnable poison radius. For a fixed computational time, in order to reduce the variance of the uncertainty propagated, it was found, for the example under investigation, it is preferable double the sample size than double the amount of particles followed by Monte Carlo process in MCNPX code. (author)

  12. Mars Sample Handling Functionality

    Science.gov (United States)

    Meyer, M. A.; Mattingly, R. L.

    2018-04-01

    The final leg of a Mars Sample Return campaign would be an entity that we have referred to as Mars Returned Sample Handling (MRSH.) This talk will address our current view of the functional requirements on MRSH, focused on the Sample Receiving Facility (SRF).

  13. Big Data, Small Sample.

    Science.gov (United States)

    Gerlovina, Inna; van der Laan, Mark J; Hubbard, Alan

    2017-05-20

    Multiple comparisons and small sample size, common characteristics of many types of "Big Data" including those that are produced by genomic studies, present specific challenges that affect reliability of inference. Use of multiple testing procedures necessitates calculation of very small tail probabilities of a test statistic distribution. Results based on large deviation theory provide a formal condition that is necessary to guarantee error rate control given practical sample sizes, linking the number of tests and the sample size; this condition, however, is rarely satisfied. Using methods that are based on Edgeworth expansions (relying especially on the work of Peter Hall), we explore the impact of departures of sampling distributions from typical assumptions on actual error rates. Our investigation illustrates how far the actual error rates can be from the declared nominal levels, suggesting potentially wide-spread problems with error rate control, specifically excessive false positives. This is an important factor that contributes to "reproducibility crisis". We also review some other commonly used methods (such as permutation and methods based on finite sampling inequalities) in their application to multiple testing/small sample data. We point out that Edgeworth expansions, providing higher order approximations to the sampling distribution, offer a promising direction for data analysis that could improve reliability of studies relying on large numbers of comparisons with modest sample sizes.

  14. Sampling Strategies and Processing of Biobank Tissue Samples from Porcine Biomedical Models.

    Science.gov (United States)

    Blutke, Andreas; Wanke, Rüdiger

    2018-03-06

    In translational medical research, porcine models have steadily become more popular. Considering the high value of individual animals, particularly of genetically modified pig models, and the often-limited number of available animals of these models, establishment of (biobank) collections of adequately processed tissue samples suited for a broad spectrum of subsequent analyses methods, including analyses not specified at the time point of sampling, represent meaningful approaches to take full advantage of the translational value of the model. With respect to the peculiarities of porcine anatomy, comprehensive guidelines have recently been established for standardized generation of representative, high-quality samples from different porcine organs and tissues. These guidelines are essential prerequisites for the reproducibility of results and their comparability between different studies and investigators. The recording of basic data, such as organ weights and volumes, the determination of the sampling locations and of the numbers of tissue samples to be generated, as well as their orientation, size, processing and trimming directions, are relevant factors determining the generalizability and usability of the specimen for molecular, qualitative, and quantitative morphological analyses. Here, an illustrative, practical, step-by-step demonstration of the most important techniques for generation of representative, multi-purpose biobank specimen from porcine tissues is presented. The methods described here include determination of organ/tissue volumes and densities, the application of a volume-weighted systematic random sampling procedure for parenchymal organs by point-counting, determination of the extent of tissue shrinkage related to histological embedding of samples, and generation of randomly oriented samples for quantitative stereological analyses, such as isotropic uniform random (IUR) sections generated by the "Orientator" and "Isector" methods, and vertical

  15. PFP Wastewater Sampling Facility

    International Nuclear Information System (INIS)

    Hirzel, D.R.

    1995-01-01

    This test report documents the results obtained while conducting operational testing of the sampling equipment in the 225-WC building, the PFP Wastewater Sampling Facility. The Wastewater Sampling Facility houses equipment to sample and monitor the PFP's liquid effluents before discharging the stream to the 200 Area Treated Effluent Disposal Facility (TEDF). The majority of the streams are not radioactive and discharges from the PFP Heating, Ventilation, and Air Conditioning (HVAC). The streams that might be contaminated are processed through the Low Level Waste Treatment Facility (LLWTF) before discharging to TEDF. The sampling equipment consists of two flow-proportional composite samplers, an ultrasonic flowmeter, pH and conductivity monitors, chart recorder, and associated relays and current isolators to interconnect the equipment to allow proper operation. Data signals from the monitors are received in the 234-5Z Shift Office which contains a chart recorder and alarm annunciator panel. The data signals are also duplicated and sent to the TEDF control room through the Local Control Unit (LCU). Performing the OTP has verified the operability of the PFP wastewater sampling system. This Operability Test Report documents the acceptance of the sampling system for use

  16. Associations between ADHD symptoms and smoking outcome expectancies in a non-clinical sample of daily cigarette smokers.

    Science.gov (United States)

    Goldenson, Nicholas I; Pang, Raina D; Leventhal, Adam M

    2016-03-01

    Smoking outcome expectancies for positive reinforcement (PR: beliefs that smoking produces desirable outcomes) and negative reinforcement (NR: beliefs that smoking alleviates negative affect) are modifiable cognitive manifestations of affect-mediated smoking motivation. Based on prior data and theory, we hypothesized that NR and PR expectancies are associated with ADHD symptom levels in a non-clinical sample of cigarette smokers. (Am J Addict 2016; XX:XX -XX) METHODS: Daily cigarette smokers (N = 256) completed self-report measures of ADHD symptoms and smoking outcome expectancies. Cross-sectional associations of overall ADHD symptomatology and the ADHD symptom dimensions of inattention (IN: difficulty concentrating and distractibility) and hyperactivity impulsivity (HI: poor inhibitory control and motor activity restlessness) with PR and NR smoking outcome expectancies were examined. Higher levels of overall, IN and HI ADHD symptoms were positively associated with NR smoking expectancies after statistically controlling for anxiety, depression, alcohol/drug use problems, nicotine dependence, and other smoking expectancies. Although neither HI nor IN symptom dimensions exhibited empirically unique relations to NR expectancies over and above one another, the collective variance across IN and HI was associated with NR expectancies. PR expectancies were not associated with ADHD symptoms. Although PR and NR expectancies may be important etiological influences in the overall population of smokers, NR outcome expectancies appear to be disproportionately expressed in smokers with elevated ADHD symptoms. Cognitive manifestations of NR motivation, which may be modifiable via intervention, are prominent in smokers with elevated ADHD symptoms. Beliefs that smoking alleviates negative affect may underlie ADHD-smoking comorbidity. © American Academy of Addiction Psychiatry.

  17. Some connections between importance sampling and enhanced sampling methods in molecular dynamics.

    Science.gov (United States)

    Lie, H C; Quer, J

    2017-11-21

    In molecular dynamics, enhanced sampling methods enable the collection of better statistics of rare events from a reference or target distribution. We show that a large class of these methods is based on the idea of importance sampling from mathematical statistics. We illustrate this connection by comparing the Hartmann-Schütte method for rare event simulation (J. Stat. Mech. Theor. Exp. 2012, P11004) and the Valsson-Parrinello method of variationally enhanced sampling [Phys. Rev. Lett. 113, 090601 (2014)]. We use this connection in order to discuss how recent results from the Monte Carlo methods literature can guide the development of enhanced sampling methods.

  18. Experimental determination of size distributions: analyzing proper sample sizes

    International Nuclear Information System (INIS)

    Buffo, A; Alopaeus, V

    2016-01-01

    The measurement of various particle size distributions is a crucial aspect for many applications in the process industry. Size distribution is often related to the final product quality, as in crystallization or polymerization. In other cases it is related to the correct evaluation of heat and mass transfer, as well as reaction rates, depending on the interfacial area between the different phases or to the assessment of yield stresses of polycrystalline metals/alloys samples. The experimental determination of such distributions often involves laborious sampling procedures and the statistical significance of the outcome is rarely investigated. In this work, we propose a novel rigorous tool, based on inferential statistics, to determine the number of samples needed to obtain reliable measurements of size distribution, according to specific requirements defined a priori. Such methodology can be adopted regardless of the measurement technique used. (paper)

  19. Reactor water sampling device

    International Nuclear Information System (INIS)

    Sakamaki, Kazuo.

    1992-01-01

    The present invention concerns a reactor water sampling device for sampling reactor water in an in-core monitor (neutron measuring tube) housing in a BWR type reactor. The upper end portion of a drain pipe of the reactor water sampling device is attached detachably to an in-core monitor flange. A push-up rod is inserted in the drain pipe vertically movably. A sampling vessel and a vacuum pump are connected to the lower end of the drain pipe. A vacuum pump is operated to depressurize the inside of the device and move the push-up rod upwardly. Reactor water in the in-core monitor housing flows between the drain pipe and the push-up rod and flows into the sampling vessel. With such a constitution, reactor water in the in-core monitor housing can be sampled rapidly with neither opening the lid of the reactor pressure vessel nor being in contact with air. Accordingly, operator's exposure dose can be reduced. (I.N.)

  20. Sequential sampling: a novel method in farm animal welfare assessment.

    Science.gov (United States)

    Heath, C A E; Main, D C J; Mullan, S; Haskell, M J; Browne, W J

    2016-02-01

    Lameness in dairy cows is an important welfare issue. As part of a welfare assessment, herd level lameness prevalence can be estimated from scoring a sample of animals, where higher levels of accuracy are associated with larger sample sizes. As the financial cost is related to the number of cows sampled, smaller samples are preferred. Sequential sampling schemes have been used for informing decision making in clinical trials. Sequential sampling involves taking samples in stages, where sampling can stop early depending on the estimated lameness prevalence. When welfare assessment is used for a pass/fail decision, a similar approach could be applied to reduce the overall sample size. The sampling schemes proposed here apply the principles of sequential sampling within a diagnostic testing framework. This study develops three sequential sampling schemes of increasing complexity to classify 80 fully assessed UK dairy farms, each with known lameness prevalence. Using the Welfare Quality herd-size-based sampling scheme, the first 'basic' scheme involves two sampling events. At the first sampling event half the Welfare Quality sample size is drawn, and then depending on the outcome, sampling either stops or is continued and the same number of animals is sampled again. In the second 'cautious' scheme, an adaptation is made to ensure that correctly classifying a farm as 'bad' is done with greater certainty. The third scheme is the only scheme to go beyond lameness as a binary measure and investigates the potential for increasing accuracy by incorporating the number of severely lame cows into the decision. The three schemes are evaluated with respect to accuracy and average sample size by running 100 000 simulations for each scheme, and a comparison is made with the fixed size Welfare Quality herd-size-based sampling scheme. All three schemes performed almost as well as the fixed size scheme but with much smaller average sample sizes. For the third scheme, an overall

  1. Coupling methods for multistage sampling

    OpenAIRE

    Chauvet, Guillaume

    2015-01-01

    Multistage sampling is commonly used for household surveys when there exists no sampling frame, or when the population is scattered over a wide area. Multistage sampling usually introduces a complex dependence in the selection of the final units, which makes asymptotic results quite difficult to prove. In this work, we consider multistage sampling with simple random without replacement sampling at the first stage, and with an arbitrary sampling design for further stages. We consider coupling ...

  2. Cr(VI) generation during sample preparation of solid samples – A ...

    African Journals Online (AJOL)

    Cr(VI) generation during sample preparation of solid samples – A chromite ore case study. R.I Glastonbury, W van der Merwe, J.P Beukes, P.G van Zyl, G Lachmann, C.J.H Steenkamp, N.F Dawson, M.H Stewart ...

  3. An evaluation of soil sampling for 137Cs using various field-sampling volumes.

    Science.gov (United States)

    Nyhan, J W; White, G C; Schofield, T G; Trujillo, G

    1983-05-01

    The sediments from a liquid effluent receiving area at the Los Alamos National Laboratory and soils from an intensive study area in the fallout pathway of Trinity were sampled for 137Cs using 25-, 500-, 2500- and 12,500-cm3 field sampling volumes. A highly replicated sampling program was used to determine mean concentrations and inventories of 137Cs at each site, as well as estimates of spatial, aliquoting, and counting variance components of the radionuclide data. The sampling methods were also analyzed as a function of soil size fractions collected in each field sampling volume and of the total cost of the program for a given variation in the radionuclide survey results. Coefficients of variation (CV) of 137Cs inventory estimates ranged from 0.063 to 0.14 for Mortandad Canyon sediments, whereas CV values for Trinity soils were observed from 0.38 to 0.57. Spatial variance components of 137Cs concentration data were usually found to be larger than either the aliquoting or counting variance estimates and were inversely related to field sampling volume at the Trinity intensive site. Subsequent optimization studies of the sampling schemes demonstrated that each aliquot should be counted once, and that only 2-4 aliquots out of as many as 30 collected need be assayed for 137Cs. The optimization studies showed that as sample costs increased to 45 man-hours of labor per sample, the variance of the mean 137Cs concentration decreased dramatically, but decreased very little with additional labor.

  4. Uniform Sampling Table Method and its Applications II--Evaluating the Uniform Sampling by Experiment.

    Science.gov (United States)

    Chen, Yibin; Chen, Jiaxi; Chen, Xuan; Wang, Min; Wang, Wei

    2015-01-01

    A new method of uniform sampling is evaluated in this paper. The items and indexes were adopted to evaluate the rationality of the uniform sampling. The evaluation items included convenience of operation, uniformity of sampling site distribution, and accuracy and precision of measured results. The evaluation indexes included operational complexity, occupation rate of sampling site in a row and column, relative accuracy of pill weight, and relative deviation of pill weight. They were obtained from three kinds of drugs with different shape and size by four kinds of sampling methods. Gray correlation analysis was adopted to make the comprehensive evaluation by comparing it with the standard method. The experimental results showed that the convenience of uniform sampling method was 1 (100%), odds ratio of occupation rate in a row and column was infinity, relative accuracy was 99.50-99.89%, reproducibility RSD was 0.45-0.89%, and weighted incidence degree exceeded the standard method. Hence, the uniform sampling method was easy to operate, and the selected samples were distributed uniformly. The experimental results demonstrated that the uniform sampling method has good accuracy and reproducibility, which can be put into use in drugs analysis.

  5. Small sample whole-genome amplification

    Science.gov (United States)

    Hara, Christine; Nguyen, Christine; Wheeler, Elizabeth; Sorensen, Karen; Arroyo, Erin; Vrankovich, Greg; Christian, Allen

    2005-11-01

    Many challenges arise when trying to amplify and analyze human samples collected in the field due to limitations in sample quantity, and contamination of the starting material. Tests such as DNA fingerprinting and mitochondrial typing require a certain sample size and are carried out in large volume reactions; in cases where insufficient sample is present whole genome amplification (WGA) can be used. WGA allows very small quantities of DNA to be amplified in a way that enables subsequent DNA-based tests to be performed. A limiting step to WGA is sample preparation. To minimize the necessary sample size, we have developed two modifications of WGA: the first allows for an increase in amplified product from small, nanoscale, purified samples with the use of carrier DNA while the second is a single-step method for cleaning and amplifying samples all in one column. Conventional DNA cleanup involves binding the DNA to silica, washing away impurities, and then releasing the DNA for subsequent testing. We have eliminated losses associated with incomplete sample release, thereby decreasing the required amount of starting template for DNA testing. Both techniques address the limitations of sample size by providing ample copies of genomic samples. Carrier DNA, included in our WGA reactions, can be used when amplifying samples with the standard purification method, or can be used in conjunction with our single-step DNA purification technique to potentially further decrease the amount of starting sample necessary for future forensic DNA-based assays.

  6. Quantitative portable gamma-spectroscopy sample analysis for non-standard sample geometries

    International Nuclear Information System (INIS)

    Ebara, S.B.

    1998-01-01

    Utilizing a portable spectroscopy system, a quantitative method for analysis of samples containing a mixture of fission and activation products in nonstandard geometries was developed. This method was not developed to replace other methods such as Monte Carlo or Discrete Ordinates but rather to offer an alternative rapid solution. The method can be used with various sample and shielding configurations where analysis on a laboratory based gamma-spectroscopy system is impractical. The portable gamma-spectroscopy method involves calibration of the detector and modeling of the sample and shielding to identify and quantify the radionuclides present in the sample. The method utilizes the intrinsic efficiency of the detector and the unattenuated gamma fluence rate at the detector surface per unit activity from the sample to calculate the nuclide activity and Minimum Detectable Activity (MDA). For a complex geometry, a computer code written for shielding applications (MICROSHIELD) is utilized to determine the unattenuated gamma fluence rate per unit activity at the detector surface. Lastly, the method is only applicable to nuclides which emit gamma-rays and cannot be used for pure beta or alpha emitters. In addition, if sample self absorption and shielding is significant, the attenuation will result in high MDA's for nuclides which solely emit low energy gamma-rays. The following presents the analysis technique and presents verification results using actual experimental data, rather than comparisons to other approximations such as Monte Carlo techniques, to demonstrate the accuracy of the method given a known geometry and source term. (author)

  7. Developing Water Sampling Standards

    Science.gov (United States)

    Environmental Science and Technology, 1974

    1974-01-01

    Participants in the D-19 symposium on aquatic sampling and measurement for water pollution assessment were informed that determining the extent of waste water stream pollution is not a cut and dry procedure. Topics discussed include field sampling, representative sampling from storm sewers, suggested sampler features and application of improved…

  8. Waste classification sampling plan

    International Nuclear Information System (INIS)

    Landsman, S.D.

    1998-01-01

    The purpose of this sampling is to explain the method used to collect and analyze data necessary to verify and/or determine the radionuclide content of the B-Cell decontamination and decommissioning waste stream so that the correct waste classification for the waste stream can be made, and to collect samples for studies of decontamination methods that could be used to remove fixed contamination present on the waste. The scope of this plan is to establish the technical basis for collecting samples and compiling quantitative data on the radioactive constituents present in waste generated during deactivation activities in B-Cell. Sampling and radioisotopic analysis will be performed on the fixed layers of contamination present on structural material and internal surfaces of process piping and tanks. In addition, dose rate measurements on existing waste material will be performed to determine the fraction of dose rate attributable to both removable and fixed contamination. Samples will also be collected to support studies of decontamination methods that are effective in removing the fixed contamination present on the waste. Sampling performed under this plan will meet criteria established in BNF-2596, Data Quality Objectives for the B-Cell Waste Stream Classification Sampling, J. M. Barnett, May 1998

  9. Prevalence and Mental Health Outcomes of Homicide Survivors in a Representative US Sample of Adolescents: Data from the 2005 National Survey of Adolescents

    Science.gov (United States)

    Rheingold, Alyssa A.; Zinzow, Heidi; Hawkins, Alesia; Saunders, Benjamin E.; Kilpatrick, Dean G.

    2012-01-01

    Background: Each homicide leaves behind several friends and family members, or homicide survivors. However, limited information is available on the impact of homicide on adolescent survivors. The purpose of the current study was to identify the prevalence of homicide survivorship and to determine mental health outcomes within a sample of US…

  10. Efficiently sampling conformations and pathways using the concurrent adaptive sampling (CAS) algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Surl-Hee; Grate, Jay W.; Darve, Eric F.

    2017-08-21

    Molecular dynamics (MD) simulations are useful in obtaining thermodynamic and kinetic properties of bio-molecules but are limited by the timescale barrier, i.e., we may be unable to efficiently obtain properties because we need to run microseconds or longer simulations using femtoseconds time steps. While there are several existing methods to overcome this timescale barrier and efficiently sample thermodynamic and/or kinetic properties, problems remain in regard to being able to sample un- known systems, deal with high-dimensional space of collective variables, and focus the computational effort on slow timescales. Hence, a new sampling method, called the “Concurrent Adaptive Sampling (CAS) algorithm,” has been developed to tackle these three issues and efficiently obtain conformations and pathways. The method is not constrained to use only one or two collective variables, unlike most reaction coordinate-dependent methods. Instead, it can use a large number of collective vari- ables and uses macrostates (a partition of the collective variable space) to enhance the sampling. The exploration is done by running a large number of short simula- tions, and a clustering technique is used to accelerate the sampling. In this paper, we introduce the new methodology and show results from two-dimensional models and bio-molecules, such as penta-alanine and triazine polymer

  11. Towards Cost-efficient Sampling Methods

    OpenAIRE

    Peng, Luo; Yongli, Li; Chong, Wu

    2014-01-01

    The sampling method has been paid much attention in the field of complex network in general and statistical physics in particular. This paper presents two new sampling methods based on the perspective that a small part of vertices with high node degree can possess the most structure information of a network. The two proposed sampling methods are efficient in sampling the nodes with high degree. The first new sampling method is improved on the basis of the stratified random sampling method and...

  12. Identification of proteomic biomarkers predicting prostate cancer aggressiveness and lethality despite biopsy-sampling error.

    Science.gov (United States)

    Shipitsin, M; Small, C; Choudhury, S; Giladi, E; Friedlander, S; Nardone, J; Hussain, S; Hurley, A D; Ernst, C; Huang, Y E; Chang, H; Nifong, T P; Rimm, D L; Dunyak, J; Loda, M; Berman, D M; Blume-Jensen, P

    2014-09-09

    Key challenges of biopsy-based determination of prostate cancer aggressiveness include tumour heterogeneity, biopsy-sampling error, and variations in biopsy interpretation. The resulting uncertainty in risk assessment leads to significant overtreatment, with associated costs and morbidity. We developed a performance-based strategy to identify protein biomarkers predictive of prostate cancer aggressiveness and lethality regardless of biopsy-sampling variation. Prostatectomy samples from a large patient cohort with long follow-up were blindly assessed by expert pathologists who identified the tissue regions with the highest and lowest Gleason grade from each patient. To simulate biopsy-sampling error, a core from a high- and a low-Gleason area from each patient sample was used to generate a 'high' and a 'low' tumour microarray, respectively. Using a quantitative proteomics approach, we identified from 160 candidates 12 biomarkers that predicted prostate cancer aggressiveness (surgical Gleason and TNM stage) and lethal outcome robustly in both high- and low-Gleason areas. Conversely, a previously reported lethal outcome-predictive marker signature for prostatectomy tissue was unable to perform under circumstances of maximal sampling error. Our results have important implications for cancer biomarker discovery in general and development of a sampling error-resistant clinical biopsy test for prediction of prostate cancer aggressiveness.

  13. Sample preparation in foodomic analyses.

    Science.gov (United States)

    Martinović, Tamara; Šrajer Gajdošik, Martina; Josić, Djuro

    2018-04-16

    Representative sampling and adequate sample preparation are key factors for successful performance of further steps in foodomic analyses, as well as for correct data interpretation. Incorrect sampling and improper sample preparation can be sources of severe bias in foodomic analyses. It is well known that both wrong sampling and sample treatment cannot be corrected anymore. These, in the past frequently neglected facts, are now taken into consideration, and the progress in sampling and sample preparation in foodomics is reviewed here. We report the use of highly sophisticated instruments for both high-performance and high-throughput analyses, as well as miniaturization and the use of laboratory robotics in metabolomics, proteomics, peptidomics and genomics. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  14. Protocol for Microplastics Sampling on the Sea Surface and Sample Analysis

    Science.gov (United States)

    Kovač Viršek, Manca; Palatinus, Andreja; Koren, Špela; Peterlin, Monika; Horvat, Petra; Kržan, Andrej

    2016-01-01

    Microplastic pollution in the marine environment is a scientific topic that has received increasing attention over the last decade. The majority of scientific publications address microplastic pollution of the sea surface. The protocol below describes the methodology for sampling, sample preparation, separation and chemical identification of microplastic particles. A manta net fixed on an »A frame« attached to the side of the vessel was used for sampling. Microplastic particles caught in the cod end of the net were separated from samples by visual identification and use of stereomicroscopes. Particles were analyzed for their size using an image analysis program and for their chemical structure using ATR-FTIR and micro FTIR spectroscopy. The described protocol is in line with recommendations for microplastics monitoring published by the Marine Strategy Framework Directive (MSFD) Technical Subgroup on Marine Litter. This written protocol with video guide will support the work of researchers that deal with microplastics monitoring all over the world. PMID:28060297

  15. Comparing fixed sampling with minimizer sampling when using k-mer indexes to find maximal exact matches.

    Science.gov (United States)

    Almutairy, Meznah; Torng, Eric

    2018-01-01

    Bioinformatics applications and pipelines increasingly use k-mer indexes to search for similar sequences. The major problem with k-mer indexes is that they require lots of memory. Sampling is often used to reduce index size and query time. Most applications use one of two major types of sampling: fixed sampling and minimizer sampling. It is well known that fixed sampling will produce a smaller index, typically by roughly a factor of two, whereas it is generally assumed that minimizer sampling will produce faster query times since query k-mers can also be sampled. However, no direct comparison of fixed and minimizer sampling has been performed to verify these assumptions. We systematically compare fixed and minimizer sampling using the human genome as our database. We use the resulting k-mer indexes for fixed sampling and minimizer sampling to find all maximal exact matches between our database, the human genome, and three separate query sets, the mouse genome, the chimp genome, and an NGS data set. We reach the following conclusions. First, using larger k-mers reduces query time for both fixed sampling and minimizer sampling at a cost of requiring more space. If we use the same k-mer size for both methods, fixed sampling requires typically half as much space whereas minimizer sampling processes queries only slightly faster. If we are allowed to use any k-mer size for each method, then we can choose a k-mer size such that fixed sampling both uses less space and processes queries faster than minimizer sampling. The reason is that although minimizer sampling is able to sample query k-mers, the number of shared k-mer occurrences that must be processed is much larger for minimizer sampling than fixed sampling. In conclusion, we argue that for any application where each shared k-mer occurrence must be processed, fixed sampling is the right sampling method.

  16. Comparing fixed sampling with minimizer sampling when using k-mer indexes to find maximal exact matches.

    Directory of Open Access Journals (Sweden)

    Meznah Almutairy

    Full Text Available Bioinformatics applications and pipelines increasingly use k-mer indexes to search for similar sequences. The major problem with k-mer indexes is that they require lots of memory. Sampling is often used to reduce index size and query time. Most applications use one of two major types of sampling: fixed sampling and minimizer sampling. It is well known that fixed sampling will produce a smaller index, typically by roughly a factor of two, whereas it is generally assumed that minimizer sampling will produce faster query times since query k-mers can also be sampled. However, no direct comparison of fixed and minimizer sampling has been performed to verify these assumptions. We systematically compare fixed and minimizer sampling using the human genome as our database. We use the resulting k-mer indexes for fixed sampling and minimizer sampling to find all maximal exact matches between our database, the human genome, and three separate query sets, the mouse genome, the chimp genome, and an NGS data set. We reach the following conclusions. First, using larger k-mers reduces query time for both fixed sampling and minimizer sampling at a cost of requiring more space. If we use the same k-mer size for both methods, fixed sampling requires typically half as much space whereas minimizer sampling processes queries only slightly faster. If we are allowed to use any k-mer size for each method, then we can choose a k-mer size such that fixed sampling both uses less space and processes queries faster than minimizer sampling. The reason is that although minimizer sampling is able to sample query k-mers, the number of shared k-mer occurrences that must be processed is much larger for minimizer sampling than fixed sampling. In conclusion, we argue that for any application where each shared k-mer occurrence must be processed, fixed sampling is the right sampling method.

  17. Comparing fixed sampling with minimizer sampling when using k-mer indexes to find maximal exact matches

    Science.gov (United States)

    Torng, Eric

    2018-01-01

    Bioinformatics applications and pipelines increasingly use k-mer indexes to search for similar sequences. The major problem with k-mer indexes is that they require lots of memory. Sampling is often used to reduce index size and query time. Most applications use one of two major types of sampling: fixed sampling and minimizer sampling. It is well known that fixed sampling will produce a smaller index, typically by roughly a factor of two, whereas it is generally assumed that minimizer sampling will produce faster query times since query k-mers can also be sampled. However, no direct comparison of fixed and minimizer sampling has been performed to verify these assumptions. We systematically compare fixed and minimizer sampling using the human genome as our database. We use the resulting k-mer indexes for fixed sampling and minimizer sampling to find all maximal exact matches between our database, the human genome, and three separate query sets, the mouse genome, the chimp genome, and an NGS data set. We reach the following conclusions. First, using larger k-mers reduces query time for both fixed sampling and minimizer sampling at a cost of requiring more space. If we use the same k-mer size for both methods, fixed sampling requires typically half as much space whereas minimizer sampling processes queries only slightly faster. If we are allowed to use any k-mer size for each method, then we can choose a k-mer size such that fixed sampling both uses less space and processes queries faster than minimizer sampling. The reason is that although minimizer sampling is able to sample query k-mers, the number of shared k-mer occurrences that must be processed is much larger for minimizer sampling than fixed sampling. In conclusion, we argue that for any application where each shared k-mer occurrence must be processed, fixed sampling is the right sampling method. PMID:29389989

  18. Influence of population versus convenience sampling on sample characteristics in studies of cognitive aging.

    Science.gov (United States)

    Brodaty, Henry; Mothakunnel, Annu; de Vel-Palumbo, Melissa; Ames, David; Ellis, Kathryn A; Reppermund, Simone; Kochan, Nicole A; Savage, Greg; Trollor, Julian N; Crawford, John; Sachdev, Perminder S

    2014-01-01

    We examined whether differences in findings of studies examining mild cognitive impairment (MCI) were associated with recruitment methods by comparing sample characteristics in two contemporaneous Australian studies, using population-based and convenience sampling. The Sydney Memory and Aging Study invited participants randomly from the electoral roll in defined geographic areas in Sydney. The Australian Imaging, Biomarkers and Lifestyle Study of Ageing recruited cognitively normal (CN) individuals via media appeals and MCI participants via referrals from clinicians in Melbourne and Perth. Demographic and cognitive variables were harmonized, and similar diagnostic criteria were applied to both samples retrospectively. CN participants recruited via convenience sampling were younger, better educated, more likely to be married and have a family history of dementia, and performed better cognitively than those recruited via population-based sampling. MCI participants recruited via population-based sampling had better memory performance and were less likely to carry the apolipoprotein E ε4 allele than clinically referred participants but did not differ on other demographic variables. A convenience sample of normal controls is likely to be younger and better functioning and that of an MCI group likely to perform worse than a purportedly random sample. Sampling bias should be considered when interpreting findings. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. Searching for the Optimal Sampling Solution: Variation in Invertebrate Communities, Sample Condition and DNA Quality.

    Directory of Open Access Journals (Sweden)

    Martin M Gossner

    Full Text Available There is a great demand for standardising biodiversity assessments in order to allow optimal comparison across research groups. For invertebrates, pitfall or flight-interception traps are commonly used, but sampling solution differs widely between studies, which could influence the communities collected and affect sample processing (morphological or genetic. We assessed arthropod communities with flight-interception traps using three commonly used sampling solutions across two forest types and two vertical strata. We first considered the effect of sampling solution and its interaction with forest type, vertical stratum, and position of sampling jar at the trap on sample condition and community composition. We found that samples collected in copper sulphate were more mouldy and fragmented relative to other solutions which might impair morphological identification, but condition depended on forest type, trap type and the position of the jar. Community composition, based on order-level identification, did not differ across sampling solutions and only varied with forest type and vertical stratum. Species richness and species-level community composition, however, differed greatly among sampling solutions. Renner solution was highly attractant for beetles and repellent for true bugs. Secondly, we tested whether sampling solution affects subsequent molecular analyses and found that DNA barcoding success was species-specific. Samples from copper sulphate produced the fewest successful DNA sequences for genetic identification, and since DNA yield or quality was not particularly reduced in these samples additional interactions between the solution and DNA must also be occurring. Our results show that the choice of sampling solution should be an important consideration in biodiversity studies. Due to the potential bias towards or against certain species by Ethanol-containing sampling solution we suggest ethylene glycol as a suitable sampling solution when

  20. Sample Size Determination for One- and Two-Sample Trimmed Mean Tests

    Science.gov (United States)

    Luh, Wei-Ming; Olejnik, Stephen; Guo, Jiin-Huarng

    2008-01-01

    Formulas to determine the necessary sample sizes for parametric tests of group comparisons are available from several sources and appropriate when population distributions are normal. However, in the context of nonnormal population distributions, researchers recommend Yuen's trimmed mean test, but formulas to determine sample sizes have not been…

  1. Clinical Outcomes among Transferred Children with Ischemic and Hemorrhagic Strokes in the Nationwide Inpatient Sample.

    Science.gov (United States)

    Adil, Malik M; Vidal, Gabriel A; Beslow, Lauren A

    2016-11-01

    Children with ischemic stroke (IS) and hemorrhagic stroke (HS) may require interfacility transfer for higher level of care. We compared the characteristics and clinical outcomes of transferred and nontransferred children with IS and HS. Children aged 1-18 years admitted to hospitals in the United States from 2008 to 2011 with a primary discharge diagnosis of IS and HS were identified from the National Inpatient Sample database by ICD-9 codes. Using logistic regression, we estimated the odds ratios (OR) and 95% confidence intervals (CI) for in-hospital mortality and discharge to nursing facilities (versus discharge home) between transferred and nontransferred patients. Of the 2815 children with IS, 26.7% were transferred. In-hospital mortality and discharge to nursing facilities were not different between transferred and nontransferred children in univariable analysis or in multivariable analysis that adjusted for age, sex, and confounding factors. Of the 6879 children with HS, 27.1% were transferred. Transferred compared to nontransferred children had higher rates of both in-hospital mortality (8% versus 4%, P = .003) and discharge to nursing facilities (25% versus 20%, P = .03). After adjusting for age, sex, and confounding factors, in-hospital mortality (OR 1.5, 95% CI 1.1-2.4, P = .04) remained higher in transferred children, whereas discharge to nursing facilities was not different between the groups. HS but not IS was associated with worse outcomes for children transferred to another hospital compared to children who were not transferred. Additional study is needed to understand what factors may contribute to poorer outcomes among transferred children with HS. Copyright © 2016 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  2. System for Earth Sample Registration SESAR: Services for IGSN Registration and Sample Metadata Management

    Science.gov (United States)

    Chan, S.; Lehnert, K. A.; Coleman, R. J.

    2011-12-01

    SESAR, the System for Earth Sample Registration, is an online registry for physical samples collected for Earth and environmental studies. SESAR generates and administers the International Geo Sample Number IGSN, a unique identifier for samples that is dramatically advancing interoperability amongst information systems for sample-based data. SESAR was developed to provide the complete range of registry services, including definition of IGSN syntax and metadata profiles, registration and validation of name spaces requested by users, tools for users to submit and manage sample metadata, validation of submitted metadata, generation and validation of the unique identifiers, archiving of sample metadata, and public or private access to the sample metadata catalog. With the development of SESAR v3, we placed particular emphasis on creating enhanced tools that make metadata submission easier and more efficient for users, and that provide superior functionality for users to manage metadata of their samples in their private workspace MySESAR. For example, SESAR v3 includes a module where users can generate custom spreadsheet templates to enter metadata for their samples, then upload these templates online for sample registration. Once the content of the template is uploaded, it is displayed online in an editable grid format. Validation rules are executed in real-time on the grid data to ensure data integrity. Other new features of SESAR v3 include the capability to transfer ownership of samples to other SESAR users, the ability to upload and store images and other files in a sample metadata profile, and the tracking of changes to sample metadata profiles. In the next version of SESAR (v3.5), we will further improve the discovery, sharing, registration of samples. For example, we are developing a more comprehensive suite of web services that will allow discovery and registration access to SESAR from external systems. Both batch and individual registrations will be possible

  3. Methodology Series Module 5: Sampling Strategies.

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ' Sampling Method'. There are essentially two types of sampling methods: 1) probability sampling - based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling - based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ' random sample' when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ' generalizability' of these results. In such a scenario, the researcher may want to use ' purposive sampling' for the study.

  4. Methodology series module 5: Sampling strategies

    Directory of Open Access Journals (Sweden)

    Maninder Singh Setia

    2016-01-01

    Full Text Available Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the 'Sampling Method'. There are essentially two types of sampling methods: 1 probability sampling – based on chance events (such as random numbers, flipping a coin etc.; and 2 non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term 'random sample' when the researcher has used convenience sample. The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the 'generalizability' of these results. In such a scenario, the researcher may want to use 'purposive sampling' for the study.

  5. A simple sample size formula for analysis of covariance in cluster randomized trials.

    NARCIS (Netherlands)

    Teerenstra, S.; Eldridge, S.; Graff, M.J.; Hoop, E. de; Borm, G.F.

    2012-01-01

    For cluster randomized trials with a continuous outcome, the sample size is often calculated as if an analysis of the outcomes at the end of the treatment period (follow-up scores) would be performed. However, often a baseline measurement of the outcome is available or feasible to obtain. An

  6. Evaluation of common methods for sampling invertebrate pollinator assemblages: net sampling out-perform pan traps.

    Science.gov (United States)

    Popic, Tony J; Davila, Yvonne C; Wardle, Glenda M

    2013-01-01

    Methods for sampling ecological assemblages strive to be efficient, repeatable, and representative. Unknowingly, common methods may be limited in terms of revealing species function and so of less value for comparative studies. The global decline in pollination services has stimulated surveys of flower-visiting invertebrates, using pan traps and net sampling. We explore the relative merits of these two methods in terms of species discovery, quantifying abundance, function, and composition, and responses of species to changing floral resources. Using a spatially-nested design we sampled across a 5000 km(2) area of arid grasslands, including 432 hours of net sampling and 1296 pan trap-days, between June 2010 and July 2011. Net sampling yielded 22% more species and 30% higher abundance than pan traps, and better reflected the spatio-temporal variation of floral resources. Species composition differed significantly between methods; from 436 total species, 25% were sampled by both methods, 50% only by nets, and the remaining 25% only by pans. Apart from being less comprehensive, if pan traps do not sample flower-visitors, the link to pollination is questionable. By contrast, net sampling functionally linked species to pollination through behavioural observations of flower-visitation interaction frequency. Netted specimens are also necessary for evidence of pollen transport. Benefits of net-based sampling outweighed minor differences in overall sampling effort. As pan traps and net sampling methods are not equivalent for sampling invertebrate-flower interactions, we recommend net sampling of invertebrate pollinator assemblages, especially if datasets are intended to document declines in pollination and guide measures to retain this important ecosystem service.

  7. Statistics and sampling in transuranic studies

    International Nuclear Information System (INIS)

    Eberhardt, L.L.; Gilbert, R.O.

    1980-01-01

    The existing data on transuranics in the environment exhibit a remarkably high variability from sample to sample (coefficients of variation of 100% or greater). This chapter stresses the necessity of adequate sample size and suggests various ways to increase sampling efficiency. Objectives in sampling are regarded as being of great importance in making decisions as to sampling methodology. Four different classes of sampling methods are described: (1) descriptive sampling, (2) sampling for spatial pattern, (3) analytical sampling, and (4) sampling for modeling. A number of research needs are identified in the various sampling categories along with several problems that appear to be common to two or more such areas

  8. Sample Size for Measuring Grammaticality in Preschool Children from Picture-Elicited Language Samples

    Science.gov (United States)

    Eisenberg, Sarita L.; Guo, Ling-Yu

    2015-01-01

    Purpose: The purpose of this study was to investigate whether a shorter language sample elicited with fewer pictures (i.e., 7) would yield a percent grammatical utterances (PGU) score similar to that computed from a longer language sample elicited with 15 pictures for 3-year-old children. Method: Language samples were elicited by asking forty…

  9. Sampling pig farms at the abattoir in a cross-sectional study - Evaluation of a sampling method.

    Science.gov (United States)

    Birkegård, Anna Camilla; Halasa, Tariq; Toft, Nils

    2017-09-15

    A cross-sectional study design is relatively inexpensive, fast and easy to conduct when compared to other study designs. Careful planning is essential to obtaining a representative sample of the population, and the recommended approach is to use simple random sampling from an exhaustive list of units in the target population. This approach is rarely feasible in practice, and other sampling procedures must often be adopted. For example, when slaughter pigs are the target population, sampling the pigs on the slaughter line may be an alternative to on-site sampling at a list of farms. However, it is difficult to sample a large number of farms from an exact predefined list, due to the logistics and workflow of an abattoir. Therefore, it is necessary to have a systematic sampling procedure and to evaluate the obtained sample with respect to the study objective. We propose a method for 1) planning, 2) conducting, and 3) evaluating the representativeness and reproducibility of a cross-sectional study when simple random sampling is not possible. We used an example of a cross-sectional study with the aim of quantifying the association of antimicrobial resistance and antimicrobial consumption in Danish slaughter pigs. It was not possible to visit farms within the designated timeframe. Therefore, it was decided to use convenience sampling at the abattoir. Our approach was carried out in three steps: 1) planning: using data from meat inspection to plan at which abattoirs and how many farms to sample; 2) conducting: sampling was carried out at five abattoirs; 3) evaluation: representativeness was evaluated by comparing sampled and non-sampled farms, and the reproducibility of the study was assessed through simulated sampling based on meat inspection data from the period where the actual data collection was carried out. In the cross-sectional study samples were taken from 681 Danish pig farms, during five weeks from February to March 2015. The evaluation showed that the sampling

  10. An empirical comparison of respondent-driven sampling, time location sampling, and snowball sampling for behavioral surveillance in men who have sex with men, Fortaleza, Brazil.

    Science.gov (United States)

    Kendall, Carl; Kerr, Ligia R F S; Gondim, Rogerio C; Werneck, Guilherme L; Macena, Raimunda Hermelinda Maia; Pontes, Marta Kerr; Johnston, Lisa G; Sabin, Keith; McFarland, Willi

    2008-07-01

    Obtaining samples of populations at risk for HIV challenges surveillance, prevention planning, and evaluation. Methods used include snowball sampling, time location sampling (TLS), and respondent-driven sampling (RDS). Few studies have made side-by-side comparisons to assess their relative advantages. We compared snowball, TLS, and RDS surveys of men who have sex with men (MSM) in Forteleza, Brazil, with a focus on the socio-economic status (SES) and risk behaviors of the samples to each other, to known AIDS cases and to the general population. RDS produced a sample with wider inclusion of lower SES than snowball sampling or TLS-a finding of health significance given the majority of AIDS cases reported among MSM in the state were low SES. RDS also achieved the sample size faster and at lower cost. For reasons of inclusion and cost-efficiency, RDS is the sampling methodology of choice for HIV surveillance of MSM in Fortaleza.

  11. Chorionic villus sampling and amniocentesis.

    Science.gov (United States)

    Brambati, Bruno; Tului, Lucia

    2005-04-01

    The advantages and disadvantages of common invasive methods for prenatal diagnosis are presented in light of new investigations. Several aspects of first-trimester chorionic villus sampling and mid-trimester amniocentesis remain controversial, especially fetal loss rate, feto-maternal complications, and the extension of both sampling methods to less traditional gestational ages (early amniocentesis, late chorionic villus sampling), all of which complicate genetic counseling. A recent randomized trial involving early amniocentesis and late chorionic villus sampling has confirmed previous studies, leading to the unquestionable conclusion that transabdominal chorionic villus sampling is safer. The old dispute over whether limb reduction defects are caused by chorionic villus sampling gains new vigor, with a paper suggesting that this technique has distinctive teratogenic effects. The large experience involving maternal and fetal complications following mid-trimester amniocentesis allows a better estimate of risk for comparison with chorionic villus sampling. Transabdominal chorionic villus sampling, which appears to be the gold standard sampling method for genetic investigations between 10 and 15 completed weeks, permits rapid diagnosis in high-risk cases detected by first-trimester screening of aneuploidies. Sampling efficiency and karyotyping reliability are as high as in mid-trimester amniocentesis with fewer complications, provided the operator has the required training, skill and experience.

  12. Good outcome of adolescent onset anorexia nervosa after systematic treatment. Intermediate to long-term follow-up of a representative county-sample.

    Science.gov (United States)

    Halvorsen, Inger; Andersen, Anne; Heyerdahl, Sonja

    2004-10-01

    We studied the intermediate to long-term outcome of childhood and adolescent onset anorexia nervosa (AN), in a sample that had received systematic treatment based on close cooperation between parents, paediatric department and child and adolescent psychiatry. Of 55 female AN-patients, 51 were examined 3.5-14.5 years after treatment start. The material includes all AN-patients under 18 years in one county that received inpatient treatment and almost all that received outpatient treatment, during the time period 1986-1998. Forty-two (82%) subjects had no eating disorder (ED) at follow-up, one (2%) had AN, one (2%) bulimia nervosa (BN) and seven (14%) had less severe ED (EDNOS). Except the one with BN, none had bulimic symptoms. There was no mortality. Twenty (41%) had one or more other axis-1 psychiatric diagnoses at follow-up. Depression and anxiety disorders were most frequent. Psychosocial functioning assessed by Global Assessment of Functioning (GAF) was fairly good; mean 73+/-SD14 for symptoms and mean 77+/-SD13 for functioning. Only 48% were satisfied with life, compared to 83% in a normal population sample. Our conclusion is that the eating disorder outcome was good. However, in accordance with other studies, many subjects had other psychiatric problems at follow-up.

  13. An 'intelligent' approach to radioimmunoassay sample counting employing a microprocessor-controlled sample counter

    International Nuclear Information System (INIS)

    Ekins, R.P.; Sufi, S.; Malan, P.G.

    1978-01-01

    The enormous impact on medical science in the last two decades of microanalytical techniques employing radioisotopic labels has, in turn, generated a large demand for automatic radioisotopic sample counters. Such instruments frequently comprise the most important item of capital equipment required in the use of radioimmunoassay and related techniques and often form a principle bottleneck in the flow of samples through a busy laboratory. It is therefore imperative that such instruments should be used 'intelligently' and in an optimal fashion to avoid both the very large capital expenditure involved in the unnecessary proliferation of instruments and the time delays arising from their sub-optimal use. Most of the current generation of radioactive sample counters nevertheless rely on primitive control mechanisms based on a simplistic statistical theory of radioactive sample counting which preclude their efficient and rational use. The fundamental principle upon which this approach is based is that it is useless to continue counting a radioactive sample for a time longer than that required to yield a significant increase in precision of the measurement. Thus, since substantial experimental errors occur during sample preparation, these errors should be assessed and must be related to the counting errors for that sample. The objective of the paper is to demonstrate that the combination of a realistic statistical assessment of radioactive sample measurement, together with the more sophisticated control mechanisms that modern microprocessor technology make possible, may often enable savings in counter usage of the order of 5- to 10-fold to be made. (author)

  14. UNLABELED SELECTED SAMPLES IN FEATURE EXTRACTION FOR CLASSIFICATION OF HYPERSPECTRAL IMAGES WITH LIMITED TRAINING SAMPLES

    Directory of Open Access Journals (Sweden)

    A. Kianisarkaleh

    2015-12-01

    Full Text Available Feature extraction plays a key role in hyperspectral images classification. Using unlabeled samples, often unlimitedly available, unsupervised and semisupervised feature extraction methods show better performance when limited number of training samples exists. This paper illustrates the importance of selecting appropriate unlabeled samples that used in feature extraction methods. Also proposes a new method for unlabeled samples selection using spectral and spatial information. The proposed method has four parts including: PCA, prior classification, posterior classification and sample selection. As hyperspectral image passes these parts, selected unlabeled samples can be used in arbitrary feature extraction methods. The effectiveness of the proposed unlabeled selected samples in unsupervised and semisupervised feature extraction is demonstrated using two real hyperspectral datasets. Results show that through selecting appropriate unlabeled samples, the proposed method can improve the performance of feature extraction methods and increase classification accuracy.

  15. Sample pretretment in microsystems

    DEFF Research Database (Denmark)

    Perch-Nielsen, Ivan R.

    2003-01-01

    : Sample preparation → DNA amplification → DNA analysis. The overall goal of the project is integration of as many as possible of these steps. This thesis covers mainly pretreatment in a microchip. Some methods for sample pretreatment have been tested. Most conventional is fluorescence activated cell sort......When a sample, e.g. from a patient, is processed using conventional methods, the sample must be transported to the laboratory where it is analyzed, after which the results is sent back. By integrating the separate steps of the analysis in a micro total analysis system (μTAS), results can...... be obtained fast and better. Preferably with all the processes from sample to signal moved to the bedside of the patient. Of course there is still much to learn and study in the process of miniaturization. DNA analysis is one process subject to integration. There are roughly three steps in a DNA analysis...

  16. Groundwater sampling in uranium reconnaissance

    International Nuclear Information System (INIS)

    Butz, T.R.

    1977-03-01

    The groundwater sampling program is based on the premise that ground water geochemistry reflects the chemical composition of, and geochemical processes active in the strata from which the sample is obtained. Pilot surveys have shown that wells are the best source of groundwater, although springs are sampled on occasion. The procedures followed in selecting a sampling site, the sampling itself, and the field measurements, as well as the site records made, are described

  17. Sampling Key Populations for HIV Surveillance: Results From Eight Cross-Sectional Studies Using Respondent-Driven Sampling and Venue-Based Snowball Sampling.

    Science.gov (United States)

    Rao, Amrita; Stahlman, Shauna; Hargreaves, James; Weir, Sharon; Edwards, Jessie; Rice, Brian; Kochelani, Duncan; Mavimbela, Mpumelelo; Baral, Stefan

    2017-10-20

    In using regularly collected or existing surveillance data to characterize engagement in human immunodeficiency virus (HIV) services among marginalized populations, differences in sampling methods may produce different pictures of the target population and may therefore result in different priorities for response. The objective of this study was to use existing data to evaluate the sample distribution of eight studies of female sex workers (FSW) and men who have sex with men (MSM), who were recruited using different sampling approaches in two locations within Sub-Saharan Africa: Manzini, Swaziland and Yaoundé, Cameroon. MSM and FSW participants were recruited using either respondent-driven sampling (RDS) or venue-based snowball sampling. Recruitment took place between 2011 and 2016. Participants at each study site were administered a face-to-face survey to assess sociodemographics, along with the prevalence of self-reported HIV status, frequency of HIV testing, stigma, and other HIV-related characteristics. Crude and RDS-adjusted prevalence estimates were calculated. Crude prevalence estimates from the venue-based snowball samples were compared with the overlap of the RDS-adjusted prevalence estimates, between both FSW and MSM in Cameroon and Swaziland. RDS samples tended to be younger (MSM aged 18-21 years in Swaziland: 47.6% [139/310] in RDS vs 24.3% [42/173] in Snowball, in Cameroon: 47.9% [99/306] in RDS vs 20.1% [52/259] in Snowball; FSW aged 18-21 years in Swaziland 42.5% [82/325] in RDS vs 8.0% [20/249] in Snowball; in Cameroon 15.6% [75/576] in RDS vs 8.1% [25/306] in Snowball). They were less educated (MSM: primary school completed or less in Swaziland 42.6% [109/310] in RDS vs 4.0% [7/173] in Snowball, in Cameroon 46.2% [138/306] in RDS vs 14.3% [37/259] in Snowball; FSW: primary school completed or less in Swaziland 86.6% [281/325] in RDS vs 23.9% [59/247] in Snowball, in Cameroon 87.4% [520/576] in RDS vs 77.5% [238/307] in Snowball) than the snowball

  18. Methodology Series Module 5: Sampling Strategies

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ‘ Sampling Method’. There are essentially two types of sampling methods: 1) probability sampling – based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ‘ random sample’ when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ‘ generalizability’ of these results. In such a scenario, the researcher may want to use ‘ purposive sampling’ for the study. PMID:27688438

  19. Evaluation of common methods for sampling invertebrate pollinator assemblages: net sampling out-perform pan traps.

    Directory of Open Access Journals (Sweden)

    Tony J Popic

    Full Text Available Methods for sampling ecological assemblages strive to be efficient, repeatable, and representative. Unknowingly, common methods may be limited in terms of revealing species function and so of less value for comparative studies. The global decline in pollination services has stimulated surveys of flower-visiting invertebrates, using pan traps and net sampling. We explore the relative merits of these two methods in terms of species discovery, quantifying abundance, function, and composition, and responses of species to changing floral resources. Using a spatially-nested design we sampled across a 5000 km(2 area of arid grasslands, including 432 hours of net sampling and 1296 pan trap-days, between June 2010 and July 2011. Net sampling yielded 22% more species and 30% higher abundance than pan traps, and better reflected the spatio-temporal variation of floral resources. Species composition differed significantly between methods; from 436 total species, 25% were sampled by both methods, 50% only by nets, and the remaining 25% only by pans. Apart from being less comprehensive, if pan traps do not sample flower-visitors, the link to pollination is questionable. By contrast, net sampling functionally linked species to pollination through behavioural observations of flower-visitation interaction frequency. Netted specimens are also necessary for evidence of pollen transport. Benefits of net-based sampling outweighed minor differences in overall sampling effort. As pan traps and net sampling methods are not equivalent for sampling invertebrate-flower interactions, we recommend net sampling of invertebrate pollinator assemblages, especially if datasets are intended to document declines in pollination and guide measures to retain this important ecosystem service.

  20. Manual versus automated blood sampling

    DEFF Research Database (Denmark)

    Teilmann, A C; Kalliokoski, Otto; Sørensen, Dorte B

    2014-01-01

    Facial vein (cheek blood) and caudal vein (tail blood) phlebotomy are two commonly used techniques for obtaining blood samples from laboratory mice, while automated blood sampling through a permanent catheter is a relatively new technique in mice. The present study compared physiological parameters......, glucocorticoid dynamics as well as the behavior of mice sampled repeatedly for 24 h by cheek blood, tail blood or automated blood sampling from the carotid artery. Mice subjected to cheek blood sampling lost significantly more body weight, had elevated levels of plasma corticosterone, excreted more fecal...... corticosterone metabolites, and expressed more anxious behavior than did the mice of the other groups. Plasma corticosterone levels of mice subjected to tail blood sampling were also elevated, although less significantly. Mice subjected to automated blood sampling were less affected with regard to the parameters...

  1. Specified assurance level sampling procedure

    International Nuclear Information System (INIS)

    Willner, O.

    1980-11-01

    In the nuclear industry design specifications for certain quality characteristics require that the final product be inspected by a sampling plan which can demonstrate product conformance to stated assurance levels. The Specified Assurance Level (SAL) Sampling Procedure has been developed to permit the direct selection of attribute sampling plans which can meet commonly used assurance levels. The SAL procedure contains sampling plans which yield the minimum sample size at stated assurance levels. The SAL procedure also provides sampling plans with acceptance numbers ranging from 0 to 10, thus, making available to the user a wide choice of plans all designed to comply with a stated assurance level

  2. Test of a sample container for shipment of small size plutonium samples with PAT-2

    International Nuclear Information System (INIS)

    Kuhn, E.; Aigner, H.; Deron, S.

    1981-11-01

    A light-weight container for the air transport of plutonium, to be designated PAT-2, has been developed in the USA and is presently undergoing licensing. The very limited effective space for bearing plutonium required the design of small size sample canisters to meet the needs of international safeguards for the shipment of plutonium samples. The applicability of a small canister for the sampling of small size powder and solution samples has been tested in an intralaboratory experiment. The results of the experiment, based on the concept of pre-weighed samples, show that the tested canister can successfully be used for the sampling of small size PuO 2 -powder samples of homogeneous source material, as well as for dried aliquands of plutonium nitrate solutions. (author)

  3. Systematic sampling for suspended sediment

    Science.gov (United States)

    Robert B. Thomas

    1991-01-01

    Abstract - Because of high costs or complex logistics, scientific populations cannot be measured entirely and must be sampled. Accepted scientific practice holds that sample selection be based on statistical principles to assure objectivity when estimating totals and variances. Probability sampling--obtaining samples with known probabilities--is the only method that...

  4. Air sampling in the workplace

    International Nuclear Information System (INIS)

    Hickey, E.E.; Stoetzel, G.A.; Strom, D.J.; Cicotte, G.R.; Wiblin, C.M.; McGuire, S.A.

    1993-09-01

    This report provides technical information on air sampling that will be useful for facilities following the recommendations in the NRC's Regulatory Guide 8.25, Revision 1, ''Air sampling in the Workplace.'' That guide addresses air sampling to meet the requirements in NRC's regulations on radiation protection, 10 CFR Part 20. This report describes how to determine the need for air sampling based on the amount of material in process modified by the type of material, release potential, and confinement of the material. The purposes of air sampling and how the purposes affect the types of air sampling provided are discussed. The report discusses how to locate air samplers to accurately determine the concentrations of airborne radioactive materials that workers will be exposed to. The need for and the methods of performing airflow pattern studies to improve the accuracy of air sampling results are included. The report presents and gives examples of several techniques that can be used to evaluate whether the airborne concentrations of material are representative of the air inhaled by workers. Methods to adjust derived air concentrations for particle size are described. Methods to calibrate for volume of air sampled and estimate the uncertainty in the volume of air sampled are described. Statistical tests for determining minimum detectable concentrations are presented. How to perform an annual evaluation of the adequacy of the air sampling is also discussed

  5. SamplingStrata: An R Package for the Optimization of Strati?ed Sampling

    Directory of Open Access Journals (Sweden)

    Giulio Barcaroli

    2014-11-01

    Full Text Available When designing a sampling survey, usually constraints are set on the desired precision levels regarding one or more target estimates (the Ys. If a sampling frame is available, containing auxiliary information related to each unit (the Xs, it is possible to adopt a stratified sample design. For any given strati?cation of the frame, in the multivariate case it is possible to solve the problem of the best allocation of units in strata, by minimizing a cost function sub ject to precision constraints (or, conversely, by maximizing the precision of the estimates under a given budget. The problem is to determine the best stratification in the frame, i.e., the one that ensures the overall minimal cost of the sample necessary to satisfy precision constraints. The Xs can be categorical or continuous; continuous ones can be transformed into categorical ones. The most detailed strati?cation is given by the Cartesian product of the Xs (the atomic strata. A way to determine the best stratification is to explore exhaustively the set of all possible partitions derivable by the set of atomic strata, evaluating each one by calculating the corresponding cost in terms of the sample required to satisfy precision constraints. This is una?ordable in practical situations, where the dimension of the space of the partitions can be very high. Another possible way is to explore the space of partitions with an algorithm that is particularly suitable in such situations: the genetic algorithm. The R package SamplingStrata, based on the use of a genetic algorithm, allows to determine the best strati?cation for a population frame, i.e., the one that ensures the minimum sample cost necessary to satisfy precision constraints, in a multivariate and multi-domain case.

  6. Do resettlement variables predict psychiatric treatment outcomes in a sample of asylum-seeking survivors of torture?

    Science.gov (United States)

    Whitsett, David; Sherman, Martin F

    2017-12-01

    Mental health clinicians who work with asylum seekers provide services to patients who face stressful everyday living conditions. However, little is known about how these problems potentially impact psychiatric treatment within these populations. The purpose of this study was thus to examine whether resettlement factors predict outcomes of a mental health intervention for a sample of asylum-seeking survivors of torture. The study included data from a US outpatient clinic that specialized in treating asylum-seeking survivors of torture. Patients (primarily from Iraq, Afghanistan and African Countries) were evaluated on demographic factors at intake and psychiatric symptoms throughout the course of treatment. Patients experienced significant reductions in depression, anxiety and trauma symptoms, although symptoms still remained near or above clinical thresholds. Stable, uncrowded housing conditions significantly predicted lower depression, anxiety and trauma symptoms at follow-up. These findings support the hypotheses that individuals seeking asylum within the United States who have survived torture can benefit from psychiatric treatment and emphasize the importance of stable living conditions in improving treatment effectiveness. This suggests the need for further research on social predictors of treatment outcomes, as well as the need for clinicians and policymakers to target improved housing as a potentially important tool to reduce psychiatric problems related to torture and forced migration.

  7. 45 CFR 1356.84 - Sampling.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false Sampling. 1356.84 Section 1356.84 Public Welfare....84 Sampling. (a) The State agency may collect and report the information required in section 1356.83(e) of this part on a sample of the baseline population consistent with the sampling requirements...

  8. Environmental surveillance master sampling schedule

    International Nuclear Information System (INIS)

    Bisping, L.E.

    1995-02-01

    Environmental surveillance of the Hanford Site and surrounding areas is conducted by the Pacific Northwest Laboratory (PNL) for the U.S. Department of Energy (DOE). This document contains the planned 1994 schedules for routine collection of samples for the Surface Environmental Surveillance Project (SESP), Drinking Water Project, and Ground-Water Surveillance Project. Samples are routinely collected for the SESP and analyzed to determine the quality of air, surface water, soil, sediment, wildlife, vegetation, foodstuffs, and farm products at Hanford Site and surrounding communities. The responsibility for monitoring onsite drinking water falls outside the scope of the SESP. PNL conducts the drinking water monitoring project concurrent with the SESP to promote efficiency and consistency, utilize expertise developed over the years, and reduce costs associated with management, procedure development, data management, quality control, and reporting. The ground-water sampling schedule identifies ground-water sampling .events used by PNL for environmental surveillance of the Hanford Site. Sampling is indicated as annual, semi-annual, quarterly, or monthly in the sampling schedule. Some samples are collected and analyzed as part of ground-water monitoring and characterization programs at Hanford (e.g. Resources Conservation and Recovery Act (RCRA), Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA), or Operational). The number of samples planned by other programs are identified in the sampling schedule by a number in the analysis column and a project designation in the Cosample column. Well sampling events may be merged to avoid redundancy in cases where sampling is planned by both-environmental surveillance and another program

  9. Environmental surveillance master sampling schedule

    Energy Technology Data Exchange (ETDEWEB)

    Bisping, L.E.

    1995-02-01

    Environmental surveillance of the Hanford Site and surrounding areas is conducted by the Pacific Northwest Laboratory (PNL) for the U.S. Department of Energy (DOE). This document contains the planned 1994 schedules for routine collection of samples for the Surface Environmental Surveillance Project (SESP), Drinking Water Project, and Ground-Water Surveillance Project. Samples are routinely collected for the SESP and analyzed to determine the quality of air, surface water, soil, sediment, wildlife, vegetation, foodstuffs, and farm products at Hanford Site and surrounding communities. The responsibility for monitoring onsite drinking water falls outside the scope of the SESP. PNL conducts the drinking water monitoring project concurrent with the SESP to promote efficiency and consistency, utilize expertise developed over the years, and reduce costs associated with management, procedure development, data management, quality control, and reporting. The ground-water sampling schedule identifies ground-water sampling .events used by PNL for environmental surveillance of the Hanford Site. Sampling is indicated as annual, semi-annual, quarterly, or monthly in the sampling schedule. Some samples are collected and analyzed as part of ground-water monitoring and characterization programs at Hanford (e.g. Resources Conservation and Recovery Act (RCRA), Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA), or Operational). The number of samples planned by other programs are identified in the sampling schedule by a number in the analysis column and a project designation in the Cosample column. Well sampling events may be merged to avoid redundancy in cases where sampling is planned by both-environmental surveillance and another program.

  10. Sample Length Affects the Reliability of Language Sample Measures in 3-Year-Olds: Evidence from Parent-Elicited Conversational Samples

    Science.gov (United States)

    Guo, Ling-Yu; Eisenberg, Sarita

    2015-01-01

    Purpose: The goal of this study was to investigate the extent to which sample length affected the reliability of total number of words (TNW), number of different words (NDW), and mean length of C-units in morphemes (MLCUm) in parent-elicited conversational samples for 3-year-olds. Method: Participants were sixty 3-year-olds. A 22-min language…

  11. Operational air sampling report

    International Nuclear Information System (INIS)

    Lyons, C.L.

    1994-03-01

    Nevada Test Site vertical shaft and tunnel events generate beta/gamma fission products. The REECo air sampling program is designed to measure these radionuclides at various facilities supporting these events. The current testing moratorium and closure of the Decontamination Facility has decreased the scope of the program significantly. Of the 118 air samples collected in the only active tunnel complex, only one showed any airborne fission products. Tritiated water vapor concentrations were very similar to previously reported levels. The 206 air samples collected at the Area-6 decontamination bays and laundry were again well below any Derived Air Concentration calculation standard. Laboratory analyses of these samples were negative for any airborne fission products

  12. Efficient sample preparation from complex biological samples using a sliding lid for immobilized droplet extractions.

    Science.gov (United States)

    Casavant, Benjamin P; Guckenberger, David J; Beebe, David J; Berry, Scott M

    2014-07-01

    Sample preparation is a major bottleneck in many biological processes. Paramagnetic particles (PMPs) are a ubiquitous method for isolating analytes of interest from biological samples and are used for their ability to thoroughly sample a solution and be easily collected with a magnet. There are three main methods by which PMPs are used for sample preparation: (1) removal of fluid from the analyte-bound PMPs, (2) removal of analyte-bound PMPs from the solution, and (3) removal of the substrate (with immobilized analyte-bound PMPs). In this paper, we explore the third and least studied method for PMP-based sample preparation using a platform termed Sliding Lid for Immobilized Droplet Extractions (SLIDE). SLIDE leverages principles of surface tension and patterned hydrophobicity to create a simple-to-operate platform for sample isolation (cells, DNA, RNA, protein) and preparation (cell staining) without the need for time-intensive wash steps, use of immiscible fluids, or precise pinning geometries. Compared to other standard isolation protocols using PMPs, SLIDE is able to perform rapid sample preparation with low (0.6%) carryover of contaminants from the original sample. The natural recirculation occurring within the pinned droplets of SLIDE make possible the performance of multistep cell staining protocols within the SLIDE by simply resting the lid over the various sample droplets. SLIDE demonstrates a simple easy to use platform for sample preparation on a range of complex biological samples.

  13. Probability Sampling Method for a Hidden Population Using Respondent-Driven Sampling: Simulation for Cancer Survivors.

    Science.gov (United States)

    Jung, Minsoo

    2015-01-01

    When there is no sampling frame within a certain group or the group is concerned that making its population public would bring social stigma, we say the population is hidden. It is difficult to approach this kind of population survey-methodologically because the response rate is low and its members are not quite honest with their responses when probability sampling is used. The only alternative known to address the problems caused by previous methods such as snowball sampling is respondent-driven sampling (RDS), which was developed by Heckathorn and his colleagues. RDS is based on a Markov chain, and uses the social network information of the respondent. This characteristic allows for probability sampling when we survey a hidden population. We verified through computer simulation whether RDS can be used on a hidden population of cancer survivors. According to the simulation results of this thesis, the chain-referral sampling of RDS tends to minimize as the sample gets bigger, and it becomes stabilized as the wave progresses. Therefore, it shows that the final sample information can be completely independent from the initial seeds if a certain level of sample size is secured even if the initial seeds were selected through convenient sampling. Thus, RDS can be considered as an alternative which can improve upon both key informant sampling and ethnographic surveys, and it needs to be utilized for various cases domestically as well.

  14. Uranium tailings sampling manual

    International Nuclear Information System (INIS)

    Feenstra, S.; Reades, D.W.; Cherry, J.A.; Chambers, D.B.; Case, G.G.; Ibbotson, B.G.

    1985-01-01

    The purpose of this manual is to describe the requisite sampling procedures for the application of uniform high-quality standards to detailed geotechnical, hydrogeological, geochemical and air quality measurements at Canadian uranium tailings disposal sites. The selection and implementation of applicable sampling procedures for such measurements at uranium tailings disposal sites are complicated by two primary factors. Firstly, the physical and chemical nature of uranium mine tailings and effluent is considerably different from natural soil materials and natural waters. Consequently, many conventional methods for the collection and analysis of natural soils and waters are not directly applicable to tailings. Secondly, there is a wide range in the physical and chemical nature of uranium tailings. The composition of the ore, the milling process, the nature of tailings depositon, and effluent treatment vary considerably and are highly site-specific. Therefore, the definition and implementation of sampling programs for uranium tailings disposal sites require considerable evaluation, and often innovation, to ensure that appropriate sampling and analysis methods are used which provide the flexibility to take into account site-specific considerations. The following chapters describe the objective and scope of a sampling program, preliminary data collection, and the procedures for sampling of tailings solids, surface water and seepage, tailings pore-water, and wind-blown dust and radon

  15. A method of language sampling

    DEFF Research Database (Denmark)

    Rijkhoff, Jan; Bakker, Dik; Hengeveld, Kees

    1993-01-01

    In recent years more attention is paid to the quality of language samples in typological work. Without an adequate sampling strategy, samples may suffer from various kinds of bias. In this article we propose a sampling method in which the genetic criterion is taken as the most important: samples...... created with this method will reflect optimally the diversity of the languages of the world. On the basis of the internal structure of each genetic language tree a measure is computed that reflects the linguistic diversity in the language families represented by these trees. This measure is used...... to determine how many languages from each phylum should be selected, given any required sample size....

  16. Self-sampling with HPV mRNA analyses from vagina and urine compared with cervical samples.

    Science.gov (United States)

    Asciutto, Katrin Christine; Ernstson, Avalon; Forslund, Ola; Borgfeldt, Christer

    2018-04-01

    In order to increase coverage in the organized cervical screening program, self-sampling with HPV analyses has been suggested. The aim was to compare human papillomavirus (HPV) mRNA detection in vaginal and urine self-collected samples with clinician-taken cervical samples and the corresponding clinician-taken histological specimens. Self-collected vaginal, urine and clinician-taken cervical samples were analyzed from 209 women with the Aptima mRNA assay (Hologic Inc, MA, USA). Cervical cytology, colposcopy, biopsy and/or the loop electrosurgical excision procedure (LEEP) were performed in every examination. The sensitivity of the HPV mRNA test in detecting high-grade squamous intraepithelial lesions (HSIL)/adenocarcinoma in situ (AIS)/cancer cases was as follows: for the vaginal self-samples 85.5% (95% CI; 75.0-92.8), the urinary samples 44.8% (95% CI; 32.6-57.4), and for routine cytology 81.7% (95% CI; 70.7-89.9). For the clinician-taken cervical HPV samples the sensitivity of the HPV mRNA test in detecting HSIL/AIS/cancer was 100.0% (95% CI; 94.9-100.0). The specificity of the HPV mRNA was similar for the clinician-taken cervical HPV samples and the self-samples: 49.0% vs. 48.1%. The urinary HPV samples had a specificity of 61.9% and cytology had a specificity of 93.3%. The sensitivity of the Aptima HPV mRNA test in detecting HSIL/AIS/cancer from vaginal self-samples was similar to that of routine cytology. The Aptima HPV mRNA vaginal self-sampling analysis may serve as a complement in screening programs. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. An Internationally Coordinated Science Management Plan for Samples Returned from Mars

    Science.gov (United States)

    Haltigin, T.; Smith, C. L.

    2015-12-01

    Mars Sample Return (MSR) remains a high priority of the planetary exploration community. Such an effort will undoubtedly be too large for any individual agency to conduct itself, and thus will require extensive global cooperation. To help prepare for an eventual MSR campaign, the International Mars Exploration Working Group (IMEWG) chartered the international Mars Architecture for the Return of Samples (iMARS) Phase II working group in 2014, consisting of representatives from 17 countries and agencies. The overarching task of the team was to provide recommendations for progressing towards campaign implementation, including a proposed science management plan. Building upon the iMARS Phase I (2008) outcomes, the Phase II team proposed the development of an International MSR Science Institute as part of the campaign governance, centering its deliberations around four themes: Organization: including an organizational structure for the Institute that outlines roles and responsibilities of key members and describes sample return facility requirements; Management: presenting issues surrounding scientific leadership, defining guidelines and assumptions for Institute membership, and proposing a possible funding model; Operations & Data: outlining a science implementation plan that details the preliminary sample examination flow, sample allocation process, and data policies; and Curation: introducing a sample curation plan that comprises sample tracking and routing procedures, sample sterilization considerations, and long-term archiving recommendations. This work presents a summary of the group's activities, findings, and recommendations, highlighting the role of international coordination in managing the returned samples.

  18. Sampling Operations on Big Data

    Science.gov (United States)

    2015-11-29

    gories. These include edge sampling methods where edges are selected by a predetermined criteria; snowball sampling methods where algorithms start... Sampling Operations on Big Data Vijay Gadepally, Taylor Herr, Luke Johnson, Lauren Milechin, Maja Milosavljevic, Benjamin A. Miller Lincoln...process and disseminate information for discovery and exploration under real-time constraints. Common signal processing operations such as sampling and

  19. Effects of systematic sampling on satellite estimates of deforestation rates

    International Nuclear Information System (INIS)

    Steininger, M K; Godoy, F; Harper, G

    2009-01-01

    Options for satellite monitoring of deforestation rates over large areas include the use of sampling. Sampling may reduce the cost of monitoring but is also a source of error in estimates of areas and rates. A common sampling approach is systematic sampling, in which sample units of a constant size are distributed in some regular manner, such as a grid. The proposed approach for the 2010 Forest Resources Assessment (FRA) of the UN Food and Agriculture Organization (FAO) is a systematic sample of 10 km wide squares at every 1 deg. intersection of latitude and longitude. We assessed the outcome of this and other systematic samples for estimating deforestation at national, sub-national and continental levels. The study is based on digital data on deforestation patterns for the five Amazonian countries outside Brazil plus the Brazilian Amazon. We tested these schemes by varying sample-unit size and frequency. We calculated two estimates of sampling error. First we calculated the standard errors, based on the size, variance and covariance of the samples, and from this calculated the 95% confidence intervals (CI). Second, we calculated the actual errors, based on the difference between the sample-based estimates and the estimates from the full-coverage maps. At the continental level, the 1 deg., 10 km scheme had a CI of 21% and an actual error of 8%. At the national level, this scheme had CIs of 126% for Ecuador and up to 67% for other countries. At this level, increasing sampling density to every 0.25 deg. produced a CI of 32% for Ecuador and CIs of up to 25% for other countries, with only Brazil having a CI of less than 10%. Actual errors were within the limits of the CIs in all but two of the 56 cases. Actual errors were half or less of the CIs in all but eight of these cases. These results indicate that the FRA 2010 should have CIs of smaller than or close to 10% at the continental level. However, systematic sampling at the national level yields large CIs unless the

  20. Estimation of plant sampling uncertainty: an example based on chemical analysis of moss samples.

    Science.gov (United States)

    Dołęgowska, Sabina

    2016-11-01

    In order to estimate the level of uncertainty arising from sampling, 54 samples (primary and duplicate) of the moss species Pleurozium schreberi (Brid.) Mitt. were collected within three forested areas (Wierna Rzeka, Piaski, Posłowice Range) in the Holy Cross Mountains (south-central Poland). During the fieldwork, each primary sample composed of 8 to 10 increments (subsamples) was taken over an area of 10 m 2 whereas duplicate samples were collected in the same way at a distance of 1-2 m. Subsequently, all samples were triple rinsed with deionized water, dried, milled, and digested (8 mL HNO 3 (1:1) + 1 mL 30 % H 2 O 2 ) in a closed microwave system Multiwave 3000. The prepared solutions were analyzed twice for Cu, Fe, Mn, and Zn using FAAS and GFAAS techniques. All datasets were checked for normality and for normally distributed elements (Cu from Piaski, Zn from Posłowice, Fe, Zn from Wierna Rzeka). The sampling uncertainty was computed with (i) classical ANOVA, (ii) classical RANOVA, (iii) modified RANOVA, and (iv) range statistics. For the remaining elements, the sampling uncertainty was calculated with traditional and/or modified RANOVA (if the amount of outliers did not exceed 10 %) or classical ANOVA after Box-Cox transformation (if the amount of outliers exceeded 10 %). The highest concentrations of all elements were found in moss samples from Piaski, whereas the sampling uncertainty calculated with different statistical methods ranged from 4.1 to 22 %.

  1. On incomplete sampling under birth-death models and connections to the sampling-based coalescent.

    Science.gov (United States)

    Stadler, Tanja

    2009-11-07

    The constant rate birth-death process is used as a stochastic model for many biological systems, for example phylogenies or disease transmission. As the biological data are usually not fully available, it is crucial to understand the effect of incomplete sampling. In this paper, we analyze the constant rate birth-death process with incomplete sampling. We derive the density of the bifurcation events for trees on n leaves which evolved under this birth-death-sampling process. This density is used for calculating prior distributions in Bayesian inference programs and for efficiently simulating trees. We show that the birth-death-sampling process can be interpreted as a birth-death process with reduced rates and complete sampling. This shows that joint inference of birth rate, death rate and sampling probability is not possible. The birth-death-sampling process is compared to the sampling-based population genetics model, the coalescent. It is shown that despite many similarities between these two models, the distribution of bifurcation times remains different even in the case of very large population sizes. We illustrate these findings on an Hepatitis C virus dataset from Egypt. We show that the transmission times estimates are significantly different-the widely used Gamma statistic even changes its sign from negative to positive when switching from the coalescent to the birth-death process.

  2. A simulative comparison of respondent driven sampling with incentivized snowball sampling – the “strudel effect”

    Science.gov (United States)

    Gyarmathy, V. Anna; Johnston, Lisa G.; Caplinskiene, Irma; Caplinskas, Saulius; Latkin, Carl A.

    2014-01-01

    Background Respondent driven sampling (RDS) and Incentivized Snowball Sampling (ISS) are two sampling methods that are commonly used to reach people who inject drugs (PWID). Methods We generated a set of simulated RDS samples on an actual sociometric ISS sample of PWID in Vilnius, Lithuania (“original sample”) to assess if the simulated RDS estimates were statistically significantly different from the original ISS sample prevalences for HIV (9.8%), Hepatitis A (43.6%), Hepatitis B (Anti-HBc 43.9% and HBsAg 3.4%), Hepatitis C (87.5%), syphilis (6.8%) and Chlamydia (8.8%) infections and for selected behavioral risk characteristics. Results The original sample consisted of a large component of 249 people (83% of the sample) and 13 smaller components with 1 to 12 individuals. Generally, as long as all seeds were recruited from the large component of the original sample, the simulation samples simply recreated the large component. There were no significant differences between the large component and the entire original sample for the characteristics of interest. Altogether 99.2% of 360 simulation sample point estimates were within the confidence interval of the original prevalence values for the characteristics of interest. Conclusions When population characteristics are reflected in large network components that dominate the population, RDS and ISS may produce samples that have statistically non-different prevalence values, even though some isolated network components may be under-sampled and/or statistically significantly different from the main groups. This so-called “strudel effect” is discussed in the paper. PMID:24360650

  3. Reconstructing genealogies of serial samples under the assumption of a molecular clock using serial-sample UPGMA.

    Science.gov (United States)

    Drummond, A; Rodrigo, A G

    2000-12-01

    Reconstruction of evolutionary relationships from noncontemporaneous molecular samples provides a new challenge for phylogenetic reconstruction methods. With recent biotechnological advances there has been an increase in molecular sequencing throughput, and the potential to obtain serial samples of sequences from populations, including rapidly evolving pathogens, is fast being realized. A new method called the serial-sample unweighted pair grouping method with arithmetic means (sUPGMA) is presented that reconstructs a genealogy or phylogeny of sequences sampled serially in time using a matrix of pairwise distances. The resulting tree depicts the terminal lineages of each sample ending at a different level consistent with the sample's temporal order. Since sUPGMA is a variant of UPGMA, it will perform best when sequences have evolved at a constant rate (i.e., according to a molecular clock). On simulated data, this new method performs better than standard cluster analysis under a variety of longitudinal sampling strategies. Serial-sample UPGMA is particularly useful for analysis of longitudinal samples of viruses and bacteria, as well as ancient DNA samples, with the minimal requirement that samples of sequences be ordered in time.

  4. The experience sampling method: Investigating students' affective experience

    Science.gov (United States)

    Nissen, Jayson M.; Stetzer, MacKenzie R.; Shemwell, Jonathan T.

    2013-01-01

    Improving non-cognitive outcomes such as attitudes, efficacy, and persistence in physics courses is an important goal of physics education. This investigation implemented an in-the-moment surveying technique called the Experience Sampling Method (ESM) [1] to measure students' affective experience in physics. Measurements included: self-efficacy, cognitive efficiency, activation, intrinsic motivation, and affect. Data are presented that show contrasts in students' experiences (e.g., in physics vs. non-physics courses).

  5. Automatic sample changers maintenance manual

    International Nuclear Information System (INIS)

    Myers, T.A.

    1978-10-01

    This manual describes and provides trouble-shooting aids for the Automatic Sample Changer electronics on the automatic beta counting system, developed by the Los Alamos Scientific Laboratory Group CNC-11. The output of a gas detector is shaped by a preamplifier, then is coupled to an amplifier. Amplifier output is discriminated and is the input to a scaler. An identification number is associated with each sample. At a predetermined count length, the identification number, scaler data plus other information is punched out on a data card. The next sample to be counted is automatically selected. The beta counter uses the same electronics as the prior count did, the only difference being the sample identification number and sample itself. This manual is intended as a step-by-step aid in trouble-shooting the electronics associated with positioning the sample, counting the sample, and getting the needed data punched on an 80-column data card

  6. Sample size determination for logistic regression on a logit-normal distribution.

    Science.gov (United States)

    Kim, Seongho; Heath, Elisabeth; Heilbrun, Lance

    2017-06-01

    Although the sample size for simple logistic regression can be readily determined using currently available methods, the sample size calculation for multiple logistic regression requires some additional information, such as the coefficient of determination ([Formula: see text]) of a covariate of interest with other covariates, which is often unavailable in practice. The response variable of logistic regression follows a logit-normal distribution which can be generated from a logistic transformation of a normal distribution. Using this property of logistic regression, we propose new methods of determining the sample size for simple and multiple logistic regressions using a normal transformation of outcome measures. Simulation studies and a motivating example show several advantages of the proposed methods over the existing methods: (i) no need for [Formula: see text] for multiple logistic regression, (ii) available interim or group-sequential designs, and (iii) much smaller required sample size.

  7. On sampling social networking services

    OpenAIRE

    Wang, Baiyang

    2012-01-01

    This article aims at summarizing the existing methods for sampling social networking services and proposing a faster confidence interval for related sampling methods. It also includes comparisons of common network sampling techniques.

  8. Sample collection and documentation

    International Nuclear Information System (INIS)

    Cullings, Harry M.; Fujita, Shoichiro; Watanabe, Tadaaki; Yamashita, Tomoaki; Tanaka, Kenichi; Endo, Satoru; Shizuma, Kiyoshi; Hoshi, Masaharu; Hasai, Hiromi

    2005-01-01

    Beginning within a few weeks after the bombings and periodically during the intervening decades, investigators in Hiroshima and Nagasaki have collected samples of materials that were in the cities at the time of the bombings. Although some early efforts were not driven by specific measurement objectives, many others were. Even some of the very earliest samples collected in 1945 were based on carefully conceived research plans and detailed specifications for samples appropriate to particular retrospective measurements, i.e., of particular residual quantities remaining from exposure to the neutrons and gamma rays from the bombs. This chapter focuses mainly on the work of groups at two institutions that have actively collaborated since the 1980s in major collection efforts and have shared samples among themselves and with other investigators: the Radiation Effects Research Foundation (RERF) and its predecessor the Atomic Bomb Casualty Commission (ABCC), and Hiroshima University. In addition, a number of others are listed, who also contributed to the literature by their collection of samples. (J.P.N.)

  9. Sample Handling Considerations for a Europa Sample Return Mission: An Overview

    Science.gov (United States)

    Fries, M. D.; Calaway, M. L.; Evans, C. A.; McCubbin, F. M.

    2015-01-01

    The intent of this abstract is to provide a basic overview of mission requirements for a generic Europan plume sample return mission, based on NASA Curation experience in NASA sample return missions ranging from Apollo to OSIRIS-REx. This should be useful for mission conception and early stage planning. We will break the mission down into Outbound and Return legs and discuss them separately.

  10. Sampling method of environmental radioactivity monitoring

    International Nuclear Information System (INIS)

    1984-01-01

    This manual provides sampling methods of environmental samples of airborne dust, precipitated dust, precipitated water (rain or snow), fresh water, soil, river sediment or lake sediment, discharged water from a nuclear facility, grains, tea, milk, pasture grass, limnetic organisms, daily diet, index organisms, sea water, marine sediment, marine organisms, and that for tritium and radioiodine determination for radiation monitoring from radioactive fallout or radioactivity release by nuclear facilities. This manual aims at the presentation of standard sampling procedures for environmental radioactivity monitoring regardless of monitoring objectives, and shows preservation method of environmental samples acquired at the samplingpoint for radiation counting for those except human body. Sampling techniques adopted in this manual is decided by the criteria that they are suitable for routine monitoring and any special skillfulness is not necessary. Based on the above-mentioned principle, this manual presents outline and aims of sampling, sampling position or object, sampling quantity, apparatus, equipment or vessel for sampling, sampling location, sampling procedures, pretreatment and preparation procedures of a sample for radiation counting, necessary recording items for sampling and sample transportation procedures. Special attention is described in the chapter of tritium and radioiodine because these radionuclides might be lost by the above-mentioned sample preservation method for radiation counting of less volatile radionuclides than tritium or radioiodine. (Takagi, S.)

  11. Comet coma sample return instrument

    Science.gov (United States)

    Albee, A. L.; Brownlee, Don E.; Burnett, Donald S.; Tsou, Peter; Uesugi, K. T.

    1994-01-01

    The sample collection technology and instrument concept for the Sample of Comet Coma Earth Return Mission (SOCCER) are described. The scientific goals of this Flyby Sample Return are to return to coma dust and volatile samples from a known comet source, which will permit accurate elemental and isotopic measurements for thousands of individual solid particles and volatiles, detailed analysis of the dust structure, morphology, and mineralogy of the intact samples, and identification of the biogenic elements or compounds in the solid and volatile samples. Having these intact samples, morphologic, petrographic, and phase structural features can be determined. Information on dust particle size, shape, and density can be ascertained by analyzing penetration holes and tracks in the capture medium. Time and spatial data of dust capture will provide understanding of the flux dynamics of the coma and the jets. Additional information will include the identification of cosmic ray tracks in the cometary grains, which can provide a particle's process history and perhaps even the age of the comet. The measurements will be made with the same equipment used for studying micrometeorites for decades past; hence, the results can be directly compared without extrapolation or modification. The data will provide a powerful and direct technique for comparing the cometary samples with all known types of meteorites and interplanetary dust. This sample collection system will provide the first sample return from a specifically identified primitive body and will allow, for the first time, a direct method of matching meteoritic materials captured on Earth with known parent bodies.

  12. Soil sampling in emergency situations

    International Nuclear Information System (INIS)

    Carvalho, Zenildo Lara de; Ramos Junior, Anthenor Costa

    1997-01-01

    The soil sampling methods used in Goiania's accident (1987) by the environmental team of Brazilian Nuclear Energy Commission (CNEN) are described. The development of this method of soil sampling to a emergency sampling method used in a Nuclear Emergency Exercise in Angra dos Reis Reactor Site (1991) is presented. A new method for soil sampling based on a Chernobyl environmental monitoring experience (1995) is suggested. (author)

  13. Feasibility of automated speech sample collection with stuttering children using interactive voice response (IVR) technology.

    Science.gov (United States)

    Vogel, Adam P; Block, Susan; Kefalianos, Elaina; Onslow, Mark; Eadie, Patricia; Barth, Ben; Conway, Laura; Mundt, James C; Reilly, Sheena

    2015-04-01

    To investigate the feasibility of adopting automated interactive voice response (IVR) technology for remotely capturing standardized speech samples from stuttering children. Participants were 10 6-year-old stuttering children. Their parents called a toll-free number from their homes and were prompted to elicit speech from their children using a standard protocol involving conversation, picture description and games. The automated IVR system was implemented using an off-the-shelf telephony software program and delivered by a standard desktop computer. The software infrastructure utilizes voice over internet protocol. Speech samples were automatically recorded during the calls. Video recordings were simultaneously acquired in the home at the time of the call to evaluate the fidelity of the telephone collected samples. Key outcome measures included syllables spoken, percentage of syllables stuttered and an overall rating of stuttering severity using a 10-point scale. Data revealed a high level of relative reliability in terms of intra-class correlation between the video and telephone acquired samples on all outcome measures during the conversation task. Findings were less consistent for speech samples during picture description and games. Results suggest that IVR technology can be used successfully to automate remote capture of child speech samples.

  14. Nonuniform sampling by quantiles

    Science.gov (United States)

    Craft, D. Levi; Sonstrom, Reilly E.; Rovnyak, Virginia G.; Rovnyak, David

    2018-03-01

    A flexible strategy for choosing samples nonuniformly from a Nyquist grid using the concept of statistical quantiles is presented for broad classes of NMR experimentation. Quantile-directed scheduling is intuitive and flexible for any weighting function, promotes reproducibility and seed independence, and is generalizable to multiple dimensions. In brief, weighting functions are divided into regions of equal probability, which define the samples to be acquired. Quantile scheduling therefore achieves close adherence to a probability distribution function, thereby minimizing gaps for any given degree of subsampling of the Nyquist grid. A characteristic of quantile scheduling is that one-dimensional, weighted NUS schedules are deterministic, however higher dimensional schedules are similar within a user-specified jittering parameter. To develop unweighted sampling, we investigated the minimum jitter needed to disrupt subharmonic tracts, and show that this criterion can be met in many cases by jittering within 25-50% of the subharmonic gap. For nD-NUS, three supplemental components to choosing samples by quantiles are proposed in this work: (i) forcing the corner samples to ensure sampling to specified maximum values in indirect evolution times, (ii) providing an option to triangular backfill sampling schedules to promote dense/uniform tracts at the beginning of signal evolution periods, and (iii) providing an option to force the edges of nD-NUS schedules to be identical to the 1D quantiles. Quantile-directed scheduling meets the diverse needs of current NUS experimentation, but can also be used for future NUS implementations such as off-grid NUS and more. A computer program implementing these principles (a.k.a. QSched) in 1D- and 2D-NUS is available under the general public license.

  15. Apparatus and method for maintaining multi-component sample gas constituents in vapor phase during sample extraction and cooling

    Science.gov (United States)

    Felix, Larry Gordon; Farthing, William Earl; Irvin, James Hodges; Snyder, Todd Robert

    2010-05-11

    A dilution apparatus for diluting a gas sample. The apparatus includes a sample gas conduit having a sample gas inlet end and a diluted sample gas outlet end, and a sample gas flow restricting orifice disposed proximate the sample gas inlet end connected with the sample gas conduit and providing fluid communication between the exterior and the interior of the sample gas conduit. A diluted sample gas conduit is provided within the sample gas conduit having a mixing end with a mixing space inlet opening disposed proximate the sample gas inlet end, thereby forming an annular space between the sample gas conduit and the diluted sample gas conduit. The mixing end of the diluted sample gas conduit is disposed at a distance from the sample gas flow restricting orifice. A dilution gas source connected with the sample gas inlet end of the sample gas conduit is provided for introducing a dilution gas into the annular space, and a filter is provided for filtering the sample gas. The apparatus is particularly suited for diluting heated sample gases containing one or more condensable components.

  16. Planetary Sample Caching System Design Options

    Science.gov (United States)

    Collins, Curtis; Younse, Paulo; Backes, Paul

    2009-01-01

    Potential Mars Sample Return missions would aspire to collect small core and regolith samples using a rover with a sample acquisition tool and sample caching system. Samples would need to be stored in individual sealed tubes in a canister that could be transfered to a Mars ascent vehicle and returned to Earth. A sample handling, encapsulation and containerization system (SHEC) has been developed as part of an integrated system for acquiring and storing core samples for application to future potential MSR and other potential sample return missions. Requirements and design options for the SHEC system were studied and a recommended design concept developed. Two families of solutions were explored: 1)transfer of a raw sample from the tool to the SHEC subsystem and 2)transfer of a tube containing the sample to the SHEC subsystem. The recommended design utilizes sample tool bit change out as the mechanism for transferring tubes to and samples in tubes from the tool. The SHEC subsystem design, called the Bit Changeout Caching(BiCC) design, is intended for operations on a MER class rover.

  17. The new LLNL AMS sample changer

    International Nuclear Information System (INIS)

    Roberts, M.L.; Norman, P.J.; Garibaldi, J.L.; Hornady, R.S.

    1993-01-01

    The Center for Accelerator Mass Spectrometry at LLNL has installed a new 64 position AMS sample changer on our spectrometer. This new sample changer has the capability of being controlled manually by an operator or automatically by the AMS data acquisition computer. Automatic control of the sample changer by the data acquisition system is a necessary step towards unattended AMS operation in our laboratory. The sample changer uses a fiber optic shaft encoder for rough rotational indexing of the sample wheel and a series of sequenced pneumatic cylinders for final mechanical indexing of the wheel and insertion and retraction of samples. Transit time from sample to sample varies from 4 s to 19 s, depending on distance moved. Final sample location can be set to within 50 microns on the x and y axis and within 100 microns in the z axis. Changing sample wheels on the new sample changer is also easier and faster than was possible on our previous sample changer and does not require the use of any tools

  18. Genetic Sample Inventory - NRDA

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This database archives genetic tissue samples from marine mammals collected in the North-Central Gulf of Mexico from 2010-2015. The collection includes samples from...

  19. Two phase sampling

    CERN Document Server

    Ahmad, Zahoor; Hanif, Muhammad

    2013-01-01

    The development of estimators of population parameters based on two-phase sampling schemes has seen a dramatic increase in the past decade. Various authors have developed estimators of population using either one or two auxiliary variables. The present volume is a comprehensive collection of estimators available in single and two phase sampling. The book covers estimators which utilize information on single, two and multiple auxiliary variables of both quantitative and qualitative nature. Th...

  20. Analysis of the Touch-And-Go Surface Sampling Concept for Comet Sample Return Missions

    Science.gov (United States)

    Mandic, Milan; Acikmese, Behcet; Bayard, David S.; Blackmore, Lars

    2012-01-01

    This paper studies the Touch-and-Go (TAG) concept for enabling a spacecraft to take a sample from the surface of a small primitive body, such as an asteroid or comet. The idea behind the TAG concept is to let the spacecraft descend to the surface, make contact with the surface for several seconds, and then ascend to a safe location. Sampling would be accomplished by an end-effector that is active during the few seconds of surface contact. The TAG event is one of the most critical events in a primitive body sample-return mission. The purpose of this study is to evaluate the dynamic behavior of a representative spacecraft during the TAG event, i.e., immediately prior, during, and after surface contact of the sampler. The study evaluates the sample-collection performance of the proposed sampling end-effector, in this case a brushwheel sampler, while acquiring material from the surface during the contact. A main result of the study is a guidance and control (G&C) validation of the overall TAG concept, in addition to specific contributions to demonstrating the effectiveness of using nonlinear clutch mechanisms in the sampling arm joints, and increasing the length of the sampling arms to improve robustness.

  1. Proteomic Challenges: Sample Preparation Techniques for Microgram-Quantity Protein Analysis from Biological Samples

    Directory of Open Access Journals (Sweden)

    Peter Feist

    2015-02-01

    Full Text Available Proteins regulate many cellular functions and analyzing the presence and abundance of proteins in biological samples are central focuses in proteomics. The discovery and validation of biomarkers, pathways, and drug targets for various diseases can be accomplished using mass spectrometry-based proteomics. However, with mass-limited samples like tumor biopsies, it can be challenging to obtain sufficient amounts of proteins to generate high-quality mass spectrometric data. Techniques developed for macroscale quantities recover sufficient amounts of protein from milligram quantities of starting material, but sample losses become crippling with these techniques when only microgram amounts of material are available. To combat this challenge, proteomicists have developed micro-scale techniques that are compatible with decreased sample size (100 μg or lower and still enable excellent proteome coverage. Extraction, contaminant removal, protein quantitation, and sample handling techniques for the microgram protein range are reviewed here, with an emphasis on liquid chromatography and bottom-up mass spectrometry-compatible techniques. Also, a range of biological specimens, including mammalian tissues and model cell culture systems, are discussed.

  2. Proteomic Challenges: Sample Preparation Techniques for Microgram-Quantity Protein Analysis from Biological Samples

    Science.gov (United States)

    Feist, Peter; Hummon, Amanda B.

    2015-01-01

    Proteins regulate many cellular functions and analyzing the presence and abundance of proteins in biological samples are central focuses in proteomics. The discovery and validation of biomarkers, pathways, and drug targets for various diseases can be accomplished using mass spectrometry-based proteomics. However, with mass-limited samples like tumor biopsies, it can be challenging to obtain sufficient amounts of proteins to generate high-quality mass spectrometric data. Techniques developed for macroscale quantities recover sufficient amounts of protein from milligram quantities of starting material, but sample losses become crippling with these techniques when only microgram amounts of material are available. To combat this challenge, proteomicists have developed micro-scale techniques that are compatible with decreased sample size (100 μg or lower) and still enable excellent proteome coverage. Extraction, contaminant removal, protein quantitation, and sample handling techniques for the microgram protein range are reviewed here, with an emphasis on liquid chromatography and bottom-up mass spectrometry-compatible techniques. Also, a range of biological specimens, including mammalian tissues and model cell culture systems, are discussed. PMID:25664860

  3. Proteomic challenges: sample preparation techniques for microgram-quantity protein analysis from biological samples.

    Science.gov (United States)

    Feist, Peter; Hummon, Amanda B

    2015-02-05

    Proteins regulate many cellular functions and analyzing the presence and abundance of proteins in biological samples are central focuses in proteomics. The discovery and validation of biomarkers, pathways, and drug targets for various diseases can be accomplished using mass spectrometry-based proteomics. However, with mass-limited samples like tumor biopsies, it can be challenging to obtain sufficient amounts of proteins to generate high-quality mass spectrometric data. Techniques developed for macroscale quantities recover sufficient amounts of protein from milligram quantities of starting material, but sample losses become crippling with these techniques when only microgram amounts of material are available. To combat this challenge, proteomicists have developed micro-scale techniques that are compatible with decreased sample size (100 μg or lower) and still enable excellent proteome coverage. Extraction, contaminant removal, protein quantitation, and sample handling techniques for the microgram protein range are reviewed here, with an emphasis on liquid chromatography and bottom-up mass spectrometry-compatible techniques. Also, a range of biological specimens, including mammalian tissues and model cell culture systems, are discussed.

  4. Synchronizing data from irregularly sampled sensors

    Science.gov (United States)

    Uluyol, Onder

    2017-07-11

    A system and method include receiving a set of sampled measurements for each of multiple sensors, wherein the sampled measurements are at irregular intervals or different rates, re-sampling the sampled measurements of each of the multiple sensors at a higher rate than one of the sensor's set of sampled measurements, and synchronizing the sampled measurements of each of the multiple sensors.

  5. Advanced pressure tube sampling tools

    International Nuclear Information System (INIS)

    Wittich, K.C.; King, J.M.

    2002-01-01

    Deuterium concentration is an important parameter that must be assessed to evaluate the Fitness for service of CANDU pressure tubes. In-reactor pressure tube sampling allows accurate deuterium concentration assessment to be made without the expenses associated with fuel channel removal. This technology, which AECL has developed over the past fifteen years, has become the standard method for deuterium concentration assessment. AECL is developing a multi-head tool that would reduce in-reactor handling overhead by allowing one tool to sequentially sample at all four axial pressure tube locations before removal from the reactor. Four sets of independent cutting heads, like those on the existing sampling tools, facilitate this incorporating proven technology demonstrated in over 1400 in-reactor samples taken to date. The multi-head tool is delivered by AECL's Advanced Delivery Machine or other similar delivery machines. Further, AECL has developed an automated sample handling system that receives and processes the tool once out of the reactor. This system retrieves samples from the tool, dries, weighs and places them in labelled vials which are then directed into shielded shipping flasks. The multi-head wet sampling tool and the automated sample handling system are based on proven technology and offer continued savings and dose reduction to utilities in a competitive electricity market. (author)

  6. FUZZY ACCEPTANCE SAMPLING AND CHARACTERISTIC CURVES

    Directory of Open Access Journals (Sweden)

    Ebru Turano?lu

    2012-02-01

    Full Text Available Acceptance sampling is primarily used for the inspection of incoming or outgoing lots. Acceptance sampling refers to the application of specific sampling plans to a designated lot or sequence of lots. The parameters of acceptance sampling plans are sample sizes and acceptance numbers. In some cases, it may not be possible to define acceptance sampling parameters as crisp values. These parameters can be expressed by linguistic variables. The fuzzy set theory can be successfully used to cope with the vagueness in these linguistic expressions for acceptance sampling. In this paper, the main distributions of acceptance sampling plans are handled with fuzzy parameters and their acceptance probability functions are derived. Then the characteristic curves of acceptance sampling are examined under fuzziness. Illustrative examples are given.

  7. Special nuclear material inventory sampling plans

    International Nuclear Information System (INIS)

    Vaccaro, H.; Goldman, A.

    1987-01-01

    Since their introduction in 1942, sampling inspection procedures have been common quality assurance practice. The U.S. Department of Energy (DOE) supports such sampling of special nuclear materials inventories. The DOE Order 5630.7 states, Operations Offices may develop and use statistically valid sampling plans appropriate for their site-specific needs. The benefits for nuclear facilities operations include reduced worker exposure and reduced work load. Improved procedures have been developed for obtaining statistically valid sampling plans that maximize these benefits. The double sampling concept is described and the resulting sample sizes for double sample plans are compared with other plans. An algorithm is given for finding optimal double sampling plans that assist in choosing the appropriate detection and false alarm probabilities for various sampling plans

  8. Downsampling Non-Uniformly Sampled Data

    Directory of Open Access Journals (Sweden)

    Fredrik Gustafsson

    2007-10-01

    Full Text Available Decimating a uniformly sampled signal a factor D involves low-pass antialias filtering with normalized cutoff frequency 1/D followed by picking out every Dth sample. Alternatively, decimation can be done in the frequency domain using the fast Fourier transform (FFT algorithm, after zero-padding the signal and truncating the FFT. We outline three approaches to decimate non-uniformly sampled signals, which are all based on interpolation. The interpolation is done in different domains, and the inter-sample behavior does not need to be known. The first one interpolates the signal to a uniformly sampling, after which standard decimation can be applied. The second one interpolates a continuous-time convolution integral, that implements the antialias filter, after which every Dth sample can be picked out. The third frequency domain approach computes an approximate Fourier transform, after which truncation and IFFT give the desired result. Simulations indicate that the second approach is particularly useful. A thorough analysis is therefore performed for this case, using the assumption that the non-uniformly distributed sampling instants are generated by a stochastic process.

  9. Solvent Hold Tank Sample Results for MCU-16-991-992-993: July 2016 Monthly sample and MCU-16-1033-1034-1035: July 2016 Superwashed Sample

    Energy Technology Data Exchange (ETDEWEB)

    Fondeur, F. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Jones, D. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2016-11-25

    SRNL received one set of SHT samples (MCU-16-991, MCU-16-992 and MCU-16-993), pulled on 07/13/2016 and another set of SHT samples (MCU-16-1033, MCU-16-1034, and MCU-16-1035) that were pulled on 07/24/2016 after the solvent was superwashed with 300 mM sodium hydroxide for analysis. Samples MCU-16-991, MCU-16-992, and MCU-16-993 were combined into one sample (MCU-16-991-992-993) and samples MCU-16-1033, MCU-16-1034, and MCU-16-1035 were combined into one sample (MCU-16-1033-1034-1035). Of the two composite samples MCU-16-1033-1034-1035 represents the current chemical state of the solvent at MCU. All analytical conclusions are based on the chemical analysis of MCU-16-1033-1034-1035. There were no chemical differences between MCU-16- 991-992-993 and superwashed MCU-16-1033-1034-1035.

  10. Development of SYVAC sampling techniques

    International Nuclear Information System (INIS)

    Prust, J.O.; Dalrymple, G.J.

    1985-04-01

    This report describes the requirements of a sampling scheme for use with the SYVAC radiological assessment model. The constraints on the number of samples that may be taken is considered. The conclusions from earlier studies using the deterministic generator sampling scheme are summarised. The method of Importance Sampling and a High Dose algorithm, which are designed to preferentially sample in the high dose region of the parameter space, are reviewed in the light of experience gained from earlier studies and the requirements of a site assessment and sensitivity analyses. In addition the use of an alternative numerical integration method for estimating risk is discussed. It is recommended that the method of Importance Sampling is developed and tested for use with SYVAC. An alternative numerical integration method is not recommended for investigation at this stage but should be the subject of future work. (author)

  11. Monitoring oil persistence on beaches : SCAT versus stratified random sampling designs

    International Nuclear Information System (INIS)

    Short, J.W.; Lindeberg, M.R.; Harris, P.M.; Maselko, J.M.; Pella, J.J.; Rice, S.D.

    2003-01-01

    In the event of a coastal oil spill, shoreline clean-up assessment teams (SCAT) commonly rely on visual inspection of the entire affected area to monitor the persistence of the oil on beaches. Occasionally, pits are excavated to evaluate the persistence of subsurface oil. This approach is practical for directing clean-up efforts directly following a spill. However, sampling of the 1989 Exxon Valdez oil spill in Prince William Sound 12 years later has shown that visual inspection combined with pit excavation does not offer estimates of contaminated beach area of stranded oil volumes. This information is needed to statistically evaluate the significance of change with time. Assumptions regarding the correlation of visually-evident surface oil and cryptic subsurface oil are usually not evaluated as part of the SCAT mandate. Stratified random sampling can avoid such problems and could produce precise estimates of oiled area and volume that allow for statistical assessment of major temporal trends and the extent of the impact. The 2001 sampling of the shoreline of Prince William Sound showed that 15 per cent of surface oil occurrences were associated with subsurface oil. This study demonstrates the usefulness of the stratified random sampling method and shows how sampling design parameters impact statistical outcome. Power analysis based on the study results, indicate that optimum power is derived when unnecessary stratification is avoided. It was emphasized that sampling effort should be balanced between choosing sufficient beaches for sampling and the intensity of sampling

  12. Wideband 4-diode sampling circuit

    Science.gov (United States)

    Wojtulewicz, Andrzej; Radtke, Maciej

    2016-09-01

    The objective of this work was to develop a wide-band sampling circuit. The device should have the ability to collect samples of a very fast signal applied to its input, strengthen it and prepare for further processing. The study emphasizes the method of sampling pulse shaping. The use of ultrafast pulse generator allows sampling signals with a wide frequency spectrum, reaching several gigahertzes. The device uses a pulse transformer to prepare symmetrical pulses. Their final shape is formed with the help of the step recovery diode, two coplanar strips and Schottky diode. Made device can be used in the sampling oscilloscope, as well as other measurement system.

  13. Where Will All Your Samples Go?

    Science.gov (United States)

    Lehnert, K.

    2017-12-01

    Even in the digital age, physical samples remain an essential component of Earth and space science research. Geoscientists collect samples, sometimes locally, often in remote locations during expensive field expeditions, or at sample repositories and museums. They take these samples to their labs to describe and analyze them. When the analyses are completed and the results are published, the samples get stored away in sheds, basements, or desk drawers, where they remain unknown and inaccessible to the broad science community. In some cases, they will get re-analyzed or shared with other researchers, who know of their existence through personal connections. The sad end comes when the researcher retires: There are many stories of samples and entire collections being discarded to free up space for new samples or other purposes, even though these samples may be unique and irreplaceable. Institutions do not feel obligated and do not have the resources to store samples in perpetuity. Only samples collected in large sampling campaigns such as the Ocean Discovery Program or cores taken on ships find a home in repositories that curate and preserve them for reuse in future science endeavors. In the era of open, transparent, and reproducible science, preservation and persistent access to samples must be considered a mandate. Policies need to be developed that guide investigators, institutions, and funding agencies to plan and implement solutions for reliably and persistently curating and providing access to samples. Registration of samples in online catalogs and use of persistent identifiers such as the International Geo Sample Number are first steps to ensure discovery and access of samples. But digital discovery and access loses its value if the physical objects are not preserved and accessible. It is unreasonable to expect that every sample ever collected can be archived. Selection of those samples that are worth preserving requires guidelines and policies. We also need to

  14. Alternating sample changer and an automatic sample changer for liquid scintillation counting of alpha-emitting materials

    International Nuclear Information System (INIS)

    Thorngate, J.H.

    1977-08-01

    Two sample changers are described that were designed for liquid scintillation counting of alpha-emitting samples prepared using solvent-extraction chemistry. One operates manually but changes samples without exposing the photomultiplier tube to light, allowing the high voltage to remain on for improved stability. The other is capable of automatically counting up to 39 samples. An electronic control for the automatic sample changer is also described

  15. Apparatus for sectioning demountable semiconductor samples

    Science.gov (United States)

    Sopori, B.L.; Wolf, A.

    1984-01-01

    Apparatus for use during polishing and sectioning operations of a ribbon sample is described. The sample holder includes a cylinder having an axially extending sample cavity terminated in a first funnel-shaped opening and a second slot-like opening. A spring-loaded pressure plunger is located adjacent the second opening of the sample cavity for frictional engagement of the sample cavity. A heat softenable molding medium is inserted in the funnel-shaped opening, to surround the sample. After polishing, the heater is energized to allow draining of the molding medium from the sample cavity. During manual polishing, the second end of the sample holder is inserted in a support ring which provides mechanical support as well as alignment of the sample holder during polishing. A gauge block for measuring the protrusion of a sample beyond the second wall of the holder is also disclosed.

  16. The Role of the Sampling Distribution in Understanding Statistical Inference

    Science.gov (United States)

    Lipson, Kay

    2003-01-01

    Many statistics educators believe that few students develop the level of conceptual understanding essential for them to apply correctly the statistical techniques at their disposal and to interpret their outcomes appropriately. It is also commonly believed that the sampling distribution plays an important role in developing this understanding.…

  17. Systematic sampling of discrete and continuous populations: sample selection and the choice of estimator

    Science.gov (United States)

    Harry T. Valentine; David L. R. Affleck; Timothy G. Gregoire

    2009-01-01

    Systematic sampling is easy, efficient, and widely used, though it is not generally recognized that a systematic sample may be drawn from the population of interest with or without restrictions on randomization. The restrictions or the lack of them determine which estimators are unbiased, when using the sampling design as the basis for inference. We describe the...

  18. Postoperative Outcomes in Graves' Disease Patients: Results from the Nationwide Inpatient Sample Database.

    Science.gov (United States)

    Rubio, Gustavo A; Koru-Sengul, Tulay; Vaghaiwalla, Tanaz M; Parikh, Punam P; Farra, Josefina C; Lew, John I

    2017-06-01

    Current surgical indications for Graves' disease include intractability to medical and/or radioablative therapy, compressive symptoms, and worsening ophthalmopathy. Total thyroidectomy for Graves' disease may be technically challenging and lead to untoward perioperative outcomes. This study examines outcomes in patients with Graves' disease who underwent total thyroidectomy and assesses its safety for this patient population. A retrospective cross-sectional analysis was performed using the Nationwide Inpatient Sample database from 2006 to 2011. Total thyroidectomy performed in patients with Graves' disease, benign multinodular goiter (MNG), and thyroid cancer was identified. Demographic factors, comorbidities, and postoperative complications were evaluated. Chi-square, one-way analysis of variance, and risk-adjusted multivariable logistic regression were performed. Of 215,068 patients who underwent total thyroidectomy during the study period, 11,205 (5.2%) had Graves' disease, 110,124 (51.2%) MNG, and 93,739 (43.6%) thyroid malignancy. Patients with Graves' disease were younger than MNG and thyroid cancer patients (M age  = 42.8 years vs. 55.5 and 51.0 years; p Graves' disease group included a higher proportion of women (p Graves' disease was independently associated with a higher risk of vocal-cord paralysis (odds ratio [OR] = 1.36 [confidence interval (CI) 1.08-1.69]), tracheostomy (OR = 1.35 [CI 1.1-1.67]), postoperative hypocalcemia (OR = 1.65 [CI 1.54-1.77]), and hematoma requiring reoperation (OR = 2.79 [CI 2.16-3.62]) compared to MNG patients. High-volume centers for total thyroidectomy were independently associated with lower risk of postoperative complications, including in patients with Graves' disease. Despite low overall morbidity following total thyroidectomy, Graves' disease patients are at increased risk of postoperative complications, including bleeding, vocal-cord paralysis, tracheostomy, and hypocalcemia. These risks appear

  19. 40 CFR 1065.150 - Continuous sampling.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Continuous sampling. 1065.150 Section... ENGINE-TESTING PROCEDURES Equipment Specifications § 1065.150 Continuous sampling. You may use continuous sampling techniques for measurements that involve raw or dilute sampling. Make sure continuous sampling...

  20. Mars Sample Return Architecture Overview

    Science.gov (United States)

    Edwards, C. D.; Vijendran, S.

    2018-04-01

    NASA and ESA are exploring potential concepts for a Sample Retrieval Lander and Earth Return Orbiter that could return samples planned to be collected and cached by the Mars 2020 rover mission. We provide an overview of the Mars Sample Return architecture.

  1. Sampling Methodologies for Epidemiologic Surveillance of Men Who Have Sex with Men and Transgender Women in Latin America: An Empiric Comparison of Convenience Sampling, Time Space Sampling, and Respondent Driven Sampling

    OpenAIRE

    Clark, J. L.; Konda, K. A.; Silva-Santisteban, A.; Peinado, J.; Lama, J. R.; Kusunoki, L.; Perez-Brumer, A.; Pun, M.; Cabello, R.; Sebastian, J. L.; Suarez-Ognio, L.; Sanchez, J.

    2014-01-01

    Alternatives to convenience sampling (CS) are needed for HIV/STI surveillance of most-at-risk populations in Latin America. We compared CS, time space sampling (TSS), and respondent driven sampling (RDS) for recruitment of men who have sex with men (MSM) and transgender women (TW) in Lima, Peru. During concurrent 60-day periods from June-August, 2011, we recruited MSM/TW for epidemiologic surveillance using CS, TSS, and RDS. A total of 748 participants were recruited through CS, 233 through T...

  2. Micro-organism distribution sampling for bioassays

    Science.gov (United States)

    Nelson, B. A.

    1975-01-01

    Purpose of sampling distribution is to characterize sample-to-sample variation so statistical tests may be applied, to estimate error due to sampling (confidence limits) and to evaluate observed differences between samples. Distribution could be used for bioassays taken in hospitals, breweries, food-processing plants, and pharmaceutical plants.

  3. Ball assisted device for analytical surface sampling

    Science.gov (United States)

    ElNaggar, Mariam S; Van Berkel, Gary J; Covey, Thomas R

    2015-11-03

    A system for sampling a surface includes a sampling probe having a housing and a socket, and a rolling sampling sphere within the socket. The housing has a sampling fluid supply conduit and a sampling fluid exhaust conduit. The sampling fluid supply conduit supplies sampling fluid to the sampling sphere. The sampling fluid exhaust conduit has an inlet opening for receiving sampling fluid carried from the surface by the sampling sphere. A surface sampling probe and a method for sampling a surface are also disclosed.

  4. 7 CFR 75.18 - Sampling.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Sampling. 75.18 Section 75.18 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards, Inspections... CERTIFICATION OF QUALITY OF AGRICULTURAL AND VEGETABLE SEEDS Inspection § 75.18 Sampling. Sampling, when...

  5. Multiple sample, radioactive particle counting apparatus

    International Nuclear Information System (INIS)

    Reddy, R.R.V.; Kelso, D.M.

    1978-01-01

    An apparatus is described for determining the respective radioactive particle sample count being emitted from radioactive particle containing samples. It includes means for modulating the information on the radioactive particles being emitted from the samples, coded detecting means for sequentially detecting different respective coded combinations of the radioactive particles emitted from more than one but less than all of the samples, and processing the modulated information to derive the sample count for each sample. It includes a single light emitting crystal next to a number of samples, an encoder belt sequentially movable between the crystal and the samples. The encoder belt has a coded array of apertures to provide corresponding modulated light pulses from the crystal, and a photomultiplier tube to convert the modulated light pulses to decodable electrical signals for deriving the respective sample count

  6. Statistical sampling plans

    International Nuclear Information System (INIS)

    Jaech, J.L.

    1984-01-01

    In auditing and in inspection, one selects a number of items by some set of procedures and performs measurements which are compared with the operator's values. This session considers the problem of how to select the samples to be measured, and what kinds of measurements to make. In the inspection situation, the ultimate aim is to independently verify the operator's material balance. The effectiveness of the sample plan in achieving this objective is briefly considered. The discussion focuses on the model plant

  7. Directional dependency of air sampling

    International Nuclear Information System (INIS)

    1994-01-01

    A field study was performed by Idaho State University-Environmental Monitoring Laboratory (EML) to examine the directional dependency of low-volume air samplers. A typical continuous low volume air sampler contains a sample head that is mounted on the sampler housing either horizontally through one of four walls or vertically on an exterior wall 'looking down or up.' In 1992, a field study was undertaken to estimate sampling error and to detect the directional effect of sampler head orientation. Approximately 1/2 mile downwind from a phosphate plant (continuous source of alpha activity), four samplers were positioned in identical orientation alongside one sampler configured with the sample head 'looking down'. At least five consecutive weekly samples were collected. The alpha activity, beta activity, and the Be-7 activity collected on the particulate filter were analyzed to determine sampling error. Four sample heads were than oriented to the four different horizontal directions. Samples were collected for at least five weeks. Analysis of the alpha data can show the effect of sampler orientation to a know near source term. Analysis of the beta and Be-7 activity shows the effect of sampler orientation to a ubiquitous source term

  8. Environmental surveillance master sampling schedule

    International Nuclear Information System (INIS)

    Bisping, L.E.

    1991-01-01

    Environmental surveillance of the Hanford Site and surrounding areas is conducted by the Pacific Northwest Laboratory (PNL) for the US Department of Energy (DOE). This document contains the planned schedule for routine sample collection for the Surface Environmental Surveillance Project (SESP) and Ground-Water Monitoring Project. The routine sampling plan for the SESP has been revised this year to reflect changing site operations and priorities. Some sampling previously performed at least annually has been reduced in frequency, and some new sampling to be performed at a less than annual frequency has been added. Therefore, the SESP schedule reflects sampling to be conducted in calendar year 1991 as well as future years. The ground-water sampling schedule is for 1991. This schedule is subject to modification during the year in response to changes in Site operation, program requirements, and the nature of the observed results. Operational limitations such as weather, mechanical failures, sample availability, etc., may also require schedule modifications. Changes will be documented in the respective project files, but this plan will not be reissued. The purpose of these monitoring projects is to evaluate levels of radioactive and nonradioactive pollutants in the Hanford evirons

  9. Environmental surveillance master sampling schedule

    Energy Technology Data Exchange (ETDEWEB)

    Bisping, L.E.

    1991-01-01

    Environmental surveillance of the Hanford Site and surrounding areas is conducted by the Pacific Northwest Laboratory (PNL) for the US Department of Energy (DOE). This document contains the planned schedule for routine sample collection for the Surface Environmental Surveillance Project (SESP) and Ground-Water Monitoring Project. The routine sampling plan for the SESP has been revised this year to reflect changing site operations and priorities. Some sampling previously performed at least annually has been reduced in frequency, and some new sampling to be performed at a less than annual frequency has been added. Therefore, the SESP schedule reflects sampling to be conducted in calendar year 1991 as well as future years. The ground-water sampling schedule is for 1991. This schedule is subject to modification during the year in response to changes in Site operation, program requirements, and the nature of the observed results. Operational limitations such as weather, mechanical failures, sample availability, etc., may also require schedule modifications. Changes will be documented in the respective project files, but this plan will not be reissued. The purpose of these monitoring projects is to evaluate levels of radioactive and nonradioactive pollutants in the Hanford evirons.

  10. Choosing a Cluster Sampling Design for Lot Quality Assurance Sampling Surveys

    OpenAIRE

    Hund, Lauren; Bedrick, Edward J.; Pagano, Marcello

    2015-01-01

    Lot quality assurance sampling (LQAS) surveys are commonly used for monitoring and evaluation in resource-limited settings. Recently several methods have been proposed to combine LQAS with cluster sampling for more timely and cost-effective data collection. For some of these methods, the standard binomial model can be used for constructing decision rules as the clustering can be ignored. For other designs, considered here, clustering is accommodated in the design phase. In this paper, we comp...

  11. Combining Electrochemical Sensors with Miniaturized Sample Preparation for Rapid Detection in Clinical Samples

    Science.gov (United States)

    Bunyakul, Natinan; Baeumner, Antje J.

    2015-01-01

    Clinical analyses benefit world-wide from rapid and reliable diagnostics tests. New tests are sought with greatest demand not only for new analytes, but also to reduce costs, complexity and lengthy analysis times of current techniques. Among the myriad of possibilities available today to develop new test systems, amperometric biosensors are prominent players—best represented by the ubiquitous amperometric-based glucose sensors. Electrochemical approaches in general require little and often enough only simple hardware components, are rugged and yet provide low limits of detection. They thus offer many of the desirable attributes for point-of-care/point-of-need tests. This review focuses on investigating the important integration of sample preparation with (primarily electrochemical) biosensors. Sample clean up requirements, miniaturized sample preparation strategies, and their potential integration with sensors will be discussed, focusing on clinical sample analyses. PMID:25558994

  12. Methodology Series Module 5: Sampling Strategies

    OpenAIRE

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ? Sampling Method?. There are essentially two types of sampling methods: 1) probability sampling ? based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling ? based on researcher's choice, population that accessible & available. Some of the non-probabilit...

  13. Automatic remote sampling and delivery system incorporating decontamination and disposal of sample bottles

    International Nuclear Information System (INIS)

    Savarkar, V.K.; Mishra, A.K.; Bajpai, D.D.; Nair, M.K.T.

    1990-01-01

    The present generation of reprocessing plants have sampling and delivery systems that have to be operated manually with its associated problems. The complete automation and remotisation of sampling system has hence been considered to reduce manual intervention and personnel exposure. As a part of this scheme an attempt to automate and remotise various steps in sampling system has been made. This paper discusses in detail the development work carried out in this area as well as the tests conducted to incorporate the same in the existing plants. (author). 3 figs

  14. Patient identification in blood sampling.

    Science.gov (United States)

    Davidson, Anne; Bolton-Maggs, Paula

    The majority of adverse reports relating to blood transfusions result from human error, including misidentification of patients and incorrect labelling of samples. This article outlines best practice in blood sampling for transfusion (but is recommended for all pathology samples) and the role of patient empowerment in improving safety.

  15. The rise of survey sampling

    NARCIS (Netherlands)

    Bethlehem, J.

    2009-01-01

    This paper is about the history of survey sampling. It describes how sampling became an accepted scientific method. From the first ideas in 1895 it took some 50 years before the principles of probability sampling were widely accepted. This papers has a focus on developments in official statistics in

  16. Mass counting of radioactivity samples

    International Nuclear Information System (INIS)

    Oesterlin, D.L.; Obrycki, R.F.

    1977-01-01

    A method and apparatus for concurrently counting a plurality of radioactive samples is claimed. The position sensitive circuitry of a scintillation camera is employed to sort electrical pulses resulting from scintillations according to the geometrical locations of scintillations causing those pulses. A scintillation means, in the form of a scintillating crystal material or a liquid scintillator, is positioned proximate to an array of radioactive samples. Improvement in the accuracy of pulse classification may be obtained by employing collimating means. If a plurality of scintillation crystals are employed to measure the iodine-125 content of samples, a method and means are provided for correcting for variations in crystal light transmission properties, sample volume, and sample container radiation absorption. 2 claims, 7 drawing figures

  17. Sample size for post-marketing safety studies based on historical controls.

    Science.gov (United States)

    Wu, Yu-te; Makuch, Robert W

    2010-08-01

    As part of a drug's entire life cycle, post-marketing studies are an important part in the identification of rare, serious adverse events. Recently, the US Food and Drug Administration (FDA) has begun to implement new post-marketing safety mandates as a consequence of increased emphasis on safety. The purpose of this research is to provide exact sample size formula for the proposed hybrid design, based on a two-group cohort study with incorporation of historical external data. Exact sample size formula based on the Poisson distribution is developed, because the detection of rare events is our outcome of interest. Performance of exact method is compared to its approximate large-sample theory counterpart. The proposed hybrid design requires a smaller sample size compared to the standard, two-group prospective study design. In addition, the exact method reduces the number of subjects required in the treatment group by up to 30% compared to the approximate method for the study scenarios examined. The proposed hybrid design satisfies the advantages and rationale of the two-group design with smaller sample sizes generally required. 2010 John Wiley & Sons, Ltd.

  18. 19 CFR 151.10 - Sampling.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Sampling. 151.10 Section 151.10 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE General § 151.10 Sampling. When necessary, the port director...

  19. An unbiased estimator of the variance of simple random sampling using mixed random-systematic sampling

    OpenAIRE

    Padilla, Alberto

    2009-01-01

    Systematic sampling is a commonly used technique due to its simplicity and ease of implementation. The drawback of this simplicity is that it is not possible to estimate the design variance without bias. There are several ways to circumvent this problem. One method is to suppose that the variable of interest has a random order in the population, so the sample variance of simple random sampling without replacement is used. By means of a mixed random - systematic sample, an unbiased estimator o...

  20. Adaptive Rate Sampling and Filtering Based on Level Crossing Sampling

    Directory of Open Access Journals (Sweden)

    Saeed Mian Qaisar

    2009-01-01

    Full Text Available The recent sophistications in areas of mobile systems and sensor networks demand more and more processing resources. In order to maintain the system autonomy, energy saving is becoming one of the most difficult industrial challenges, in mobile computing. Most of efforts to achieve this goal are focused on improving the embedded systems design and the battery technology, but very few studies target to exploit the input signal time-varying nature. This paper aims to achieve power efficiency by intelligently adapting the processing activity to the input signal local characteristics. It is done by completely rethinking the processing chain, by adopting a non conventional sampling scheme and adaptive rate filtering. The proposed approach, based on the LCSS (Level Crossing Sampling Scheme presents two filtering techniques, able to adapt their sampling rate and filter order by online analyzing the input signal variations. Indeed, the principle is to intelligently exploit the signal local characteristics—which is usually never considered—to filter only the relevant signal parts, by employing the relevant order filters. This idea leads towards a drastic gain in the computational efficiency and hence in the processing power when compared to the classical techniques.

  1. The Viking X ray fluorescence experiment - Sampling strategies and laboratory simulations. [Mars soil sampling

    Science.gov (United States)

    Baird, A. K.; Castro, A. J.; Clark, B. C.; Toulmin, P., III; Rose, H., Jr.; Keil, K.; Gooding, J. L.

    1977-01-01

    Ten samples of Mars regolith material (six on Viking Lander 1 and four on Viking Lander 2) have been delivered to the X ray fluorescence spectrometers as of March 31, 1977. An additional six samples at least are planned for acquisition in the remaining Extended Mission (to January 1979) for each lander. All samples acquired are Martian fines from the near surface (less than 6-cm depth) of the landing sites except the latest on Viking Lander 1, which is fine material from the bottom of a trench dug to a depth of 25 cm. Several attempts on each lander to acquire fresh rock material (in pebble sizes) for analysis have yielded only cemented surface crustal material (duricrust). Laboratory simulation and experimentation are required both for mission planning of sampling and for interpretation of data returned from Mars. This paper is concerned with the rationale for sample site selections, surface sampler operations, and the supportive laboratory studies needed to interpret X ray results from Mars.

  2. Sample Preprocessing For Atomic Spectrometry

    International Nuclear Information System (INIS)

    Kim, Sun Tae

    2004-08-01

    This book gives descriptions of atomic spectrometry, which deals with atomic absorption spectrometry such as Maxwell-Boltzmann equation and Beer-Lambert law, atomic absorption spectrometry for solvent extraction, HGAAS, ETASS, and CVAAS and inductively coupled plasma emission spectrometer, such as basic principle, generative principle of plasma and device and equipment, and interferences, and inductively coupled plasma mass spectrometry like device, pros and cons of ICP/MS, sample analysis, reagent, water, acid, flux, materials of experiments, sample and sampling and disassembling of sample and pollution and loss in open system and closed system.

  3. Enhancing sampling design in mist-net bat surveys by accounting for sample size optimization

    OpenAIRE

    Trevelin, Leonardo Carreira; Novaes, Roberto Leonan Morim; Colas-Rosas, Paul François; Benathar, Thayse Cristhina Melo; Peres, Carlos A.

    2017-01-01

    The advantages of mist-netting, the main technique used in Neotropical bat community studies to date, include logistical implementation, standardization and sampling representativeness. Nonetheless, study designs still have to deal with issues of detectability related to how different species behave and use the environment. Yet there is considerable sampling heterogeneity across available studies in the literature. Here, we approach the problem of sample size optimization. We evaluated the co...

  4. Multilocus sequence typing of Trichomonas vaginalis clinical samples from Amsterdam, the Netherlands.

    Science.gov (United States)

    van der Veer, C; Himschoot, M; Bruisten, S M

    2016-10-13

    In this cross-sectional epidemiological study we aimed to identify molecular profiles for Trichomonas vaginalis and to determine how these molecular profiles were related to patient demographic and clinical characteristics. Molecular typing methods previously identified two genetically distinct subpopulations for T. vaginalis; however, few molecular epidemiological studies have been performed. We now increased the sensitivity of a previously described multilocus sequence typing (MLST) tool for T. vaginalis by using nested PCR. This enabled the typing of direct patient samples. From January to December 2014, we collected all T. vaginalis positive samples as detected by routine laboratory testing. Samples from patients either came from general practitioners offices or from the sexually transmitted infections (STI) clinic in Amsterdam. Epidemiological data for the STI clinic patients were retrieved from electronic patient files. The primary outcome was the success rate of genotyping direct T. vaginalis positive samples. The secondary outcome was the relation between T. vaginalis genotypes and risk factors for STI. All 7 MLST loci were successfully typed for 71/87 clinical samples. The 71 typed samples came from 69 patients, the majority of whom were women (n=62; 90%) and half (n=34; 49%) were STI clinic patients. Samples segregated into a two population structure for T. vaginalis representing genotypes I and II. Genotype I was most common (n=40; 59.7%). STI clinic patients infected with genotype II reported more sexual partners in the preceding 6 months than patients infected with genotype I (p=0.028). No other associations for gender, age, ethnicity, urogenital discharge or co-occurring STIs with T. vaginalis genotype were found. MLST with nested PCR is a sensitive typing method that allows typing of direct (uncultured) patient material. Genotype II is possibly more prevalent in high-risk sexual networks. Published by the BMJ Publishing Group Limited. For

  5. Rapid Sampling from Sealed Containers

    International Nuclear Information System (INIS)

    Johnston, R.G.; Garcia, A.R.E.; Martinez, R.K.; Baca, E.T.

    1999-01-01

    The authors have developed several different types of tools for sampling from sealed containers. These tools allow the user to rapidly drill into a closed container, extract a sample of its contents (gas, liquid, or free-flowing powder), and permanently reseal the point of entry. This is accomplished without exposing the user or the environment to the container contents, even while drilling. The entire process is completed in less than 15 seconds for a 55 gallon drum. Almost any kind of container can be sampled (regardless of the materials) with wall thicknesses up to 1.3 cm and internal pressures up to 8 atm. Samples can be taken from the top, sides, or bottom of a container. The sampling tools are inexpensive, small, and easy to use. They work with any battery-powered hand drill. This allows considerable safety, speed, flexibility, and maneuverability. The tools also permit the user to rapidly attach plumbing, a pressure relief valve, alarms, or other instrumentation to a container. Possible applications include drum venting, liquid transfer, container flushing, waste characterization, monitoring, sampling for archival or quality control purposes, emergency sampling by rapid response teams, counter-terrorism, non-proliferation and treaty verification, and use by law enforcement personnel during drug or environmental raids

  6. Field sampling, preparation procedure and plutonium analyses of large freshwater samples

    International Nuclear Information System (INIS)

    Straelberg, E.; Bjerk, T.O.; Oestmo, K.; Brittain, J.E.

    2002-01-01

    This work is part of an investigation of the mobility of plutonium in freshwater systems containing humic substances. A well-defined bog-stream system located in the catchment area of a subalpine lake, Oevre Heimdalsvatn, Norway, is being studied. During the summer of 1999, six water samples were collected from the tributary stream Lektorbekken and the lake itself. However, the analyses showed that the plutonium concentration was below the detection limit in all the samples. Therefore renewed sampling at the same sites was carried out in August 2000. The results so far are in agreement with previous analyses from the Heimdalen area. However, 100 times higher concentrations are found in the lowlands in the eastern part of Norway. The reason for this is not understood, but may be caused by differences in the concentrations of humic substances and/or the fact that the mountain areas are covered with snow for a longer period of time every year. (LN)

  7. Sample normalization methods in quantitative metabolomics.

    Science.gov (United States)

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Sampling device for radioactive molten salt

    International Nuclear Information System (INIS)

    Shindo, Masato

    1998-01-01

    The present invention provides a device for accurately sampling molten salts to which various kinds of metals in a molten salt storage tank are mixed for analyzing them during a spent fuel dry type reprocessing. Namely, the device comprises a sampling tube having an opened lower end to be inserted into the radioactive molten salts stored in a tank and keeps reduced pressure from the upper end, and a pressure reducing pipeline having one end connected to the sampling tube and other end connected to an evacuating pump. In this device, the top end of the sampling tube is inserted to a position for sampling the radioactive molten salts (molten salts). The pressure inside the evacuating pipeline connected to the upper portion of the sampling tube is reduced for a while. In this case, the inside of the pressure reducing pipeline is previously evacuated by the evacuating pump so as to keep a predetermined pressure. Since the pressure in the sampling tube is lowered, molten salts are inserted into the sampling tube, the sampling tube is withdrawn, and the molten salts flown in the sampling tube are analyzed. (I.S.)

  9. Preferential sampling in veterinary parasitological surveillance

    Directory of Open Access Journals (Sweden)

    Lorenzo Cecconi

    2016-04-01

    Full Text Available In parasitological surveillance of livestock, prevalence surveys are conducted on a sample of farms using several sampling designs. For example, opportunistic surveys or informative sampling designs are very common. Preferential sampling refers to any situation in which the spatial process and the sampling locations are not independent. Most examples of preferential sampling in the spatial statistics literature are in environmental statistics with focus on pollutant monitors, and it has been shown that, if preferential sampling is present and is not accounted for in the statistical modelling and data analysis, statistical inference can be misleading. In this paper, working in the context of veterinary parasitology, we propose and use geostatistical models to predict the continuous and spatially-varying risk of a parasite infection. Specifically, breaking with the common practice in veterinary parasitological surveillance to ignore preferential sampling even though informative or opportunistic samples are very common, we specify a two-stage hierarchical Bayesian model that adjusts for preferential sampling and we apply it to data on Fasciola hepatica infection in sheep farms in Campania region (Southern Italy in the years 2013-2014.

  10. Sampling system and method

    Science.gov (United States)

    Decker, David L.; Lyles, Brad F.; Purcell, Richard G.; Hershey, Ronald Lee

    2013-04-16

    The present disclosure provides an apparatus and method for coupling conduit segments together. A first pump obtains a sample and transmits it through a first conduit to a reservoir accessible by a second pump. The second pump further conducts the sample from the reservoir through a second conduit.

  11. Inorganic elements in sugar samples

    International Nuclear Information System (INIS)

    Salles, Paulo M.B. de; Campos, Tarcisio P.R. de

    2013-01-01

    Sugar is considered a safe food ingredient; however, it can be contaminated by organic elements since its planting until its production process. Thus, this study aims at checking the presence of inorganic elements in samples of crystal, refined and brown sugar available for consumption in Brazil. The applied technique was neutron activation analysis, the k 0 method, using the TRIGA MARK - IPR-R1 reactor located at CDTN/CNEN, in Belo Horizonte. It was identified the presence of elements such as, Au, Br, Co, Cr, Hf, K, Na, Sb, Sc and Zn in the samples of crystal/refined sugar and the presence of As, Au, Br, Ca, Co, Cr, Cs, Fe, Hf, K, Na, Sb, Sc, Sm, Sr, Th and Zn in the brown sugar samples. The applied technique was appropriate to this study because it was not necessary to put the samples in solution, essential condition in order to apply other techniques, avoiding contaminations and sample losses, besides allowing a multi elementary detection in different sugar samples. (author)

  12. Inorganic elements in sugar samples

    Energy Technology Data Exchange (ETDEWEB)

    Salles, Paulo M.B. de; Campos, Tarcisio P.R. de, E-mail: pauladesalles@yahoo.com.br, E-mail: tprcampos@pq.cnpq.br [Universidade Federal de Minas Gerais (DEN/UFMG), Belo Horizonte, MG (Brazil). Departamento de Engenharia Nuclear; Menezes, Maria Angela de B.C., E-mail: menezes@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2013-07-01

    Sugar is considered a safe food ingredient; however, it can be contaminated by organic elements since its planting until its production process. Thus, this study aims at checking the presence of inorganic elements in samples of crystal, refined and brown sugar available for consumption in Brazil. The applied technique was neutron activation analysis, the k{sub 0} method, using the TRIGA MARK - IPR-R1 reactor located at CDTN/CNEN, in Belo Horizonte. It was identified the presence of elements such as, Au, Br, Co, Cr, Hf, K, Na, Sb, Sc and Zn in the samples of crystal/refined sugar and the presence of As, Au, Br, Ca, Co, Cr, Cs, Fe, Hf, K, Na, Sb, Sc, Sm, Sr, Th and Zn in the brown sugar samples. The applied technique was appropriate to this study because it was not necessary to put the samples in solution, essential condition in order to apply other techniques, avoiding contaminations and sample losses, besides allowing a multi elementary detection in different sugar samples. (author)

  13. Supporting Sampling and Sample Preparation Tools for Isotope and Nuclear Analysis

    International Nuclear Information System (INIS)

    2016-03-01

    Nuclear and related techniques can help develop climate-smart agricultural practices by optimizing water and nutrient use efficiency, assessing organic carbon sequestration in soil, and assisting in the evaluation of soil erosion control measures. Knowledge on the behaviour of radioactive materials in soil, water and foodstuffs is also essential in enhancing nuclear emergency preparedness and response. Appropriate sampling and sample preparation are the first steps to ensure the quality and effective use of the measurements and this publication provides comprehensive detail on the necessary steps

  14. An antithetic variate to facilitate upper-stem height measurements for critical height sampling with importance sampling

    Science.gov (United States)

    Thomas B. Lynch; Jeffrey H. Gove

    2013-01-01

    Critical height sampling (CHS) estimates cubic volume per unit area by multiplying the sum of critical heights measured on trees tallied in a horizontal point sample (HPS) by the HPS basal area factor. One of the barriers to practical application of CHS is the fact that trees near the field location of the point-sampling sample point have critical heights that occur...

  15. Sampling for Beryllium Surface Contamination using Wet, Dry and Alcohol Wipe Sampling

    Energy Technology Data Exchange (ETDEWEB)

    Kerr, Kent [Central Missouri State Univ., Warrensburg, MO (United States)

    2004-12-01

    This research project was conducted at the National Nuclear Security Administration's Kansas City Plant, operated by Honeywell Federal Manufacturing and Technologies, in conjunction with the Safety Sciences Department of Central Missouri State University, to compare relative removal efficiencies of three wipe sampling techniques currently used at Department of Energy facilities. Efficiencies of removal of beryllium contamination from typical painted surfaces were tested by wipe sampling with dry Whatman 42 filter paper, with water-moistened (Ghost Wipe) materials, and by methanol-moistened wipes. Test plates were prepared using 100 mm X 15 mm Pyrex Petri dishes with interior surfaces spray painted with a bond coat primer. To achieve uniform deposition over the test plate surface, 10 ml aliquots of solution containing 1 beryllium and 0.1 ml of metal working fluid were transferred to the test plates and subsequently evaporated. Metal working fluid was added to simulate the slight oiliness common on surfaces in metal working shops where fugitive oil mist accumulates over time. Sixteen test plates for each wipe method (dry, water, and methanol) were processed and sampled using a modification of wiping patterns recommended by OSHA Method 125G. Laboratory and statistical analysis showed that methanol-moistened wipe sampling removed significantly more (about twice as much) beryllium/oil-film surface contamination as water-moistened wipes (p< 0.001), which removed significantly more (about twice as much) residue as dry wipes (p <0.001). Evidence for beryllium sensitization via skin exposure argues in favor of wipe sampling with wetting agents that provide enhanced residue removal efficiency.

  16. 16 CFR 305.6 - Sampling.

    Science.gov (United States)

    2010-01-01

    ... 16 Commercial Practices 1 2010-01-01 2010-01-01 false Sampling. 305.6 Section 305.6 Commercial... ENERGY POLICY AND CONSERVATION ACT (âAPPLIANCE LABELING RULEâ) Testing § 305.6 Sampling. (a) For any... based upon the sampling procedures set forth in § 430.24 of 10 CFR part 430, subpart B. (b) For any...

  17. Sampling Large Graphs for Anticipatory Analytics

    Science.gov (United States)

    2015-05-15

    low. C. Random Area Sampling Random area sampling [8] is a “ snowball ” sampling method in which a set of random seed vertices are selected and areas... Sampling Large Graphs for Anticipatory Analytics Lauren Edwards, Luke Johnson, Maja Milosavljevic, Vijay Gadepally, Benjamin A. Miller Lincoln...systems, greater human-in-the-loop involvement, or through complex algorithms. We are investigating the use of sampling to mitigate these challenges

  18. Scalability on LHS (Latin Hypercube Sampling) samples for use in uncertainty analysis of large numerical models

    International Nuclear Information System (INIS)

    Baron, Jorge H.; Nunez Mac Leod, J.E.

    2000-01-01

    The present paper deals with the utilization of advanced sampling statistical methods to perform uncertainty and sensitivity analysis on numerical models. Such models may represent physical phenomena, logical structures (such as boolean expressions) or other systems, and various of their intrinsic parameters and/or input variables are usually treated as random variables simultaneously. In the present paper a simple method to scale-up Latin Hypercube Sampling (LHS) samples is presented, starting with a small sample and duplicating its size at each step, making it possible to use the already run numerical model results with the smaller sample. The method does not distort the statistical properties of the random variables and does not add any bias to the samples. The results is a significant reduction in numerical models running time can be achieved (by re-using the previously run samples), keeping all the advantages of LHS, until an acceptable representation level is achieved in the output variables. (author)

  19. Environmental sampling for trace analysis

    International Nuclear Information System (INIS)

    Markert, B.

    1994-01-01

    Often too little attention is given to the sampling before and after actual instrumental measurement. This leads to errors, despite increasingly sensitive analytical systems. This is one of the first books to pay proper attention to representative sampling. It offers an overview of the most common techniques used today for taking environmental samples. The techniques are clearly presented, yield accurate and reproducible results and can be used to sample -air - water - soil and sediments - plants and animals. A comprehensive handbook, this volume provides an excellent starting point for researchers in the rapidly expanding field of environmental analysis. (orig.)

  20. Use of robotic systems for radiochemical sample changing and for analytical sample preparation

    International Nuclear Information System (INIS)

    Delmastro, J.R.; Hartenstein, S.D.; Wade, M.A.

    1989-01-01

    Two uses of the Perkin-Elmer (PE) robotic system will be presented. In the first, a PE robot functions as an automatic sample changer for up to five low energy photon spectrometry (LEPS) detectors operated with a Nuclear Data ND 6700 system. The entire system, including the robot, is controlled by an IBM PC-AT using software written in compiled BASIC. Problems associated with the development of the system and modifications to the robot will be presented. In the second, an evaluation study was performed to assess the abilities of the PE robotic system for performing complex analytical sample preparation procedures. For this study, a robotic system based upon the PE robot and auxiliary devices was constructed and programmed to perform the preparation of final product samples (UO 3 ) for accountability and impurity specification analyses. These procedures require sample dissolution, dilution, and liquid-liquid extraction steps. The results of an in-depth evaluation of all system components will be presented

  1. The optimal sampling of outsourcing product

    International Nuclear Information System (INIS)

    Yang Chao; Pei Jiacheng

    2014-01-01

    In order to improve quality and cost, the sampling c = 0 has been introduced to the inspection of outsourcing product. According to the current quality level (p = 0.4%), we confirmed the optimal sampling that is: Ac = 0; if N ≤ 3000, n = 55; 3001 ≤ N ≤ 10000, n = 86; N ≥ 10001, n = 108. Through analyzing the OC curve, we came to the conclusion that when N ≤ 3000, the protective ability of optimal sampling for product quality is stronger than current sampling. Corresponding to the same 'consumer risk', the product quality of optimal sampling is superior to current sampling. (authors)

  2. Lunar sample studies

    International Nuclear Information System (INIS)

    1977-01-01

    Lunar samples discussed and the nature of their analyses are: (1) an Apollo 15 breccia which is thoroughly analyzed as to the nature of the mature regolith from which it derived and the time and nature of the lithification process, (2) two Apollo 11 and one Apollo 12 basalts analyzed in terms of chemistry, Cross-Iddings-Pirsson-Washington norms, mineralogy, and petrography, (3) eight Apollo 17 mare basalts, also analyzed in terms of chemistry, Cross-Iddings-Pirsson-Washington norms, mineralogy, and petrography. The first seven are shown to be chemically similar although of two main textural groups; the eighth is seen to be distinct in both chemistry and mineralogy, (4) a troctolitic clast from a Fra Mauro breccia, analyzed and contrasted with other high-temperature lunar mineral assemblages. Two basaltic clasts from the same breccia are shown to have affinities with rock 14053, and (5) the uranium-thorium-lead systematics of three Apollo 16 samples are determined; serious terrestrial-lead contamination of the first two samples is attributed to bandsaw cutting in the lunar curatorial facility

  3. Bottom sample taker

    Energy Technology Data Exchange (ETDEWEB)

    Garbarenko, O V; Slonimskiy, L D

    1982-01-01

    In order to improve the quality of the samples taken during offshore exploration from benthic sediments, the proposed design of the sample taker has a device which makes it possible to regulate the depth of submersion of the core lifter. For this purpose the upper part of the core lifter has an inner delimiting ring, and within the core lifter there is a piston suspended on a cable. The position of the piston in relation to the core lifter is previously assigned depending on the compactness of the benthic sediments and is fixed by tension of the cable which is held by a clamp in the cover of the core taker housing. When lowered to the bottom, the core taker is released, and under the influence of hydrostatic pressure of sea water, it enters the sediments. The magnitude of penetration is limited by the distance between the piston and the stopping ring. The piston also guarantees better preservation of the sample when the instrument is lifted to the surface.

  4. Sample-taking apparatus

    Energy Technology Data Exchange (ETDEWEB)

    Tanov, Y I; Ismailov, R A; Orazov, A

    1980-10-07

    The invention refers to the equipment for testing water-bearing levels in loose rocks. Its purpose is to simultaneously remove with the rock sample a separate fluid sample from the assigned interval. The sample-taking apparatus contains a core lifter which can be submerged into the casting string with housing and front endpiece in the form of a rod with a piston which covers the cavity of the core lifter, as well as mechanism for fixing and moving the endpiece within the core lifter cavity. The device differs from the known similar devices because the upper part of the housing of the core lifter is equipped with a filter and mobile casting which covers the filter. In this case the casing is connected to the endpiece rod and the endpiece is installed with the possibility of movement which is limited with fixing in the upper position and in the extreme upper position it divides the core lifter cavity into two parts, filter settling tank and core-receiving cavity.

  5. CHOMIK -Sampling Device of Penetrating Type for Russian Phobos Sample Return Mission

    Science.gov (United States)

    Seweryn, Karol; Grygorczuk, Jerzy; Rickmann, Hans; Morawski, Marek; Aleksashkin, Sergey; Banaszkiewicz, Marek; Drogosz, Michal; Gurgurewicz, Joanna; Kozlov, Oleg E.; Krolikowska-Soltan, Malgorzata; Sutugin, Sergiej E.; Wawrzaszek, Roman; Wisniewski, Lukasz; Zakharov, Alexander

    Measurements of physical properties of planetary bodies allow to determine many important parameters for scientists working in different fields of research. For example effective heat conductivity of the regolith can help with better understanding of processes occurring in the body interior. Chemical and mineralogical composition gives us a chance to better understand the origin and evolution of the moons. In principle such parameters of the planetary bodies can be determined based on three different measurement techniques: (i) in situ measurements (ii) measurements of the samples in laboratory conditions at the Earth and (iii) remote sensing measurements. Scientific missions which allow us to perform all type of measurements, give us a chance for not only parameters determination but also cross calibration of the instruments. Russian Phobos Sample Return (PhSR) mission is one of few which allows for all type of such measurements. The spacecraft will be equipped with remote sensing instruments like: spectrometers, long wave radar and dust counter, instruments for in-situ measurements -gas-chromatograph, seismometer, thermodetector and others and also robotic arm and sampling device. PhSR mission will be launched in November 2011 on board of a launch vehicle Zenit. About a year later (11 months) the vehicle will reach the Martian orbit. It is anticipated that it will land on Phobos in the beginning of 2013. A take off back will take place a month later and the re-entry module containing a capsule that will hold the soil sample enclosed in a container will be on its way back to Earth. The 11 kg re-entry capsule with the container will land in Kazakhstan in mid-2014. A unique geological penetrator CHOMIK dedicated for the Phobos Sample Return space mis-sion will be designed and manufactured at the Space Mechatronics and Robotics Laboratory, Space Research Centre Polish Academy of Sciences (SRC PAS) in Warsaw. Functionally CHOMIK is based on the well known MUPUS

  6. Computer Graphics Simulations of Sampling Distributions.

    Science.gov (United States)

    Gordon, Florence S.; Gordon, Sheldon P.

    1989-01-01

    Describes the use of computer graphics simulations to enhance student understanding of sampling distributions that arise in introductory statistics. Highlights include the distribution of sample proportions, the distribution of the difference of sample means, the distribution of the difference of sample proportions, and the distribution of sample…

  7. Design-based Sample and Probability Law-Assumed Sample: Their Role in Scientific Investigation.

    Science.gov (United States)

    Ojeda, Mario Miguel; Sahai, Hardeo

    2002-01-01

    Discusses some key statistical concepts in probabilistic and non-probabilistic sampling to provide an overview for understanding the inference process. Suggests a statistical model constituting the basis of statistical inference and provides a brief review of the finite population descriptive inference and a quota sampling inferential theory.…

  8. Chapter 12. Sampling and analytical methods

    International Nuclear Information System (INIS)

    Busenberg, E.; Plummer, L.N.; Cook, P.G.; Solomon, D.K.; Han, L.F.; Groening, M.; Oster, H.

    2006-01-01

    When water samples are taken for the analysis of CFCs, regardless of the sampling method used, contamination of samples by contact with atmospheric air (with its 'high' CFC concentrations) is a major concern. This is because groundwaters usually have lower CFC concentrations than those waters which have been exposed to the modern air. Some groundwaters might not contain CFCs and, therefore, are most sensitive to trace contamination by atmospheric air. Thus, extreme precautions are needed to obtain uncontaminated samples when groundwaters, particularly those with older ages, are sampled. It is recommended at the start of any CFC investigation that samples from a CFC-free source be collected and analysed, as a check upon the sampling equipment and methodology. The CFC-free source might be a deep monitoring well or, alternatively, CFC-free water could be carefully prepared in the laboratory. It is especially important that all tubing, pumps and connection that will be used in the sampling campaign be checked in this manner

  9. Subrandom methods for multidimensional nonuniform sampling.

    Science.gov (United States)

    Worley, Bradley

    2016-08-01

    Methods of nonuniform sampling that utilize pseudorandom number sequences to select points from a weighted Nyquist grid are commonplace in biomolecular NMR studies, due to the beneficial incoherence introduced by pseudorandom sampling. However, these methods require the specification of a non-arbitrary seed number in order to initialize a pseudorandom number generator. Because the performance of pseudorandom sampling schedules can substantially vary based on seed number, this can complicate the task of routine data collection. Approaches such as jittered sampling and stochastic gap sampling are effective at reducing random seed dependence of nonuniform sampling schedules, but still require the specification of a seed number. This work formalizes the use of subrandom number sequences in nonuniform sampling as a means of seed-independent sampling, and compares the performance of three subrandom methods to their pseudorandom counterparts using commonly applied schedule performance metrics. Reconstruction results using experimental datasets are also provided to validate claims made using these performance metrics. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. PIXE analysis of thin samples

    International Nuclear Information System (INIS)

    Kiss, Ildiko; Koltay, Ede; Szabo, Gyula; Laszlo, S.; Meszaros, A.

    1985-01-01

    Particle-induced X-ray emission (PIXE) multielemental analysis of thin film samples are reported. Calibration methods of K and L X-lines are discussed. Application of PIXE analysis to aerosol monitoring, multielement aerosol analysis is described. Results of PIXE analysis of samples from two locations in Hungary are compared with the results of aerosol samples from Scandinavia and the USA. (D.Gy.)

  11. BWIP-RANDOM-SAMPLING, Random Sample Generation for Nuclear Waste Disposal

    International Nuclear Information System (INIS)

    Sagar, B.

    1989-01-01

    1 - Description of program or function: Random samples for different distribution types are generated. Distribution types as required for performance assessment modeling of geologic nuclear waste disposal are provided. These are: - Uniform, - Log-uniform (base 10 or natural), - Normal, - Lognormal (base 10 or natural), - Exponential, - Bernoulli, - User defined continuous distribution. 2 - Method of solution: A linear congruential generator is used for uniform random numbers. A set of functions is used to transform the uniform distribution to the other distributions. Stratified, rather than random, sampling can be chosen. Truncated limits can be specified on many distributions, whose usual definition has an infinite support. 3 - Restrictions on the complexity of the problem: Generation of correlated random variables is not included

  12. Characterization of Vadose Zone Sediment: Uncontaminated RCRA Borehole Core Samples and Composite Samples

    International Nuclear Information System (INIS)

    Serne, R. Jeffrey; Bjornstad, Bruce N.; Schaef, Herbert T.; Williams, Bruce A.; Lanigan, David C.; Horton, Duane G.; Clayton, Ray E.; Mitroshkov, Alexandre V.; Legore, Virginia L.; O'Hara, Matthew J.; Brown, Christopher F.; Parker, Kent E.; Kutnyakov, Igor V.; Serne, Jennifer N.; Last, George V.; Smith, Steven C.; Lindenmeier, Clark W.; Zachara, John M.; Burke, Deborah Sd.

    2001-01-01

    The overall goal of the of the Tank Farm Vadose Zone Project, led by CH2M HILL Hanford Group, Inc., is to define risks from past and future single-shell tank farm activities. To meet this goal, CH2M HILL Hanford Group, Inc. asked scientists from Pacific Northwest National Laboratory to perform detailed analyses on vadose zone sediment from within the S-SX Waste Management Area. This report is the first in a series of four reports to present the results of these analyses. Specifically, this report contains all the geologic, geochemical, and selected physical characterization data collected on vadose zone sediment recovered from RCRA borehole bore samples and composite samples. Intact cores from two RCRA boreholes (299-W22-48 and 299-W22-50) near the SX Tank Farm and four, large-quantity grab samples from outcrop sediment on and off the Hanford Site were sampled to better understand the fate of contaminants in the vadose zone beneath underground storage tanks at the Hanford Site. Borehole and outcrop samples analyzed for this report are located outside the tank farms, and therefore may be considered standard or background samples from which to compare contaminated sediments within the tank farms themselves. This report presents our interpretation of the physical, chemical, and mineralogical properties of the uncontaminated vadose zone sediments, and variations in the vertical distribution of these properties. The information presented in this report is intended to support preparation of the S-SX Field Investigation Report to be prepared by CH2M Hill Hanford Group, Inc. as well as future remediation actions at the S-SX Tank Farm

  13. Sampling Methodologies for Epidemiologic Surveillance of Men Who Have Sex with Men and Transgender Women in Latin America: An Empiric Comparison of Convenience Sampling, Time Space Sampling, and Respondent Driven Sampling

    Science.gov (United States)

    Clark, J. L.; Konda, K. A.; Silva-Santisteban, A.; Peinado, J.; Lama, J. R.; Kusunoki, L.; Perez-Brumer, A.; Pun, M.; Cabello, R.; Sebastian, J. L.; Suarez-Ognio, L.; Sanchez, J.

    2014-01-01

    Alternatives to convenience sampling (CS) are needed for HIV/STI surveillance of most-at-risk populations in Latin America. We compared CS, time space sampling (TSS), and respondent driven sampling (RDS) for recruitment of men who have sex with men (MSM) and transgender women (TW) in Lima, Peru. During concurrent 60-day periods from June–August, 2011, we recruited MSM/TW for epidemiologic surveillance using CS, TSS, and RDS. A total of 748 participants were recruited through CS, 233 through TSS, and 127 through RDS. The TSS sample included the largest proportion of TW (30.7 %) and the lowest percentage of subjects who had previously participated in HIV/STI research (14.9 %). The prevalence of newly diagnosed HIV infection, according to participants’ self-reported previous HIV diagnosis, was highest among TSS recruits (17.9 %) compared with RDS (12.6 %) and CS (10.2 %). TSS identified diverse populations of MSM/TW with higher prevalences of HIV/STIs not accessed by other methods. PMID:24362754

  14. Sampling methodologies for epidemiologic surveillance of men who have sex with men and transgender women in Latin America: an empiric comparison of convenience sampling, time space sampling, and respondent driven sampling.

    Science.gov (United States)

    Clark, J L; Konda, K A; Silva-Santisteban, A; Peinado, J; Lama, J R; Kusunoki, L; Perez-Brumer, A; Pun, M; Cabello, R; Sebastian, J L; Suarez-Ognio, L; Sanchez, J

    2014-12-01

    Alternatives to convenience sampling (CS) are needed for HIV/STI surveillance of most-at-risk populations in Latin America. We compared CS, time space sampling (TSS), and respondent driven sampling (RDS) for recruitment of men who have sex with men (MSM) and transgender women (TW) in Lima, Peru. During concurrent 60-day periods from June-August, 2011, we recruited MSM/TW for epidemiologic surveillance using CS, TSS, and RDS. A total of 748 participants were recruited through CS, 233 through TSS, and 127 through RDS. The TSS sample included the largest proportion of TW (30.7 %) and the lowest percentage of subjects who had previously participated in HIV/STI research (14.9 %). The prevalence of newly diagnosed HIV infection, according to participants' self-reported previous HIV diagnosis, was highest among TSS recruits (17.9 %) compared with RDS (12.6 %) and CS (10.2 %). TSS identified diverse populations of MSM/TW with higher prevalences of HIV/STIs not accessed by other methods.

  15. On Invertible Sampling and Adaptive Security

    DEFF Research Database (Denmark)

    Ishai, Yuval; Kumarasubramanian, Abishek; Orlandi, Claudio

    2011-01-01

    functionalities was left open. We provide the first convincing evidence that the answer to this question is negative, namely that some (randomized) functionalities cannot be realized with adaptive security. We obtain this result by studying the following related invertible sampling problem: given an efficient...... sampling algorithm A, obtain another sampling algorithm B such that the output of B is computationally indistinguishable from the output of A, but B can be efficiently inverted (even if A cannot). This invertible sampling problem is independently motivated by other cryptographic applications. We show......, under strong but well studied assumptions, that there exist efficient sampling algorithms A for which invertible sampling as above is impossible. At the same time, we show that a general feasibility result for adaptively secure MPC implies that invertible sampling is possible for every A, thereby...

  16. 40 CFR 761.323 - Sample preparation.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sample preparation. 761.323 Section... Remediation Waste Samples § 761.323 Sample preparation. (a) The comparison study requires analysis of a... concentrations by dilution. Any excess material resulting from the preparation of these samples, which is not...

  17. 30 CFR 90.208 - Bimonthly sampling.

    Science.gov (United States)

    2010-07-01

    ... MANDATORY HEALTH STANDARDS-COAL MINERS WHO HAVE EVIDENCE OF THE DEVELOPMENT OF PNEUMOCONIOSIS Sampling Procedures § 90.208 Bimonthly sampling. (a) Each operator shall take one valid respirable dust sample for... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Bimonthly sampling. 90.208 Section 90.208...

  18. 30 CFR 90.207 - Compliance sampling.

    Science.gov (United States)

    2010-07-01

    ... MANDATORY HEALTH STANDARDS-COAL MINERS WHO HAVE EVIDENCE OF THE DEVELOPMENT OF PNEUMOCONIOSIS Sampling Procedures § 90.207 Compliance sampling. (a) The operator shall take five valid respirable dust samples for... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Compliance sampling. 90.207 Section 90.207...

  19. Special nuclear material inventory sampling plans

    International Nuclear Information System (INIS)

    Vaccaro, H.S.; Goldman, A.S.

    1987-01-01

    This paper presents improved procedures for obtaining statistically valid sampling plans for nuclear facilities. The double sampling concept and methods for developing optimal double sampling plans are described. An algorithm is described that is satisfactory for finding optimal double sampling plans and choosing appropriate detection and false alarm probabilities

  20. Sample container for neutron activation analysis

    International Nuclear Information System (INIS)

    Lersmacher, B.; Verheijke, M.L.; Jaspers, H.J.

    1983-01-01

    The sample container avoids contaminating the sample substance by diffusion of foreign matter from the wall of the sample container into the sample. It cannot be activated, so that the results of measurements are not falsified by a radioactive container wall. It consists of solid carbon. (orig./HP) [de

  1. Simple street tree sampling

    Science.gov (United States)

    David J. Nowak; Jeffrey T. Walton; James Baldwin; Jerry. Bond

    2015-01-01

    Information on street trees is critical for management of this important resource. Sampling of street tree populations provides an efficient means to obtain street tree population information. Long-term repeat measures of street tree samples supply additional information on street tree changes and can be used to report damages from catastrophic events. Analyses of...

  2. Rational Learning and Information Sampling: On the "Naivety" Assumption in Sampling Explanations of Judgment Biases

    Science.gov (United States)

    Le Mens, Gael; Denrell, Jerker

    2011-01-01

    Recent research has argued that several well-known judgment biases may be due to biases in the available information sample rather than to biased information processing. Most of these sample-based explanations assume that decision makers are "naive": They are not aware of the biases in the available information sample and do not correct for them.…

  3. A Bayesian Method for Weighted Sampling

    OpenAIRE

    Lo, Albert Y.

    1993-01-01

    Bayesian statistical inference for sampling from weighted distribution models is studied. Small-sample Bayesian bootstrap clone (BBC) approximations to the posterior distribution are discussed. A second-order property for the BBC in unweighted i.i.d. sampling is given. A consequence is that BBC approximations to a posterior distribution of the mean and to the sampling distribution of the sample average, can be made asymptotically accurate by a proper choice of the random variables that genera...

  4. On-capillary sample cleanup method for the electrophoretic determination of carbohydrates in juice samples.

    Science.gov (United States)

    Morales-Cid, Gabriel; Simonet, Bartolomé M; Cárdenas, Soledad; Valcárcel, Miguel

    2007-05-01

    On many occasions, sample treatment is a critical step in electrophoretic analysis. As an alternative to batch procedures, in this work, a new strategy is presented with a view to develop an on-capillary sample cleanup method. This strategy is based on the partial filling of the capillary with carboxylated single-walled carbon nanotube (c-SWNT). The nanoparticles retain interferences from the matrix allowing the determination and quantification of carbohydrates (viz glucose, maltose and fructose). The precision of the method for the analysis of real samples ranged from 5.3 to 6.4%. The proposed method was compared with a method based on a batch filtration of the juice sample through diatomaceous earth and further electrophoretic determination. This method was also validated in this work. The RSD for this other method ranged from 5.1 to 6%. The results obtained by both methods were statistically comparable demonstrating the accuracy of the proposed methods and their effectiveness. Electrophoretic separation of carbohydrates was achieved using 200 mM borate solution as a buffer at pH 9.5 and applying 15 kV. During separation, the capillary temperature was kept constant at 40 degrees C. For the on-capillary cleanup method, a solution containing 50 mg/L of c-SWNTs prepared in 300 mM borate solution at pH 9.5 was introduced for 60 s into the capillary just before sample introduction. For the electrophoretic analysis of samples cleaned in batch with diatomaceous earth, it is also recommended to introduce into the capillary, just before the sample, a 300 mM borate solution as it enhances the sensitivity and electrophoretic resolution.

  5. Sample Preparation Report of the Fourth OPCW Confidence Building Exercise on Biomedical Sample Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Udey, R. N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Corzett, T. H. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Alcaraz, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-07-03

    Following the successful completion of the 3rd biomedical confidence building exercise (February 2013 – March 2013), which included the analysis of plasma and urine samples spiked at low ppb levels as part of the exercise scenario, another confidence building exercise was targeted to be conducted in 2014. In this 4th exercise, it was desired to focus specifically on the analysis of plasma samples. The scenario was designed as an investigation of an alleged use of chemical weapons where plasma samples were collected, as plasma has been reported to contain CWA adducts which remain present in the human body for several weeks (Solano et al. 2008). In the 3rd exercise most participants used the fluoride regeneration method to analyze for the presence of nerve agents in plasma samples. For the 4th biomedical exercise it was decided to evaluate the analysis of human plasma samples for the presence/absence of the VX adducts and aged adducts to blood proteins (e.g., VX-butyrylcholinesterase (BuChE) and aged BuChE adducts using a pepsin digest technique to yield nonapeptides; or equivalent). As the aging of VX-BuChE adducts is relatively slow (t1/2 = 77 hr at 37 °C [Aurbek et al. 2009]), soman (GD), which ages much more quickly (t1/2 = 9 min at 37 °C [Masson et al. 2010]), was used to simulate an aged VX sample. Additional objectives of this exercise included having laboratories assess novel OP-adducted plasma sample preparation techniques and analytical instrumentation methodologies, as well as refining/designating the reporting formats for these new techniques.

  6. Nomogram for sample size calculation on a straightforward basis for the kappa statistic.

    Science.gov (United States)

    Hong, Hyunsook; Choi, Yunhee; Hahn, Seokyung; Park, Sue Kyung; Park, Byung-Joo

    2014-09-01

    Kappa is a widely used measure of agreement. However, it may not be straightforward in some situation such as sample size calculation due to the kappa paradox: high agreement but low kappa. Hence, it seems reasonable in sample size calculation that the level of agreement under a certain marginal prevalence is considered in terms of a simple proportion of agreement rather than a kappa value. Therefore, sample size formulae and nomograms using a simple proportion of agreement rather than a kappa under certain marginal prevalences are proposed. A sample size formula was derived using the kappa statistic under the common correlation model and goodness-of-fit statistic. The nomogram for the sample size formula was developed using SAS 9.3. The sample size formulae using a simple proportion of agreement instead of a kappa statistic and nomograms to eliminate the inconvenience of using a mathematical formula were produced. A nomogram for sample size calculation with a simple proportion of agreement should be useful in the planning stages when the focus of interest is on testing the hypothesis of interobserver agreement involving two raters and nominal outcome measures. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. Sampling a guide for internal auditors

    CERN Document Server

    Apostolou, Barbara

    2004-01-01

    While it is possible to examine 100 percent of an audit customer's data, the time and cost associated with such a study are often prohibitive. To obtain sufficient, reliable, and relevant information with a limited data set, sampling is an efficient and effective tool. It can help you evaluate the customer's assertions, as well as reach audit conclusions and provide reasonable assurance to your organization. This handbook will help you understand sampling. It also serves as a guide for auditors and students preparing for certification. Topics include: An overview of sampling. Statistical and nonstatistical sampling issues. Sampling selection methods and risks. The pros and cons of popular sampling plans.

  8. Consensus of heterogeneous multi-agent systems based on sampled data with a small sampling delay

    International Nuclear Information System (INIS)

    Wang Na; Wu Zhi-Hai; Peng Li

    2014-01-01

    In this paper, consensus problems of heterogeneous multi-agent systems based on sampled data with a small sampling delay are considered. First, a consensus protocol based on sampled data with a small sampling delay for heterogeneous multi-agent systems is proposed. Then, the algebra graph theory, the matrix method, the stability theory of linear systems, and some other techniques are employed to derive the necessary and sufficient conditions guaranteeing heterogeneous multi-agent systems to asymptotically achieve the stationary consensus. Finally, simulations are performed to demonstrate the correctness of the theoretical results. (interdisciplinary physics and related areas of science and technology)

  9. Using silver nano particles for sampling of toxic mercury vapors from industrial air sample

    Directory of Open Access Journals (Sweden)

    M. Osanloo

    2014-05-01

    .Conclusion: The presented adsorbent is very useful for sampling of the trace amounts of mercury vapors from air. Moreover, it can be regenerated easily is suitable or sampling at 25 to 70 °C. Due to oxidation of silver and reduction in uptake of nanoparticles, oven temperature of 245 °C is used for the recovery of metallic silver. Low amount of adsorbent, high absorbency, high repeatability for sampling, low cost and high accuracy are of the advantages of the presented method.

  10. Optimum sample size allocation to minimize cost or maximize power for the two-sample trimmed mean test.

    Science.gov (United States)

    Guo, Jiin-Huarng; Luh, Wei-Ming

    2009-05-01

    When planning a study, sample size determination is one of the most important tasks facing the researcher. The size will depend on the purpose of the study, the cost limitations, and the nature of the data. By specifying the standard deviation ratio and/or the sample size ratio, the present study considers the problem of heterogeneous variances and non-normality for Yuen's two-group test and develops sample size formulas to minimize the total cost or maximize the power of the test. For a given power, the sample size allocation ratio can be manipulated so that the proposed formulas can minimize the total cost, the total sample size, or the sum of total sample size and total cost. On the other hand, for a given total cost, the optimum sample size allocation ratio can maximize the statistical power of the test. After the sample size is determined, the present simulation applies Yuen's test to the sample generated, and then the procedure is validated in terms of Type I errors and power. Simulation results show that the proposed formulas can control Type I errors and achieve the desired power under the various conditions specified. Finally, the implications for determining sample sizes in experimental studies and future research are discussed.

  11. A recommended procedure for establishing the source level relationships between heroin case samples of unknown origins

    Directory of Open Access Journals (Sweden)

    Kar-Weng Chan

    2014-06-01

    Full Text Available A recent concern of how to reliably establish the source level relationships of heroin case samples is addressed in this paper. Twenty-two trafficking heroin case samples of unknown origins seized from two major regions (Kuala Lumpur and Penang in Malaysia were studied. A procedure containing six major steps was followed to analyze and classify these samples. Subsequently, with the aid of statistical control samples, reliability of the clustering result was assessed. The final outcome reveals that the samples seized from the two regions in 2013 had highly likely originated from two different sources. Hence, the six-step procedure is sufficient for any chemist who attempts to assess the relative source level relationships of heroin samples.

  12. Monolith Chromatography as Sample Preparation Step in Virome Studies of Water Samples.

    Science.gov (United States)

    Gutiérrez-Aguirre, Ion; Kutnjak, Denis; Rački, Nejc; Rupar, Matevž; Ravnikar, Maja

    2018-01-01

    Viruses exist in aquatic media and many of them use this media as transmission route. Next-generation sequencing (NGS) technologies have opened new doors in virus research, allowing also to reveal a hidden diversity of viral species in aquatic environments. Not surprisingly, many of the newly discovered viruses are found in environmental fresh and marine waters. One of the problems in virome research can be the low amount of viral nucleic acids present in the sample in contrast to the background ones (host, eukaryotic, prokaryotic, environmental). Therefore, virus enrichment prior to NGS is necessary in many cases. In water samples, an added problem resides in the low concentration of viruses typically present in aquatic media. Different concentration strategies have been used to overcome such limitations. CIM monoliths are a new generation of chromatographic supports that due to their particular structural characteristics are very efficient in concentration and purification of viruses. In this chapter, we describe the use of CIM monolithic chromatography for sample preparation step in NGS studies targeting viruses in fresh or marine water. The step-by-step protocol will include a case study where CIM concentration was used to study the virome of a wastewater sample using NGS.

  13. Solvent Hold Tank Sample Results for MCU-16-934-935-936: June 2016 Monthly Sample

    Energy Technology Data Exchange (ETDEWEB)

    Fondeur, F. F. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Jones, D. H. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2016-08-30

    Savannah River National Laboratory (SRNL) received one set of Solvent Hold Tank (SHT) samples (MCU-16-934-935-936), pulled on 07/01/2016 for analysis. The samples were combined and analyzed for composition. Analysis of the composite sample MCU-16-934-935-936 indicated the Isopar™L concentration is above its nominal level (101%). The modifier (CS-7SB) and the TiDG concentrations are 8% and 29 % below their nominal concentrations. This analysis confirms the solvent may require the addition of TiDG, and possibly of modifier. Based on the current monthly sample, the levels of TiDG, Isopar™L, MaxCalix, and modifier are sufficient for continuing operation but are expected to decrease with time. Periodic characterization and trimming additions to the solvent are recommended. No impurities above the 1000 ppm level were found in this solvent by the Semi-Volatile Organic Analysis (SVOA). No impurities were observed in the Hydrogen Nuclear Magnetic Resonance (HNMR). However, up to 21.1 ± 4 micrograms of mercury per gram of solvent (or 17.5 μg/mL) was detected in this sample (as determined by the XRF method of undigested sample). The current gamma level (1.41E5 dpm/mL) confirmed that the gamma concentration has returned to previous levels (as observed in the late 2015 samples) where the process operated normally and as expected.

  14. 40 CFR 141.702 - Sampling schedules.

    Science.gov (United States)

    2010-07-01

    ... serving at least 10,000 people must submit their sampling schedule for the initial round of source water... submitting the sampling schedule that EPA approves. (3) Systems serving fewer than 10,000 people must submit... analytical result for a scheduled sampling date due to equipment failure, loss of or damage to the sample...

  15. 7 CFR 51.17 - Official sampling.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Official sampling. 51.17 Section 51.17 Agriculture... Inspection Service § 51.17 Official sampling. Samples may be officially drawn by any duly authorized... time and place of the sampling and the brands or other identifying marks of the containers from which...

  16. 40 CFR 61.34 - Air sampling.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 8 2010-07-01 2010-07-01 false Air sampling. 61.34 Section 61.34... sampling. (a) Stationary sources subject to § 61.32(b) shall locate air sampling sites in accordance with a... concentrations calculated within 30 days after filters are collected. Records of concentrations at all sampling...

  17. Credit in Acceptance Sampling on Attributes

    NARCIS (Netherlands)

    Klaassen, Chris A.J.

    2000-01-01

    Credit is introduced in acceptance sampling on attributes and a Credit Based Acceptance sampling system is developed that is very easy to apply in practice.The credit of a producer is defined as the total number of items accepted since the last rejection.In our sampling system the sample size for a

  18. Calibration samples for accelerator mass spectrometry

    International Nuclear Information System (INIS)

    Hershberger, R.L.; Flynn, D.S.; Gabbard, F.

    1981-01-01

    Radioactive samples with precisely known numbers of atoms are useful as calibration sources for lifetime measurements using accelerator mass spectrometry. Such samples can be obtained in two ways: either by measuring the production rate as the sample is created or by measuring the decay rate after the sample has been obtained. The latter method requires that a large sample be produced and that the decay constant be accurately known. The former method is a useful and independent alternative, especially when the decay constant is not well known. The facilities at the University of Kentucky for precision measurements of total neutron production cross sections offer a source of such calibration samples. The possibilities, while quite extensive, would be limited to the proton rich side of the line of stability because of the use of (p,n) and (α,n) reactions for sample production

  19. Including Below Detection Limit Samples in Post Decommissioning Soil Sample Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jung Hwan; Yim, Man Sung [KAIST, Daejeon (Korea, Republic of)

    2016-05-15

    To meet the required standards the site owner has to show that the soil at the facility has been sufficiently cleaned up. To do this one must know the contamination of the soil at the site prior to clean up. This involves sampling that soil to identify the degree of contamination. However there is a technical difficulty in determining how much decontamination should be done. The problem arises when measured samples are below the detection limit. Regulatory guidelines for site reuse after decommissioning are commonly challenged because the majority of the activity in the soil at or below the limit of detection. Using additional statistical analyses of contaminated soil after decommissioning is expected to have the following advantages: a better and more reliable probabilistic exposure assessment, better economics (lower project costs) and improved communication with the public. This research will develop an approach that defines an acceptable method for demonstrating compliance of decommissioned NPP sites and validates that compliance. Soil samples from NPP often contain censored data. Conventional methods for dealing with censored data sets are statistically biased and limited in their usefulness.

  20. Sample design effects in landscape genetics

    Science.gov (United States)

    Oyler-McCance, Sara J.; Fedy, Bradley C.; Landguth, Erin L.

    2012-01-01

    An important research gap in landscape genetics is the impact of different field sampling designs on the ability to detect the effects of landscape pattern on gene flow. We evaluated how five different sampling regimes (random, linear, systematic, cluster, and single study site) affected the probability of correctly identifying the generating landscape process of population structure. Sampling regimes were chosen to represent a suite of designs common in field studies. We used genetic data generated from a spatially-explicit, individual-based program and simulated gene flow in a continuous population across a landscape with gradual spatial changes in resistance to movement. Additionally, we evaluated the sampling regimes using realistic and obtainable number of loci (10 and 20), number of alleles per locus (5 and 10), number of individuals sampled (10-300), and generational time after the landscape was introduced (20 and 400). For a simulated continuously distributed species, we found that random, linear, and systematic sampling regimes performed well with high sample sizes (>200), levels of polymorphism (10 alleles per locus), and number of molecular markers (20). The cluster and single study site sampling regimes were not able to correctly identify the generating process under any conditions and thus, are not advisable strategies for scenarios similar to our simulations. Our research emphasizes the importance of sampling data at ecologically appropriate spatial and temporal scales and suggests careful consideration for sampling near landscape components that are likely to most influence the genetic structure of the species. In addition, simulating sampling designs a priori could help guide filed data collection efforts.

  1. Procedures for sampling radium-contaminated soils

    International Nuclear Information System (INIS)

    Fleischhauer, H.L.

    1985-10-01

    Two procedures for sampling the surface layer (0 to 15 centimeters) of radium-contaminated soil are recommended for use in remedial action projects. Both procedures adhere to the philosophy that soil samples should have constant geometry and constant volume in order to ensure uniformity. In the first procedure, a ''cookie cutter'' fashioned from pipe or steel plate, is driven to the desired depth by means of a slide hammer, and the sample extracted as a core or plug. The second procedure requires use of a template to outline the sampling area, from which the sample is obtained using a trowel or spoon. Sampling to the desired depth must then be performed incrementally. Selection of one procedure over the other is governed primarily by soil conditions, the cookie cutter being effective in nongravelly soils, and the template procedure appropriate for use in both gravelly and nongravelly soils. In any event, a minimum sample volume of 1000 cubic centimeters is recommended. The step-by-step procedures are accompanied by a description of the minimum requirements for sample documentation. Transport of the soil samples from the field is then addressed in a discussion of the federal regulations for shipping radioactive materials. Interpretation of those regulations, particularly in light of their application to remedial action soil-sampling programs, is provided in the form of guidance and suggested procedures. Due to the complex nature of the regulations, however, there is no guarantee that our interpretations of them are complete or entirely accurate. Preparation of soil samples for radium-226 analysis by means of gamma-ray spectroscopy is described

  2. Influence of Sampling Practices on the Appearance of DNA Image Histograms of Prostate Cells in FNAB Samples

    Directory of Open Access Journals (Sweden)

    Abdelbaset Buhmeida

    1999-01-01

    Full Text Available Twenty‐one fine needle aspiration biopsies (FNAB of the prostate, diagnostically classified as definitely malignant, were studied. The Papanicolaou or H&E stained samples were destained and then stained for DNA with the Feulgen reaction. DNA cytometry was applied after different sampling rules. The histograms varied according to the sampling rule applied. Because free cells between cell groups were easier to measure than cells in the cell groups, two sampling rules were tested in all samples: (i cells in the cell groups were measured, and (ii free cells between cell groups were measured. Abnormal histograms were more common after the sampling rule based on free cells, suggesting that abnormal patterns are best revealed through the free cells in these samples. The conclusions were independent of the applied histogram interpretation method.

  3. Superfund Site Information - Site Sampling Data

    Data.gov (United States)

    U.S. Environmental Protection Agency — This asset includes Superfund site-specific sampling information including location of samples, types of samples, and analytical chemistry characteristics of...

  4. Sampling and sample preparation methods for the analysis of trace elements in biological material

    International Nuclear Information System (INIS)

    Sansoni, B.; Iyengar, V.

    1978-05-01

    The authors attempt to give a most systamtic possible treatment of the sample taking and sample preparation of biological material (particularly in human medicine) for trace analysis (e.g. neutron activation analysis, atomic absorption spectrometry). Contamination and loss problems are discussed as well as the manifold problems of the different consistency of solid and liquid biological materials, as well as the stabilization of the sample material. The process of dry and wet ashing is particularly dealt with, where new methods are also described. (RB) [de

  5. Conversion of National Health Insurance Service-National Sample Cohort (NHIS-NSC) Database into Observational Medical Outcomes Partnership-Common Data Model (OMOP-CDM).

    Science.gov (United States)

    You, Seng Chan; Lee, Seongwon; Cho, Soo-Yeon; Park, Hojun; Jung, Sungjae; Cho, Jaehyeong; Yoon, Dukyong; Park, Rae Woong

    2017-01-01

    It is increasingly necessary to generate medical evidence applicable to Asian people compared to those in Western countries. Observational Health Data Sciences a Informatics (OHDSI) is an international collaborative which aims to facilitate generating high-quality evidence via creating and applying open-source data analytic solutions to a large network of health databases across countries. We aimed to incorporate Korean nationwide cohort data into the OHDSI network by converting the national sample cohort into Observational Medical Outcomes Partnership-Common Data Model (OMOP-CDM). The data of 1.13 million subjects was converted to OMOP-CDM, resulting in average 99.1% conversion rate. The ACHILLES, open-source OMOP-CDM-based data profiling tool, was conducted on the converted database to visualize data-driven characterization and access the quality of data. The OMOP-CDM version of National Health Insurance Service-National Sample Cohort (NHIS-NSC) can be a valuable tool for multiple aspects of medical research by incorporation into the OHDSI research network.

  6. Device for sampling liquid radioactive materials

    International Nuclear Information System (INIS)

    Vlasak, L.

    1987-01-01

    Remote sampling of radioactive materials in the process of radioactive waste treatment is claimed by the Czechoslovak Patent Document 238599. The existing difficulties are eliminated consisting in a complex remote control of sampling featuring the control of sliding and rotary movements of the sampling device. The new device consists of a vertical pipe with an opening provided with a cover. A bend is provided above the opening level housing flow distributors. A sampling tray is pivoted in the cover. In sampling, the tray is tilted in the vertical pipe space while it tilts back when filled. The sample flows into a vessel below the tray. Only rotary movement is thus sufficient for controlling the tray. (Z.M.)

  7. Sample Curation at a Lunar Outpost

    Science.gov (United States)

    Allen, Carlton C.; Lofgren, Gary E.; Treiman, A. H.; Lindstrom, Marilyn L.

    2007-01-01

    The six Apollo surface missions returned 2,196 individual rock and soil samples, with a total mass of 381.6 kg. Samples were collected based on visual examination by the astronauts and consultation with geologists in the science back room in Houston. The samples were photographed during collection, packaged in uniquely-identified containers, and transported to the Lunar Module. All samples collected on the Moon were returned to Earth. NASA's upcoming return to the Moon will be different. Astronauts will have extended stays at an out-post and will collect more samples than they will return. They will need curation and analysis facilities on the Moon in order to carefully select samples for return to Earth.

  8. New Approach Based on Compressive Sampling for Sample Rate Enhancement in DASs for Low-Cost Sensing Nodes

    Directory of Open Access Journals (Sweden)

    Francesco Bonavolontà

    2014-10-01

    Full Text Available The paper deals with the problem of improving the maximum sample rate of analog-to-digital converters (ADCs included in low cost wireless sensing nodes. To this aim, the authors propose an efficient acquisition strategy based on the combined use of high-resolution time-basis and compressive sampling. In particular, the high-resolution time-basis is adopted to provide a proper sequence of random sampling instants, and a suitable software procedure, based on compressive sampling approach, is exploited to reconstruct the signal of interest from the acquired samples. Thanks to the proposed strategy, the effective sample rate of the reconstructed signal can be as high as the frequency of the considered time-basis, thus significantly improving the inherent ADC sample rate. Several tests are carried out in simulated and real conditions to assess the performance of the proposed acquisition strategy in terms of reconstruction error. In particular, the results obtained in experimental tests with ADC included in actual 8- and 32-bits microcontrollers highlight the possibility of achieving effective sample rate up to 50 times higher than that of the original ADC sample rate.

  9. Distance sampling methods and applications

    CERN Document Server

    Buckland, S T; Marques, T A; Oedekoven, C S

    2015-01-01

    In this book, the authors cover the basic methods and advances within distance sampling that are most valuable to practitioners and in ecology more broadly. This is the fourth book dedicated to distance sampling. In the decade since the last book published, there have been a number of new developments. The intervening years have also shown which advances are of most use. This self-contained book covers topics from the previous publications, while also including recent developments in method, software and application. Distance sampling refers to a suite of methods, including line and point transect sampling, in which animal density or abundance is estimated from a sample of distances to detected individuals. The book illustrates these methods through case studies; data sets and computer code are supplied to readers through the book’s accompanying website.  Some of the case studies use the software Distance, while others use R code. The book is in three parts.  The first part addresses basic methods, the ...

  10. Biological Sampling Variability Study

    Energy Technology Data Exchange (ETDEWEB)

    Amidan, Brett G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hutchison, Janine R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-11-08

    There are many sources of variability that exist in the sample collection and analysis process. This paper addresses many, but not all, sources of variability. The main focus of this paper was to better understand and estimate variability due to differences between samplers. Variability between days was also studied, as well as random variability within each sampler. Experiments were performed using multiple surface materials (ceramic and stainless steel), multiple contaminant concentrations (10 spores and 100 spores), and with and without the presence of interfering material. All testing was done with sponge sticks using 10-inch by 10-inch coupons. Bacillus atrophaeus was used as the BA surrogate. Spores were deposited using wet deposition. Grime was coated on the coupons which were planned to include the interfering material (Section 3.3). Samples were prepared and analyzed at PNNL using CDC protocol (Section 3.4) and then cultured and counted. Five samplers were trained so that samples were taken using the same protocol. Each sampler randomly sampled eight coupons each day, four coupons with 10 spores deposited and four coupons with 100 spores deposited. Each day consisted of one material being tested. The clean samples (no interfering materials) were run first, followed by the dirty samples (coated with interfering material). There was a significant difference in recovery efficiency between the coupons with 10 spores deposited (mean of 48.9%) and those with 100 spores deposited (mean of 59.8%). There was no general significant difference between the clean and dirty (containing interfering material) coupons or between the two surface materials; however, there was a significant interaction between concentration amount and presence of interfering material. The recovery efficiency was close to the same for coupons with 10 spores deposited, but for the coupons with 100 spores deposited, the recovery efficiency for the dirty samples was significantly larger (65

  11. Multielement analysis of aerosol samples by X-ray fluorescence analysis with totally reflecting sample holders

    International Nuclear Information System (INIS)

    Ketelsen, P.; Knoechel, A.

    1984-01-01

    Aerosole samples on filter support were analyzed using the X-ray flourescence analytical method (Mo excitation) with totally reflecting sample carrier (TXFA). Wet decomposition of the sample material with HNO 3 in an enclosed system and subsequent sample preparation by evaporating an aliquot of the solution on the sample carrier yields detection limits up to 0.3 ng/cm 2 . The reproducibilities of the measurements of the elements K, Ca, Ti, V, Cr, Mn, Fe, Ni, Cu, Zn, As, Se, Rb, Sr, Ba and Pb lie between 5 and 25%. Similar detection limits and reproducibilities are obtained, when low-temperature oxygen plasma is employed for the direct ashing of the homogenously covered filter on the sample carrier. For the systematic loss of elements both methods were investigated with radiotracers as well as with inactive techniques. A comparison of the results with those obtained by NAA, AAS and PIXE shows good agreement in most cases. For the bromine determination and the fast coverage of the main elements a possibility for measuring the filter membrane has been indicated, which neglects the ashing step. The corresponding detection limits are up to 3 ng/cm 2 . (orig.) [de

  12. Accounting for Diversity in Suicide Research: Sampling and Sample Reporting Practices in the United States.

    Science.gov (United States)

    Cha, Christine B; Tezanos, Katherine M; Peros, Olivia M; Ng, Mei Yi; Ribeiro, Jessica D; Nock, Matthew K; Franklin, Joseph C

    2018-04-01

    Research on suicidal thoughts and behaviors (STB) has identified many risk factors, but whether these findings generalize to diverse populations remains unclear. We review longitudinal studies on STB risk factors over the past 50 years in the United States and evaluate the methodological practices of sampling and reporting sample characteristics. We found that articles frequently reported participant age and sex, less frequently reported participant race and ethnicity, and rarely reported participant veteran status or lesbian, gay, bisexual, and transgender status. Sample reporting practices modestly and inconsistently improved over time. Finally, articles predominantly featured White, non-Hispanic, young adult samples. © 2017 The American Association of Suicidology.

  13. Sodium sampling and impurities determination

    International Nuclear Information System (INIS)

    Docekal, J.; Kovar, C.; Stuchlik, S.

    1980-01-01

    Samples may be obtained from tubes in-built in the sodium facility and further processed or they are taken into crucibles, stored and processed later. Another sampling method is a method involving vacuum distillation of sodium, thus concentrating impurities. Oxygen is determined by malgamation, distillation or vanadium balance methods. Hydrogen is determined by the metal diaphragm extraction, direct extraction or amalgamation methods. Carbon is determined using dry techniques involving burning a sodium sample at 1100 degC or using wet techniques by dissolving the sample with an acid. Trace amounts of metal impurities are determined after dissolving sodium in ethanol. The trace metals are concentrated and sodium excess is removed. (M.S.)

  14. Sample Size Calculations for Population Size Estimation Studies Using Multiplier Methods With Respondent-Driven Sampling Surveys.

    Science.gov (United States)

    Fearon, Elizabeth; Chabata, Sungai T; Thompson, Jennifer A; Cowan, Frances M; Hargreaves, James R

    2017-09-14

    While guidance exists for obtaining population size estimates using multiplier methods with respondent-driven sampling surveys, we lack specific guidance for making sample size decisions. To guide the design of multiplier method population size estimation studies using respondent-driven sampling surveys to reduce the random error around the estimate obtained. The population size estimate is obtained by dividing the number of individuals receiving a service or the number of unique objects distributed (M) by the proportion of individuals in a representative survey who report receipt of the service or object (P). We have developed an approach to sample size calculation, interpreting methods to estimate the variance around estimates obtained using multiplier methods in conjunction with research into design effects and respondent-driven sampling. We describe an application to estimate the number of female sex workers in Harare, Zimbabwe. There is high variance in estimates. Random error around the size estimate reflects uncertainty from M and P, particularly when the estimate of P in the respondent-driven sampling survey is low. As expected, sample size requirements are higher when the design effect of the survey is assumed to be greater. We suggest a method for investigating the effects of sample size on the precision of a population size estimate obtained using multipler methods and respondent-driven sampling. Uncertainty in the size estimate is high, particularly when P is small, so balancing against other potential sources of bias, we advise researchers to consider longer service attendance reference periods and to distribute more unique objects, which is likely to result in a higher estimate of P in the respondent-driven sampling survey. ©Elizabeth Fearon, Sungai T Chabata, Jennifer A Thompson, Frances M Cowan, James R Hargreaves. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 14.09.2017.

  15. Groundwater sampling: Chapter 5

    Science.gov (United States)

    Wang, Qingren; Munoz-Carpena, Rafael; Foster, Adam; Migliaccio, Kati W.; Li, Yuncong; Migliaccio, Kati

    2011-01-01

    About the book: As water quality becomes a leading concern for people and ecosystems worldwide, it must be properly assessed in order to protect water resources for current and future generations. Water Quality Concepts, Sampling, and Analyses supplies practical information for planning, conducting, or evaluating water quality monitoring programs. It presents the latest information and methodologies for water quality policy, regulation, monitoring, field measurement, laboratory analysis, and data analysis. The book addresses water quality issues, water quality regulatory development, monitoring and sampling techniques, best management practices, and laboratory methods related to the water quality of surface and ground waters. It also discusses basic concepts of water chemistry and hydrology related to water sampling and analysis; instrumentation; water quality data analysis; and evaluation and reporting results.

  16. Solvent Hold Tank Sample Results for MCU-16-701-702-703: May 2016 Monthly Sample and MCU-16-710-711-712: May 2016 Superwashed Sample

    Energy Technology Data Exchange (ETDEWEB)

    Fondeur, F. F. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Jones, D. H. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2016-08-30

    The Savannah River National Laboratory (SRNL) received one set of Solvent Hold Tank (SHT) samples (MCU-16-701, MCU-16-702 and MCU-16-703), pulled on 05/23/2016, and another set of SHT samples (MCU-16-710, MCU-16-711, and MCU-16-712) were pulled on 05/28/2016 after the solvent was superwashed with 300 mM sodium hydroxide for analysis. Samples MCU-16-701, MCU-16-702, and MCU-16-703 were combined into one sample (MCU-16-701-702-703) and samples MCU-16-710, MCU- 16-711, and MCU-16-712 were combined into one sample (MCU-16-710-711-712). Of the two composite samples MCU-16-710-711-712 represents the current chemical state of the solvent at MCU. All analytical conclusions are based on the chemical analysis of MCU-16-710-711-712. There were no chemical differences between MCU-16-701-702-703 and superwashed MCU-16-710-711-712. Analysis of the composited sample MCU-16-710-712-713 indicated the Isopar™L concentration is above its nominal level (102%). The modifier (CS-7SB) is 16% below its nominal concentration, while the TiDG and MaxCalix concentrations are at and above their nominal concentrations, respectively. The TiDG level has begun to decrease, and it is 7% below its nominal level as of May 28, 2016. Based on this current analysis, the levels of TiDG, Isopar™L, MaxCalix, and modifier are sufficient for continuing operation but are expected to decrease with time. Periodic characterization and trimming additions to the solvent are recommended.

  17. Rational Arithmetic Mathematica Functions to Evaluate the Two-Sided One Sample K-S Cumulative Sampling Distribution

    Directory of Open Access Journals (Sweden)

    J. Randall Brown

    2007-06-01

    Full Text Available One of the most widely used goodness-of-fit tests is the two-sided one sample Kolmogorov-Smirnov (K-S test which has been implemented by many computer statistical software packages. To calculate a two-sided p value (evaluate the cumulative sampling distribution, these packages use various methods including recursion formulae, limiting distributions, and approximations of unknown accuracy developed over thirty years ago. Based on an extensive literature search for the two-sided one sample K-S test, this paper identifies an exact formula for sample sizes up to 31, six recursion formulae, and one matrix formula that can be used to calculate a p value. To ensure accurate calculation by avoiding catastrophic cancelation and eliminating rounding error, each of these formulae is implemented in rational arithmetic. For the six recursion formulae and the matrix formula, computational experience for sample sizes up to 500 shows that computational times are increasing functions of both the sample size and the number of digits in the numerator and denominator integers of the rational number test statistic. The computational times of the seven formulae vary immensely but the Durbin recursion formula is almost always the fastest. Linear search is used to calculate the inverse of the cumulative sampling distribution (find the confidence interval half-width and tables of calculated half-widths are presented for sample sizes up to 500. Using calculated half-widths as input, computational times for the fastest formula, the Durbin recursion formula, are given for sample sizes up to two thousand.

  18. Comparing Respondent-Driven Sampling and Targeted Sampling Methods of Recruiting Injection Drug Users in San Francisco

    Science.gov (United States)

    Malekinejad, Mohsen; Vaudrey, Jason; Martinez, Alexis N.; Lorvick, Jennifer; McFarland, Willi; Raymond, H. Fisher

    2010-01-01

    The objective of this article is to compare demographic characteristics, risk behaviors, and service utilization among injection drug users (IDUs) recruited from two separate studies in San Francisco in 2005, one which used targeted sampling (TS) and the other which used respondent-driven sampling (RDS). IDUs were recruited using TS (n = 651) and RDS (n = 534) and participated in quantitative interviews that included demographic characteristics, risk behaviors, and service utilization. Prevalence estimates and 95% confidence intervals (CIs) were calculated to assess whether there were differences in these variables by sampling method. There was overlap in 95% CIs for all demographic variables except African American race (TS: 45%, 53%; RDS: 29%, 44%). Maps showed that the proportion of IDUs distributed across zip codes were similar for the TS and RDS sample, with the exception of a single zip code that was more represented in the TS sample. This zip code includes an isolated, predominantly African American neighborhood where only the TS study had a field site. Risk behavior estimates were similar for both TS and RDS samples, although self-reported hepatitis C infection was lower in the RDS sample. In terms of service utilization, more IDUs in the RDS sample reported no recent use of drug treatment and syringe exchange program services. Our study suggests that perhaps a hybrid sampling plan is best suited for recruiting IDUs in San Francisco, whereby the more intensive ethnographic and secondary analysis components of TS would aid in the planning of seed placement and field locations for RDS. PMID:20582573

  19. Failure Probability Estimation Using Asymptotic Sampling and Its Dependence upon the Selected Sampling Scheme

    Directory of Open Access Journals (Sweden)

    Martinásková Magdalena

    2017-12-01

    Full Text Available The article examines the use of Asymptotic Sampling (AS for the estimation of failure probability. The AS algorithm requires samples of multidimensional Gaussian random vectors, which may be obtained by many alternative means that influence the performance of the AS method. Several reliability problems (test functions have been selected in order to test AS with various sampling schemes: (i Monte Carlo designs; (ii LHS designs optimized using the Periodic Audze-Eglājs (PAE criterion; (iii designs prepared using Sobol’ sequences. All results are compared with the exact failure probability value.

  20. Sampling, storage and sample preparation procedures for X ray fluorescence analysis of environmental materials

    International Nuclear Information System (INIS)

    1997-06-01

    X ray fluorescence (XRF) method is one of the most commonly used nuclear analytical technique because of its multielement and non-destructive character, speed, economy and ease of operation. From the point of view of quality assurance practices, sampling and sample preparation procedures are the most crucial steps in all analytical techniques, (including X ray fluorescence) applied for the analysis of heterogeneous materials. This technical document covers recent modes of the X ray fluorescence method and recent developments in sample preparation techniques for the analysis of environmental materials. Refs, figs, tabs

  1. Core sampling system spare parts assessment

    International Nuclear Information System (INIS)

    Walter, E.J.

    1995-01-01

    Soon, there will be 4 independent core sampling systems obtaining samples from the underground tanks. It is desirable that these systems be available for sampling during the next 2 years. This assessment was prepared to evaluate the adequacy of the spare parts identified for the core sampling system and to provide recommendations that may remediate overages or inadequacies of spare parts

  2. Strategies for achieving high sequencing accuracy for low diversity samples and avoiding sample bleeding using illumina platform.

    Science.gov (United States)

    Mitra, Abhishek; Skrzypczak, Magdalena; Ginalski, Krzysztof; Rowicka, Maga

    2015-01-01

    Sequencing microRNA, reduced representation sequencing, Hi-C technology and any method requiring the use of in-house barcodes result in sequencing libraries with low initial sequence diversity. Sequencing such data on the Illumina platform typically produces low quality data due to the limitations of the Illumina cluster calling algorithm. Moreover, even in the case of diverse samples, these limitations are causing substantial inaccuracies in multiplexed sample assignment (sample bleeding). Such inaccuracies are unacceptable in clinical applications, and in some other fields (e.g. detection of rare variants). Here, we discuss how both problems with quality of low-diversity samples and sample bleeding are caused by incorrect detection of clusters on the flowcell during initial sequencing cycles. We propose simple software modifications (Long Template Protocol) that overcome this problem. We present experimental results showing that our Long Template Protocol remarkably increases data quality for low diversity samples, as compared with the standard analysis protocol; it also substantially reduces sample bleeding for all samples. For comprehensiveness, we also discuss and compare experimental results from alternative approaches to sequencing low diversity samples. First, we discuss how the low diversity problem, if caused by barcodes, can be avoided altogether at the barcode design stage. Second and third, we present modified guidelines, which are more stringent than the manufacturer's, for mixing low diversity samples with diverse samples and lowering cluster density, which in our experience consistently produces high quality data from low diversity samples. Fourth and fifth, we present rescue strategies that can be applied when sequencing results in low quality data and when there is no more biological material available. In such cases, we propose that the flowcell be re-hybridized and sequenced again using our Long Template Protocol. Alternatively, we discuss how

  3. Strategies for achieving high sequencing accuracy for low diversity samples and avoiding sample bleeding using illumina platform.

    Directory of Open Access Journals (Sweden)

    Abhishek Mitra

    Full Text Available Sequencing microRNA, reduced representation sequencing, Hi-C technology and any method requiring the use of in-house barcodes result in sequencing libraries with low initial sequence diversity. Sequencing such data on the Illumina platform typically produces low quality data due to the limitations of the Illumina cluster calling algorithm. Moreover, even in the case of diverse samples, these limitations are causing substantial inaccuracies in multiplexed sample assignment (sample bleeding. Such inaccuracies are unacceptable in clinical applications, and in some other fields (e.g. detection of rare variants. Here, we discuss how both problems with quality of low-diversity samples and sample bleeding are caused by incorrect detection of clusters on the flowcell during initial sequencing cycles. We propose simple software modifications (Long Template Protocol that overcome this problem. We present experimental results showing that our Long Template Protocol remarkably increases data quality for low diversity samples, as compared with the standard analysis protocol; it also substantially reduces sample bleeding for all samples. For comprehensiveness, we also discuss and compare experimental results from alternative approaches to sequencing low diversity samples. First, we discuss how the low diversity problem, if caused by barcodes, can be avoided altogether at the barcode design stage. Second and third, we present modified guidelines, which are more stringent than the manufacturer's, for mixing low diversity samples with diverse samples and lowering cluster density, which in our experience consistently produces high quality data from low diversity samples. Fourth and fifth, we present rescue strategies that can be applied when sequencing results in low quality data and when there is no more biological material available. In such cases, we propose that the flowcell be re-hybridized and sequenced again using our Long Template Protocol. Alternatively

  4. Teknik Sampling Snowball dalam Penelitian Lapangan

    Directory of Open Access Journals (Sweden)

    Nina Nurdiani

    2014-12-01

    Full Text Available Field research can be associated with both qualitative and quantitative research methods, depending on the problems faced and the goals to be achieved. The success of data collection in the field research depends on the determination of the appropriate sampling technique, to obtain accurate data, and reliably. In studies that have problems related to specific issues, requiring a non-probability sampling techniques one of which is the snowball sampling technique. This technique is useful for finding, identifying, selecting and taking samples in a network or chain of relationships. Phased implementation procedures performed through interviews and questionnaires. Snowball sampling technique has strengths and weaknesses in its application. Field research housing sector become the case study to explain this sampling technique.

  5. Hungry to learn: the prevalence and effects of food insecurity on health behaviors and outcomes over time among a diverse sample of university freshmen.

    Science.gov (United States)

    Bruening, Meg; van Woerden, Irene; Todd, Michael; Laska, Melissa N

    2018-01-18

    To examine longitudinal associations between food insecurity (FI) and health behaviors/outcomes among a diverse sample of university freshmen. Freshman students (n = 1138; 65% female; 49% non-white) participating in the Social impact of Physical Activity and nutRition in College study completed surveys on health behaviors and had height/weight measured up to 4 times (T1-T4) in Arizona during 2015-2016. Structural equation models were estimated to determine if, after adjusting for covariates, FI predicted concurrent behaviors/outcomes and subsequent behaviors/outcomes. Analyses reported here were conducted in 2017. The prevalence of FI was significantly higher at the end of each semester (35% and 36%, respectively) than at the start of the year (28%). Longitudinally, FI was not related to any health behaviors/outcomes at future time points. However, FI was significantly and inversely associated with concurrent breakfast consumption on most days of the week (OR = 0.67, 99% CI = 0.46, 0.99), daily evening meal consumption (OR = 0.55, 99% CI = 0.36, 0.86) healthy eating habits on campus (OR = 0.68, 99% CI = 0.46, 1.00), healthy physical activity habits on campus (OR = 0.66, 99% CI = 0.44, 1.00), and positively related to the likelihood of experiencing stress (OR = 1.69, 99% CI = 1.16, 2.46) and depressed mood (OR = 1.98, 99% CI = 1.34, 2.91). Compared with US prevalence rates, the sample FI prevalence was high. FI was related to poorer eating patterns, physical activity behaviors, and mental health, even after adjusting for prior levels of behavior.

  6. Producing standard damaged DNA samples by heating: pitfalls and suggestions.

    Science.gov (United States)

    Fattorini, Paolo; Marrubini, Giorgio; Bonin, Serena; Bertoglio, Barbara; Grignani, Pierangela; Recchia, Elisa; Pitacco, Paola; Procopio, Francesca; Cantoni, Carolina; Pajnič, Irena Zupanič; Sorçaburu-Cigliero, Solange; Previderè, Carlo

    2018-05-15

    Heat-mediated hydrolysis of DNA is a simple and inexpensive method for producing damaged samples in vitro. Despite heat-mediated DNA hydrolysis is being widely used in forensic and clinical validation procedures, the lack of standardized procedures makes it impossible to compare the intra and inter-laboratory outcomes of the damaging treatments. In this work, a systematic approach to heat induced DNA hydrolysis was performed at 70 °C for 0-18 h to test the role both of the hydrolysis buffer and of the experimental conditions. Specifically, a trial DNA sample, resuspended in three different media (ultrapure water, 0.1% DEPC-water and, respectively, TE) was treated both in Eppendorf tubes ("Protocol P") and in Eppendorf tubes provided with screwcaps ("Protocol S"). The results of these comparative tests were assessed by normalization of the qPCR results. DEPC-water increased the degradation of the samples up to about 100 times when compared to the ultrapure water. Conversely, the TE protected the DNA from degradation whose level was about 1700 times lower than in samples treated in ultrapure water. Even the employment of the "Protocol S" affected the level of degradation, by consistently increasing it (up to about 180 times in DEPC-water). Thus, this comparative approach showed that even seemingly apparently trivial and often underestimated parameters modify the degradation level up to 2-3 orders of magnitude. The chemical-physical reasons of these findings are discussed together with the role of potential factors such as enhanced reactivity of CO 2 , ROS, NO x and pressure, which are likely to be involved. Since the intra and inter-laboratory comparison of the outcomes of the hydrolytic procedure is the first step toward its standardization, the normalization of the qPCR data by the UV/qPCR ratio seems to be the simplest and most reliable way to allow this. Finally, the supplying (provided with the commercial qPCR kits) of a DNA sample whose degree of

  7. Synchrotron/crystal sample preparation

    Science.gov (United States)

    Johnson, R. Barry

    1993-01-01

    The Center for Applied Optics (CAO) of the University of Alabama in Huntsville (UAH) prepared this final report entitled 'Synchrotron/Crystal Sample Preparation' in completion of contract NAS8-38609, Delivery Order No. 53. Hughes Danbury Optical Systems (HDOS) is manufacturing the Advanced X-ray Astrophysics Facility (AXAF) mirrors. These thin-walled, grazing incidence, Wolter Type-1 mirrors, varying in diameter from 1.2 to 0.68 meters, must be ground and polished using state-of-the-art techniques in order to prevent undue stress due to damage or the presence of crystals and inclusions. The effect of crystals on the polishing and grinding process must also be understood. This involves coating special samples of Zerodur and measuring the reflectivity of the coatings in a synchrotron system. In order to gain the understanding needed on the effect of the Zerodur crystals by the grinding and polishing process, UAH prepared glass samples by cutting, grinding, etching, and polishing as required to meet specifications for witness bars for synchrotron measurements and for investigations of crystals embedded in Zerodur. UAH then characterized these samples for subsurface damage and surface roughness and figure.

  8. Liquid waste sampling device

    International Nuclear Information System (INIS)

    Kosuge, Tadashi

    1998-01-01

    A liquid pumping pressure regulator is disposed on the midway of a pressure control tube which connects the upper portion of a sampling pot and the upper portion of a liquid waste storage vessel. With such a constitution, when the pressure in the sampling pot is made negative, and liquid wastes are sucked to the liquid pumping tube passing through the sampling pot, the difference between the pressure on the entrance of the liquid pumping pressure regulator of the pressure regulating tube and the pressure at the bottom of the liquid waste storage vessel is made constant. An opening degree controlling meter is disposed to control the degree of opening of a pressure regulating valve for sending actuation pressurized air to the liquid pumping pressure regulator. Accordingly, even if the liquid level of liquid wastes in the liquid waste storage vessel is changed, the height for the suction of the liquid wastes in the liquid pumping tube can be kept constant. With such procedures, sampling can be conducted correctly, and the discharge of the liquid wastes to the outside can be prevented. (T.M.)

  9. Optimal sampling strategy for data mining

    International Nuclear Information System (INIS)

    Ghaffar, A.; Shahbaz, M.; Mahmood, W.

    2013-01-01

    Latest technology like Internet, corporate intranets, data warehouses, ERP's, satellites, digital sensors, embedded systems, mobiles networks all are generating such a massive amount of data that it is getting very difficult to analyze and understand all these data, even using data mining tools. Huge datasets are becoming a difficult challenge for classification algorithms. With increasing amounts of data, data mining algorithms are getting slower and analysis is getting less interactive. Sampling can be a solution. Using a fraction of computing resources, Sampling can often provide same level of accuracy. The process of sampling requires much care because there are many factors involved in the determination of correct sample size. The approach proposed in this paper tries to find a solution to this problem. Based on a statistical formula, after setting some parameters, it returns a sample size called s ufficient sample size , which is then selected through probability sampling. Results indicate the usefulness of this technique in coping with the problem of huge datasets. (author)

  10. Controlled sample program publication No. 1: characterization of rock samples

    International Nuclear Information System (INIS)

    Ames, L.L.

    1978-10-01

    A description is presented of the methodology used and the geologic parameters measured on several rocks which are being used in round-robin laboratory and nuclide adsorption methodology experiments. Presently investigators from various laboratories are determining nuclide distribution coefficients utilizing numerous experimental techniques. Unfortunately, it appears that often the resultant data are dependent not only on the type of groundwater and rock utilized, but also on the experimentor or method used. The Controlled Sample Program is a WISAP (Waste Isolation Safety Assessment Program) attempt to resolve the apparent method and dependencies and to identify individual experimenter's bias. The rock samples characterized in an interlaboratory Kd methodology comparison program include Westerly granite, Argillaceous shale, Oolitic limestone, Sentinel Gap basalt, Conasauga shale, Climax Stock granite, anhydrite, Magenta dolomite and Culebra dolomite. Techniques used in the characterization include whole rock chemical analysis, X-ray diffraction, optical examination, electron microprobe elemental mapping, and chemical analysis of specific mineral phases. Surface areas were determined by the B.E.T. and ethylene glycol sorption methods. Cation exchange capacities were determined with 85 Sr, but were of questionable value for the high calcium rocks. A quantitative mineralogy was also estimated for each rock. Characteristics which have the potential of strongly affecting radionuclide Kd values such as the presence of sulfides, water-soluble, pH-buffering carbonates, glass, and ferrous iron were listed for each rock sample

  11. Calculating Confidence, Uncertainty, and Numbers of Samples When Using Statistical Sampling Approaches to Characterize and Clear Contaminated Areas

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, Gregory F.; Matzke, Brett D.; Sego, Landon H.; Amidan, Brett G.

    2013-04-27

    This report discusses the methodology, formulas, and inputs needed to make characterization and clearance decisions for Bacillus anthracis-contaminated and uncontaminated (or decontaminated) areas using a statistical sampling approach. Specifically, the report includes the methods and formulas for calculating the • number of samples required to achieve a specified confidence in characterization and clearance decisions • confidence in making characterization and clearance decisions for a specified number of samples for two common statistically based environmental sampling approaches. In particular, the report addresses an issue raised by the Government Accountability Office by providing methods and formulas to calculate the confidence that a decision area is uncontaminated (or successfully decontaminated) if all samples collected according to a statistical sampling approach have negative results. Key to addressing this topic is the probability that an individual sample result is a false negative, which is commonly referred to as the false negative rate (FNR). The two statistical sampling approaches currently discussed in this report are 1) hotspot sampling to detect small isolated contaminated locations during the characterization phase, and 2) combined judgment and random (CJR) sampling during the clearance phase. Typically if contamination is widely distributed in a decision area, it will be detectable via judgment sampling during the characterization phrase. Hotspot sampling is appropriate for characterization situations where contamination is not widely distributed and may not be detected by judgment sampling. CJR sampling is appropriate during the clearance phase when it is desired to augment judgment samples with statistical (random) samples. The hotspot and CJR statistical sampling approaches are discussed in the report for four situations: 1. qualitative data (detect and non-detect) when the FNR = 0 or when using statistical sampling methods that account

  12. Pulsed Direct Current Electrospray: Enabling Systematic Analysis of Small Volume Sample by Boosting Sample Economy.

    Science.gov (United States)

    Wei, Zhenwei; Xiong, Xingchuang; Guo, Chengan; Si, Xingyu; Zhao, Yaoyao; He, Muyi; Yang, Chengdui; Xu, Wei; Tang, Fei; Fang, Xiang; Zhang, Sichun; Zhang, Xinrong

    2015-11-17

    We had developed pulsed direct current electrospray ionization mass spectrometry (pulsed-dc-ESI-MS) for systematically profiling and determining components in small volume sample. Pulsed-dc-ESI utilized constant high voltage to induce the generation of single polarity pulsed electrospray remotely. This method had significantly boosted the sample economy, so as to obtain several minutes MS signal duration from merely picoliter volume sample. The elongated MS signal duration enable us to collect abundant MS(2) information on interested components in a small volume sample for systematical analysis. This method had been successfully applied for single cell metabolomics analysis. We had obtained 2-D profile of metabolites (including exact mass and MS(2) data) from single plant and mammalian cell, concerning 1034 components and 656 components for Allium cepa and HeLa cells, respectively. Further identification had found 162 compounds and 28 different modification groups of 141 saccharides in a single Allium cepa cell, indicating pulsed-dc-ESI a powerful tool for small volume sample systematical analysis.

  13. Reference samples for the earth sciences

    Science.gov (United States)

    Flanagan, F.J.

    1974-01-01

    A revised list of reference samples of interest to geoscientists has been extended to include samples for the agronomist, the archaeologist and the environmentalist. In addition to the source from which standard samples may be obtained, references or pertinent notes for some samples are included. The number of rock reference samples is now almost adequate, and the variety of ore samples will soon be sufficient. There are very few samples for microprobe work. Oil shales will become more important because of the outlook for world petroleum resources. The dryland equivalent of a submarine basalt might be useful in studies of sea-floor spreading and of the geochemistry of basalts. The Na- and K-feldspars of BCS (British Chemical Standards-Bureau of Analysed Samples), NBS (National Bureau of Standards), and ANRT (Association Kationale de la Recherche Technique) could serve as trace-element standards if such data were available. Similarly, the present NBS flint and plastic clays, as well as their predecessors, might be useful for archaeological pottery studies. The International Decade for Ocean Exploration may stimulate the preparation of ocean-water standards for trace elements or pollutants and a standard for manganese nodules. ?? 1974.

  14. Research-Grade 3D Virtual Astromaterials Samples: Novel Visualization of NASA's Apollo Lunar Samples and Antarctic Meteorite Samples to Benefit Curation, Research, and Education

    Science.gov (United States)

    Blumenfeld, E. H.; Evans, C. A.; Oshel, E. R.; Liddle, D. A.; Beaulieu, K. R.; Zeigler, R. A.; Righter, K.; Hanna, R. D.; Ketcham, R. A.

    2017-01-01

    NASA's vast and growing collections of astromaterials are both scientifically and culturally significant, requiring unique preservation strategies that need to be recurrently updated to contemporary technological capabilities and increasing accessibility demands. New technologies have made it possible to advance documentation and visualization practices that can enhance conservation and curation protocols for NASA's Astromaterials Collections. Our interdisciplinary team has developed a method to create 3D Virtual Astromaterials Samples (VAS) of the existing collections of Apollo Lunar Samples and Antarctic Meteorites. Research-grade 3D VAS will virtually put these samples in the hands of researchers and educators worldwide, increasing accessibility and visibility of these significant collections. With new sample return missions on the horizon, it is of primary importance to develop advanced curation standards for documentation and visualization methodologies.

  15. Sample Results From Tank 48H Samples HTF-48-14-158, -159, -169, and -170

    Energy Technology Data Exchange (ETDEWEB)

    Peters, T. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Hang, T. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-04-28

    Savannah River National Laboratory (SRNL) analyzed samples from Tank 48H in support of determining the cause for the unusually high dose rates at the sampling points for this tank. A set of two samples was taken from the quiescent tank, and two additional samples were taken after the contents of the tank were mixed. The results of the analyses of all the samples show that the contents of the tank have changed very little since the analysis of the previous sample in 2012. The solids are almost exclusively composed of tetraphenylborate (TPB) salts, and there is no indication of acceleration in the TPB decomposition. The filtrate composition shows a moderate increase in salt concentration and density, which is attributable to the addition of NaOH for the purposes of corrosion control. An older modeling simulation of the TPB degradation was updated, and the supernate results from a 2012 sample were run in the model. This result was compared to the results from the 2014 recent sample results reported in this document. The model indicates there is no change in the TPB degradation from 2012 to 2014. SRNL measured the buoyancy of the TPB solids in Tank 48H simulant solutions. It was determined that a solution of density 1.279 g/mL (~6.5M sodium) was capable of indefinitely suspending the TPB solids evenly throughout the solution. A solution of density 1.296 g/mL (~7M sodium) caused a significant fraction of the solids to float on the solution surface. As the experiments could not include the effect of additional buoyancy elements such as benzene or hydrogen generation, the buoyancy measurements provide an upper bound estimate of the density in Tank 48H required to float the solids.

  16. Gaseous radiocarbon measurements of small samples

    International Nuclear Information System (INIS)

    Ruff, M.; Szidat, S.; Gaeggeler, H.W.; Suter, M.; Synal, H.-A.; Wacker, L.

    2010-01-01

    Radiocarbon dating by means of accelerator mass spectrometry (AMS) is a well-established method for samples containing carbon in the milligram range. However, the measurement of small samples containing less than 50 μg carbon often fails. It is difficult to graphitise these samples and the preparation is prone to contamination. To avoid graphitisation, a solution can be the direct measurement of carbon dioxide. The MICADAS, the smallest accelerator for radiocarbon dating in Zurich, is equipped with a hybrid Cs sputter ion source. It allows the measurement of both, graphite targets and gaseous CO 2 samples, without any rebuilding. This work presents experiences dealing with small samples containing 1-40 μg carbon. 500 unknown samples of different environmental research fields have been measured yet. Most of the samples were measured with the gas ion source. These data are compared with earlier measurements of small graphite samples. The performance of the two different techniques is discussed and main contributions to the blank determined. An analysis of blank and standard data measured within years allowed a quantification of the contamination, which was found to be of the order of 55 ng and 750 ng carbon (50 pMC) for the gaseous and the graphite samples, respectively. For quality control, a number of certified standards were measured using the gas ion source to demonstrate reliability of the data.

  17. Sampling pig farms at the abattoir in a cross-sectional study − Evaluation of a sampling method

    DEFF Research Database (Denmark)

    Birkegård, Anna Camilla; Hisham Beshara Halasa, Tariq; Toft, Nils

    2017-01-01

    slaughter pigs. It was not possible to visit farms within the designated timeframe. Therefore, it was decided to use convenience sampling at the abattoir. Our approach was carried out in three steps: 1) planning: using data from meat inspection to plan at which abattoirs and how many farms to sample; 2......A cross-sectional study design is relatively inexpensive, fast and easy to conduct when compared to other study designs. Careful planning is essential to obtaining a representative sample of the population, and the recommended approach is to use simple random sampling from an exhaustive list...... of units in the target population. This approach is rarely feasible in practice, and other sampling procedures must often be adopted. For example, when slaughter pigs are the target population, sampling the pigs on the slaughter line may be an alternative to on-site sampling at a list of farms. However...

  18. Public Use Microdata Samples (PUMS)

    Data.gov (United States)

    National Aeronautics and Space Administration — Public Use Microdata Samples (PUMS) are computer-accessible files containing records for a sample of housing units, with information on the characteristics of each...

  19. Vapor and gas sampling of single-shell tank 241-B-102 using the in situ vapor sampling system

    International Nuclear Information System (INIS)

    Lockrem, L.L.

    1997-01-01

    The Vapor Issue Resolution Program tasked the Vapor Team (the team) to collect representative headspace samples from Hanford Site single-shell tank (SST) 241-B-102. This document presents sampling data resulting from the April 18, 1996 sampling of SST 241-B-102. Analytical results will be presented in a separate report issued by Pacific Northwest National Laboratory (PNNL), which supplied and analyzed the sampling media. The team, consisting of Sampling and Mobile Laboratories (SML) and Special Analytical Studies (SAS) personnel, used the vapor sampling system (VSS) to collect representative samples of the air, gases, and vapors from the headspace of SST 241-B-102 with sorbent traps and SUMMA canisters

  20. Bayesian sample size determination for cost-effectiveness studies with censored data.

    Directory of Open Access Journals (Sweden)

    Daniel P Beavers

    Full Text Available Cost-effectiveness models are commonly utilized to determine the combined clinical and economic impact of one treatment compared to another. However, most methods for sample size determination of cost-effectiveness studies assume fully observed costs and effectiveness outcomes, which presents challenges for survival-based studies in which censoring exists. We propose a Bayesian method for the design and analysis of cost-effectiveness data in which costs and effectiveness may be censored, and the sample size is approximated for both power and assurance. We explore two parametric models and demonstrate the flexibility of the approach to accommodate a variety of modifications to study assumptions.

  1. Visual Sample Plan (VSP) - FIELDS Integration

    Energy Technology Data Exchange (ETDEWEB)

    Pulsipher, Brent A.; Wilson, John E.; Gilbert, Richard O.; Hassig, Nancy L.; Carlson, Deborah K.; Bing-Canar, John; Cooper, Brian; Roth, Chuck

    2003-04-19

    Two software packages, VSP 2.1 and FIELDS 3.5, are being used by environmental scientists to plan the number and type of samples required to meet project objectives, display those samples on maps, query a database of past sample results, produce spatial models of the data, and analyze the data in order to arrive at defensible decisions. VSP 2.0 is an interactive tool to calculate optimal sample size and optimal sample location based on user goals, risk tolerance, and variability in the environment and in lab methods. FIELDS 3.0 is a set of tools to explore the sample results in a variety of ways to make defensible decisions with quantified levels of risk and uncertainty. However, FIELDS 3.0 has a small sample design module. VSP 2.0, on the other hand, has over 20 sampling goals, allowing the user to input site-specific assumptions such as non-normality of sample results, separate variability between field and laboratory measurements, make two-sample comparisons, perform confidence interval estimation, use sequential search sampling methods, and much more. Over 1,000 copies of VSP are in use today. FIELDS is used in nine of the ten U.S. EPA regions, by state regulatory agencies, and most recently by several international countries. Both software packages have been peer-reviewed, enjoy broad usage, and have been accepted by regulatory agencies as well as site project managers as key tools to help collect data and make environmental cleanup decisions. Recently, the two software packages were integrated, allowing the user to take advantage of the many design options of VSP, and the analysis and modeling options of FIELDS. The transition between the two is simple for the user – VSP can be called from within FIELDS, automatically passing a map to VSP and automatically retrieving sample locations and design information when the user returns to FIELDS. This paper will describe the integration, give a demonstration of the integrated package, and give users download

  2. Solvent hold tank sample results for MCU-16-1363-1365. November 2016 monthly sample

    Energy Technology Data Exchange (ETDEWEB)

    Fondeur, F. F. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Jones, D. H. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-03-22

    Savannah River National Laboratory (SRNL) received one set of three Solvent Hold Tank (SHT) samples (MCU-16-1363-1364-1365), pulled on 11/15/2016 for analysis. The samples were combined and analyzed for composition. Analysis of the composite sample MCU-16-1363-1364-1365 indicated the Isopar™L concentration is at its nominal level (100%). The extractant (MaxCalix) and the modifier (CS- 7SB) are 8% and 2 % below their nominal concentrations. The suppressor (TiDG) is 7% below its nominal concentration. This analysis confirms the trim and Isopar™ additions to the solvent in November. This analysis also indicates the solvent did not require further additions. Based on the current monthly sample, the levels of TiDG, Isopar™L, MaxCalix, and modifier are sufficient for continuing operation but are expected to decrease with time. Periodic characterization and trimming additions to the solvent are recommended.

  3. Electrokinetic sample preconcentration and hydrodynamic sample injection for microchip electrophoresis using a pneumatic microvalve.

    Science.gov (United States)

    Cong, Yongzheng; Katipamula, Shanta; Geng, Tao; Prost, Spencer A; Tang, Keqi; Kelly, Ryan T

    2016-02-01

    A microfluidic platform was developed to perform online electrokinetic sample preconcentration and rapid hydrodynamic sample injection for zone electrophoresis using a single microvalve. The polydimethylsiloxane microchip comprises a separation channel, a side channel for sample introduction, and a control channel which is used as a pneumatic microvalve aligned at the intersection of the two flow channels. The closed microvalve, created by multilayer soft lithography, serves as a nanochannel preconcentrator under an applied electric potential, enabling current to pass through while preventing bulk flow. Once analytes are concentrated, the valve is briefly opened and the stacked sample is pressure injected into the separation channel for electrophoretic separation. Fluorescently labeled peptides were enriched by a factor of ∼450 in 230 s. This method enables both rapid analyte concentration and controlled injection volume for high sensitivity, high-resolution CE. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Verification of Representative Sampling in RI waste

    International Nuclear Information System (INIS)

    Ahn, Hong Joo; Song, Byung Cheul; Sohn, Se Cheul; Song, Kyu Seok; Jee, Kwang Yong; Choi, Kwang Seop

    2009-01-01

    For evaluating the radionuclide inventories for RI wastes, representative sampling is one of the most important parts in the process of radiochemical assay. Sampling to characterized RI waste conditions typically has been based on judgment or convenience sampling of individual or groups. However, it is difficult to get a sample representatively among the numerous drums. In addition, RI waste drums might be classified into heterogeneous wastes because they have a content of cotton, glass, vinyl, gloves, etc. In order to get the representative samples, the sample to be analyzed must be collected from selected every drum. Considering the expense and time of analysis, however, the number of sample has to be minimized. In this study, RI waste drums were classified by the various conditions of the half-life, surface dose, acceptance date, waste form, generator, etc. A sample for radiochemical assay was obtained through mixing samples of each drum. The sample has to be prepared for radiochemical assay and although the sample should be reasonably uniform, it is rare that a completely homogeneous material is received. Every sample is shredded by a 1 ∼ 2 cm 2 diameter and a representative aliquot taken for the required analysis. For verification of representative sampling, classified every group is tested for evaluation of 'selection of representative drum in a group' and 'representative sampling in a drum'

  5. Use of Sequenom sample ID Plus® SNP genotyping in identification of FFPE tumor samples.

    Directory of Open Access Journals (Sweden)

    Jessica K Miller

    Full Text Available Short tandem repeat (STR analysis, such as the AmpFlSTR® Identifiler® Plus kit, is a standard, PCR-based human genotyping method used in the field of forensics. Misidentification of cell line and tissue DNA can be costly if not detected early; therefore it is necessary to have quality control measures such as STR profiling in place. A major issue in large-scale research studies involving archival formalin-fixed paraffin embedded (FFPE tissues is that varying levels of DNA degradation can result in failure to correctly identify samples using STR genotyping. PCR amplification of STRs of several hundred base pairs is not always possible when DNA is degraded. The Sample ID Plus® panel from Sequenom allows for human DNA identification and authentication using SNP genotyping. In comparison to lengthy STR amplicons, this multiplexing PCR assay requires amplification of only 76-139 base pairs, and utilizes 47 SNPs to discriminate between individual samples. In this study, we evaluated both STR and SNP genotyping methods of sample identification, with a focus on paired FFPE tumor/normal DNA samples intended for next-generation sequencing (NGS. The ability to successfully validate the identity of FFPE samples can enable cost savings by reducing rework.

  6. Automated Sampling and Extraction of Krypton from Small Air Samples for Kr-85 Measurement Using Atom Trap Trace Analysis

    International Nuclear Information System (INIS)

    Hebel, S.; Hands, J.; Goering, F.; Kirchner, G.; Purtschert, R.

    2015-01-01

    Atom-Trap-Trace-Analysis (ATTA) provides the capability of measuring the Krypton-85 concentration in microlitre amounts of krypton extracted from air samples of about 1 litre. This sample size is sufficiently small to allow for a range of applications, including on-site spot sampling and continuous sampling over periods of several hours. All samples can be easily handled and transported to an off-site laboratory for ATTA measurement, or stored and analyzed on demand. Bayesian sampling methodologies can be applied by blending samples for bulk measurement and performing in-depth analysis as required. Prerequisite for measurement is the extraction of a pure krypton fraction from the sample. This paper introduces an extraction unit able to isolate the krypton in small ambient air samples with high speed, high efficiency and in a fully automated manner using a combination of cryogenic distillation and gas chromatography. Air samples are collected using an automated smart sampler developed in-house to achieve a constant sampling rate over adjustable time periods ranging from 5 minutes to 3 hours per sample. The smart sampler can be deployed in the field and operate on battery for one week to take up to 60 air samples. This high flexibility of sampling and the fast, robust sample preparation are a valuable tool for research and the application of Kr-85 measurements to novel Safeguards procedures. (author)

  7. Adaptive sampling of AEM transients

    Science.gov (United States)

    Di Massa, Domenico; Florio, Giovanni; Viezzoli, Andrea

    2016-02-01

    This paper focuses on the sampling of the electromagnetic transient as acquired by airborne time-domain electromagnetic (TDEM) systems. Typically, the sampling of the electromagnetic transient is done using a fixed number of gates whose width grows logarithmically (log-gating). The log-gating has two main benefits: improving the signal to noise (S/N) ratio at late times, when the electromagnetic signal has amplitudes equal or lower than the natural background noise, and ensuring a good resolution at the early times. However, as a result of fixed time gates, the conventional log-gating does not consider any geological variations in the surveyed area, nor the possibly varying characteristics of the measured signal. We show, using synthetic models, how a different, flexible sampling scheme can increase the resolution of resistivity models. We propose a new sampling method, which adapts the gating on the base of the slope variations in the electromagnetic (EM) transient. The use of such an alternative sampling scheme aims to get more accurate inverse models by extracting the geoelectrical information from the measured data in an optimal way.

  8. ALARA ASSESSMENT OF SETTLER SLUDGE SAMPLING METHODS

    International Nuclear Information System (INIS)

    Nelsen, L.A.

    2009-01-01

    The purpose of this assessment is to compare underwater and above water settler sludge sampling methods to determine if the added cost for underwater sampling for the sole purpose of worker dose reductions is justified. Initial planning for sludge sampling included container, settler and knock-out-pot (KOP) sampling. Due to the significantly higher dose consequence of KOP sludge, a decision was made to sample KOP underwater to achieve worker dose reductions. Additionally, initial plans were to utilize the underwater sampling apparatus for settler sludge. Since there are no longer plans to sample KOP sludge, the decision for underwater sampling for settler sludge needs to be revisited. The present sampling plan calls for spending an estimated $2,500,000 to design and construct a new underwater sampling system (per A21 C-PL-001 RevOE). This evaluation will compare and contrast the present method of above water sampling to the underwater method that is planned by the Sludge Treatment Project (STP) and determine if settler samples can be taken using the existing sampling cart (with potentially minor modifications) while maintaining doses to workers As Low As Reasonably Achievable (ALARA) and eliminate the need for costly redesigns, testing and personnel retraining

  9. ALARA ASSESSMENT OF SETTLER SLUDGE SAMPLING METHODS

    Energy Technology Data Exchange (ETDEWEB)

    NELSEN LA

    2009-01-30

    The purpose of this assessment is to compare underwater and above water settler sludge sampling methods to determine if the added cost for underwater sampling for the sole purpose of worker dose reductions is justified. Initial planning for sludge sampling included container, settler and knock-out-pot (KOP) sampling. Due to the significantly higher dose consequence of KOP sludge, a decision was made to sample KOP underwater to achieve worker dose reductions. Additionally, initial plans were to utilize the underwater sampling apparatus for settler sludge. Since there are no longer plans to sample KOP sludge, the decision for underwater sampling for settler sludge needs to be revisited. The present sampling plan calls for spending an estimated $2,500,000 to design and construct a new underwater sampling system (per A21 C-PL-001 RevOE). This evaluation will compare and contrast the present method of above water sampling to the underwater method that is planned by the Sludge Treatment Project (STP) and determine if settler samples can be taken using the existing sampling cart (with potentially minor modifications) while maintaining doses to workers As Low As Reasonably Achievable (ALARA) and eliminate the need for costly redesigns, testing and personnel retraining.

  10. Statistical distribution sampling

    Science.gov (United States)

    Johnson, E. S.

    1975-01-01

    Determining the distribution of statistics by sampling was investigated. Characteristic functions, the quadratic regression problem, and the differential equations for the characteristic functions are analyzed.

  11. On-line Automated Sample Preparation-Capillary Gas Chromatography for the Analysis of Plasma Samples.

    NARCIS (Netherlands)

    Louter, A.J.H.; van der Wagt, R.A.C.A.; Brinkman, U.A.T.

    1995-01-01

    An automated sample preparation module, (the automated sample preparation with extraction columns, ASPEC), was interfaced with a capillary gas chromatograph (GC) by means of an on-column interface. The system was optimised for the determination of the antidepressant trazodone in plasma. The clean-up

  12. Equilibrium Molecular Thermodynamics from Kirkwood Sampling

    OpenAIRE

    Somani, Sandeep; Okamoto, Yuko; Ballard, Andrew J.; Wales, David J.

    2015-01-01

    We present two methods for barrierless equilibrium sampling of molecular systems based on the recently proposed Kirkwood method (J. Chem. Phys. 2009, 130, 134102). Kirkwood sampling employs low-order correlations among internal coordinates of a molecule for random (or non-Markovian) sampling of the high dimensional conformational space. This is a geometrical sampling method independent of the potential energy surface. The first method is a variant of biased Monte Carlo, wher...

  13. Sampling marine sediments for radionuclide monitoring

    International Nuclear Information System (INIS)

    Papucci, C.

    1997-01-01

    A description of the most common devices used for sampling marine sediments are reported. The systems are compared to evidence their intrinsic usefulness, for collecting samples in different environmental conditions or with different scientific objectives. Perturbations and artifacts introduced during the various steps of the sampling procedure are also reviewed, and suggestions are proposed for obtaining and preserving, as much as possible, the representativeness of the sediment samples. (author)

  14. Sample preparation and EFTEM of Meat Samples for Nanoparticle Analysis in Food

    International Nuclear Information System (INIS)

    Lari, L; Dudkiewicz, A

    2014-01-01

    Nanoparticles are used in industry for personal care products and the preparation of food. In the latter application, their functions include the prevention of microbes' growth, increase of the foods nutritional value and sensory quality. EU regulations require a risk assessment of the nanoparticles used in foods and food contact materials before the products can reach the market. However, availability of validated analytical methodologies for detection and characterisation of the nanoparticles in food hampers appropriate risk assessment. As part of a research on the evaluation of the methods for screening and quantification of Ag nanoparticles in meat we have tested a new TEM sample preparation alternative to resin embedding and cryo-sectioning. Energy filtered TEM analysis was applied to evaluate thickness and the uniformity of thin meat layers acquired at increasing input of the sample demonstrating that the protocols used ensured good stability under the electron beam, reliable sample concentration and reproducibility

  15. Sample preparation and EFTEM of Meat Samples for Nanoparticle Analysis in Food

    Science.gov (United States)

    Lari, L.; Dudkiewicz, A.

    2014-06-01

    Nanoparticles are used in industry for personal care products and the preparation of food. In the latter application, their functions include the prevention of microbes' growth, increase of the foods nutritional value and sensory quality. EU regulations require a risk assessment of the nanoparticles used in foods and food contact materials before the products can reach the market. However, availability of validated analytical methodologies for detection and characterisation of the nanoparticles in food hampers appropriate risk assessment. As part of a research on the evaluation of the methods for screening and quantification of Ag nanoparticles in meat we have tested a new TEM sample preparation alternative to resin embedding and cryo-sectioning. Energy filtered TEM analysis was applied to evaluate thickness and the uniformity of thin meat layers acquired at increasing input of the sample demonstrating that the protocols used ensured good stability under the electron beam, reliable sample concentration and reproducibility.

  16. 14CO2 analysis of soil gas: Evaluation of sample size limits and sampling devices

    Science.gov (United States)

    Wotte, Anja; Wischhöfer, Philipp; Wacker, Lukas; Rethemeyer, Janet

    2017-12-01

    Radiocarbon (14C) analysis of CO2 respired from soils or sediments is a valuable tool to identify different carbon sources. The collection and processing of the CO2, however, is challenging and prone to contamination. We thus continuously improve our handling procedures and present a refined method for the collection of even small amounts of CO2 in molecular sieve cartridges (MSCs) for accelerator mass spectrometry 14C analysis. Using a modified vacuum rig and an improved desorption procedure, we were able to increase the CO2 recovery from the MSC (95%) as well as the sample throughput compared to our previous study. By processing series of different sample size, we show that our MSCs can be used for CO2 samples of as small as 50 μg C. The contamination by exogenous carbon determined in these laboratory tests, was less than 2.0 μg C from fossil and less than 3.0 μg C from modern sources. Additionally, we tested two sampling devices for the collection of CO2 samples released from soils or sediments, including a respiration chamber and a depth sampler, which are connected to the MSC. We obtained a very promising, low process blank for the entire CO2 sampling and purification procedure of ∼0.004 F14C (equal to 44,000 yrs BP) and ∼0.003 F14C (equal to 47,000 yrs BP). In contrast to previous studies, we observed no isotopic fractionation towards lighter δ13C values during the passive sampling with the depth samplers.

  17. Effect of sampling site, repeated sampling, pH, and PCO2 on plasma lactate concentration in healthy dogs.

    Science.gov (United States)

    Hughes, D; Rozanski, E R; Shofer, F S; Laster, L L; Drobatz, K J

    1999-04-01

    To characterize the variation in plasma lactate concentration among samples from commonly used blood sampling sites in conscious, healthy dogs. 60 healthy dogs. Cross-sectional study using a replicated Latin square design. Each dog was assigned to 1 of 6 groups (n = 10) representing all possible orders for 3 sites (cephalic vein, jugular vein, and femoral artery) used to obtain blood. Samples were analyzed immediately, by use of direct amperometry for pH, PO2, Pco2, glucose, and lactate concentration. Significant differences in plasma lactate concentrations were detected among blood samples from the cephalic vein (highest), femoral artery, and jugular vein (lowest). Mean plasma lactate concentration in the first sample obtained, irrespective of sampling site, was lower than in subsequent samples. Covariation was identified among plasma lactate concentration, pH, and PCO2, but correlation coefficients were low. Plasma lactate concentrations differed among blood samples from various sites. A reference range for plasma lactate concentration was 0.3 to 2.5 mmol/L. Differences in plasma lactate concentrations among samples from various sites and with repeated sampling, in healthy dogs, are small. Use of the reference range may facilitate the clinical use of plasma lactate concentration in dogs.

  18. Sample requirements and design of an inter-laboratory trial for radiocarbon laboratories

    International Nuclear Information System (INIS)

    Bryant, Charlotte; Carmi, Israel; Cook, Gordon; Gulliksen, Steinar; Harkness, Doug; Heinemeier, Jan; McGee, Edward; Naysmith, Philip; Possnert, Goran; Scott, Marian; Plicht, Hans van der; Strydonck, Mark van

    2000-01-01

    An on-going inter-comparison programme which is focused on assessing and establishing consensus protocols to be applied in the identification, selection and sub-sampling of materials for subsequent 14 C analysis is described. The outcome of the programme will provide a detailed quantification of the uncertainties associated with 14 C measurements including the issues of accuracy and precision. Such projects have become recognised as a fundamental aspect of continuing laboratory quality assurance schemes, providing a mechanism for the harmonisation of measurements and for demonstrating the traceability of results. The design of this study and its rationale are described. In summary, a suite of core samples has been defined which will be made available to both AMS and radiometric laboratories. These core materials are representative of routinely dated material and their ages span the full range of the applied 14 C time-scale. Two of the samples are of wood from the German and Irish dendrochronologies, thus providing a direct connection to the master dendrochronological calibration curve. Further samples link this new inter-comparison to past studies. Sample size and precision have been identified as being of paramount importance in defining dating confidence, and so several core samples have been identified for more in-depth study of these practical issues. In addition to the core samples, optional samples have been identified and prepared specifically for either AMS and/or radiometric laboratories. For AMS laboratories, these include bone, textile, leather and parchment samples. Participation in the study requires a commitment to a minimum of 10 core analyses, with results to be returned within a year

  19. Phd dissertation on: Effects of an expressive writing intervention in a nationwide sample of breast cancer patients

    DEFF Research Database (Denmark)

    Jensen-Johansen, Mikael Birkelund

    2010-01-01

    and cancer patients. EWI had been used in 10 of the studies (n=661) with number of participants ranging from 30 to 234 and an average sample size of 82.6. Across all available studies, EWI did not show the hypothesized therapeutic effect on either psychological or physical outcomes. However, when including...... at baseline with respect to the prevalence of somatic symptoms, depression, distress, and sociodemografic patterns, and to compare the present sample with a large Danish cohort of 3500 Danish women treated for breast cancer. Objective 4 – To investigate the impact of EWI on the outcomes of cancer related...... is a literature review and meta-analysis of studies of expressive writing intervention (EWI) focusing on health outcomes in samples of cancer patients. A list of randomized clinical trials (RCT) of EWI with cancer patients published before December 2009 was compiled using relevant search engines and previously...

  20. Methodological integrative review of the work sampling technique used in nursing workload research.

    Science.gov (United States)

    Blay, Nicole; Duffield, Christine M; Gallagher, Robyn; Roche, Michael

    2014-11-01

    To critically review the work sampling technique used in nursing workload research. Work sampling is a technique frequently used by researchers and managers to explore and measure nursing activities. However, work sampling methods used are diverse making comparisons of results between studies difficult. Methodological integrative review. Four electronic databases were systematically searched for peer-reviewed articles published between 2002-2012. Manual scanning of reference lists and Rich Site Summary feeds from contemporary nursing journals were other sources of data. Articles published in the English language between 2002-2012 reporting on research which used work sampling to examine nursing workload. Eighteen articles were reviewed. The review identified that the work sampling technique lacks a standardized approach, which may have an impact on the sharing or comparison of results. Specific areas needing a shared understanding included the training of observers and subjects who self-report, standardization of the techniques used to assess observer inter-rater reliability, sampling methods and reporting of outcomes. Work sampling is a technique that can be used to explore the many facets of nursing work. Standardized reporting measures would enable greater comparison between studies and contribute to knowledge more effectively. Author suggestions for the reporting of results may act as guidelines for researchers considering work sampling as a research method. © 2014 John Wiley & Sons Ltd.

  1. Quantitating morphological changes in biological samples during scanning electron microscopy sample preparation with correlative super-resolution microscopy.

    Science.gov (United States)

    Zhang, Ying; Huang, Tao; Jorgens, Danielle M; Nickerson, Andrew; Lin, Li-Jung; Pelz, Joshua; Gray, Joe W; López, Claudia S; Nan, Xiaolin

    2017-01-01

    Sample preparation is critical to biological electron microscopy (EM), and there have been continuous efforts on optimizing the procedures to best preserve structures of interest in the sample. However, a quantitative characterization of the morphological changes associated with each step in EM sample preparation is currently lacking. Using correlative EM and superresolution microscopy (SRM), we have examined the effects of different drying methods as well as osmium tetroxide (OsO4) post-fixation on cell morphology during scanning electron microscopy (SEM) sample preparation. Here, SRM images of the sample acquired under hydrated conditions were used as a baseline for evaluating morphological changes as the sample went through SEM sample processing. We found that both chemical drying and critical point drying lead to a mild cellular boundary retraction of ~60 nm. Post-fixation by OsO4 causes at least 40 nm additional boundary retraction. We also found that coating coverslips with adhesion molecules such as fibronectin prior to cell plating helps reduce cell distortion from OsO4 post-fixation. These quantitative measurements offer useful information for identifying causes of cell distortions in SEM sample preparation and improving current procedures.

  2. Solid phase microextraction headspace sampling of chemical warfare agent contaminated samples : method development for GC-MS analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jackson Lepage, C.R.; Hancock, J.R. [Defence Research and Development Canada, Medicine Hat, AB (Canada); Wyatt, H.D.M. [Regina Univ., SK (Canada)

    2004-07-01

    Defence R and D Canada-Suffield (DRDC-Suffield) is responsible for analyzing samples that are suspected to contain chemical warfare agents, either collected by the Canadian Forces or by first-responders in the event of a terrorist attack in Canada. The analytical techniques used to identify the composition of the samples include gas chromatography-mass spectrometry (GC-MS), liquid chromatography-mass spectrometry (LC-MS), Fourier-transform infrared spectroscopy (FT-IR) and nuclear magnetic resonance spectroscopy. GC-MS and LC-MS generally require solvent extraction and reconcentration, thereby increasing sample handling. The authors examined analytical techniques which reduce or eliminate sample manipulation. In particular, this paper presented a screening method based on solid phase microextraction (SPME) headspace sampling and GC-MS analysis for chemical warfare agents such as mustard, sarin, soman, and cyclohexyl methylphosphonofluoridate in contaminated soil samples. SPME is a method which uses small adsorbent polymer coated silica fibers that trap vaporous or liquid analytes for GC or LC analysis. Collection efficiency can be increased by adjusting sampling time and temperature. This method was tested on two real-world samples, one from excavated chemical munitions and the second from a caustic decontamination mixture. 7 refs., 2 tabs., 3 figs.

  3. Sampling artifacts in measurement of elemental and organic carbon: Low-volume sampling in indoor and outdoor environments

    Science.gov (United States)

    Olson, David A.; Norris, Gary A.

    Experiments were completed to determine the extent of artifacts from sampling elemental carbon (EC) and organic carbon (OC) under sample conditions consistent with personal sampling. Two different types of experiments were completed; the first examined possible artifacts from oils used in personal environmental monitor (PEM) impactor plates, and the second examined artifacts from microenvironmental sampling using different sampling media combinations (quartz, Teflon, XAD denuder, and electrostatic precipitator). The effectiveness of front and backup filters was evaluated for most sampling configurations. Mean total carbon concentrations from sampling configurations using impactor oils were not statistically different from the control case (using a sharp cut cyclone). Three microenvironments were tested (kitchen, library, and ambient); carbon concentrations were highest in the kitchen using a front quartz filter (mean OC of 16.4 μg m -3). The lowest front quartz filter concentrations were measured in the library using XAD denuders (mean OC of 3.6 μg m -3). Denuder removal efficiencies (average of 82% for total carbon) were lower compared with previous ambient studies and may indicate that indoor sources influenced denuder efficiency during sample collection. The highest carbon concentrations from backup quartz filters were measured using the Teflon-quartz combination.

  4. Physical sampling for site and waste characterization

    International Nuclear Information System (INIS)

    Bonnough, T.L.

    1996-01-01

    Physical sampling plays a basic role in high-level radioactive waste management program effort. The term ''physical sampling'' used here means collecting tangible, physical samples of soil, water, air, waste streams, or other materials. The industry defines the term ''physical sampling'' broadly to include measurements of physical conditions such as temperature, wind conditions, and pH, which are also often taken in a sample collection effort. Most environmental compliance actions are supported by the results of taking, recording, and analyzing physical samples and the measurements of physical conditions taken in association with sample collecting. Therefore, the when and how to take samples is needed to be known and planned

  5. Material sampling for rotor evaluation

    International Nuclear Information System (INIS)

    Mercaldi, D.; Parker, J.

    1990-01-01

    Decisions regarding continued operation of aging rotating machinery must often be made without adequate knowledge of rotor material conditions. Physical specimens of the material are not generally available due to lack of an appropriate sampling technique or the high cost and inconvenience of obtaining such samples. This is despite the fact that examination of such samples may be critical to effectively assess the degradation of mechanical properties of the components in service or to permit detailed examination of microstructure and surface flaws. Such information permits a reduction in the uncertainty of remaining life estimates for turbine rotors to avoid unnecessarily premature and costly rotor retirement decisions. This paper describes the operation and use of a recently developed material sampling device which machines and recovers an undeformed specimen from the surface of rotor bores or other components for metallurgical analysis. The removal of the thin, wafer-like sample has a negligible effect on the structural integrity of these components, due to the geometry and smooth surface finish of the resulting shallow depression. Samples measuring approximately 0.03 to 0.1 inches (0.76 to 2.5 mm) thick by 0.5 to 1.0 inch (1.3 to 2.5 cm) in diameter can be removed without mechanical deformation or thermal degradation of the sample or the remaining component material. The device is operated remotely from a control console and can be used externally or internally on any surface for which there is at least a three inch (7.6 cm) working clearance. Application of the device in two case studies of turbine-generator evaluations are presented

  6. Detecting spatial structures in throughfall data: The effect of extent, sample size, sampling design, and variogram estimation method

    Science.gov (United States)

    Voss, Sebastian; Zimmermann, Beate; Zimmermann, Alexander

    2016-09-01

    In the last decades, an increasing number of studies analyzed spatial patterns in throughfall by means of variograms. The estimation of the variogram from sample data requires an appropriate sampling scheme: most importantly, a large sample and a layout of sampling locations that often has to serve both variogram estimation and geostatistical prediction. While some recommendations on these aspects exist, they focus on Gaussian data and high ratios of the variogram range to the extent of the study area. However, many hydrological data, and throughfall data in particular, do not follow a Gaussian distribution. In this study, we examined the effect of extent, sample size, sampling design, and calculation method on variogram estimation of throughfall data. For our investigation, we first generated non-Gaussian random fields based on throughfall data with large outliers. Subsequently, we sampled the fields with three extents (plots with edge lengths of 25 m, 50 m, and 100 m), four common sampling designs (two grid-based layouts, transect and random sampling) and five sample sizes (50, 100, 150, 200, 400). We then estimated the variogram parameters by method-of-moments (non-robust and robust estimators) and residual maximum likelihood. Our key findings are threefold. First, the choice of the extent has a substantial influence on the estimation of the variogram. A comparatively small ratio of the extent to the correlation length is beneficial for variogram estimation. Second, a combination of a minimum sample size of 150, a design that ensures the sampling of small distances and variogram estimation by residual maximum likelihood offers a good compromise between accuracy and efficiency. Third, studies relying on method-of-moments based variogram estimation may have to employ at least 200 sampling points for reliable variogram estimates. These suggested sample sizes exceed the number recommended by studies dealing with Gaussian data by up to 100 %. Given that most previous

  7. Sampling for stereology in lungs

    Directory of Open Access Journals (Sweden)

    J. R. Nyengaard

    2006-12-01

    Full Text Available The present article reviews the relevant stereological estimators for obtaining reliable quantitative structural data from the lungs. Stereological sampling achieves reliable, quantitative information either about the whole lung or complete lobes, whilst minimising the workload. Studies have used systematic random sampling, which has fixed and constant sampling probabilities on all blocks, sections and fields of view. For an estimation of total lung or lobe volume, the Cavalieri principle can be used, but it is not useful in estimating individual cell volume due to various effects from over- or underprojection. If the number of certain structures is required, two methods can be used: the disector and the fractionator. The disector method is a three-dimensional stereological probe for sampling objects according to their number. However, it may be affected on tissue deformation and, therefore, the fractionator method is often the preferred sampling principle. In this method, a known and predetermined fraction of an object is sampled in one or more steps, with the final step estimating the number. Both methods can be performed in a physical and optical manner, therefore enabling cells and larger lung structure numbers (e.g. number of alveoli to be estimated. Some estimators also require randomisation of orientation, so that all directions have an equal chance of being chosen. Using such isotropic sections, surface area, length, and diameter can be estimated on a Cavalieri set of sections. Stereology can also illustrate the potential for transport between two compartments by analysing the barrier width. Estimating the individual volume of cells can be achieved by local stereology using a two-step procedure that first samples lung cells using the disector and then introduces individual volume estimation of the sampled cells. The coefficient of error of most unbiased stereological estimators is a combination of variance from blocks, sections, fields

  8. Environmental sample banking-research and methodology

    International Nuclear Information System (INIS)

    Becker, D.A.

    1976-01-01

    The National Bureau of Standards (NBS), in cooperation with the Environment Protection Agency and the National Science Foundation, is engaged in a research program establishing methodology for environmental sample banking. This program is aimed toward evaluating the feasibility of a National Environment Specimen Bank (NESB). The capability for retrospective chemical analyses to evaluate changes in our environment would provide useful information. Much of this information could not be obtained using data from previously analyzed samples. However, to assure validity for these stored samples, they must be sampled, processed and stored under rigorously evaluated, controlled and documented conditions. The program currently under way in the NBS Analytical Chemistry Division has 3 main components. The first is an extension survey of available literature concerning problems of contamination, losses and storage. The components of interest include trace elements, pesticides, other trace organics (PCBs, plasticizers, etc.), radionuclides and microbiological species. The second component is an experimental evaluation of contamination and losses during sampling and sample handling. Of particular interest here is research into container cleaning methodology for trace elements, with respect to adsorption, desorption, leaching and partial dissolution by various sample matrices. The third component of this program is an evaluation of existing methodology for long-term sample storage

  9. Waste sampling and characterization facility (WSCF)

    International Nuclear Information System (INIS)

    1994-10-01

    The Waste Sampling and Characterization Facility (WSCF) complex consists of the main structure (WSCF) and four support structures located in the 600 Area of the Hanford site east of the 200 West area and south of the Hanford Meterology Station. WSCF is to be used for low level sample analysis, less than 2 mRem. The Laboratory features state-of-the-art analytical and low level radiological counting equipment for gaseous, soil, and liquid sample analysis. In particular, this facility is to be used to perform Resource Conservation and Recovery Act (RCRA) of 1976 and Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) of 1980 sample analysis in accordance with U.S. Environmental Protection Agency Protocols, room air and stack monitoring sample analysis, waste water treatment process support, and contractor laboratory quality assurance checks. The samples to be analyzed contain very low concentrations of radioisotopes. The main reason that WSCF is considered a Nuclear Facility is due to the storage of samples at the facility. This maintenance Implementation Plan has been developed for maintenace functions associate with the WSCF

  10. A standardized method for sampling and extraction methods for quantifying microplastics in beach sand.

    Science.gov (United States)

    Besley, Aiken; Vijver, Martina G; Behrens, Paul; Bosker, Thijs

    2017-01-15

    Microplastics are ubiquitous in the environment, are frequently ingested by organisms, and may potentially cause harm. A range of studies have found significant levels of microplastics in beach sand. However, there is a considerable amount of methodological variability among these studies. Methodological variation currently limits comparisons as there is no standard procedure for sampling or extraction of microplastics. We identify key sampling and extraction procedures across the literature through a detailed review. We find that sampling depth, sampling location, number of repeat extractions, and settling times are the critical parameters of variation. Next, using a case-study we determine whether and to what extent these differences impact study outcomes. By investigating the common practices identified in the literature with the case-study, we provide a standard operating procedure for sampling and extracting microplastics from beach sand. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Comparative analysis of vaginal microbiota sampling using 16S rRNA gene analysis.

    Science.gov (United States)

    Virtanen, Seppo; Kalliala, Ilkka; Nieminen, Pekka; Salonen, Anne

    2017-01-01

    Molecular methods such as next-generation sequencing are actively being employed to characterize the vaginal microbiota in health and disease. Previous studies have focused on characterizing the biological variation in the microbiota, and less is known about how factors related to sampling contribute to the results. Our aim was to investigate the impact of a sampling device and anatomical sampling site on the quantitative and qualitative outcomes relevant for vaginal microbiota research. We sampled 10 Finnish women representing diverse clinical characteristics with flocked swabs, the Evalyn® self-sampling device, sterile plastic spatulas and a cervical brush that were used to collect samples from fornix, vaginal wall and cervix. Samples were compared on DNA and protein yield, bacterial load, and microbiota diversity and species composition based on Illumina MiSeq sequencing of the 16S rRNA gene. We quantified the relative contributions of sampling variables versus intrinsic variables in the overall microbiota variation, and evaluated the microbiota profiles using several commonly employed metrics such as alpha and beta diversity as well as abundance of major bacterial genera and species. The total DNA yield was strongly dependent on the sampling device and to a lesser extent on the anatomical site of sampling. The sampling strategy did not affect the protein yield or the bacterial load. All tested sampling methods produced highly comparable microbiota profiles based on MiSeq sequencing. The sampling method explained only 2% (p-value = 0.89) of the overall microbiota variation, markedly surpassed by intrinsic factors such as clinical status (microscopy for bacterial vaginosis 53%, p = 0.0001), bleeding (19%, p = 0.0001), and the variation between subjects (11%, p-value 0.0001). The results indicate that different sampling strategies yield comparable vaginal microbiota composition and diversity. Hence, past and future vaginal microbiota studies employing different

  12. Diagnostic herd sensitivity using environmental samples

    DEFF Research Database (Denmark)

    Vigre, Håkan; Josefsen, Mathilde Hartmann; Seyfarth, Anne Mette

    either at farm or slaughter. Three sample matrices were collected; dust samples (5 environmental swabs), nasal swabs (10 pools with 5 animals per pool) and air samples (1 filter). Based on the assumption that MRSA occurred in all 48 herds the overall herd sensitivity was 58% for nasal swabs, 33% for dust....... In our example, the prevalence of infected pigs in each herd was estimated from the pooled samples of nasal swabs. Logistic regression was used to estimate the effect of animal prevalence on the probability to detect MRSA in the dust and air samples at herd level. The results show a significant increase...

  13. Sample size in qualitative interview studies

    DEFF Research Database (Denmark)

    Malterud, Kirsti; Siersma, Volkert Dirk; Guassora, Ann Dorrit Kristiane

    2016-01-01

    Sample sizes must be ascertained in qualitative studies like in quantitative studies but not by the same means. The prevailing concept for sample size in qualitative studies is “saturation.” Saturation is closely tied to a specific methodology, and the term is inconsistently applied. We propose...... the concept “information power” to guide adequate sample size for qualitative studies. Information power indicates that the more information the sample holds, relevant for the actual study, the lower amount of participants is needed. We suggest that the size of a sample with sufficient information power...... and during data collection of a qualitative study is discussed....

  14. Small-sample-worth perturbation methods

    International Nuclear Information System (INIS)

    1985-01-01

    It has been assumed that the perturbed region, R/sub p/, is large enough so that: (1) even without a great deal of biasing there is a substantial probability that an average source-neutron will enter it; and (2) once having entered, the neutron is likely to make several collisions in R/sub p/ during its lifetime. Unfortunately neither assumption is valid for the typical configurations one encounters in small-sample-worth experiments. In such experiments one measures the reactivity change which is induced when a very small void in a critical assembly is filled with a sample of some test-material. Only a minute fraction of the fission-source neutrons ever gets into the sample and, of those neutrons that do, most emerge uncollided. Monte Carlo small-sample perturbations computations are described

  15. Physical sampling for site and waste characterization

    International Nuclear Information System (INIS)

    Bonnough, T.L.

    1994-01-01

    Physical sampling plays a basic role in site and waste characterization program effort. The term ''physical sampling'' used here means collecting tangible, physical samples of soil, water, air, waste streams, or other materials. The industry defines the term ''physical sampling'' broadly to include measurements of physical conditions such as temperature, wind conditions, and pH which are also often taken in a sample collection effort. Most environmental compliance actions are supported by the results of taking, recording, and analyzing physical samples and the measuring of physical conditions taken in association with sample collecting

  16. Sampling the Mouse Hippocampal Dentate Gyrus

    OpenAIRE

    Lisa Basler; Lisa Basler; Stephan Gerdes; David P. Wolfer; David P. Wolfer; David P. Wolfer; Lutz Slomianka; Lutz Slomianka

    2017-01-01

    Sampling is a critical step in procedures that generate quantitative morphological data in the neurosciences. Samples need to be representative to allow statistical evaluations, and samples need to deliver a precision that makes statistical evaluations not only possible but also meaningful. Sampling generated variability should, e.g., not be able to hide significant group differences from statistical detection if they are present. Estimators of the coefficient of error (CE) have been develope...

  17. Design-based estimators for snowball sampling

    OpenAIRE

    Shafie, Termeh

    2010-01-01

    Snowball sampling, where existing study subjects recruit further subjects from amongtheir acquaintances, is a popular approach when sampling from hidden populations.Since people with many in-links are more likely to be selected, there will be a selectionbias in the samples obtained. In order to eliminate this bias, the sample data must beweighted. However, the exact selection probabilities are unknown for snowball samplesand need to be approximated in an appropriate way. This paper proposes d...

  18. 1999 Baseline Sampling and Analysis Sampling Locations, Geographic NAD83, LOSCO (2004) [BSA_1999_sample_locations_LOSCO_2004

    Data.gov (United States)

    Louisiana Geographic Information Center — The monitor point data set was produced as a part of the Baseline Sampling and Analysis program coordinated by the Louisiana Oil Spill Coordinator's Office. This...

  19. 1997 Baseline Sampling and Analysis Sample Locations, Geographic NAD83, LOSCO (2004) [BSA_1997_sample_locations_LOSCO_2004

    Data.gov (United States)

    Louisiana Geographic Information Center — The monitor point data set was produced as a part of the Baseline Sampling and Analysis (BSA) program coordinated by the Louisiana Oil Spill Coordinator's Office....

  20. 1998 Baseline Sampling and Analysis Sampling Locations, Geographic NAD83, LOSCO (2004) [BSA_1998_sample_locations_LOSCO_2004

    Data.gov (United States)

    Louisiana Geographic Information Center — The monitor point data set was produced as a part of the Baseline Sampling and Analysis program coordinated by the Louisiana Oil Spill Coordinator's Office. This...

  1. Standard practices for sampling uranium-Ore concentrate

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 These practices are intended to provide the nuclear industry with procedures for obtaining representative bulk samples from uranium-ore concentrates (UOC) (see Specification C967). 1.2 These practices also provide for obtaining a series of representative secondary samples from the original bulk sample for the determination of moisture and other test purposes, and for the preparation of pulverized analytical samples (see Test Methods C1022). 1.3 These practices consist of a number of alternative procedures for sampling and sample preparation which have been shown to be satisfactory through long experience in the nuclear industry. These procedures are described in the following order. Stage Procedure Section Primary Sampling One-stage falling stream 4 Two-stage falling stream 5 Auger 6 Secondary Sampling Straight-path (reciprocating) 7 Rotating (Vezin) 8, 9 Sample Preparation 10 Concurrent-drying 11-13 Natural moisture 14-16 Calcination 17, 18 Sample Packaging 19 Wax s...

  2. In-Sample Confidence Bands and Out-of-Sample Forecast Bands for Time-Varying Parameters in Observation Driven Models

    NARCIS (Netherlands)

    Blasques, F.; Koopman, S.J.; Lasak, K.A.; Lucas, A.

    2016-01-01

    We study the performances of alternative methods for calculating in-sample confidence and out-of-sample forecast bands for time-varying parameters. The in-sample bands reflect parameter uncertainty, while the out-of-sample bands reflect not only parameter uncertainty, but also innovation

  3. Equipment for collecting samples of radioactive solutions

    International Nuclear Information System (INIS)

    Raggenbass, A.; Fradin, J.; Joubert, G.

    1958-01-01

    The authors present an equipment aimed at collecting samples of fission products to perform radio-chemical analysis. As the sample must have a total activity between 1 and 50 micro-Curie, this installation comprises a sampling system and a dilution device which aims at bringing the sample to the suitable activity. Samples are collected by means of needles. The sample reproducibility is discussed. The dilution device is described

  4. Commutability of food microbiology proficiency testing samples.

    Science.gov (United States)

    Abdelmassih, M; Polet, M; Goffaux, M-J; Planchon, V; Dierick, K; Mahillon, J

    2014-03-01

    Food microbiology proficiency testing (PT) is a useful tool to assess the analytical performances among laboratories. PT items should be close to routine samples to accurately evaluate the acceptability of the methods. However, most PT providers distribute exclusively artificial samples such as reference materials or irradiated foods. This raises the issue of the suitability of these samples because the equivalence-or 'commutability'-between results obtained on artificial vs. authentic food samples has not been demonstrated. In the clinical field, the use of noncommutable PT samples has led to erroneous evaluation of the performances when different analytical methods were used. This study aimed to provide a first assessment of the commutability of samples distributed in food microbiology PT. REQUASUD and IPH organized 13 food microbiology PTs including 10-28 participants. Three types of PT items were used: genuine food samples, sterile food samples and reference materials. The commutability of the artificial samples (reference material or sterile samples) was assessed by plotting the distribution of the results on natural and artificial PT samples. This comparison highlighted matrix-correlated issues when nonfood matrices, such as reference materials, were used. Artificially inoculated food samples, on the other hand, raised only isolated commutability issues. In the organization of a PT-scheme, authentic or artificially inoculated food samples are necessary to accurately evaluate the analytical performances. Reference materials, used as PT items because of their convenience, may present commutability issues leading to inaccurate penalizing conclusions for methods that would have provided accurate results on food samples. For the first time, the commutability of food microbiology PT samples was investigated. The nature of the samples provided by the organizer turned out to be an important factor because matrix effects can impact on the analytical results. © 2013

  5. Unit 06 - Sampling the World

    OpenAIRE

    Unit 06, CC in GIS; Parson, Charles; Nyerges, Timothy

    1990-01-01

    This unit begins the section on data acquisition by looking at how the infinite complexity of the real world can be discretized and sampled. It considers sampling techniques and associated issues of accuracy and standards.

  6. 40 CFR 761.312 - Compositing of samples.

    Science.gov (United States)

    2010-07-01

    ... to composite surface wipe test samples and to use the composite measurement to represent the PCB concentration of the entire surface. Composite samples consist of more than one sample gauze extracted and... arithmetic mean of the composited samples. (a) Compositing samples from surfaces to be used or reused. For...

  7. Air sampling with solid phase microextraction

    Science.gov (United States)

    Martos, Perry Anthony

    There is an increasing need for simple yet accurate air sampling methods. The acceptance of new air sampling methods requires compatibility with conventional chromatographic equipment, and the new methods have to be environmentally friendly, simple to use, yet with equal, or better, detection limits, accuracy and precision than standard methods. Solid phase microextraction (SPME) satisfies the conditions for new air sampling methods. Analyte detection limits, accuracy and precision of analysis with SPME are typically better than with any conventional air sampling methods. Yet, air sampling with SPME requires no pumps, solvents, is re-usable, extremely simple to use, is completely compatible with current chromatographic equipment, and requires a small capital investment. The first SPME fiber coating used in this study was poly(dimethylsiloxane) (PDMS), a hydrophobic liquid film, to sample a large range of airborne hydrocarbons such as benzene and octane. Quantification without an external calibration procedure is possible with this coating. Well understood are the physical and chemical properties of this coating, which are quite similar to those of the siloxane stationary phase used in capillary columns. The log of analyte distribution coefficients for PDMS are linearly related to chromatographic retention indices and to the inverse of temperature. Therefore, the actual chromatogram from the analysis of the PDMS air sampler will yield the calibration parameters which are used to quantify unknown airborne analyte concentrations (ppb v to ppm v range). The second fiber coating used in this study was PDMS/divinyl benzene (PDMS/DVB) onto which o-(2,3,4,5,6- pentafluorobenzyl) hydroxylamine (PFBHA) was adsorbed for the on-fiber derivatization of gaseous formaldehyde (ppb v range), with and without external calibration. The oxime formed from the reaction can be detected with conventional gas chromatographic detectors. Typical grab sampling times were as small as 5 seconds

  8. Radioactive air sampling methods

    CERN Document Server

    Maiello, Mark L

    2010-01-01

    Although the field of radioactive air sampling has matured and evolved over decades, it has lacked a single resource that assimilates technical and background information on its many facets. Edited by experts and with contributions from top practitioners and researchers, Radioactive Air Sampling Methods provides authoritative guidance on measuring airborne radioactivity from industrial, research, and nuclear power operations, as well as naturally occuring radioactivity in the environment. Designed for industrial hygienists, air quality experts, and heath physicists, the book delves into the applied research advancing and transforming practice with improvements to measurement equipment, human dose modeling of inhaled radioactivity, and radiation safety regulations. To present a wide picture of the field, it covers the international and national standards that guide the quality of air sampling measurements and equipment. It discusses emergency response issues, including radioactive fallout and the assets used ...

  9. Gleeble Testing of Tungsten Samples

    Science.gov (United States)

    2013-02-01

    temperature on an Instron load frame with a 222.41 kN (50 kip) load cell . The samples were compressed at the same strain rate as on the Gleeble...ID % RE Initial Density (cm 3 ) Density after Compression (cm 3 ) % Change in Density Test Temperature NT1 0 18.08 18.27 1.06 1000 NT3 0...4.1 Nano-Tungsten The results for the compression of the nano-tungsten samples are shown in tables 2 and 3 and figure 5. During testing, sample NT1

  10. Robotic system for process sampling

    International Nuclear Information System (INIS)

    Dyches, G.M.

    1985-01-01

    A three-axis cartesian geometry robot for process sampling was developed at the Savannah River Laboratory (SRL) and implemented in one of the site radioisotope separations facilities. Use of the robot reduces personnel radiation exposure and contamination potential by routinely handling sample containers under operator control in a low-level radiation area. This robot represents the initial phase of a longer term development program to use robotics for further sample automation. Preliminary design of a second generation robot with additional capabilities is also described. 8 figs

  11. Determination of copper in powdered chocolate samples by slurry-sampling flame atomic-absorption spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Walter N.L. dos; Silva, Erik G.P. da; Fernandes, Marcelo S.; Araujo, Rennan G.O.; Costa, Anto' ' enio C.S.; Ferreira, Sergio L.C. [Nucleo de Excelencia em Quimica Analitica da Bahia, Universidade Federal da Bahia, Instituto de Quimica, Salvador, Bahia (Brazil); Vale, M.G.R. [Instituto de Quimica, Universidade Federal da Bahia do Rio Grande do Sul, Porto Alegre, Rio Grande do Sul (Brazil)

    2005-06-01

    Chocolate is a complex sample with a high content of organic compounds and its analysis generally involves digestion procedures that might include the risk of losses and/or contamination. The determination of copper in chocolate is important because copper compounds are extensively used as fungicides in the farming of cocoa. In this paper, a slurry-sampling flame atomic-absorption spectrometric method is proposed for determination of copper in powdered chocolate samples. Optimization was carried out using univariate methodology involving the variables nature and concentration of the acid solution for slurry preparation, sonication time, and sample mass. The recommended conditions include a sample mass of 0.2 g, 2.0 mol L{sup -1} hydrochloric acid solution, and a sonication time of 15 min. The calibration curve was prepared using aqueous copper standards in 2.0 mol L{sup -1} hydrochloric acid. This method allowed determination of copper in chocolate with a detection limit of 0.4 {mu}g g{sup -1} and precision, expressed as relative standard deviation (RSD), of 2.5% (n=10) for a copper content of approximately 30 {mu}g g{sup -1}, using a chocolate mass of 0.2 g. The accuracy was confirmed by analyzing the certified reference materials NIST SRM 1568a rice flour and NIES CRM 10-b rice flour. The proposed method was used for determination of copper in three powdered chocolate samples, the copper content of which varied between 26.6 and 31.5 {mu}g g{sup -1}. The results showed no significant differences with those obtained after complete digestion, using a t-test for comparison. (orig.)

  12. Maintaining continuity of knowledge on safeguards samples

    International Nuclear Information System (INIS)

    Franssen, F.; Islam, A.B.M.N.; Sonnier, C.; Schoeneman, J.L.; Baumann, M.

    1992-01-01

    The conclusions of the vulnerability test on VOPAN (verification of Operator's Analysis) as conducted at Safeguards Analytical Laboratory (ASA) at Seibersdorf, Austria in October 1990 and documented in STR-266, indicate that ''whenever samples are taken for safeguards purposes extreme care must be taken to ensure that they have not been interfered with during the sample taking, transportation, storage or sample preparation process.'' Indeed there exist a number of possibilities to alter the content of a safeguards sample vial from the moment of sampling up to the arrival of the treated (or untreated) sample at SAL. The time lapse between these two events can range from a few days up to months. The sample history over this period can be subdivided into three main sub-periods: (1) the period from when the sampling activities are commenced up to the treatment in the operator's laboratory, (2) during treatment of samples in the operator's laboratory, and finally, (3) the period between that treatment and the arrival of the sample at SAL. A combined effort between the Agency and the United States Support Program to the Agency (POTAS) has resulted in two active tasks and one proposed task to investigate improving the maintenance of continuity of knowledge on safeguards samples during the entire period of their existence. This paper describes the use of the Sample Vial Secure Container (SVSC), of the Authenticated Secure Container System (ASCS), and of the Secure Container for Storage and Transportation of samples (SCST) to guarantee that a representative portion of the solution sample will be received at SAL

  13. Two sampling techniques for game meat

    OpenAIRE

    van der Merwe, Maretha; Jooste, Piet J.; Hoffman, Louw C.; Calitz, Frikkie J.

    2013-01-01

    A study was conducted to compare the excision sampling technique used by the export market and the sampling technique preferred by European countries, namely the biotrace cattle and swine test. The measuring unit for the excision sampling was grams (g) and square centimetres (cm2) for the swabbing technique. The two techniques were compared after a pilot test was conducted on spiked approved beef carcasses (n = 12) that statistically proved the two measuring units correlated. The two sampling...

  14. Effects of depression, anxiety, self-esteem, and health behaviour on neonatal outcomes in a population-based Hungarian sample.

    Science.gov (United States)

    Bödecs, Tamás; Horváth, Boldizsár; Szilágyi, Eniko; Gonda, Xénia; Rihmer, Zoltán; Sándor, János

    2011-01-01

    To investigate possible associations of maternal antenatal depression, anxiety and self-esteem with negative neonatal outcomes controlling for the effects of demographic covariates and health behaviour in a Hungarian sample. A population-based monitoring system was established in 10 districts of health visitors in Szombathely, Hungary, covering every woman registered as pregnant between February 1, 2008 and February 1 2009. Three hundred and seven expectant women in the early stage of their pregnancy were surveyed using the Short Form of Beck Depression Inventory for the measurement of depression and the Spielberger Trait-Anxiety Inventory for the measurement of anxiety. Self-esteem was evaluated by the Rosenberg's Self-Esteem Scale. At the end of the follow-up period, data on 261 mothers and their singleton neonates were available. The relationship between the explanatory and outcome variables (birth weight, length, chest circumference, gestational age, and 1- and 5-min Apgar score) was tested in girls and boys separately by multiple linear regression analysis (Forward method). Categorical variables were used as "dummy variables". Maternal depression, anxiety and health behaviour did not show any association with neonatal outcomes. Higher level of maternal self-esteem was associated with higher birth weight and birth length in boys and higher birth length in girls. Maternal education positively correlated with birth length, gestational age and chest circumference in boys, and with birth length in girls. In girls, maternal socioeconomic status showed a positive association with birth weight and gestational age, while common law marriage had a negative effect on birth weight and chest circumference. Lower level of maternal self-esteem possibly leads to a higher level of maternal stress which may reduce fetal growth via physiologic changes. Gender differences in associations between demographic factors and neonatal outcome measures indicate differences in fetal

  15. 100 Area Columbia River sediment sampling

    International Nuclear Information System (INIS)

    Weiss, S.G.

    1993-01-01

    Forty-four sediment samples were collected from 28 locations in the Hanford Reach of the Columbia River to assess the presence of metals and man-made radionuclides in the near shore and shoreline settings of the Hanford Site. Three locations were sampled upriver of the Hanford Site plutonium production reactors. Twenty-two locations were sampled near the reactors. Three locations were sampled downstream of the reactors near the Hanford Townsite. Sediment was collected from depths of 0 to 6 in. and between 12 to 24 in. below the surface. Samples containing concentrations of metals exceeding the 95 % upper threshold limit values (DOE-RL 1993b) are considered contaminated. Contamination by arsenic, chromium, copper, lead, and zinc was found. Man-made radionuclides occur in all samples except four collected opposite the Hanford Townsite. Man-made radionuclide concentrations were generally less than 1 pCi/g

  16. 100 Area Columbia River sediment sampling

    Energy Technology Data Exchange (ETDEWEB)

    Weiss, S.G. [Westinghouse Hanford Co., Richland, WA (United States)

    1993-09-08

    Forty-four sediment samples were collected from 28 locations in the Hanford Reach of the Columbia River to assess the presence of metals and man-made radionuclides in the near shore and shoreline settings of the Hanford Site. Three locations were sampled upriver of the Hanford Site plutonium production reactors. Twenty-two locations were sampled near the reactors. Three locations were sampled downstream of the reactors near the Hanford Townsite. Sediment was collected from depths of 0 to 6 in. and between 12 to 24 in. below the surface. Samples containing concentrations of metals exceeding the 95 % upper threshold limit values (DOE-RL 1993b) are considered contaminated. Contamination by arsenic, chromium, copper, lead, and zinc was found. Man-made radionuclides occur in all samples except four collected opposite the Hanford Townsite. Man-made radionuclide concentrations were generally less than 1 pCi/g.

  17. Perilymph sampling from the cochlear apex: a reliable method to obtain higher purity perilymph samples from scala tympani.

    Science.gov (United States)

    Salt, Alec N; Hale, Shane A; Plonkte, Stefan K R

    2006-05-15

    Measurements of drug levels in the fluids of the inner ear are required to establish kinetic parameters and to determine the influence of specific local delivery protocols. For most substances, this requires cochlear fluids samples to be obtained for analysis. When auditory function is of primary interest, the drug level in the perilymph of scala tympani (ST) is most relevant, since drug in this scala has ready access to the auditory sensory cells. In many prior studies, ST perilymph samples have been obtained from the basal turn, either by aspiration through the round window membrane (RWM) or through an opening in the bony wall. A number of studies have demonstrated that such samples are likely to be contaminated with cerebrospinal fluid (CSF). CSF enters the basal turn of ST through the cochlear aqueduct when the bony capsule is perforated or when fluid is aspirated. The degree of sample contamination has, however, not been widely appreciated. Recent studies have shown that perilymph samples taken through the round window membrane are highly contaminated with CSF, with samples greater than 2microL in volume containing more CSF than perilymph. In spite of this knowledge, many groups continue to sample from the base of the cochlea, as it is a well-established method. We have developed an alternative, technically simple method to increase the proportion of ST perilymph in a fluid sample. The sample is taken from the apex of the cochlea, a site that is distant from the cochlear aqueduct. A previous problem with sampling through a perforation in the bone was that the native perilymph rapidly leaked out driven by CSF pressure and was lost to the middle ear space. We therefore developed a procedure to collect all the fluid that emerged from the perforated apex after perforation. We evaluated the method using a marker ion trimethylphenylammonium (TMPA). TMPA was applied to the perilymph of guinea pigs either by RW irrigation or by microinjection into the apical turn. The

  18. Using lot quality-assurance sampling and area sampling to identify priority areas for trachoma control: Viet Nam.

    Science.gov (United States)

    Myatt, Mark; Mai, Nguyen Phuong; Quynh, Nguyen Quang; Nga, Nguyen Huy; Tai, Ha Huy; Long, Nguyen Hung; Minh, Tran Hung; Limburg, Hans

    2005-10-01

    To report on the use of lot quality-assurance sampling (LQAS) surveys undertaken within an area-sampling framework to identify priority areas for intervention with trachoma control activities in Viet Nam. The LQAS survey method for the rapid assessment of the prevalence of active trachoma was adapted for use in Viet Nam with the aim of classifying individual communes by the prevalence of active trachoma among children in primary school. School-based sampling was used; school sites to be sampled were selected using an area-sampling approach. A total of 719 communes in 41 districts in 18 provinces were surveyed. Survey staff found the LQAS survey method both simple and rapid to use after initial problems with area-sampling methods were identified and remedied. The method yielded a finer spatial resolution of prevalence than had been previously achieved in Viet Nam using semiquantitative rapid assessment surveys and multistage cluster-sampled surveys. When used with area-sampling techniques, the LQAS survey method has the potential to form the basis of survey instruments that can be used to efficiently target resources for interventions against active trachoma. With additional work, such methods could provide a generally applicable tool for effective programme planning and for the certification of the elimination of trachoma as a blinding disease.

  19. 30 CFR 71.202 - Certified person; sampling.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Certified person; sampling. 71.202 Section 71... Sampling Procedures § 71.202 Certified person; sampling. (a) The respirable dust sampling required by this... on sampling of respirable coal mine dust. (c) A person may be temporarily certified by MSHA to take...

  20. Smoking, activity level and exercise test outcomes in a young population sample without cardiopulmonary disease.

    Science.gov (United States)

    Vozoris, N T; O'donnell, D E

    2015-01-01

    Whether reduced activity level and exercise intolerance precede the clinical diagnosis of cardiopulmonary disorders in smokers is not known. We examined activity level and exercise test outcomes in a young population-based sample without overt cardiopulmonary disease, differentiating by smoking history. This was a multiyear cross-sectional study using United States National Health and Nutrition Examination Survey data from 1999-2004. Self-reported activity level and incremental exercise treadmill testing were obtained on survey participants ages 20-49 years, excluding individuals with cardio-pulmonary disease. Three thousand seven hundred and one individuals completed exercise testing. Compared to never smokers, current smokers with >10 pack years reported significantly higher odds of little or no recreation, sport, or physical activity (adjusted OR 1.62; 95% CI 1.12-2.35). Mean perceived exertion ratings (Borg 6-20) at an estimated standardized workload were significantly greater among current smokers (18.3-18.6) compared to never (17.3) and former smokers (17.9) (psmoking abstinence was associated with significantly lower likelihood of low estimated peak oxygen uptake categorization (psmoking cessation, these results set the stage for future studies that examine mechanisms of activity restriction in young smokers and the utility of measures of activity restriction in the earlier diagnosis of smoking-related diseases.