WorldWideScience

Sample records for sample selection sample

  1. Sample size estimation and sampling techniques for selecting a representative sample

    Directory of Open Access Journals (Sweden)

    Aamir Omair

    2014-01-01

    Full Text Available Introduction: The purpose of this article is to provide a general understanding of the concepts of sampling as applied to health-related research. Sample Size Estimation: It is important to select a representative sample in quantitative research in order to be able to generalize the results to the target population. The sample should be of the required sample size and must be selected using an appropriate probability sampling technique. There are many hidden biases which can adversely affect the outcome of the study. Important factors to consider for estimating the sample size include the size of the study population, confidence level, expected proportion of the outcome variable (for categorical variables/standard deviation of the outcome variable (for numerical variables, and the required precision (margin of accuracy from the study. The more the precision required, the greater is the required sample size. Sampling Techniques: The probability sampling techniques applied for health related research include simple random sampling, systematic random sampling, stratified random sampling, cluster sampling, and multistage sampling. These are more recommended than the nonprobability sampling techniques, because the results of the study can be generalized to the target population.

  2. UNLABELED SELECTED SAMPLES IN FEATURE EXTRACTION FOR CLASSIFICATION OF HYPERSPECTRAL IMAGES WITH LIMITED TRAINING SAMPLES

    Directory of Open Access Journals (Sweden)

    A. Kianisarkaleh

    2015-12-01

    Full Text Available Feature extraction plays a key role in hyperspectral images classification. Using unlabeled samples, often unlimitedly available, unsupervised and semisupervised feature extraction methods show better performance when limited number of training samples exists. This paper illustrates the importance of selecting appropriate unlabeled samples that used in feature extraction methods. Also proposes a new method for unlabeled samples selection using spectral and spatial information. The proposed method has four parts including: PCA, prior classification, posterior classification and sample selection. As hyperspectral image passes these parts, selected unlabeled samples can be used in arbitrary feature extraction methods. The effectiveness of the proposed unlabeled selected samples in unsupervised and semisupervised feature extraction is demonstrated using two real hyperspectral datasets. Results show that through selecting appropriate unlabeled samples, the proposed method can improve the performance of feature extraction methods and increase classification accuracy.

  3. 40 CFR 89.507 - Sample selection.

    Science.gov (United States)

    2010-07-01

    ... Auditing § 89.507 Sample selection. (a) Engines comprising a test sample will be selected at the location...). However, once the manufacturer ships any test engine, it relinquishes the prerogative to conduct retests...

  4. 40 CFR 90.507 - Sample selection.

    Science.gov (United States)

    2010-07-01

    ... Auditing § 90.507 Sample selection. (a) Engines comprising a test sample will be selected at the location... manufacturer ships any test engine, it relinquishes the prerogative to conduct retests as provided in § 90.508...

  5. Sample Selection for Training Cascade Detectors.

    Science.gov (United States)

    Vállez, Noelia; Deniz, Oscar; Bueno, Gloria

    2015-01-01

    Automatic detection systems usually require large and representative training datasets in order to obtain good detection and false positive rates. Training datasets are such that the positive set has few samples and/or the negative set should represent anything except the object of interest. In this respect, the negative set typically contains orders of magnitude more images than the positive set. However, imbalanced training databases lead to biased classifiers. In this paper, we focus our attention on a negative sample selection method to properly balance the training data for cascade detectors. The method is based on the selection of the most informative false positive samples generated in one stage to feed the next stage. The results show that the proposed cascade detector with sample selection obtains on average better partial AUC and smaller standard deviation than the other compared cascade detectors.

  6. Sample Selection for Training Cascade Detectors.

    Directory of Open Access Journals (Sweden)

    Noelia Vállez

    Full Text Available Automatic detection systems usually require large and representative training datasets in order to obtain good detection and false positive rates. Training datasets are such that the positive set has few samples and/or the negative set should represent anything except the object of interest. In this respect, the negative set typically contains orders of magnitude more images than the positive set. However, imbalanced training databases lead to biased classifiers. In this paper, we focus our attention on a negative sample selection method to properly balance the training data for cascade detectors. The method is based on the selection of the most informative false positive samples generated in one stage to feed the next stage. The results show that the proposed cascade detector with sample selection obtains on average better partial AUC and smaller standard deviation than the other compared cascade detectors.

  7. Robust inference in sample selection models

    KAUST Repository

    Zhelonkin, Mikhail; Genton, Marc G.; Ronchetti, Elvezio

    2015-01-01

    The problem of non-random sample selectivity often occurs in practice in many fields. The classical estimators introduced by Heckman are the backbone of the standard statistical analysis of these models. However, these estimators are very sensitive to small deviations from the distributional assumptions which are often not satisfied in practice. We develop a general framework to study the robustness properties of estimators and tests in sample selection models. We derive the influence function and the change-of-variance function of Heckman's two-stage estimator, and we demonstrate the non-robustness of this estimator and its estimated variance to small deviations from the model assumed. We propose a procedure for robustifying the estimator, prove its asymptotic normality and give its asymptotic variance. Both cases with and without an exclusion restriction are covered. This allows us to construct a simple robust alternative to the sample selection bias test. We illustrate the use of our new methodology in an analysis of ambulatory expenditures and we compare the performance of the classical and robust methods in a Monte Carlo simulation study.

  8. Robust inference in sample selection models

    KAUST Repository

    Zhelonkin, Mikhail

    2015-11-20

    The problem of non-random sample selectivity often occurs in practice in many fields. The classical estimators introduced by Heckman are the backbone of the standard statistical analysis of these models. However, these estimators are very sensitive to small deviations from the distributional assumptions which are often not satisfied in practice. We develop a general framework to study the robustness properties of estimators and tests in sample selection models. We derive the influence function and the change-of-variance function of Heckman\\'s two-stage estimator, and we demonstrate the non-robustness of this estimator and its estimated variance to small deviations from the model assumed. We propose a procedure for robustifying the estimator, prove its asymptotic normality and give its asymptotic variance. Both cases with and without an exclusion restriction are covered. This allows us to construct a simple robust alternative to the sample selection bias test. We illustrate the use of our new methodology in an analysis of ambulatory expenditures and we compare the performance of the classical and robust methods in a Monte Carlo simulation study.

  9. 40 CFR 205.171-3 - Test motorcycle sample selection.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 24 2010-07-01 2010-07-01 false Test motorcycle sample selection. 205... ABATEMENT PROGRAMS TRANSPORTATION EQUIPMENT NOISE EMISSION CONTROLS Motorcycle Exhaust Systems § 205.171-3 Test motorcycle sample selection. A test motorcycle to be used for selective enforcement audit testing...

  10. Acceptance sampling using judgmental and randomly selected samples

    Energy Technology Data Exchange (ETDEWEB)

    Sego, Landon H.; Shulman, Stanley A.; Anderson, Kevin K.; Wilson, John E.; Pulsipher, Brent A.; Sieber, W. Karl

    2010-09-01

    We present a Bayesian model for acceptance sampling where the population consists of two groups, each with different levels of risk of containing unacceptable items. Expert opinion, or judgment, may be required to distinguish between the high and low-risk groups. Hence, high-risk items are likely to be identifed (and sampled) using expert judgment, while the remaining low-risk items are sampled randomly. We focus on the situation where all observed samples must be acceptable. Consequently, the objective of the statistical inference is to quantify the probability that a large percentage of the unsampled items in the population are also acceptable. We demonstrate that traditional (frequentist) acceptance sampling and simpler Bayesian formulations of the problem are essentially special cases of the proposed model. We explore the properties of the model in detail, and discuss the conditions necessary to ensure that required samples sizes are non-decreasing function of the population size. The method is applicable to a variety of acceptance sampling problems, and, in particular, to environmental sampling where the objective is to demonstrate the safety of reoccupying a remediated facility that has been contaminated with a lethal agent.

  11. Risk Attitudes, Sample Selection and Attrition in a Longitudinal Field Experiment

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Lau, Morten Igel

    with respect to risk attitudes. Our design builds in explicit randomization on the incentives for participation. We show that there are significant sample selection effects on inferences about the extent of risk aversion, but that the effects of subsequent sample attrition are minimal. Ignoring sample...... selection leads to inferences that subjects in the population are more risk averse than they actually are. Correcting for sample selection and attrition affects utility curvature, but does not affect inferences about probability weighting. Properly accounting for sample selection and attrition effects leads...... to findings of temporal stability in overall risk aversion. However, that stability is around different levels of risk aversion than one might naively infer without the controls for sample selection and attrition we are able to implement. This evidence of “randomization bias” from sample selection...

  12. Measurement of radioactivity in the environment - Soil - Part 2: Guidance for the selection of the sampling strategy, sampling and pre-treatment of samples

    International Nuclear Information System (INIS)

    2007-01-01

    This part of ISO 18589 specifies the general requirements, based on ISO 11074 and ISO/IEC 17025, for all steps in the planning (desk study and area reconnaissance) of the sampling and the preparation of samples for testing. It includes the selection of the sampling strategy, the outline of the sampling plan, the presentation of general sampling methods and equipment, as well as the methodology of the pre-treatment of samples adapted to the measurements of the activity of radionuclides in soil. This part of ISO 18589 is addressed to the people responsible for determining the radioactivity present in soil for the purpose of radiation protection. It is applicable to soil from gardens, farmland, urban or industrial sites, as well as soil not affected by human activities. This part of ISO 18589 is applicable to all laboratories regardless of the number of personnel or the range of the testing performed. When a laboratory does not undertake one or more of the activities covered by this part of ISO 18589, such as planning, sampling or testing, the corresponding requirements do not apply. Information is provided on scope, normative references, terms and definitions and symbols, principle, sampling strategy, sampling plan, sampling process, pre-treatment of samples and recorded information. Five annexes inform about selection of the sampling strategy according to the objectives and the radiological characterization of the site and sampling areas, diagram of the evolution of the sample characteristics from the sampling site to the laboratory, example of sampling plan for a site divided in three sampling areas, example of a sampling record for a single/composite sample and example for a sample record for a soil profile with soil description. A bibliography is provided

  13. Failure Probability Estimation Using Asymptotic Sampling and Its Dependence upon the Selected Sampling Scheme

    Directory of Open Access Journals (Sweden)

    Martinásková Magdalena

    2017-12-01

    Full Text Available The article examines the use of Asymptotic Sampling (AS for the estimation of failure probability. The AS algorithm requires samples of multidimensional Gaussian random vectors, which may be obtained by many alternative means that influence the performance of the AS method. Several reliability problems (test functions have been selected in order to test AS with various sampling schemes: (i Monte Carlo designs; (ii LHS designs optimized using the Periodic Audze-Eglājs (PAE criterion; (iii designs prepared using Sobol’ sequences. All results are compared with the exact failure probability value.

  14. 40 CFR 205.160-2 - Test sample selection and preparation.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 24 2010-07-01 2010-07-01 false Test sample selection and preparation... sample selection and preparation. (a) Vehicles comprising the sample which are required to be tested... maintained in any manner unless such preparation, tests, modifications, adjustments or maintenance are part...

  15. The quasar luminosity function from a variability-selected sample

    Science.gov (United States)

    Hawkins, M. R. S.; Veron, P.

    1993-01-01

    A sample of quasars is selected from a 10-yr sequence of 30 UK Schmidt plates. Luminosity functions are derived in several redshift intervals, which in each case show a featureless power-law rise towards low luminosities. There is no sign of the 'break' found in the recent UVX sample of Boyle et al. It is suggested that reasons for the disagreement are connected with biases in the selection of the UVX sample. The question of the nature of quasar evolution appears to be still unresolved.

  16. On a Robust MaxEnt Process Regression Model with Sample-Selection

    Directory of Open Access Journals (Sweden)

    Hea-Jung Kim

    2018-04-01

    Full Text Available In a regression analysis, a sample-selection bias arises when a dependent variable is partially observed as a result of the sample selection. This study introduces a Maximum Entropy (MaxEnt process regression model that assumes a MaxEnt prior distribution for its nonparametric regression function and finds that the MaxEnt process regression model includes the well-known Gaussian process regression (GPR model as a special case. Then, this special MaxEnt process regression model, i.e., the GPR model, is generalized to obtain a robust sample-selection Gaussian process regression (RSGPR model that deals with non-normal data in the sample selection. Various properties of the RSGPR model are established, including the stochastic representation, distributional hierarchy, and magnitude of the sample-selection bias. These properties are used in the paper to develop a hierarchical Bayesian methodology to estimate the model. This involves a simple and computationally feasible Markov chain Monte Carlo algorithm that avoids analytical or numerical derivatives of the log-likelihood function of the model. The performance of the RSGPR model in terms of the sample-selection bias correction, robustness to non-normality, and prediction, is demonstrated through results in simulations that attest to its good finite-sample performance.

  17. HOT-DUST-POOR QUASARS IN MID-INFRARED AND OPTICALLY SELECTED SAMPLES

    International Nuclear Information System (INIS)

    Hao Heng; Elvis, Martin; Civano, Francesca; Lawrence, Andy

    2011-01-01

    We show that the hot-dust-poor (HDP) quasars, originally found in the X-ray-selected XMM-COSMOS type 1 active galactic nucleus (AGN) sample, are just as common in two samples selected at optical/infrared wavelengths: the Richards et al. Spitzer/SDSS sample (8.7% ± 2.2%) and the Palomar-Green-quasar-dominated sample of Elvis et al. (9.5% ± 5.0%). The properties of the HDP quasars in these two samples are consistent with the XMM-COSMOS sample, except that, at the 99% (∼ 2.5σ) significance, a larger proportion of the HDP quasars in the Spitzer/SDSS sample have weak host galaxy contributions, probably due to the selection criteria used. Either the host dust is destroyed (dynamically or by radiation) or is offset from the central black hole due to recoiling. Alternatively, the universality of HDP quasars in samples with different selection methods and the continuous distribution of dust covering factor in type 1 AGNs suggest that the range of spectral energy distributions could be related to the range of tilts in warped fueling disks, as in the model of Lawrence and Elvis, with HDP quasars having relatively small warps.

  18. The genealogy of samples in models with selection.

    Science.gov (United States)

    Neuhauser, C; Krone, S M

    1997-02-01

    We introduce the genealogy of a random sample of genes taken from a large haploid population that evolves according to random reproduction with selection and mutation. Without selection, the genealogy is described by Kingman's well-known coalescent process. In the selective case, the genealogy of the sample is embedded in a graph with a coalescing and branching structure. We describe this graph, called the ancestral selection graph, and point out differences and similarities with Kingman's coalescent. We present simulations for a two-allele model with symmetric mutation in which one of the alleles has a selective advantage over the other. We find that when the allele frequencies in the population are already in equilibrium, then the genealogy does not differ much from the neutral case. This is supported by rigorous results. Furthermore, we describe the ancestral selection graph for other selective models with finitely many selection classes, such as the K-allele models, infinitely-many-alleles models. DNA sequence models, and infinitely-many-sites models, and briefly discuss the diploid case.

  19. A novel heterogeneous training sample selection method on space-time adaptive processing

    Science.gov (United States)

    Wang, Qiang; Zhang, Yongshun; Guo, Yiduo

    2018-04-01

    The performance of ground target detection about space-time adaptive processing (STAP) decreases when non-homogeneity of clutter power is caused because of training samples contaminated by target-like signals. In order to solve this problem, a novel nonhomogeneous training sample selection method based on sample similarity is proposed, which converts the training sample selection into a convex optimization problem. Firstly, the existing deficiencies on the sample selection using generalized inner product (GIP) are analyzed. Secondly, the similarities of different training samples are obtained by calculating mean-hausdorff distance so as to reject the contaminated training samples. Thirdly, cell under test (CUT) and the residual training samples are projected into the orthogonal subspace of the target in the CUT, and mean-hausdorff distances between the projected CUT and training samples are calculated. Fourthly, the distances are sorted in order of value and the training samples which have the bigger value are selective preference to realize the reduced-dimension. Finally, simulation results with Mountain-Top data verify the effectiveness of the proposed method.

  20. Evaluation of pump pulsation in respirable size-selective sampling: part II. Changes in sampling efficiency.

    Science.gov (United States)

    Lee, Eun Gyung; Lee, Taekhee; Kim, Seung Won; Lee, Larry; Flemmer, Michael M; Harper, Martin

    2014-01-01

    This second, and concluding, part of this study evaluated changes in sampling efficiency of respirable size-selective samplers due to air pulsations generated by the selected personal sampling pumps characterized in Part I (Lee E, Lee L, Möhlmann C et al. Evaluation of pump pulsation in respirable size-selective sampling: Part I. Pulsation measurements. Ann Occup Hyg 2013). Nine particle sizes of monodisperse ammonium fluorescein (from 1 to 9 μm mass median aerodynamic diameter) were generated individually by a vibrating orifice aerosol generator from dilute solutions of fluorescein in aqueous ammonia and then injected into an environmental chamber. To collect these particles, 10-mm nylon cyclones, also known as Dorr-Oliver (DO) cyclones, were used with five medium volumetric flow rate pumps. Those were the Apex IS, HFS513, GilAir5, Elite5, and Basic5 pumps, which were found in Part I to generate pulsations of 5% (the lowest), 25%, 30%, 56%, and 70% (the highest), respectively. GK2.69 cyclones were used with the Legacy [pump pulsation (PP) = 15%] and Elite12 (PP = 41%) pumps for collection at high flows. The DO cyclone was also used to evaluate changes in sampling efficiency due to pulse shape. The HFS513 pump, which generates a more complex pulse shape, was compared to a single sine wave fluctuation generated by a piston. The luminescent intensity of the fluorescein extracted from each sample was measured with a luminescence spectrometer. Sampling efficiencies were obtained by dividing the intensity of the fluorescein extracted from the filter placed in a cyclone with the intensity obtained from the filter used with a sharp-edged reference sampler. Then, sampling efficiency curves were generated using a sigmoid function with three parameters and each sampling efficiency curve was compared to that of the reference cyclone by constructing bias maps. In general, no change in sampling efficiency (bias under ±10%) was observed until pulsations exceeded 25% for the

  1. 6. Label-free selective plane illumination microscopy of tissue samples

    Directory of Open Access Journals (Sweden)

    Muteb Alharbi

    2017-10-01

    Conclusion: Overall this method meets the demands of the current needs for 3D imaging tissue samples in a label-free manner. Label-free Selective Plane Microscopy directly provides excellent information about the structure of the tissue samples. This work has highlighted the superiority of Label-free Selective Plane Microscopy to current approaches to label-free 3D imaging of tissue.

  2. Patch-based visual tracking with online representative sample selection

    Science.gov (United States)

    Ou, Weihua; Yuan, Di; Li, Donghao; Liu, Bin; Xia, Daoxun; Zeng, Wu

    2017-05-01

    Occlusion is one of the most challenging problems in visual object tracking. Recently, a lot of discriminative methods have been proposed to deal with this problem. For the discriminative methods, it is difficult to select the representative samples for the target template updating. In general, the holistic bounding boxes that contain tracked results are selected as the positive samples. However, when the objects are occluded, this simple strategy easily introduces the noises into the training data set and the target template and then leads the tracker to drift away from the target seriously. To address this problem, we propose a robust patch-based visual tracker with online representative sample selection. Different from previous works, we divide the object and the candidates into several patches uniformly and propose a score function to calculate the score of each patch independently. Then, the average score is adopted to determine the optimal candidate. Finally, we utilize the non-negative least square method to find the representative samples, which are used to update the target template. The experimental results on the object tracking benchmark 2013 and on the 13 challenging sequences show that the proposed method is robust to the occlusion and achieves promising results.

  3. Selection of the Sample for Data-Driven $Z \\to \

    CERN Document Server

    Krauss, Martin

    2009-01-01

    The topic of this study was to improve the selection of the sample for data-driven Z → ν ν background estimation, which is a major contribution in supersymmetric searches in ̄ a no-lepton search mode. The data is based on Z → + − samples using data created with ATLAS simulation software. This method works if two leptons are reconstructed, but using cuts that are typical for SUSY searches reconstruction efficiency for electrons and muons is rather low. For this reason it was tried to enhance the data sample. Therefore events were considered, where only one electron was reconstructed. In this case the invariant mass for the electron and each jet was computed to select the jet with the best match for the Z boson mass as not reconstructed electron. This way the sample can be extended but significantly looses purity because of also reconstructed background events. To improve this method other variables have to be considered which were not available for this study. Applying a similar method to muons using ...

  4. Mineral Composition of Selected Serbian Propolis Samples

    Directory of Open Access Journals (Sweden)

    Tosic Snezana

    2017-06-01

    Full Text Available The aim of this work was to determine the content of 22 macro- and microelements in ten raw Serbian propolis samples which differ in geographical and botanical origin as well as in polluted agent contents by atomic emission spectrometry with inductively coupled plasma (ICP-OES. The macroelements were more common and present Ca content was the highest while Na content the lowest. Among the studied essential trace elements Fe was the most common element. The levels of toxic elements (Pb, Cd, As and Hg were also analyzed, since they were possible environmental contaminants that could be transferred into propolis products for human consumption. As and Hg were not detected in any of the analyzed samples but a high level of Pb (2.0-9.7 mg/kg was detected and only selected portions of raw propolis could be used to produce natural medicines and dietary supplements for humans. Obtained results were statistically analyzed, and the examined samples showed a wide range of element content.

  5. Robust online tracking via adaptive samples selection with saliency detection

    Science.gov (United States)

    Yan, Jia; Chen, Xi; Zhu, QiuPing

    2013-12-01

    Online tracking has shown to be successful in tracking of previously unknown objects. However, there are two important factors which lead to drift problem of online tracking, the one is how to select the exact labeled samples even when the target locations are inaccurate, and the other is how to handle the confusors which have similar features with the target. In this article, we propose a robust online tracking algorithm with adaptive samples selection based on saliency detection to overcome the drift problem. To deal with the problem of degrading the classifiers using mis-aligned samples, we introduce the saliency detection method to our tracking problem. Saliency maps and the strong classifiers are combined to extract the most correct positive samples. Our approach employs a simple yet saliency detection algorithm based on image spectral residual analysis. Furthermore, instead of using the random patches as the negative samples, we propose a reasonable selection criterion, in which both the saliency confidence and similarity are considered with the benefits that confusors in the surrounding background are incorporated into the classifiers update process before the drift occurs. The tracking task is formulated as a binary classification via online boosting framework. Experiment results in several challenging video sequences demonstrate the accuracy and stability of our tracker.

  6. Using maximum entropy modeling for optimal selection of sampling sites for monitoring networks

    Science.gov (United States)

    Stohlgren, Thomas J.; Kumar, Sunil; Barnett, David T.; Evangelista, Paul H.

    2011-01-01

    Environmental monitoring programs must efficiently describe state shifts. We propose using maximum entropy modeling to select dissimilar sampling sites to capture environmental variability at low cost, and demonstrate a specific application: sample site selection for the Central Plains domain (453,490 km2) of the National Ecological Observatory Network (NEON). We relied on four environmental factors: mean annual temperature and precipitation, elevation, and vegetation type. A “sample site” was defined as a 20 km × 20 km area (equal to NEON’s airborne observation platform [AOP] footprint), within which each 1 km2 cell was evaluated for each environmental factor. After each model run, the most environmentally dissimilar site was selected from all potential sample sites. The iterative selection of eight sites captured approximately 80% of the environmental envelope of the domain, an improvement over stratified random sampling and simple random designs for sample site selection. This approach can be widely used for cost-efficient selection of survey and monitoring sites.

  7. Pierre Gy's sampling theory and sampling practice heterogeneity, sampling correctness, and statistical process control

    CERN Document Server

    Pitard, Francis F

    1993-01-01

    Pierre Gy's Sampling Theory and Sampling Practice, Second Edition is a concise, step-by-step guide for process variability management and methods. Updated and expanded, this new edition provides a comprehensive study of heterogeneity, covering the basic principles of sampling theory and its various applications. It presents many practical examples to allow readers to select appropriate sampling protocols and assess the validity of sampling protocols from others. The variability of dynamic process streams using variography is discussed to help bridge sampling theory with statistical process control. Many descriptions of good sampling devices, as well as descriptions of poor ones, are featured to educate readers on what to look for when purchasing sampling systems. The book uses its accessible, tutorial style to focus on professional selection and use of methods. The book will be a valuable guide for mineral processing engineers; metallurgists; geologists; miners; chemists; environmental scientists; and practit...

  8. Data Quality Objectives For Selecting Waste Samples For Bench-Scale Reformer Treatability Studies

    International Nuclear Information System (INIS)

    Banning, D.L.

    2011-01-01

    This document describes the data quality objectives to select archived samples located at the 222-S Laboratory for Bench-Scale Reforming testing. The type, quantity, and quality of the data required to select the samples for Fluid Bed Steam Reformer testing are discussed. In order to maximize the efficiency and minimize the time to treat Hanford tank waste in the Waste Treatment and Immobilization Plant, additional treatment processes may be required. One of the potential treatment processes is the fluidized bed steam reformer. A determination of the adequacy of the fluidized bed steam reformer process to treat Hanford tank waste is required. The initial step in determining the adequacy of the fluidized bed steam reformer process is to select archived waste samples from the 222-S Laboratory that will be used in a bench scale tests. Analyses of the selected samples will be required to confirm the samples meet the shipping requirements and for comparison to the bench scale reformer (BSR) test sample selection requirements.

  9. A quick method based on SIMPLISMA-KPLS for simultaneously selecting outlier samples and informative samples for model standardization in near infrared spectroscopy

    Science.gov (United States)

    Li, Li-Na; Ma, Chang-Ming; Chang, Ming; Zhang, Ren-Cheng

    2017-12-01

    A novel method based on SIMPLe-to-use Interactive Self-modeling Mixture Analysis (SIMPLISMA) and Kernel Partial Least Square (KPLS), named as SIMPLISMA-KPLS, is proposed in this paper for selection of outlier samples and informative samples simultaneously. It is a quick algorithm used to model standardization (or named as model transfer) in near infrared (NIR) spectroscopy. The NIR experiment data of the corn for analysis of the protein content is introduced to evaluate the proposed method. Piecewise direct standardization (PDS) is employed in model transfer. And the comparison of SIMPLISMA-PDS-KPLS and KS-PDS-KPLS is given in this research by discussion of the prediction accuracy of protein content and calculation speed of each algorithm. The conclusions include that SIMPLISMA-KPLS can be utilized as an alternative sample selection method for model transfer. Although it has similar accuracy to Kennard-Stone (KS), it is different from KS as it employs concentration information in selection program. This means that it ensures analyte information is involved in analysis, and the spectra (X) of the selected samples is interrelated with concentration (y). And it can be used for outlier sample elimination simultaneously by validation of calibration. According to the statistical data results of running time, it is clear that the sample selection process is more rapid when using KPLS. The quick algorithm of SIMPLISMA-KPLS is beneficial to improve the speed of online measurement using NIR spectroscopy.

  10. 40 CFR 761.247 - Sample site selection for pipe segment removal.

    Science.gov (United States)

    2010-07-01

    ... end of the pipe segment. (3) If the pipe segment is cut with a saw or other mechanical device, take..., take samples from a total of seven segments. (A) Sample the first and last segments removed. (B) Select... total length for purposes of disposal, take samples of each segment that is 1/2 mile distant from the...

  11. An improved selective sampling method

    International Nuclear Information System (INIS)

    Miyahara, Hiroshi; Iida, Nobuyuki; Watanabe, Tamaki

    1986-01-01

    The coincidence methods which are currently used for the accurate activity standardisation of radio-nuclides, require dead time and resolving time corrections which tend to become increasingly uncertain as countrates exceed about 10 K. To reduce the dependence on such corrections, Muller, in 1981, proposed the selective sampling method using a fast multichannel analyser (50 ns ch -1 ) for measuring the countrates. It is, in many ways, more convenient and possibly potentially more reliable to replace the MCA with scalers and a circuit is described employing five scalers; two of them serving to measure the background correction. Results of comparisons using our new method and the coincidence method for measuring the activity of 60 Co sources yielded agree-ment within statistical uncertainties. (author)

  12. Balanced sampling

    NARCIS (Netherlands)

    Brus, D.J.

    2015-01-01

    In balanced sampling a linear relation between the soil property of interest and one or more covariates with known means is exploited in selecting the sampling locations. Recent developments make this sampling design attractive for statistical soil surveys. This paper introduces balanced sampling

  13. 40 CFR 205.171-2 - Test exhaust system sample selection and preparation.

    Science.gov (United States)

    2010-07-01

    ... Systems § 205.171-2 Test exhaust system sample selection and preparation. (a)(1) Exhaust systems... 40 Protection of Environment 24 2010-07-01 2010-07-01 false Test exhaust system sample selection and preparation. 205.171-2 Section 205.171-2 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY...

  14. 40 CFR 205.57-2 - Test vehicle sample selection.

    Science.gov (United States)

    2010-07-01

    ... pursuant to a test request in accordance with this subpart will be selected in the manner specified in the... then using a table of random numbers to select the number of vehicles as specified in paragraph (c) of... with the desig-nated AQL are contained in Appendix I, -Table II. (c) The appropriate batch sample size...

  15. Biological sample collector

    Science.gov (United States)

    Murphy, Gloria A [French Camp, CA

    2010-09-07

    A biological sample collector is adapted to a collect several biological samples in a plurality of filter wells. A biological sample collector may comprise a manifold plate for mounting a filter plate thereon, the filter plate having a plurality of filter wells therein; a hollow slider for engaging and positioning a tube that slides therethrough; and a slide case within which the hollow slider travels to allow the tube to be aligned with a selected filter well of the plurality of filter wells, wherein when the tube is aligned with the selected filter well, the tube is pushed through the hollow slider and into the selected filter well to sealingly engage the selected filter well and to allow the tube to deposit a biological sample onto a filter in the bottom of the selected filter well. The biological sample collector may be portable.

  16. Hybrid nested sampling algorithm for Bayesian model selection applied to inverse subsurface flow problems

    International Nuclear Information System (INIS)

    Elsheikh, Ahmed H.; Wheeler, Mary F.; Hoteit, Ibrahim

    2014-01-01

    A Hybrid Nested Sampling (HNS) algorithm is proposed for efficient Bayesian model calibration and prior model selection. The proposed algorithm combines, Nested Sampling (NS) algorithm, Hybrid Monte Carlo (HMC) sampling and gradient estimation using Stochastic Ensemble Method (SEM). NS is an efficient sampling algorithm that can be used for Bayesian calibration and estimating the Bayesian evidence for prior model selection. Nested sampling has the advantage of computational feasibility. Within the nested sampling algorithm, a constrained sampling step is performed. For this step, we utilize HMC to reduce the correlation between successive sampled states. HMC relies on the gradient of the logarithm of the posterior distribution, which we estimate using a stochastic ensemble method based on an ensemble of directional derivatives. SEM only requires forward model runs and the simulator is then used as a black box and no adjoint code is needed. The developed HNS algorithm is successfully applied for Bayesian calibration and prior model selection of several nonlinear subsurface flow problems

  17. Hybrid nested sampling algorithm for Bayesian model selection applied to inverse subsurface flow problems

    Energy Technology Data Exchange (ETDEWEB)

    Elsheikh, Ahmed H., E-mail: aelsheikh@ices.utexas.edu [Institute for Computational Engineering and Sciences (ICES), University of Texas at Austin, TX (United States); Institute of Petroleum Engineering, Heriot-Watt University, Edinburgh EH14 4AS (United Kingdom); Wheeler, Mary F. [Institute for Computational Engineering and Sciences (ICES), University of Texas at Austin, TX (United States); Hoteit, Ibrahim [Department of Earth Sciences and Engineering, King Abdullah University of Science and Technology (KAUST), Thuwal (Saudi Arabia)

    2014-02-01

    A Hybrid Nested Sampling (HNS) algorithm is proposed for efficient Bayesian model calibration and prior model selection. The proposed algorithm combines, Nested Sampling (NS) algorithm, Hybrid Monte Carlo (HMC) sampling and gradient estimation using Stochastic Ensemble Method (SEM). NS is an efficient sampling algorithm that can be used for Bayesian calibration and estimating the Bayesian evidence for prior model selection. Nested sampling has the advantage of computational feasibility. Within the nested sampling algorithm, a constrained sampling step is performed. For this step, we utilize HMC to reduce the correlation between successive sampled states. HMC relies on the gradient of the logarithm of the posterior distribution, which we estimate using a stochastic ensemble method based on an ensemble of directional derivatives. SEM only requires forward model runs and the simulator is then used as a black box and no adjoint code is needed. The developed HNS algorithm is successfully applied for Bayesian calibration and prior model selection of several nonlinear subsurface flow problems.

  18. Hybrid nested sampling algorithm for Bayesian model selection applied to inverse subsurface flow problems

    KAUST Repository

    Elsheikh, Ahmed H.

    2014-02-01

    A Hybrid Nested Sampling (HNS) algorithm is proposed for efficient Bayesian model calibration and prior model selection. The proposed algorithm combines, Nested Sampling (NS) algorithm, Hybrid Monte Carlo (HMC) sampling and gradient estimation using Stochastic Ensemble Method (SEM). NS is an efficient sampling algorithm that can be used for Bayesian calibration and estimating the Bayesian evidence for prior model selection. Nested sampling has the advantage of computational feasibility. Within the nested sampling algorithm, a constrained sampling step is performed. For this step, we utilize HMC to reduce the correlation between successive sampled states. HMC relies on the gradient of the logarithm of the posterior distribution, which we estimate using a stochastic ensemble method based on an ensemble of directional derivatives. SEM only requires forward model runs and the simulator is then used as a black box and no adjoint code is needed. The developed HNS algorithm is successfully applied for Bayesian calibration and prior model selection of several nonlinear subsurface flow problems. © 2013 Elsevier Inc.

  19. Does self-selection affect samples' representativeness in online surveys? An investigation in online video game research.

    Science.gov (United States)

    Khazaal, Yasser; van Singer, Mathias; Chatton, Anne; Achab, Sophia; Zullino, Daniele; Rothen, Stephane; Khan, Riaz; Billieux, Joel; Thorens, Gabriel

    2014-07-07

    The number of medical studies performed through online surveys has increased dramatically in recent years. Despite their numerous advantages (eg, sample size, facilitated access to individuals presenting stigmatizing issues), selection bias may exist in online surveys. However, evidence on the representativeness of self-selected samples in online studies is patchy. Our objective was to explore the representativeness of a self-selected sample of online gamers using online players' virtual characters (avatars). All avatars belonged to individuals playing World of Warcraft (WoW), currently the most widely used online game. Avatars' characteristics were defined using various games' scores, reported on the WoW's official website, and two self-selected samples from previous studies were compared with a randomly selected sample of avatars. We used scores linked to 1240 avatars (762 from the self-selected samples and 478 from the random sample). The two self-selected samples of avatars had higher scores on most of the assessed variables (except for guild membership and exploration). Furthermore, some guilds were overrepresented in the self-selected samples. Our results suggest that more proficient players or players more involved in the game may be more likely to participate in online surveys. Caution is needed in the interpretation of studies based on online surveys that used a self-selection recruitment procedure. Epidemiological evidence on the reduced representativeness of sample of online surveys is warranted.

  20. Proposal for selecting an ore sample from mining shaft under Kvanefjeld

    International Nuclear Information System (INIS)

    Lund Clausen, F.

    1979-02-01

    Uranium ore recovered from the tunnel under Kvanefjeld (Greenland) will be processed in a pilot plant. Selection of a fully representative ore sample for both the whole area and single local sites is discussed. A FORTRAN program for ore distribution is presented, in order to enable correct sampling. (EG)

  1. Privacy problems in the small sample selection

    Directory of Open Access Journals (Sweden)

    Loredana Cerbara

    2013-05-01

    Full Text Available The side of social research that uses small samples for the production of micro data, today finds some operating difficulties due to the privacy law. The privacy code is a really important and necessary law because it guarantees the Italian citizen’s rights, as already happens in other Countries of the world. However it does not seem appropriate to limit once more the possibilities of the data production of the national centres of research. That possibilities are already moreover compromised due to insufficient founds is a common problem becoming more and more frequent in the research field. It would be necessary, therefore, to include in the law the possibility to use telephonic lists to select samples useful for activities directly of interest and importance to the citizen, such as the collection of the data carried out on the basis of opinion polls by the centres of research of the Italian CNR and some universities.

  2. Adult health study reference papers. Selection of the sample. Characteristics of the sample

    Energy Technology Data Exchange (ETDEWEB)

    Beebe, G W; Fujisawa, Hideo; Yamasaki, Mitsuru

    1960-12-14

    The characteristics and selection of the clinical sample have been described in some detail to provide information on the comparability of the exposure groups with respect to factors excluded from the matching criteria and to provide basic descriptive information potentially relevant to individual studies that may be done within the framework of the Adult Health Study. The characteristics under review here are age, sex, many different aspects of residence, marital status, occupation and industry, details of location and shielding ATB, acute radiation signs and symptoms, and prior ABCC medical or pathology examinations. 5 references, 57 tables.

  3. [A comparison of convenience sampling and purposive sampling].

    Science.gov (United States)

    Suen, Lee-Jen Wu; Huang, Hui-Man; Lee, Hao-Hsien

    2014-06-01

    Convenience sampling and purposive sampling are two different sampling methods. This article first explains sampling terms such as target population, accessible population, simple random sampling, intended sample, actual sample, and statistical power analysis. These terms are then used to explain the difference between "convenience sampling" and purposive sampling." Convenience sampling is a non-probabilistic sampling technique applicable to qualitative or quantitative studies, although it is most frequently used in quantitative studies. In convenience samples, subjects more readily accessible to the researcher are more likely to be included. Thus, in quantitative studies, opportunity to participate is not equal for all qualified individuals in the target population and study results are not necessarily generalizable to this population. As in all quantitative studies, increasing the sample size increases the statistical power of the convenience sample. In contrast, purposive sampling is typically used in qualitative studies. Researchers who use this technique carefully select subjects based on study purpose with the expectation that each participant will provide unique and rich information of value to the study. As a result, members of the accessible population are not interchangeable and sample size is determined by data saturation not by statistical power analysis.

  4. Field-based random sampling without a sampling frame: control selection for a case-control study in rural Africa.

    Science.gov (United States)

    Crampin, A C; Mwinuka, V; Malema, S S; Glynn, J R; Fine, P E

    2001-01-01

    Selection bias, particularly of controls, is common in case-control studies and may materially affect the results. Methods of control selection should be tailored both for the risk factors and disease under investigation and for the population being studied. We present here a control selection method devised for a case-control study of tuberculosis in rural Africa (Karonga, northern Malawi) that selects an age/sex frequency-matched random sample of the population, with a geographical distribution in proportion to the population density. We also present an audit of the selection process, and discuss the potential of this method in other settings.

  5. An Improved Nested Sampling Algorithm for Model Selection and Assessment

    Science.gov (United States)

    Zeng, X.; Ye, M.; Wu, J.; WANG, D.

    2017-12-01

    Multimodel strategy is a general approach for treating model structure uncertainty in recent researches. The unknown groundwater system is represented by several plausible conceptual models. Each alternative conceptual model is attached with a weight which represents the possibility of this model. In Bayesian framework, the posterior model weight is computed as the product of model prior weight and marginal likelihood (or termed as model evidence). As a result, estimating marginal likelihoods is crucial for reliable model selection and assessment in multimodel analysis. Nested sampling estimator (NSE) is a new proposed algorithm for marginal likelihood estimation. The implementation of NSE comprises searching the parameters' space from low likelihood area to high likelihood area gradually, and this evolution is finished iteratively via local sampling procedure. Thus, the efficiency of NSE is dominated by the strength of local sampling procedure. Currently, Metropolis-Hasting (M-H) algorithm and its variants are often used for local sampling in NSE. However, M-H is not an efficient sampling algorithm for high-dimensional or complex likelihood function. For improving the performance of NSE, it could be feasible to integrate more efficient and elaborated sampling algorithm - DREAMzs into the local sampling. In addition, in order to overcome the computation burden problem of large quantity of repeating model executions in marginal likelihood estimation, an adaptive sparse grid stochastic collocation method is used to build the surrogates for original groundwater model.

  6. Approaches to sampling and case selection in qualitative research: examples in the geography of health.

    Science.gov (United States)

    Curtis, S; Gesler, W; Smith, G; Washburn, S

    2000-04-01

    This paper focuses on the question of sampling (or selection of cases) in qualitative research. Although the literature includes some very useful discussions of qualitative sampling strategies, the question of sampling often seems to receive less attention in methodological discussion than questions of how data is collected or is analysed. Decisions about sampling are likely to be important in many qualitative studies (although it may not be an issue in some research). There are varying accounts of the principles applicable to sampling or case selection. Those who espouse 'theoretical sampling', based on a 'grounded theory' approach, are in some ways opposed to those who promote forms of 'purposive sampling' suitable for research informed by an existing body of social theory. Diversity also results from the many different methods for drawing purposive samples which are applicable to qualitative research. We explore the value of a framework suggested by Miles and Huberman [Miles, M., Huberman,, A., 1994. Qualitative Data Analysis, Sage, London.], to evaluate the sampling strategies employed in three examples of research by the authors. Our examples comprise three studies which respectively involve selection of: 'healing places'; rural places which incorporated national anti-malarial policies; young male interviewees, identified as either chronically ill or disabled. The examples are used to show how in these three studies the (sometimes conflicting) requirements of the different criteria were resolved, as well as the potential and constraints placed on the research by the selection decisions which were made. We also consider how far the criteria Miles and Huberman suggest seem helpful for planning 'sample' selection in qualitative research.

  7. Choosing a suitable sample size in descriptive sampling

    International Nuclear Information System (INIS)

    Lee, Yong Kyun; Choi, Dong Hoon; Cha, Kyung Joon

    2010-01-01

    Descriptive sampling (DS) is an alternative to crude Monte Carlo sampling (CMCS) in finding solutions to structural reliability problems. It is known to be an effective sampling method in approximating the distribution of a random variable because it uses the deterministic selection of sample values and their random permutation,. However, because this method is difficult to apply to complex simulations, the sample size is occasionally determined without thorough consideration. Input sample variability may cause the sample size to change between runs, leading to poor simulation results. This paper proposes a numerical method for choosing a suitable sample size for use in DS. Using this method, one can estimate a more accurate probability of failure in a reliability problem while running a minimal number of simulations. The method is then applied to several examples and compared with CMCS and conventional DS to validate its usefulness and efficiency

  8. Selecting Sample Preparation Workflows for Mass Spectrometry-Based Proteomic and Phosphoproteomic Analysis of Patient Samples with Acute Myeloid Leukemia.

    Science.gov (United States)

    Hernandez-Valladares, Maria; Aasebø, Elise; Selheim, Frode; Berven, Frode S; Bruserud, Øystein

    2016-08-22

    Global mass spectrometry (MS)-based proteomic and phosphoproteomic studies of acute myeloid leukemia (AML) biomarkers represent a powerful strategy to identify and confirm proteins and their phosphorylated modifications that could be applied in diagnosis and prognosis, as a support for individual treatment regimens and selection of patients for bone marrow transplant. MS-based studies require optimal and reproducible workflows that allow a satisfactory coverage of the proteome and its modifications. Preparation of samples for global MS analysis is a crucial step and it usually requires method testing, tuning and optimization. Different proteomic workflows that have been used to prepare AML patient samples for global MS analysis usually include a standard protein in-solution digestion procedure with a urea-based lysis buffer. The enrichment of phosphopeptides from AML patient samples has previously been carried out either with immobilized metal affinity chromatography (IMAC) or metal oxide affinity chromatography (MOAC). We have recently tested several methods of sample preparation for MS analysis of the AML proteome and phosphoproteome and introduced filter-aided sample preparation (FASP) as a superior methodology for the sensitive and reproducible generation of peptides from patient samples. FASP-prepared peptides can be further fractionated or IMAC-enriched for proteome or phosphoproteome analyses. Herein, we will review both in-solution and FASP-based sample preparation workflows and encourage the use of the latter for the highest protein and phosphorylation coverage and reproducibility.

  9. Hybrid nested sampling algorithm for Bayesian model selection applied to inverse subsurface flow problems

    KAUST Repository

    Elsheikh, Ahmed H.; Wheeler, Mary Fanett; Hoteit, Ibrahim

    2014-01-01

    A Hybrid Nested Sampling (HNS) algorithm is proposed for efficient Bayesian model calibration and prior model selection. The proposed algorithm combines, Nested Sampling (NS) algorithm, Hybrid Monte Carlo (HMC) sampling and gradient estimation using

  10. Statistical sampling strategies

    International Nuclear Information System (INIS)

    Andres, T.H.

    1987-01-01

    Systems assessment codes use mathematical models to simulate natural and engineered systems. Probabilistic systems assessment codes carry out multiple simulations to reveal the uncertainty in values of output variables due to uncertainty in the values of the model parameters. In this paper, methods are described for sampling sets of parameter values to be used in a probabilistic systems assessment code. Three Monte Carlo parameter selection methods are discussed: simple random sampling, Latin hypercube sampling, and sampling using two-level orthogonal arrays. Three post-selection transformations are also described: truncation, importance transformation, and discretization. Advantages and disadvantages of each method are summarized

  11. New sorbent materials for selective extraction of cocaine and benzoylecgonine from human urine samples.

    Science.gov (United States)

    Bujak, Renata; Gadzała-Kopciuch, Renata; Nowaczyk, Alicja; Raczak-Gutknecht, Joanna; Kordalewska, Marta; Struck-Lewicka, Wiktoria; Waszczuk-Jankowska, Małgorzata; Tomczak, Ewa; Kaliszan, Michał; Buszewski, Bogusław; Markuszewski, Michał J

    2016-02-20

    An increase in cocaine consumption has been observed in Europe during the last decade. Benzoylecgonine, as a main urinary metabolite of cocaine in human, is so far the most reliable marker of cocaine consumption. Determination of cocaine and its metabolite in complex biological samples as urine or blood, requires efficient and selective sample pretreatment. In this preliminary study, the newly synthesized sorbent materials were proposed for selective extraction of cocaine and benzoylecgonine from urine samples. Application of these sorbent media allowed to determine cocaine and benzoylecgonine in urine samples at the concentration level of 100ng/ml with good recovery values as 81.7%±6.6 and 73.8%±4.2, respectively. The newly synthesized materials provided efficient, inexpensive and selective extraction of both cocaine and benzoylecgonine from urine samples, which can consequently lead to an increase of the sensitivity of the current available screening diagnostic tests. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Selecting Sample Preparation Workflows for Mass Spectrometry-Based Proteomic and Phosphoproteomic Analysis of Patient Samples with Acute Myeloid Leukemia

    Directory of Open Access Journals (Sweden)

    Maria Hernandez-Valladares

    2016-08-01

    Full Text Available Global mass spectrometry (MS-based proteomic and phosphoproteomic studies of acute myeloid leukemia (AML biomarkers represent a powerful strategy to identify and confirm proteins and their phosphorylated modifications that could be applied in diagnosis and prognosis, as a support for individual treatment regimens and selection of patients for bone marrow transplant. MS-based studies require optimal and reproducible workflows that allow a satisfactory coverage of the proteome and its modifications. Preparation of samples for global MS analysis is a crucial step and it usually requires method testing, tuning and optimization. Different proteomic workflows that have been used to prepare AML patient samples for global MS analysis usually include a standard protein in-solution digestion procedure with a urea-based lysis buffer. The enrichment of phosphopeptides from AML patient samples has previously been carried out either with immobilized metal affinity chromatography (IMAC or metal oxide affinity chromatography (MOAC. We have recently tested several methods of sample preparation for MS analysis of the AML proteome and phosphoproteome and introduced filter-aided sample preparation (FASP as a superior methodology for the sensitive and reproducible generation of peptides from patient samples. FASP-prepared peptides can be further fractionated or IMAC-enriched for proteome or phosphoproteome analyses. Herein, we will review both in-solution and FASP-based sample preparation workflows and encourage the use of the latter for the highest protein and phosphorylation coverage and reproducibility.

  13. Frequency-Selective Signal Sensing with Sub-Nyquist Uniform Sampling Scheme

    DEFF Research Database (Denmark)

    Pierzchlewski, Jacek; Arildsen, Thomas

    2015-01-01

    In this paper the authors discuss a problem of acquisition and reconstruction of a signal polluted by adjacent- channel interference. The authors propose a method to find a sub-Nyquist uniform sampling pattern which allows for correct reconstruction of selected frequencies. The method is inspired...... by the Restricted Isometry Property, which is known from the field of compressed sensing. Then, compressed sensing is used to successfully reconstruct a wanted signal even if some of the uniform samples were randomly lost, e. g. due to ADC saturation. An experiment which tests the proposed method in practice...

  14. Data Quality Objectives For Selecting Waste Samples To Test The Fluid Bed Steam Reformer Test

    International Nuclear Information System (INIS)

    Banning, D.L.

    2010-01-01

    This document describes the data quality objectives to select archived samples located at the 222-S Laboratory for Fluid Bed Steam Reformer testing. The type, quantity and quality of the data required to select the samples for Fluid Bed Steam Reformer testing are discussed. In order to maximize the efficiency and minimize the time to treat Hanford tank waste in the Waste Treatment and Immobilization Plant, additional treatment processes may be required. One of the potential treatment processes is the fluid bed steam reformer (FBSR). A determination of the adequacy of the FBSR process to treat Hanford tank waste is required. The initial step in determining the adequacy of the FBSR process is to select archived waste samples from the 222-S Laboratory that will be used to test the FBSR process. Analyses of the selected samples will be required to confirm the samples meet the testing criteria.

  15. Sampling or gambling

    Energy Technology Data Exchange (ETDEWEB)

    Gy, P.M.

    1981-12-01

    Sampling can be compared to no other technique. A mechanical sampler must above all be selected according to its aptitude for supressing or reducing all components of the sampling error. Sampling is said to be correct when it gives all elements making up the batch of matter submitted to sampling an uniform probability of being selected. A sampler must be correctly designed, built, installed, operated and maintained. When the conditions of sampling correctness are not strictly respected, the sampling error can no longer be controlled and can, unknown to the user, be unacceptably large: the sample is no longer representative. The implementation of an incorrect sampler is a form of gambling and this paper intends to show that at this game the user is nearly always the loser in the long run. The users' and the manufacturers' interests may diverge and the standards which should safeguard the users' interests very often fail to do so by tolerating or even recommending incorrect techniques such as the implementation of too narrow cutters traveling too fast through the stream to be sampled.

  16. Nested sampling algorithm for subsurface flow model selection, uncertainty quantification, and nonlinear calibration

    KAUST Repository

    Elsheikh, A. H.

    2013-12-01

    Calibration of subsurface flow models is an essential step for managing ground water aquifers, designing of contaminant remediation plans, and maximizing recovery from hydrocarbon reservoirs. We investigate an efficient sampling algorithm known as nested sampling (NS), which can simultaneously sample the posterior distribution for uncertainty quantification, and estimate the Bayesian evidence for model selection. Model selection statistics, such as the Bayesian evidence, are needed to choose or assign different weights to different models of different levels of complexities. In this work, we report the first successful application of nested sampling for calibration of several nonlinear subsurface flow problems. The estimated Bayesian evidence by the NS algorithm is used to weight different parameterizations of the subsurface flow models (prior model selection). The results of the numerical evaluation implicitly enforced Occam\\'s razor where simpler models with fewer number of parameters are favored over complex models. The proper level of model complexity was automatically determined based on the information content of the calibration data and the data mismatch of the calibrated model.

  17. Observed Characteristics and Teacher Quality: Impacts of Sample Selection on a Value Added Model

    Science.gov (United States)

    Winters, Marcus A.; Dixon, Bruce L.; Greene, Jay P.

    2012-01-01

    We measure the impact of observed teacher characteristics on student math and reading proficiency using a rich dataset from Florida. We expand upon prior work by accounting directly for nonrandom attrition of teachers from the classroom in a sample selection framework. We find evidence that sample selection is present in the estimation of the…

  18. Sampling point selection for energy estimation in the quasicontinuum method

    NARCIS (Netherlands)

    Beex, L.A.A.; Peerlings, R.H.J.; Geers, M.G.D.

    2010-01-01

    The quasicontinuum (QC) method reduces computational costs of atomistic calculations by using interpolation between a small number of so-called repatoms to represent the displacements of the complete lattice and by selecting a small number of sampling atoms to estimate the total potential energy of

  19. Size selective isocyanate aerosols personal air sampling using porous plastic foams

    International Nuclear Information System (INIS)

    Cong Khanh Huynh; Trinh Vu Duc

    2009-01-01

    As part of a European project (SMT4-CT96-2137), various European institutions specialized in occupational hygiene (BGIA, HSL, IOM, INRS, IST, Ambiente e Lavoro) have established a program of scientific collaboration to develop one or more prototypes of European personal samplers for the collection of simultaneous three dust fractions: inhalable, thoracic and respirable. These samplers based on existing sampling heads (IOM, GSP and cassettes) use Polyurethane Plastic Foam (PUF) according to their porosity to support sampling and separator size of the particles. In this study, the authors present an original application of size selective personal air sampling using chemical impregnated PUF to perform isocyanate aerosols capturing and derivatizing in industrial spray-painting shops.

  20. Sample selection based on kernel-subclustering for the signal reconstruction of multifunctional sensors

    International Nuclear Information System (INIS)

    Wang, Xin; Wei, Guo; Sun, Jinwei

    2013-01-01

    The signal reconstruction methods based on inverse modeling for the signal reconstruction of multifunctional sensors have been widely studied in recent years. To improve the accuracy, the reconstruction methods have become more and more complicated because of the increase in the model parameters and sample points. However, there is another factor that affects the reconstruction accuracy, the position of the sample points, which has not been studied. A reasonable selection of the sample points could improve the signal reconstruction quality in at least two ways: improved accuracy with the same number of sample points or the same accuracy obtained with a smaller number of sample points. Both ways are valuable for improving the accuracy and decreasing the workload, especially for large batches of multifunctional sensors. In this paper, we propose a sample selection method based on kernel-subclustering distill groupings of the sample data and produce the representation of the data set for inverse modeling. The method calculates the distance between two data points based on the kernel-induced distance instead of the conventional distance. The kernel function is a generalization of the distance metric by mapping the data that are non-separable in the original space into homogeneous groups in the high-dimensional space. The method obtained the best results compared with the other three methods in the simulation. (paper)

  1. Accounting for animal movement in estimation of resource selection functions: sampling and data analysis.

    Science.gov (United States)

    Forester, James D; Im, Hae Kyung; Rathouz, Paul J

    2009-12-01

    Patterns of resource selection by animal populations emerge as a result of the behavior of many individuals. Statistical models that describe these population-level patterns of habitat use can miss important interactions between individual animals and characteristics of their local environment; however, identifying these interactions is difficult. One approach to this problem is to incorporate models of individual movement into resource selection models. To do this, we propose a model for step selection functions (SSF) that is composed of a resource-independent movement kernel and a resource selection function (RSF). We show that standard case-control logistic regression may be used to fit the SSF; however, the sampling scheme used to generate control points (i.e., the definition of availability) must be accommodated. We used three sampling schemes to analyze simulated movement data and found that ignoring sampling and the resource-independent movement kernel yielded biased estimates of selection. The level of bias depended on the method used to generate control locations, the strength of selection, and the spatial scale of the resource map. Using empirical or parametric methods to sample control locations produced biased estimates under stronger selection; however, we show that the addition of a distance function to the analysis substantially reduced that bias. Assuming a uniform availability within a fixed buffer yielded strongly biased selection estimates that could be corrected by including the distance function but remained inefficient relative to the empirical and parametric sampling methods. As a case study, we used location data collected from elk in Yellowstone National Park, USA, to show that selection and bias may be temporally variable. Because under constant selection the amount of bias depends on the scale at which a resource is distributed in the landscape, we suggest that distance always be included as a covariate in SSF analyses. This approach to

  2. HICOSMO - cosmology with a complete sample of galaxy clusters - I. Data analysis, sample selection and luminosity-mass scaling relation

    Science.gov (United States)

    Schellenberger, G.; Reiprich, T. H.

    2017-08-01

    The X-ray regime, where the most massive visible component of galaxy clusters, the intracluster medium, is visible, offers directly measured quantities, like the luminosity, and derived quantities, like the total mass, to characterize these objects. The aim of this project is to analyse a complete sample of galaxy clusters in detail and constrain cosmological parameters, like the matter density, Ωm, or the amplitude of initial density fluctuations, σ8. The purely X-ray flux-limited sample (HIFLUGCS) consists of the 64 X-ray brightest galaxy clusters, which are excellent targets to study the systematic effects, that can bias results. We analysed in total 196 Chandra observations of the 64 HIFLUGCS clusters, with a total exposure time of 7.7 Ms. Here, we present our data analysis procedure (including an automated substructure detection and an energy band optimization for surface brightness profile analysis) that gives individually determined, robust total mass estimates. These masses are tested against dynamical and Planck Sunyaev-Zeldovich (SZ) derived masses of the same clusters, where good overall agreement is found with the dynamical masses. The Planck SZ masses seem to show a mass-dependent bias to our hydrostatic masses; possible biases in this mass-mass comparison are discussed including the Planck selection function. Furthermore, we show the results for the (0.1-2.4) keV luminosity versus mass scaling relation. The overall slope of the sample (1.34) is in agreement with expectations and values from literature. Splitting the sample into galaxy groups and clusters reveals, even after a selection bias correction, that galaxy groups exhibit a significantly steeper slope (1.88) compared to clusters (1.06).

  3. A Uniformly Selected Sample of Low-mass Black Holes in Seyfert 1 Galaxies. II. The SDSS DR7 Sample

    Science.gov (United States)

    Liu, He-Yang; Yuan, Weimin; Dong, Xiao-Bo; Zhou, Hongyan; Liu, Wen-Juan

    2018-04-01

    A new sample of 204 low-mass black holes (LMBHs) in active galactic nuclei (AGNs) is presented with black hole masses in the range of (1–20) × 105 M ⊙. The AGNs are selected through a systematic search among galaxies in the Seventh Data Release (DR7) of the Sloan Digital Sky Survey (SDSS), and careful analyses of their optical spectra and precise measurement of spectral parameters. Combining them with our previous sample selected from SDSS DR4 makes it the largest LMBH sample so far, totaling over 500 objects. Some of the statistical properties of the combined LMBH AGN sample are briefly discussed in the context of exploring the low-mass end of the AGN population. Their X-ray luminosities follow the extension of the previously known correlation with the [O III] luminosity. The effective optical-to-X-ray spectral indices α OX, albeit with a large scatter, are broadly consistent with the extension of the relation with the near-UV luminosity L 2500 Å. Interestingly, a correlation of α OX with black hole mass is also found, with α OX being statistically flatter (stronger X-ray relative to optical) for lower black hole masses. Only 26 objects, mostly radio loud, were detected in radio at 20 cm in the FIRST survey, giving a radio-loud fraction of 4%. The host galaxies of LMBHs have stellar masses in the range of 108.8–1012.4 M ⊙ and optical colors typical of Sbc spirals. They are dominated by young stellar populations that seem to have undergone continuous star formation history.

  4. Evaluation of Stress Loaded Steel Samples Using Selected Electromagnetic Methods

    International Nuclear Information System (INIS)

    Chady, T.

    2004-01-01

    In this paper the magnetic leakage flux and eddy current method were used to evaluate changes of materials' properties caused by stress. Seven samples made of ferromagnetic material with different level of applied stress were prepared. First, the leakage magnetic fields were measured by scanning the surface of the specimens with GMR gradiometer. Next, the same samples were evaluated using an eddy current sensor. A comparison between results obtained from both methods was carried out. Finally, selected parameters of the measured signal were calculated and utilized to evaluate level of the applied stress. A strong coincidence between amount of the applied stress and the maximum amplitude of the derivative was confirmed

  5. The Impact of Selection, Gene Conversion, and Biased Sampling on the Assessment of Microbial Demography.

    Science.gov (United States)

    Lapierre, Marguerite; Blin, Camille; Lambert, Amaury; Achaz, Guillaume; Rocha, Eduardo P C

    2016-07-01

    Recent studies have linked demographic changes and epidemiological patterns in bacterial populations using coalescent-based approaches. We identified 26 studies using skyline plots and found that 21 inferred overall population expansion. This surprising result led us to analyze the impact of natural selection, recombination (gene conversion), and sampling biases on demographic inference using skyline plots and site frequency spectra (SFS). Forward simulations based on biologically relevant parameters from Escherichia coli populations showed that theoretical arguments on the detrimental impact of recombination and especially natural selection on the reconstructed genealogies cannot be ignored in practice. In fact, both processes systematically lead to spurious interpretations of population expansion in skyline plots (and in SFS for selection). Weak purifying selection, and especially positive selection, had important effects on skyline plots, showing patterns akin to those of population expansions. State-of-the-art techniques to remove recombination further amplified these biases. We simulated three common sampling biases in microbiological research: uniform, clustered, and mixed sampling. Alone, or together with recombination and selection, they further mislead demographic inferences producing almost any possible skyline shape or SFS. Interestingly, sampling sub-populations also affected skyline plots and SFS, because the coalescent rates of populations and their sub-populations had different distributions. This study suggests that extreme caution is needed to infer demographic changes solely based on reconstructed genealogies. We suggest that the development of novel sampling strategies and the joint analyzes of diverse population genetic methods are strictly necessary to estimate demographic changes in populations where selection, recombination, and biased sampling are present. © The Author 2016. Published by Oxford University Press on behalf of the Society for

  6. Systematic sampling of discrete and continuous populations: sample selection and the choice of estimator

    Science.gov (United States)

    Harry T. Valentine; David L. R. Affleck; Timothy G. Gregoire

    2009-01-01

    Systematic sampling is easy, efficient, and widely used, though it is not generally recognized that a systematic sample may be drawn from the population of interest with or without restrictions on randomization. The restrictions or the lack of them determine which estimators are unbiased, when using the sampling design as the basis for inference. We describe the...

  7. 40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or as...

  8. Selection bias in population-based cancer case-control studies due to incomplete sampling frame coverage.

    Science.gov (United States)

    Walsh, Matthew C; Trentham-Dietz, Amy; Gangnon, Ronald E; Nieto, F Javier; Newcomb, Polly A; Palta, Mari

    2012-06-01

    Increasing numbers of individuals are choosing to opt out of population-based sampling frames due to privacy concerns. This is especially a problem in the selection of controls for case-control studies, as the cases often arise from relatively complete population-based registries, whereas control selection requires a sampling frame. If opt out is also related to risk factors, bias can arise. We linked breast cancer cases who reported having a valid driver's license from the 2004-2008 Wisconsin women's health study (N = 2,988) with a master list of licensed drivers from the Wisconsin Department of Transportation (WDOT). This master list excludes Wisconsin drivers that requested their information not be sold by the state. Multivariate-adjusted selection probability ratios (SPR) were calculated to estimate potential bias when using this driver's license sampling frame to select controls. A total of 962 cases (32%) had opted out of the WDOT sampling frame. Cases age <40 (SPR = 0.90), income either unreported (SPR = 0.89) or greater than $50,000 (SPR = 0.94), lower parity (SPR = 0.96 per one-child decrease), and hormone use (SPR = 0.93) were significantly less likely to be covered by the WDOT sampling frame (α = 0.05 level). Our results indicate the potential for selection bias due to differential opt out between various demographic and behavioral subgroups of controls. As selection bias may differ by exposure and study base, the assessment of potential bias needs to be ongoing. SPRs can be used to predict the direction of bias when cases and controls stem from different sampling frames in population-based case-control studies.

  9. THE zCOSMOS-SINFONI PROJECT. I. SAMPLE SELECTION AND NATURAL-SEEING OBSERVATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Mancini, C.; Renzini, A. [INAF-OAPD, Osservatorio Astronomico di Padova, Vicolo Osservatorio 5, I-35122 Padova (Italy); Foerster Schreiber, N. M.; Hicks, E. K. S.; Genzel, R.; Tacconi, L.; Davies, R. [Max-Planck-Institut fuer Extraterrestrische Physik, Giessenbachstrasse, D-85748 Garching (Germany); Cresci, G. [Osservatorio Astrofisico di Arcetri (OAF), INAF-Firenze, Largo E. Fermi 5, I-50125 Firenze (Italy); Peng, Y.; Lilly, S.; Carollo, M.; Oesch, P. [Institute of Astronomy, Department of Physics, Eidgenossische Technische Hochschule, ETH Zurich CH-8093 (Switzerland); Vergani, D.; Pozzetti, L.; Zamorani, G. [INAF-Bologna, Via Ranzani, I-40127 Bologna (Italy); Daddi, E. [CEA-Saclay, DSM/DAPNIA/Service d' Astrophysique, F-91191 Gif-Sur Yvette Cedex (France); Maraston, C. [Institute of Cosmology and Gravitation, University of Portsmouth, Dennis Sciama Building, Burnaby Road, PO1 3HE Portsmouth (United Kingdom); McCracken, H. J. [IAP, 98bis bd Arago, F-75014 Paris (France); Bouche, N. [Department of Physics, University of California, Santa Barbara, CA 93106 (United States); Shapiro, K. [Aerospace Research Laboratories, Northrop Grumman Aerospace Systems, Redondo Beach, CA 90278 (United States); and others

    2011-12-10

    The zCOSMOS-SINFONI project is aimed at studying the physical and kinematical properties of a sample of massive z {approx} 1.4-2.5 star-forming galaxies, through SINFONI near-infrared integral field spectroscopy (IFS), combined with the multiwavelength information from the zCOSMOS (COSMOS) survey. The project is based on one hour of natural-seeing observations per target, and adaptive optics (AO) follow-up for a major part of the sample, which includes 30 galaxies selected from the zCOSMOS/VIMOS spectroscopic survey. This first paper presents the sample selection, and the global physical characterization of the target galaxies from multicolor photometry, i.e., star formation rate (SFR), stellar mass, age, etc. The H{alpha} integrated properties, such as, flux, velocity dispersion, and size, are derived from the natural-seeing observations, while the follow-up AO observations will be presented in the next paper of this series. Our sample appears to be well representative of star-forming galaxies at z {approx} 2, covering a wide range in mass and SFR. The H{alpha} integrated properties of the 25 H{alpha} detected galaxies are similar to those of other IFS samples at the same redshifts. Good agreement is found among the SFRs derived from H{alpha} luminosity and other diagnostic methods, provided the extinction affecting the H{alpha} luminosity is about twice that affecting the continuum. A preliminary kinematic analysis, based on the maximum observed velocity difference across the source and on the integrated velocity dispersion, indicates that the sample splits nearly 50-50 into rotation-dominated and velocity-dispersion-dominated galaxies, in good agreement with previous surveys.

  10. Contributions to sampling statistics

    CERN Document Server

    Conti, Pier; Ranalli, Maria

    2014-01-01

    This book contains a selection of the papers presented at the ITACOSM 2013 Conference, held in Milan in June 2013. ITACOSM is the bi-annual meeting of the Survey Sampling Group S2G of the Italian Statistical Society, intended as an international  forum of scientific discussion on the developments of theory and application of survey sampling methodologies and applications in human and natural sciences. The book gathers research papers carefully selected from both invited and contributed sessions of the conference. The whole book appears to be a relevant contribution to various key aspects of sampling methodology and techniques; it deals with some hot topics in sampling theory, such as calibration, quantile-regression and multiple frame surveys, and with innovative methodologies in important topics of both sampling theory and applications. Contributions cut across current sampling methodologies such as interval estimation for complex samples, randomized responses, bootstrap, weighting, modeling, imputati...

  11. Sample size methodology

    CERN Document Server

    Desu, M M

    2012-01-01

    One of the most important problems in designing an experiment or a survey is sample size determination and this book presents the currently available methodology. It includes both random sampling from standard probability distributions and from finite populations. Also discussed is sample size determination for estimating parameters in a Bayesian setting by considering the posterior distribution of the parameter and specifying the necessary requirements. The determination of the sample size is considered for ranking and selection problems as well as for the design of clinical trials. Appropria

  12. Selective solid-phase extraction of Ni(II) by an ion-imprinted polymer from water samples

    International Nuclear Information System (INIS)

    Saraji, Mohammad; Yousefi, Hamideh

    2009-01-01

    A new ion-imprinted polymer (IIP) material was synthesized by copolymerization of 4-vinylpyridine as monomer, ethyleneglycoldimethacrylate as crosslinking agent and 2,2'-azobis-sobutyronitrile as initiator in the presence of Ni-dithizone complex. The IIP was used as sorbent in a solid-phase extraction column. The effects of sampling volume, elution conditions, sample pH and sample flow rate on the extraction of Ni ions form water samples were studied. The maximum adsorption capacity and the relative selectivity coefficients of imprinted polymer for Ni(II)/Co(II), Ni(II)/Cu(II) and Ni(II)/Cd(II) were calculated. Compared with non-imprinted polymer particles, the IIP had higher selectivity for Ni(II). The relative selectivity factor (α r ) values of Ni(II)/Co(II), Ni(II)/Cu(II) and Ni(II)/Cd(II) were 21.6, 54.3, and 22.7, respectively, which are greater than 1. The relative standard deviation of the five replicate determinations of Ni(II) was 3.4%. The detection limit for 150 mL of sample was 1.6 μg L -1 using flame atomic absorption spectrometry. The developed method was successfully applied to the determination of trace nickel in water samples with satisfactory results.

  13. 40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sample selection by random number... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square... area created in accordance with paragraph (a) of this section, select two random numbers: one each for...

  14. Selection Component Analysis of Natural Polymorphisms using Population Samples Including Mother-Offspring Combinations, II

    DEFF Research Database (Denmark)

    Jarmer, Hanne Østergaard; Christiansen, Freddy Bugge

    1981-01-01

    Population samples including mother-offspring combinations provide information on the selection components: zygotic selection, sexual selection, gametic seletion and fecundity selection, on the mating pattern, and on the deviation from linkage equilibrium among the loci studied. The theory...

  15. Passive sampling of selected endocrine disrupting compounds using polar organic chemical integrative samplers

    International Nuclear Information System (INIS)

    Arditsoglou, Anastasia; Voutsa, Dimitra

    2008-01-01

    Two types of polar organic chemical integrative samplers (pharmaceutical POCIS and pesticide POCIS) were examined for their sampling efficiency of selected endocrine disrupting compounds (EDCs). Laboratory-based calibration of POCISs was conducted by exposing them at high and low concentrations of 14 EDCs (4-alkyl-phenols, their ethoxylate oligomers, bisphenol A, selected estrogens and synthetic steroids) for different time periods. The kinetic studies showed an integrative uptake up to 28 days. The sampling rates for the individual compounds were obtained. The use of POCISs could result in an integrative approach to the quality status of the aquatic systems especially in the case of high variation of water concentrations of EDCs. The sampling efficiency of POCISs under various field conditions was assessed after their deployment in different aquatic environments. - Calibration and field performance of polar organic integrative samplers for monitoring EDCs in aquatic environments

  16. 40 CFR 86.607-84 - Sample selection.

    Science.gov (United States)

    2010-07-01

    ... Auditing of New Light-Duty Vehicles, Light-Duty Trucks, and Heavy-Duty Vehicles § 86.607-84 Sample..., once a manufacturer ships any vehicle from the test sample, it relinquishes the prerogative to conduct...

  17. Electromembrane extraction as a rapid and selective miniaturized sample preparation technique for biological fluids

    DEFF Research Database (Denmark)

    Gjelstad, Astrid; Pedersen-Bjergaard, Stig; Seip, Knut Fredrik

    2015-01-01

    This special report discusses the sample preparation method electromembrane extraction, which was introduced in 2006 as a rapid and selective miniaturized extraction method. The extraction principle is based on isolation of charged analytes extracted from an aqueous sample, across a thin film....... Technical aspects of electromembrane extraction, important extraction parameters as well as a handful of examples of applications from different biological samples and bioanalytical areas are discussed in the paper....

  18. Gender Wage Gap : A Semi-Parametric Approach With Sample Selection Correction

    NARCIS (Netherlands)

    Picchio, M.; Mussida, C.

    2010-01-01

    Sizeable gender differences in employment rates are observed in many countries. Sample selection into the workforce might therefore be a relevant issue when estimating gender wage gaps. This paper proposes a new semi-parametric estimator of densities in the presence of covariates which incorporates

  19. Selective parathyroid venous sampling in primary hyperparathyroidism: A systematic review and meta-analysis.

    Science.gov (United States)

    Ibraheem, Kareem; Toraih, Eman A; Haddad, Antoine B; Farag, Mahmoud; Randolph, Gregory W; Kandil, Emad

    2018-05-14

    Minimally invasive parathyroidectomy requires accurate preoperative localization techniques. There is considerable controversy about the effectiveness of selective parathyroid venous sampling (sPVS) in primary hyperparathyroidism (PHPT) patients. The aim of this meta-analysis is to examine the diagnostic accuracy of sPVS as a preoperative localization modality in PHPT. Studies evaluating the diagnostic accuracy of sPVS for PHPT were electronically searched in the PubMed, EMBASE, Web of Science, and Cochrane Controlled Trials Register databases. Two independent authors reviewed the studies, and revised quality assessment of diagnostic accuracy study tool was used for the quality assessment. Study heterogeneity and pooled estimates were calculated. Two hundred and two unique studies were identified. Of those, 12 studies were included in the meta-analysis. Pooled sensitivity, specificity, and positive likelihood ratio (PLR) of sPVS were 74%, 41%, and 1.55, respectively. The area-under-the-receiver operating characteristic curve was 0.684, indicating an average discriminatory ability of sPVS. On comparison between sPVS and noninvasive imaging modalities, sensitivity, PLR, and positive posttest probability were significantly higher in sPVS compared to noninvasive imaging modalities. Interestingly, super-selective venous sampling had the highest sensitivity, accuracy, and positive posttest probability compared to other parathyroid venous sampling techniques. This is the first meta-analysis to examine the accuracy of sPVS in PHPT. sPVS had higher pooled sensitivity when compared to noninvasive modalities in revision parathyroid surgery. However, the invasiveness of this technique does not favor its routine use for preoperative localization. Super-selective venous sampling was the most accurate among all other parathyroid venous sampling techniques. Laryngoscope, 2018. © 2018 The American Laryngological, Rhinological and Otological Society, Inc.

  20. Methodology Series Module 5: Sampling Strategies

    OpenAIRE

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ? Sampling Method?. There are essentially two types of sampling methods: 1) probability sampling ? based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling ? based on researcher's choice, population that accessible & available. Some of the non-probabilit...

  1. Statistical sampling plans

    International Nuclear Information System (INIS)

    Jaech, J.L.

    1984-01-01

    In auditing and in inspection, one selects a number of items by some set of procedures and performs measurements which are compared with the operator's values. This session considers the problem of how to select the samples to be measured, and what kinds of measurements to make. In the inspection situation, the ultimate aim is to independently verify the operator's material balance. The effectiveness of the sample plan in achieving this objective is briefly considered. The discussion focuses on the model plant

  2. A genetic algorithm-based framework for wavelength selection on sample categorization.

    Science.gov (United States)

    Anzanello, Michel J; Yamashita, Gabrielli; Marcelo, Marcelo; Fogliatto, Flávio S; Ortiz, Rafael S; Mariotti, Kristiane; Ferrão, Marco F

    2017-08-01

    In forensic and pharmaceutical scenarios, the application of chemometrics and optimization techniques has unveiled common and peculiar features of seized medicine and drug samples, helping investigative forces to track illegal operations. This paper proposes a novel framework aimed at identifying relevant subsets of attenuated total reflectance Fourier transform infrared (ATR-FTIR) wavelengths for classifying samples into two classes, for example authentic or forged categories in case of medicines, or salt or base form in cocaine analysis. In the first step of the framework, the ATR-FTIR spectra were partitioned into equidistant intervals and the k-nearest neighbour (KNN) classification technique was applied to each interval to insert samples into proper classes. In the next step, selected intervals were refined through the genetic algorithm (GA) by identifying a limited number of wavelengths from the intervals previously selected aimed at maximizing classification accuracy. When applied to Cialis®, Viagra®, and cocaine ATR-FTIR datasets, the proposed method substantially decreased the number of wavelengths needed to categorize, and increased the classification accuracy. From a practical perspective, the proposed method provides investigative forces with valuable information towards monitoring illegal production of drugs and medicines. In addition, focusing on a reduced subset of wavelengths allows the development of portable devices capable of testing the authenticity of samples during police checking events, avoiding the need for later laboratorial analyses and reducing equipment expenses. Theoretically, the proposed GA-based approach yields more refined solutions than the current methods relying on interval approaches, which tend to insert irrelevant wavelengths in the retained intervals. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  3. Experimental and Sampling Design for the INL-2 Sample Collection Operational Test

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, Gregory F.; Amidan, Brett G.; Matzke, Brett D.

    2009-02-16

    , sample extraction, and analytical methods to be used in the INL-2 study. For each of the five test events, the specified floor of the INL building will be contaminated with BG using a point-release device located in the room specified in the experimental design. Then quality control (QC), reference material coupon (RMC), judgmental, and probabilistic samples will be collected according to the sampling plan for each test event. Judgmental samples will be selected based on professional judgment and prior information. Probabilistic samples were selected with a random aspect and in sufficient numbers to provide desired confidence for detecting contamination or clearing uncontaminated (or decontaminated) areas. Following sample collection for a given test event, the INL building will be decontaminated. For possibly contaminated areas, the numbers of probabilistic samples were chosen to provide 95% confidence of detecting contaminated areas of specified sizes. For rooms that may be uncontaminated following a contamination event, or for whole floors after decontamination, the numbers of judgmental and probabilistic samples were chosen using the CJR approach. The numbers of samples were chosen to support making X%/Y% clearance statements with X = 95% or 99% and Y = 96% or 97%. The experimental and sampling design also provides for making X%/Y% clearance statements using only probabilistic samples. For each test event, the numbers of characterization and clearance samples were selected within limits based on operational considerations while still maintaining high confidence for detection and clearance aspects. The sampling design for all five test events contains 2085 samples, with 1142 after contamination and 943 after decontamination. These numbers include QC, RMC, judgmental, and probabilistic samples. The experimental and sampling design specified in this report provides a good statistical foundation for achieving the objectives of the INL-2 study.

  4. Coordination of Conditional Poisson Samples

    Directory of Open Access Journals (Sweden)

    Grafström Anton

    2015-12-01

    Full Text Available Sample coordination seeks to maximize or to minimize the overlap of two or more samples. The former is known as positive coordination, and the latter as negative coordination. Positive coordination is mainly used for estimation purposes and to reduce data collection costs. Negative coordination is mainly performed to diminish the response burden of the sampled units. Poisson sampling design with permanent random numbers provides an optimum coordination degree of two or more samples. The size of a Poisson sample is, however, random. Conditional Poisson (CP sampling is a modification of the classical Poisson sampling that produces a fixed-size πps sample. We introduce two methods to coordinate Conditional Poisson samples over time or simultaneously. The first one uses permanent random numbers and the list-sequential implementation of CP sampling. The second method uses a CP sample in the first selection and provides an approximate one in the second selection because the prescribed inclusion probabilities are not respected exactly. The methods are evaluated using the size of the expected sample overlap, and are compared with their competitors using Monte Carlo simulation. The new methods provide a good coordination degree of two samples, close to the performance of Poisson sampling with permanent random numbers.

  5. SAMPLING IN EXTERNAL AUDIT - THE MONETARY UNIT SAMPLING METHOD

    Directory of Open Access Journals (Sweden)

    E. Dascalu

    2016-12-01

    Full Text Available This article approaches the general issue of diminishing the evidence investigation space in audit activities, by means of sampling techniques, given that in the instance of a significant data volume an exhaustive examination of the assessed popula¬tion is not possible and/or effective. The general perspective of the presentation involves dealing with sampling risk, in essence, the risk that a selected sample may not be representative for the overall population, in correlation with the audit risk model and with the component parts of this model (inherent risk, control risk and non detection risk and highlights the inter-conditionings between these two models.

  6. Perpendicular distance sampling: an alternative method for sampling downed coarse woody debris

    Science.gov (United States)

    Michael S. Williams; Jeffrey H. Gove

    2003-01-01

    Coarse woody debris (CWD) plays an important role in many forest ecosystem processes. In recent years, a number of new methods have been proposed to sample CWD. These methods select individual logs into the sample using some form of unequal probability sampling. One concern with most of these methods is the difficulty in estimating the volume of each log. A new method...

  7. Transfer function design based on user selected samples for intuitive multivariate volume exploration

    KAUST Repository

    Zhou, Liang

    2013-02-01

    Multivariate volumetric datasets are important to both science and medicine. We propose a transfer function (TF) design approach based on user selected samples in the spatial domain to make multivariate volumetric data visualization more accessible for domain users. Specifically, the user starts the visualization by probing features of interest on slices and the data values are instantly queried by user selection. The queried sample values are then used to automatically and robustly generate high dimensional transfer functions (HDTFs) via kernel density estimation (KDE). Alternatively, 2D Gaussian TFs can be automatically generated in the dimensionality reduced space using these samples. With the extracted features rendered in the volume rendering view, the user can further refine these features using segmentation brushes. Interactivity is achieved in our system and different views are tightly linked. Use cases show that our system has been successfully applied for simulation and complicated seismic data sets. © 2013 IEEE.

  8. Transfer function design based on user selected samples for intuitive multivariate volume exploration

    KAUST Repository

    Zhou, Liang; Hansen, Charles

    2013-01-01

    Multivariate volumetric datasets are important to both science and medicine. We propose a transfer function (TF) design approach based on user selected samples in the spatial domain to make multivariate volumetric data visualization more accessible for domain users. Specifically, the user starts the visualization by probing features of interest on slices and the data values are instantly queried by user selection. The queried sample values are then used to automatically and robustly generate high dimensional transfer functions (HDTFs) via kernel density estimation (KDE). Alternatively, 2D Gaussian TFs can be automatically generated in the dimensionality reduced space using these samples. With the extracted features rendered in the volume rendering view, the user can further refine these features using segmentation brushes. Interactivity is achieved in our system and different views are tightly linked. Use cases show that our system has been successfully applied for simulation and complicated seismic data sets. © 2013 IEEE.

  9. Selective removal of phosphate for analysis of organic acids in complex samples.

    Science.gov (United States)

    Deshmukh, Sandeep; Frolov, Andrej; Marcillo, Andrea; Birkemeyer, Claudia

    2015-04-03

    Accurate quantitation of compounds in samples of biological origin is often hampered by matrix interferences one of which occurs in GC-MS analysis from the presence of highly abundant phosphate. Consequently, high concentrations of phosphate need to be removed before sample analysis. Within this context, we screened 17 anion exchange solid-phase extraction (SPE) materials for selective phosphate removal using different protocols to meet the challenge of simultaneous recovery of six common organic acids in aqueous samples prior to derivatization for GC-MS analysis. Up to 75% recovery was achieved for the most organic acids, only the low pKa tartaric and citric acids were badly recovered. Compared to the traditional approach of phosphate removal by precipitation, SPE had a broader compatibility with common detection methods and performed more selectively among the organic acids under investigation. Based on the results of this study, it is recommended that phosphate removal strategies during the analysis of biologically relevant small molecular weight organic acids consider the respective pKa of the anticipated analytes and the detection method of choice. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. A method of language sampling

    DEFF Research Database (Denmark)

    Rijkhoff, Jan; Bakker, Dik; Hengeveld, Kees

    1993-01-01

    In recent years more attention is paid to the quality of language samples in typological work. Without an adequate sampling strategy, samples may suffer from various kinds of bias. In this article we propose a sampling method in which the genetic criterion is taken as the most important: samples...... to determine how many languages from each phylum should be selected, given any required sample size....

  11. Principal Stratification in sample selection problems with non normal error terms

    DEFF Research Database (Denmark)

    Rocci, Roberto; Mellace, Giovanni

    The aim of the paper is to relax distributional assumptions on the error terms, often imposed in parametric sample selection models to estimate causal effects, when plausible exclusion restrictions are not available. Within the principal stratification framework, we approximate the true distribut...... an application to the Job Corps training program....

  12. Effective traffic features selection algorithm for cyber-attacks samples

    Science.gov (United States)

    Li, Yihong; Liu, Fangzheng; Du, Zhenyu

    2018-05-01

    By studying the defense scheme of Network attacks, this paper propose an effective traffic features selection algorithm based on k-means++ clustering to deal with the problem of high dimensionality of traffic features which extracted from cyber-attacks samples. Firstly, this algorithm divide the original feature set into attack traffic feature set and background traffic feature set by the clustering. Then, we calculates the variation of clustering performance after removing a certain feature. Finally, evaluating the degree of distinctiveness of the feature vector according to the result. Among them, the effective feature vector is whose degree of distinctiveness exceeds the set threshold. The purpose of this paper is to select out the effective features from the extracted original feature set. In this way, it can reduce the dimensionality of the features so as to reduce the space-time overhead of subsequent detection. The experimental results show that the proposed algorithm is feasible and it has some advantages over other selection algorithms.

  13. 40 CFR 91.506 - Engine sample selection.

    Science.gov (United States)

    2010-07-01

    ... paragraph (b)(2) of this section. It defines one-tail, 95 percent confidence intervals. σ=actual test sample... individual engine x=mean of emission test results of the actual sample FEL=Family Emission Limit n=The actual... carry-over engine families: After one engine is tested, the manufacturer will combine the test with the...

  14. Application of the Sampling Selection Technique in Approaching Financial Audit

    Directory of Open Access Journals (Sweden)

    Victor Munteanu

    2018-03-01

    Full Text Available In his professional approach, the financial auditor has a wide range of working techniques, including selection techniques. They are applied depending on the nature of the information available to the financial auditor, the manner in which they are presented - paper or electronic format, and, last but not least, the time available. Several techniques are applied, successively or in parallel, to increase the safety of the expressed opinion and to provide the audit report with a solid basis of information. Sampling is used in the phase of control or clarification of the identified error. The main purpose is to corroborate or measure the degree of risk detected following a pertinent analysis. Since the auditor does not have time or means to thoroughly rebuild the information, the sampling technique can provide an effective response to the need for valorization.

  15. Systematic sampling for suspended sediment

    Science.gov (United States)

    Robert B. Thomas

    1991-01-01

    Abstract - Because of high costs or complex logistics, scientific populations cannot be measured entirely and must be sampled. Accepted scientific practice holds that sample selection be based on statistical principles to assure objectivity when estimating totals and variances. Probability sampling--obtaining samples with known probabilities--is the only method that...

  16. Coupling methods for multistage sampling

    OpenAIRE

    Chauvet, Guillaume

    2015-01-01

    Multistage sampling is commonly used for household surveys when there exists no sampling frame, or when the population is scattered over a wide area. Multistage sampling usually introduces a complex dependence in the selection of the final units, which makes asymptotic results quite difficult to prove. In this work, we consider multistage sampling with simple random without replacement sampling at the first stage, and with an arbitrary sampling design for further stages. We consider coupling ...

  17. Selection of Sampling Pumps Used for Groundwater Monitoring at the Hanford Site

    Energy Technology Data Exchange (ETDEWEB)

    Schalla, Ronald; Webber, William D.; Smith, Ronald M.

    2001-11-05

    The variable frequency drive centrifugal submersible pump, Redi-Flo2a made by Grundfosa, was selected for universal application for Hanford Site groundwater monitoring. Specifications for the selected pump and five other pumps were evaluated against current and future Hanford groundwater monitoring performance requirements, and the Redi-Flo2 was selected as the most versatile and applicable for the range of monitoring conditions. The Redi-Flo2 pump distinguished itself from the other pumps considered because of its wide range in output flow rate and its comparatively moderate maintenance and low capital costs. The Redi-Flo2 pump is able to purge a well at a high flow rate and then supply water for sampling at a low flow rate. Groundwater sampling using a low-volume-purging technique (e.g., low flow, minimal purge, no purge, or micropurgea) is planned in the future, eliminating the need for the pump to supply a high-output flow rate. Under those conditions, the Well Wizard bladder pump, manufactured by QED Environmental Systems, Inc., may be the preferred pump because of the lower capital cost.

  18. Correlations fo Sc, rare earths and other elements in selected rock samples from Arrua-i

    Energy Technology Data Exchange (ETDEWEB)

    Facetti, J F; Prats, M [Asuncion Nacional Univ. (Paraguay). Inst. de Ciencias

    1972-01-01

    The Sc and Eu contents in selected rocks samples from the stock of Arrua-i have been determined and correlations established with other elements and with the relative amount of some rare earths. These correlations suggest metasomatic phenomena for the formation of the rock samples.

  19. Correlations fo Sc, rare earths and other elements in selected rock samples from Arrua-i

    International Nuclear Information System (INIS)

    Facetti, J.F.; Prats, M.

    1972-01-01

    The Sc and Eu contents in selected rocks samples from the stock of Arrua-i have been determined and correlations established with other elements and with the relative amount of some rare earths. These correlations suggest metasomatic phenomena for the formation of the rock samples

  20. The Complete Local Volume Groups Sample - I. Sample selection and X-ray properties of the high-richness subsample

    Science.gov (United States)

    O'Sullivan, Ewan; Ponman, Trevor J.; Kolokythas, Konstantinos; Raychaudhury, Somak; Babul, Arif; Vrtilek, Jan M.; David, Laurence P.; Giacintucci, Simona; Gitti, Myriam; Haines, Chris P.

    2017-12-01

    We present the Complete Local-Volume Groups Sample (CLoGS), a statistically complete optically selected sample of 53 groups within 80 Mpc. Our goal is to combine X-ray, radio and optical data to investigate the relationship between member galaxies, their active nuclei and the hot intra-group medium (IGM). We describe sample selection, define a 26-group high-richness subsample of groups containing at least four optically bright (log LB ≥ 10.2 LB⊙) galaxies, and report the results of XMM-Newton and Chandra observations of these systems. We find that 14 of the 26 groups are X-ray bright, possessing a group-scale IGM extending at least 65 kpc and with luminosity >1041 erg s-1, while a further three groups host smaller galaxy-scale gas haloes. The X-ray bright groups have masses in the range M500 ≃ 0.5-5 × 1013 M⊙, based on system temperatures of 0.4-1.4 keV, and X-ray luminosities in the range 2-200 × 1041 erg s-1. We find that ∼53-65 per cent of the X-ray bright groups have cool cores, a somewhat lower fraction than found by previous archival surveys. Approximately 30 per cent of the X-ray bright groups show evidence of recent dynamical interactions (mergers or sloshing), and ∼35 per cent of their dominant early-type galaxies host active galactic nuclei with radio jets. We find no groups with unusually high central entropies, as predicted by some simulations, and confirm that CLoGS is in principle capable of detecting such systems. We identify three previously unrecognized groups, and find that they are either faint (LX, R500 < 1042 erg s-1) with no concentrated cool core, or highly disturbed. This leads us to suggest that ∼20 per cent of X-ray bright groups in the local universe may still be unidentified.

  1. Design-based estimators for snowball sampling

    OpenAIRE

    Shafie, Termeh

    2010-01-01

    Snowball sampling, where existing study subjects recruit further subjects from amongtheir acquaintances, is a popular approach when sampling from hidden populations.Since people with many in-links are more likely to be selected, there will be a selectionbias in the samples obtained. In order to eliminate this bias, the sample data must beweighted. However, the exact selection probabilities are unknown for snowball samplesand need to be approximated in an appropriate way. This paper proposes d...

  2. Selective information sampling

    Directory of Open Access Journals (Sweden)

    Peter A. F. Fraser-Mackenzie

    2009-06-01

    Full Text Available This study investigates the amount and valence of information selected during single item evaluation. One hundred and thirty-five participants evaluated a cell phone by reading hypothetical customers reports. Some participants were first asked to provide a preliminary rating based on a picture of the phone and some technical specifications. The participants who were given the customer reports only after they made a preliminary rating exhibited valence bias in their selection of customers reports. In contrast, the participants that did not make an initial rating sought subsequent information in a more balanced, albeit still selective, manner. The preliminary raters used the least amount of information in their final decision, resulting in faster decision times. The study appears to support the notion that selective exposure is utilized in order to develop cognitive coherence.

  3. Methodology Series Module 5: Sampling Strategies.

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ' Sampling Method'. There are essentially two types of sampling methods: 1) probability sampling - based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling - based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ' random sample' when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ' generalizability' of these results. In such a scenario, the researcher may want to use ' purposive sampling' for the study.

  4. Methodology series module 5: Sampling strategies

    Directory of Open Access Journals (Sweden)

    Maninder Singh Setia

    2016-01-01

    Full Text Available Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the 'Sampling Method'. There are essentially two types of sampling methods: 1 probability sampling – based on chance events (such as random numbers, flipping a coin etc.; and 2 non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term 'random sample' when the researcher has used convenience sample. The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the 'generalizability' of these results. In such a scenario, the researcher may want to use 'purposive sampling' for the study.

  5. Sampling Operations on Big Data

    Science.gov (United States)

    2015-11-29

    gories. These include edge sampling methods where edges are selected by a predetermined criteria; snowball sampling methods where algorithms start... Sampling Operations on Big Data Vijay Gadepally, Taylor Herr, Luke Johnson, Lauren Milechin, Maja Milosavljevic, Benjamin A. Miller Lincoln...process and disseminate information for discovery and exploration under real-time constraints. Common signal processing operations such as sampling and

  6. Optimal Selection of the Sampling Interval for Estimation of Modal Parameters by an ARMA- Model

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning

    1993-01-01

    Optimal selection of the sampling interval for estimation of the modal parameters by an ARMA-model for a white noise loaded structure modelled as a single degree of- freedom linear mechanical system is considered. An analytical solution for an optimal uniform sampling interval, which is optimal...

  7. Random selection of items. Selection of n1 samples among N items composing a stratum

    International Nuclear Information System (INIS)

    Jaech, J.L.; Lemaire, R.J.

    1987-02-01

    STR-224 provides generalized procedures to determine required sample sizes, for instance in the course of a Physical Inventory Verification at Bulk Handling Facilities. The present report describes procedures to generate random numbers and select groups of items to be verified in a given stratum through each of the measurement methods involved in the verification. (author). 3 refs

  8. Semiparametric efficient and robust estimation of an unknown symmetric population under arbitrary sample selection bias

    KAUST Repository

    Ma, Yanyuan

    2013-09-01

    We propose semiparametric methods to estimate the center and shape of a symmetric population when a representative sample of the population is unavailable due to selection bias. We allow an arbitrary sample selection mechanism determined by the data collection procedure, and we do not impose any parametric form on the population distribution. Under this general framework, we construct a family of consistent estimators of the center that is robust to population model misspecification, and we identify the efficient member that reaches the minimum possible estimation variance. The asymptotic properties and finite sample performance of the estimation and inference procedures are illustrated through theoretical analysis and simulations. A data example is also provided to illustrate the usefulness of the methods in practice. © 2013 American Statistical Association.

  9. Magnetically separable polymer (Mag-MIP) for selective analysis of biotin in food samples.

    Science.gov (United States)

    Uzuriaga-Sánchez, Rosario Josefina; Khan, Sabir; Wong, Ademar; Picasso, Gino; Pividori, Maria Isabel; Sotomayor, Maria Del Pilar Taboada

    2016-01-01

    This work presents an efficient method for the preparation of magnetic nanoparticles modified with molecularly imprinted polymers (Mag-MIP) through core-shell method for the determination of biotin in milk food samples. The functional monomer acrylic acid was selected from molecular modeling, EGDMA was used as cross-linking monomer and AIBN as radical initiator. The Mag-MIP and Mag-NIP were characterized by FTIR, magnetic hysteresis, XRD, SEM and N2-sorption measurements. The capacity of Mag-MIP for biotin adsorption, its kinetics and selectivity were studied in detail. The adsorption data was well described by Freundlich isotherm model with adsorption equilibrium constant (KF) of 1.46 mL g(-1). The selectivity experiments revealed that prepared Mag-MIP had higher selectivity toward biotin compared to other molecules with different chemical structure. The material was successfully applied for the determination of biotin in diverse milk samples using HPLC for quantification of the analyte, obtaining the mean value of 87.4% recovery. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Are most samples of animals systematically biased? Consistent individual trait differences bias samples despite random sampling.

    Science.gov (United States)

    Biro, Peter A

    2013-02-01

    Sampling animals from the wild for study is something nearly every biologist has done, but despite our best efforts to obtain random samples of animals, 'hidden' trait biases may still exist. For example, consistent behavioral traits can affect trappability/catchability, independent of obvious factors such as size and gender, and these traits are often correlated with other repeatable physiological and/or life history traits. If so, systematic sampling bias may exist for any of these traits. The extent to which this is a problem, of course, depends on the magnitude of bias, which is presently unknown because the underlying trait distributions in populations are usually unknown, or unknowable. Indeed, our present knowledge about sampling bias comes from samples (not complete population censuses), which can possess bias to begin with. I had the unique opportunity to create naturalized populations of fish by seeding each of four small fishless lakes with equal densities of slow-, intermediate-, and fast-growing fish. Using sampling methods that are not size-selective, I observed that fast-growing fish were up to two-times more likely to be sampled than slower-growing fish. This indicates substantial and systematic bias with respect to an important life history trait (growth rate). If correlations between behavioral, physiological and life-history traits are as widespread as the literature suggests, then many animal samples may be systematically biased with respect to these traits (e.g., when collecting animals for laboratory use), and affect our inferences about population structure and abundance. I conclude with a discussion on ways to minimize sampling bias for particular physiological/behavioral/life-history types within animal populations.

  11. Methodology Series Module 5: Sampling Strategies

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ‘ Sampling Method’. There are essentially two types of sampling methods: 1) probability sampling – based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ‘ random sample’ when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ‘ generalizability’ of these results. In such a scenario, the researcher may want to use ‘ purposive sampling’ for the study. PMID:27688438

  12. Groundwater sampling in uranium reconnaissance

    International Nuclear Information System (INIS)

    Butz, T.R.

    1977-03-01

    The groundwater sampling program is based on the premise that ground water geochemistry reflects the chemical composition of, and geochemical processes active in the strata from which the sample is obtained. Pilot surveys have shown that wells are the best source of groundwater, although springs are sampled on occasion. The procedures followed in selecting a sampling site, the sampling itself, and the field measurements, as well as the site records made, are described

  13. 40 CFR Appendix A to Subpart G of... - Sampling Plans for Selective Enforcement Auditing of Marine Engines

    Science.gov (United States)

    2010-07-01

    ... Enforcement Auditing of Marine Engines A Appendix A to Subpart G of Part 91 Protection of Environment...-IGNITION ENGINES Selective Enforcement Auditing Regulations Pt. 91, Subpt. G, App. A Appendix A to Subpart G of Part 91—Sampling Plans for Selective Enforcement Auditing of Marine Engines Table 1—Sampling...

  14. Optimization of sampling parameters for standardized exhaled breath sampling.

    Science.gov (United States)

    Doran, Sophie; Romano, Andrea; Hanna, George B

    2017-09-05

    The lack of standardization of breath sampling is a major contributing factor to the poor repeatability of results and hence represents a barrier to the adoption of breath tests in clinical practice. On-line and bag breath sampling have advantages but do not suit multicentre clinical studies whereas storage and robust transport are essential for the conduct of wide-scale studies. Several devices have been developed to control sampling parameters and to concentrate volatile organic compounds (VOCs) onto thermal desorption (TD) tubes and subsequently transport those tubes for laboratory analysis. We conducted three experiments to investigate (i) the fraction of breath sampled (whole vs. lower expiratory exhaled breath); (ii) breath sample volume (125, 250, 500 and 1000ml) and (iii) breath sample flow rate (400, 200, 100 and 50 ml/min). The target VOCs were acetone and potential volatile biomarkers for oesophago-gastric cancer belonging to the aldehyde, fatty acids and phenol chemical classes. We also examined the collection execution time and the impact of environmental contamination. The experiments showed that the use of exhaled breath-sampling devices requires the selection of optimum sampling parameters. The increase in sample volume has improved the levels of VOCs detected. However, the influence of the fraction of exhaled breath and the flow rate depends on the target VOCs measured. The concentration of potential volatile biomarkers for oesophago-gastric cancer was not significantly different between the whole and lower airway exhaled breath. While the recovery of phenols and acetone from TD tubes was lower when breath sampling was performed at a higher flow rate, other VOCs were not affected. A dedicated 'clean air supply' overcomes the contamination from ambient air, but the breath collection device itself can be a source of contaminants. In clinical studies using VOCs to diagnose gastro-oesophageal cancer, the optimum parameters are 500mls sample volume

  15. 40 CFR Appendix A to Subpart F of... - Sampling Plans for Selective Enforcement Auditing of Nonroad Engines

    Science.gov (United States)

    2010-07-01

    ... Enforcement Auditing of Nonroad Engines A Appendix A to Subpart F of Part 89 Protection of Environment... NONROAD COMPRESSION-IGNITION ENGINES Selective Enforcement Auditing Pt. 89, Subpt. F, App. A Appendix A to Subpart F of Part 89—Sampling Plans for Selective Enforcement Auditing of Nonroad Engines Table 1—Sampling...

  16. Acrylamide exposure among Turkish toddlers from selected cereal-based baby food samples.

    Science.gov (United States)

    Cengiz, Mehmet Fatih; Gündüz, Cennet Pelin Boyacı

    2013-10-01

    In this study, acrylamide exposure from selected cereal-based baby food samples was investigated among toddlers aged 1-3 years in Turkey. The study contained three steps. The first step was collecting food consumption data and toddlers' physical properties, such as gender, age and body weight, using a questionnaire given to parents by a trained interviewer between January and March 2012. The second step was determining the acrylamide levels in food samples that were reported on by the parents in the questionnaire, using a gas chromatography-mass spectrometry (GC-MS) method. The last step was combining the determined acrylamide levels in selected food samples with individual food consumption and body weight data using a deterministic approach to estimate the acrylamide exposure levels. The mean acrylamide levels of baby biscuits, breads, baby bread-rusks, crackers, biscuits, breakfast cereals and powdered cereal-based baby foods were 153, 225, 121, 604, 495, 290 and 36 μg/kg, respectively. The minimum, mean and maximum acrylamide exposures were estimated to be 0.06, 1.43 and 6.41 μg/kg BW per day, respectively. The foods that contributed to acrylamide exposure were aligned from high to low as bread, crackers, biscuits, baby biscuits, powdered cereal-based baby foods, baby bread-rusks and breakfast cereals. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Stability of selected volatile breath constituents in Tedlar, Kynar and Flexfilm sampling bags

    Science.gov (United States)

    Mochalski, Paweł; King, Julian; Unterkofler, Karl; Amann, Anton

    2016-01-01

    The stability of 41 selected breath constituents in three types of polymer sampling bags, Tedlar, Kynar, and Flexfilm, was investigated using solid phase microextraction and gas chromatography mass spectrometry. The tested molecular species belong to different chemical classes (hydrocarbons, ketones, aldehydes, aromatics, sulphurs, esters, terpenes, etc.) and exhibit close-to-breath low ppb levels (3–12 ppb) with the exception of isoprene, acetone and acetonitrile (106 ppb, 760 ppb, 42 ppb respectively). Stability tests comprised the background emission of contaminants, recovery from dry samples, recovery from humid samples (RH 80% at 37 °C), influence of the bag’s filling degree, and reusability. Findings yield evidence of the superiority of Tedlar bags over remaining polymers in terms of background emission, species stability (up to 7 days for dry samples), and reusability. Recoveries of species under study suffered from the presence of high amounts of water (losses up to 10%). However, only heavier volatiles, with molecular masses higher than 90, exhibited more pronounced losses (20–40%). The sample size (the degree of bag filling) was found to be one of the most important factors affecting the sample integrity. To sum up, it is recommended to store breath samples in pre-conditioned Tedlar bags up to 6 hours at the maximum possible filling volume. Among the remaining films, Kynar can be considered as an alternative to Tedlar; however, higher losses of compounds should be expected even within the first hours of storage. Due to the high background emission Flexfilm is not suitable for sampling and storage of samples for analyses aiming at volatiles at a low ppb level. PMID:23323261

  18. Decomposing the Gender Wage Gap in the Netherlands with Sample Selection Adjustments

    NARCIS (Netherlands)

    Albrecht, James; Vuuren, van Aico; Vroman, Susan

    2004-01-01

    In this paper, we use quantile regression decomposition methods to analyzethe gender gap between men and women who work full time in the Nether-lands. Because the fraction of women working full time in the Netherlands isquite low, sample selection is a serious issue. In addition to shedding light

  19. Enhanced Sampling and Analysis, Selection of Technology for Testing

    Energy Technology Data Exchange (ETDEWEB)

    Svoboda, John; Meikrantz, David

    2010-02-01

    The focus of this study includes the investigation of sampling technologies used in industry and their potential application to nuclear fuel processing. The goal is to identify innovative sampling methods using state of the art techniques that could evolve into the next generation sampling and analysis system for metallic elements. This report details the progress made in the first half of FY 2010 and includes a further consideration of the research focus and goals for this year. Our sampling options and focus for the next generation sampling method are presented along with the criteria used for choosing our path forward. We have decided to pursue the option of evaluating the feasibility of microcapillary based chips to remotely collect, transfer, track and supply microliters of sample solutions to analytical equipment in support of aqueous processes for used nuclear fuel cycles. Microchip vendors have been screened and a choice made for the development of a suitable microchip design followed by production of samples for evaluation by ANL, LANL, and INL on an independent basis.

  20. The diagnostic value of CT scan and selective venous sampling in Cushing's syndrome

    International Nuclear Information System (INIS)

    Negoro, Makoto; Kuwayama, Akio; Yamamoto, Naoto; Nakane, Toshichi; Yokoe, Toshio; Kageyama, Naoki; Ichihara, Kaoru; Ishiguchi, Tsuneo; Sakuma, Sadayuki

    1986-01-01

    We studied 24 patients with Cushing's syndrome in order to find the best way to confirm the pituitary adenoma preoperatively. At first, the sellar content was studied by means of a high-resolution CT scan in each patient. Second, by selective catheterization in the bilateral internal jugular vein and the inferior petrosal sinus, venous samples (c) were obtained for ACTH assay. Simultaneously, peripheral blood sampling (P) was made at the anterior cubital vein for the same purpose, and the C/P ratio was carefully calculated in each patient. If the C/P ratio exceeded 2, it was highly suggestive of the presence of pituitary adenoma. Even by an advanced high-resolution CT scan with a thickness of 2 mm, pituitary adenomas were detected in only 32 % of the patients studied. The result of image diagnosis in Cushing disease was discouraging. As for the chemical diagnosis, the results were as follows. At the early stage of this study, the catheterization was terminated in the jugular veins of nine patients. Among these, in five patients the presence of pituitary adenoma was predicted correctly in the preoperative stage. Later, by means of inferior petrosal sinus samplings, pituitary microadenomas were detected in ten patients among the twelve. Selective venous sampling for ACTH in the inferior petrosal sinus or jugular vein proved to be useful for the differential diagnosis of Cushing's syndrome when other diagnostic measures such as CT scan were inconclusive. (author)

  1. The Lunar Sample Compendium

    Science.gov (United States)

    Meyer, Charles

    2009-01-01

    The Lunar Sample Compendium is a succinct summary of the data obtained from 40 years of study of Apollo and Luna samples of the Moon. Basic petrographic, chemical and age information is compiled, sample-by-sample, in the form of an advanced catalog in order to provide a basic description of each sample. The LSC can be found online using Google. The initial allocation of lunar samples was done sparingly, because it was realized that scientific techniques would improve over the years and new questions would be formulated. The LSC is important because it enables scientists to select samples within the context of the work that has already been done and facilitates better review of proposed allocations. It also provides back up material for public displays, captures information found only in abstracts, grey literature and curatorial databases and serves as a ready access to the now-vast scientific literature.

  2. Testing a groundwater sampling tool: Are the samples representative?

    International Nuclear Information System (INIS)

    Kaback, D.S.; Bergren, C.L.; Carlson, C.A.; Carlson, C.L.

    1989-01-01

    A ground water sampling tool, the HydroPunch trademark, was tested at the Department of Energy's Savannah River Site in South Carolina to determine if representative ground water samples could be obtained without installing monitoring wells. Chemical analyses of ground water samples collected with the HydroPunch trademark from various depths within a borehole were compared with chemical analyses of ground water from nearby monitoring wells. The site selected for the test was in the vicinity of a large coal storage pile and a coal pile runoff basin that was constructed to collect the runoff from the coal storage pile. Existing monitoring wells in the area indicate the presence of a ground water contaminant plume that: (1) contains elevated concentrations of trace metals; (2) has an extremely low pH; and (3) contains elevated concentrations of major cations and anions. Ground water samples collected with the HydroPunch trademark provide in excellent estimate of ground water quality at discrete depths. Groundwater chemical data collected from various depths using the HydroPunch trademark can be averaged to simulate what a screen zone in a monitoring well would sample. The averaged depth-discrete data compared favorably with the data obtained from the nearby monitoring wells

  3. Sample selection via angular distance in the space of the arguments of an artificial neural network

    Science.gov (United States)

    Fernández Jaramillo, J. M.; Mayerle, R.

    2018-05-01

    In the construction of an artificial neural network (ANN) a proper data splitting of the available samples plays a major role in the training process. This selection of subsets for training, testing and validation affects the generalization ability of the neural network. Also the number of samples has an impact in the time required for the design of the ANN and the training. This paper introduces an efficient and simple method for reducing the set of samples used for training a neural network. The method reduces the required time to calculate the network coefficients, while keeping the diversity and avoiding overtraining the ANN due the presence of similar samples. The proposed method is based on the calculation of the angle between two vectors, each one representing one input of the neural network. When the angle formed among samples is smaller than a defined threshold only one input is accepted for the training. The accepted inputs are scattered throughout the sample space. Tidal records are used to demonstrate the proposed method. The results of a cross-validation show that with few inputs the quality of the outputs is not accurate and depends on the selection of the first sample, but as the number of inputs increases the accuracy is improved and differences among the scenarios with a different starting sample have and important reduction. A comparison with the K-means clustering algorithm shows that for this application the proposed method with a smaller number of samples is producing a more accurate network.

  4. Specified assurance level sampling procedure

    International Nuclear Information System (INIS)

    Willner, O.

    1980-11-01

    In the nuclear industry design specifications for certain quality characteristics require that the final product be inspected by a sampling plan which can demonstrate product conformance to stated assurance levels. The Specified Assurance Level (SAL) Sampling Procedure has been developed to permit the direct selection of attribute sampling plans which can meet commonly used assurance levels. The SAL procedure contains sampling plans which yield the minimum sample size at stated assurance levels. The SAL procedure also provides sampling plans with acceptance numbers ranging from 0 to 10, thus, making available to the user a wide choice of plans all designed to comply with a stated assurance level

  5. Selection of representative calibration sample sets for near-infrared reflectance spectroscopy to predict nitrogen concentration in grasses

    DEFF Research Database (Denmark)

    Shetty, Nisha; Rinnan, Åsmund; Gislum, René

    2012-01-01

    ) algorithm were used and compared. Both Puchwein and CADEX methods provide a calibration set equally distributed in space, and both methods require a minimum prior of knowledge. The samples were also selected randomly using complete random, cultivar random (year fixed), year random (cultivar fixed......) and interaction (cultivar × year fixed) random procedures to see the influence of different factors on sample selection. Puchwein's method performed best with lowest RMSEP followed by CADEX, interaction random, year random, cultivar random and complete random. Out of 118 samples of the complete calibration set...... effectively enhance the cost-effectiveness of NIR spectral analysis by reducing the number of analyzed samples in the calibration set by more than 80%, which substantially reduces the effort of laboratory analyses with no significant loss in prediction accuracy....

  6. Concentration of ions in selected bottled water samples sold in Malaysia

    Science.gov (United States)

    Aris, Ahmad Zaharin; Kam, Ryan Chuan Yang; Lim, Ai Phing; Praveena, Sarva Mangala

    2013-03-01

    Many consumers around the world, including Malaysians, have turned to bottled water as their main source of drinking water. The aim of this study is to determine the physical and chemical properties of bottled water samples sold in Selangor, Malaysia. A total of 20 bottled water brands consisting of `natural mineral (NM)' and `packaged drinking (PD)' types were randomly collected and analyzed for their physical-chemical characteristics: hydrogen ion concentration (pH), electrical conductivity (EC) and total dissolved solids (TDS), selected major ions: calcium (Ca), potassium (K), magnesium (Mg) and sodium (Na), and minor trace constituents: copper (Cu) and zinc (Zn) to ascertain their suitability for human consumption. The results obtained were compared with guideline values recommended by World Health Organization (WHO) and Malaysian Ministry of Health (MMOH), respectively. It was found that all bottled water samples were in accordance with the guidelines set by WHO and MMOH except for one sample (D3) which was below the pH limit of 6.5. Both NM and PD bottled water were dominated by Na + K > Ca > Mg. Low values for EC and TDS in the bottled water samples showed that water was deficient in essential elements, likely an indication that these were removed by water treatment. Minerals like major ions were present in very low concentrations which could pose a risk to individuals who consume this water on a regular basis. Generally, the overall quality of the supplied bottled water was in accordance to standards and guidelines set by WHO and MMOH and safe for consumption.

  7. Estimating the residential demand function for natural gas in Seoul with correction for sample selection bias

    International Nuclear Information System (INIS)

    Yoo, Seung-Hoon; Lim, Hea-Jin; Kwak, Seung-Jun

    2009-01-01

    Over the last twenty years, the consumption of natural gas in Korea has increased dramatically. This increase has mainly resulted from the rise of consumption in the residential sector. The main objective of the study is to estimate households' demand function for natural gas by applying a sample selection model using data from a survey of households in Seoul. The results show that there exists a selection bias in the sample and that failure to correct for sample selection bias distorts the mean estimate, of the demand for natural gas, downward by 48.1%. In addition, according to the estimation results, the size of the house, the dummy variable for dwelling in an apartment, the dummy variable for having a bed in an inner room, and the household's income all have positive relationships with the demand for natural gas. On the other hand, the size of the family and the price of gas negatively contribute to the demand for natural gas. (author)

  8. Sampling Large Graphs for Anticipatory Analytics

    Science.gov (United States)

    2015-05-15

    low. C. Random Area Sampling Random area sampling [8] is a “ snowball ” sampling method in which a set of random seed vertices are selected and areas... Sampling Large Graphs for Anticipatory Analytics Lauren Edwards, Luke Johnson, Maja Milosavljevic, Vijay Gadepally, Benjamin A. Miller Lincoln...systems, greater human-in-the-loop involvement, or through complex algorithms. We are investigating the use of sampling to mitigate these challenges

  9. A method of language sampling

    DEFF Research Database (Denmark)

    Rijkhoff, Jan; Bakker, Dik; Hengeveld, Kees

    1993-01-01

    In recent years more attention is paid to the quality of language samples in typological work. Without an adequate sampling strategy, samples may suffer from various kinds of bias. In this article we propose a sampling method in which the genetic criterion is taken as the most important: samples...... created with this method will reflect optimally the diversity of the languages of the world. On the basis of the internal structure of each genetic language tree a measure is computed that reflects the linguistic diversity in the language families represented by these trees. This measure is used...... to determine how many languages from each phylum should be selected, given any required sample size....

  10. Handling missing data in ranked set sampling

    CERN Document Server

    Bouza-Herrera, Carlos N

    2013-01-01

    The existence of missing observations is a very important aspect to be considered in the application of survey sampling, for example. In human populations they may be caused by a refusal of some interviewees to give the true value for the variable of interest. Traditionally, simple random sampling is used to select samples. Most statistical models are supported by the use of samples selected by means of this design. In recent decades, an alternative design has started being used, which, in many cases, shows an improvement in terms of accuracy compared with traditional sampling. It is called R

  11. A GMM-Based Test for Normal Disturbances of the Heckman Sample Selection Model

    Directory of Open Access Journals (Sweden)

    Michael Pfaffermayr

    2014-10-01

    Full Text Available The Heckman sample selection model relies on the assumption of normal and homoskedastic disturbances. However, before considering more general, alternative semiparametric models that do not need the normality assumption, it seems useful to test this assumption. Following Meijer and Wansbeek (2007, the present contribution derives a GMM-based pseudo-score LM test on whether the third and fourth moments of the disturbances of the outcome equation of the Heckman model conform to those implied by the truncated normal distribution. The test is easy to calculate and in Monte Carlo simulations it shows good performance for sample sizes of 1000 or larger.

  12. An algorithm to improve sampling efficiency for uncertainty propagation using sampling based method

    International Nuclear Information System (INIS)

    Campolina, Daniel; Lima, Paulo Rubens I.; Pereira, Claubia; Veloso, Maria Auxiliadora F.

    2015-01-01

    Sample size and computational uncertainty were varied in order to investigate sample efficiency and convergence of the sampling based method for uncertainty propagation. Transport code MCNPX was used to simulate a LWR model and allow the mapping, from uncertain inputs of the benchmark experiment, to uncertain outputs. Random sampling efficiency was improved through the use of an algorithm for selecting distributions. Mean range, standard deviation range and skewness were verified in order to obtain a better representation of uncertainty figures. Standard deviation of 5 pcm in the propagated uncertainties for 10 n-samples replicates was adopted as convergence criterion to the method. Estimation of 75 pcm uncertainty on reactor k eff was accomplished by using sample of size 93 and computational uncertainty of 28 pcm to propagate 1σ uncertainty of burnable poison radius. For a fixed computational time, in order to reduce the variance of the uncertainty propagated, it was found, for the example under investigation, it is preferable double the sample size than double the amount of particles followed by Monte Carlo process in MCNPX code. (author)

  13. Imaging a Large Sample with Selective Plane Illumination Microscopy Based on Multiple Fluorescent Microsphere Tracking

    Science.gov (United States)

    Ryu, Inkeon; Kim, Daekeun

    2018-04-01

    A typical selective plane illumination microscopy (SPIM) image size is basically limited by the field of view, which is a characteristic of the objective lens. If an image larger than the imaging area of the sample is to be obtained, image stitching, which combines step-scanned images into a single panoramic image, is required. However, accurately registering the step-scanned images is very difficult because the SPIM system uses a customized sample mount where uncertainties for the translational and the rotational motions exist. In this paper, an image registration technique based on multiple fluorescent microsphere tracking is proposed in the view of quantifying the constellations and measuring the distances between at least two fluorescent microspheres embedded in the sample. Image stitching results are demonstrated for optically cleared large tissue with various staining methods. Compensation for the effect of the sample rotation that occurs during the translational motion in the sample mount is also discussed.

  14. A quantitative method to detect explosives and selected semivolatiles in soil samples by Fourier transform infrared spectroscopy

    International Nuclear Information System (INIS)

    Clapper-Gowdy, M.; Dermirgian, J.; Robitaille, G.

    1995-01-01

    This paper describes a novel Fourier transform infrared (FTIR) spectroscopic method that can be used to rapidly screen soil samples from potentially hazardous waste sites. Samples are heated in a thermal desorption unit and the resultant vapors are collected and analyzed in a long-path gas cell mounted in a FTIR. Laboratory analysis of a soil sample by FTIR takes approximately 10 minutes. This method has been developed to identify and quantify microgram concentrations of explosives in soil samples and is directly applicable to the detection of selected volatile organics, semivolatile organics, and pesticides

  15. Population genetics inference for longitudinally-sampled mutants under strong selection.

    Science.gov (United States)

    Lacerda, Miguel; Seoighe, Cathal

    2014-11-01

    Longitudinal allele frequency data are becoming increasingly prevalent. Such samples permit statistical inference of the population genetics parameters that influence the fate of mutant variants. To infer these parameters by maximum likelihood, the mutant frequency is often assumed to evolve according to the Wright-Fisher model. For computational reasons, this discrete model is commonly approximated by a diffusion process that requires the assumption that the forces of natural selection and mutation are weak. This assumption is not always appropriate. For example, mutations that impart drug resistance in pathogens may evolve under strong selective pressure. Here, we present an alternative approximation to the mutant-frequency distribution that does not make any assumptions about the magnitude of selection or mutation and is much more computationally efficient than the standard diffusion approximation. Simulation studies are used to compare the performance of our method to that of the Wright-Fisher and Gaussian diffusion approximations. For large populations, our method is found to provide a much better approximation to the mutant-frequency distribution when selection is strong, while all three methods perform comparably when selection is weak. Importantly, maximum-likelihood estimates of the selection coefficient are severely attenuated when selection is strong under the two diffusion models, but not when our method is used. This is further demonstrated with an application to mutant-frequency data from an experimental study of bacteriophage evolution. We therefore recommend our method for estimating the selection coefficient when the effective population size is too large to utilize the discrete Wright-Fisher model. Copyright © 2014 by the Genetics Society of America.

  16. Uniform Sampling Table Method and its Applications II--Evaluating the Uniform Sampling by Experiment.

    Science.gov (United States)

    Chen, Yibin; Chen, Jiaxi; Chen, Xuan; Wang, Min; Wang, Wei

    2015-01-01

    A new method of uniform sampling is evaluated in this paper. The items and indexes were adopted to evaluate the rationality of the uniform sampling. The evaluation items included convenience of operation, uniformity of sampling site distribution, and accuracy and precision of measured results. The evaluation indexes included operational complexity, occupation rate of sampling site in a row and column, relative accuracy of pill weight, and relative deviation of pill weight. They were obtained from three kinds of drugs with different shape and size by four kinds of sampling methods. Gray correlation analysis was adopted to make the comprehensive evaluation by comparing it with the standard method. The experimental results showed that the convenience of uniform sampling method was 1 (100%), odds ratio of occupation rate in a row and column was infinity, relative accuracy was 99.50-99.89%, reproducibility RSD was 0.45-0.89%, and weighted incidence degree exceeded the standard method. Hence, the uniform sampling method was easy to operate, and the selected samples were distributed uniformly. The experimental results demonstrated that the uniform sampling method has good accuracy and reproducibility, which can be put into use in drugs analysis.

  17. Continuous sampling from distributed streams

    DEFF Research Database (Denmark)

    Graham, Cormode; Muthukrishnan, S.; Yi, Ke

    2012-01-01

    A fundamental problem in data management is to draw and maintain a sample of a large data set, for approximate query answering, selectivity estimation, and query planning. With large, streaming data sets, this problem becomes particularly difficult when the data is shared across multiple distribu......A fundamental problem in data management is to draw and maintain a sample of a large data set, for approximate query answering, selectivity estimation, and query planning. With large, streaming data sets, this problem becomes particularly difficult when the data is shared across multiple...... distributed sites. The main challenge is to ensure that a sample is drawn uniformly across the union of the data while minimizing the communication needed to run the protocol on the evolving data. At the same time, it is also necessary to make the protocol lightweight, by keeping the space and time costs low...... for each participant. In this article, we present communication-efficient protocols for continuously maintaining a sample (both with and without replacement) from k distributed streams. These apply to the case when we want a sample from the full streams, and to the sliding window cases of only the W most...

  18. Detection of Salmonella spp. in veterinary samples by combining selective enrichment and real-time PCR.

    Science.gov (United States)

    Goodman, Laura B; McDonough, Patrick L; Anderson, Renee R; Franklin-Guild, Rebecca J; Ryan, James R; Perkins, Gillian A; Thachil, Anil J; Glaser, Amy L; Thompson, Belinda S

    2017-11-01

    Rapid screening for enteric bacterial pathogens in clinical environments is essential for biosecurity. Salmonella found in veterinary hospitals, particularly Salmonella enterica serovar Dublin, can pose unique challenges for culture and testing because of its poor growth. Multiple Salmonella serovars including Dublin are emerging threats to public health given increasing prevalence and antimicrobial resistance. We adapted an automated food testing method to veterinary samples and evaluated the performance of the method in a variety of matrices including environmental samples ( n = 81), tissues ( n = 52), feces ( n = 148), and feed ( n = 29). A commercial kit was chosen as the basis for this approach in view of extensive performance characterizations published by multiple independent organizations. A workflow was established for efficiently and accurately testing veterinary matrices and environmental samples by use of real-time PCR after selective enrichment in Rappaport-Vassiliadis soya (RVS) medium. Using this method, the detection limit for S. Dublin improved by 100-fold over subculture on selective agars (eosin-methylene blue, brilliant green, and xylose-lysine-deoxycholate). Overall, the procedure was effective in detecting Salmonella spp. and provided next-day results.

  19. Correlations Between Life-Detection Techniques and Implications for Sampling Site Selection in Planetary Analog Missions

    Science.gov (United States)

    Gentry, Diana M.; Amador, Elena S.; Cable, Morgan L.; Chaudry, Nosheen; Cullen, Thomas; Jacobsen, Malene B.; Murukesan, Gayathri; Schwieterman, Edward W.; Stevens, Adam H.; Stockton, Amanda; Tan, George; Yin, Chang; Cullen, David C.; Geppert, Wolf

    2017-10-01

    We conducted an analog sampling expedition under simulated mission constraints to areas dominated by basaltic tephra of the Eldfell and Fimmvörðuháls lava fields (Iceland). Sites were selected to be "homogeneous" at a coarse remote sensing resolution (10-100 m) in apparent color, morphology, moisture, and grain size, with best-effort realism in numbers of locations and replicates. Three different biomarker assays (counting of nucleic-acid-stained cells via fluorescent microscopy, a luciferin/luciferase assay for adenosine triphosphate, and quantitative polymerase chain reaction (qPCR) to detect DNA associated with bacteria, archaea, and fungi) were characterized at four nested spatial scales (1 m, 10 m, 100 m, and >1 km) by using five common metrics for sample site representativeness (sample mean variance, group F tests, pairwise t tests, and the distribution-free rank sum H and u tests). Correlations between all assays were characterized with Spearman's rank test. The bioluminescence assay showed the most variance across the sites, followed by qPCR for bacterial and archaeal DNA; these results could not be considered representative at the finest resolution tested (1 m). Cell concentration and fungal DNA also had significant local variation, but they were homogeneous over scales of >1 km. These results show that the selection of life detection assays and the number, distribution, and location of sampling sites in a low biomass environment with limited a priori characterization can yield both contrasting and complementary results, and that their interdependence must be given due consideration to maximize science return in future biomarker sampling expeditions.

  20. Effects of soil water saturation on sampling equilibrium and kinetics of selected polycyclic aromatic hydrocarbons.

    Science.gov (United States)

    Kim, Pil-Gon; Roh, Ji-Yeon; Hong, Yongseok; Kwon, Jung-Hwan

    2017-10-01

    Passive sampling can be applied for measuring the freely dissolved concentration of hydrophobic organic chemicals (HOCs) in soil pore water. When using passive samplers under field conditions, however, there are factors that might affect passive sampling equilibrium and kinetics, such as soil water saturation. To determine the effects of soil water saturation on passive sampling, the equilibrium and kinetics of passive sampling were evaluated by observing changes in the distribution coefficient between sampler and soil (K sampler/soil ) and the uptake rate constant (k u ) at various soil water saturations. Polydimethylsiloxane (PDMS) passive samplers were deployed into artificial soils spiked with seven selected polycyclic aromatic hydrocarbons (PAHs). In dry soil (0% water saturation), both K sampler/soil and k u values were much lower than those in wet soils likely due to the contribution of adsorption of PAHs onto soil mineral surfaces and the conformational changes in soil organic matter. For high molecular weight PAHs (chrysene, benzo[a]pyrene, and dibenzo[a,h]anthracene), both K sampler/soil and k u values increased with increasing soil water saturation, whereas they decreased with increasing soil water saturation for low molecular weight PAHs (phenanthrene, anthracene, fluoranthene, and pyrene). Changes in the sorption capacity of soil organic matter with soil water content would be the main cause of the changes in passive sampling equilibrium. Henry's law constant could explain the different behaviors in uptake kinetics of the selected PAHs. The results of this study would be helpful when passive samplers are deployed under various soil water saturations. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Verification of Representative Sampling in RI waste

    International Nuclear Information System (INIS)

    Ahn, Hong Joo; Song, Byung Cheul; Sohn, Se Cheul; Song, Kyu Seok; Jee, Kwang Yong; Choi, Kwang Seop

    2009-01-01

    For evaluating the radionuclide inventories for RI wastes, representative sampling is one of the most important parts in the process of radiochemical assay. Sampling to characterized RI waste conditions typically has been based on judgment or convenience sampling of individual or groups. However, it is difficult to get a sample representatively among the numerous drums. In addition, RI waste drums might be classified into heterogeneous wastes because they have a content of cotton, glass, vinyl, gloves, etc. In order to get the representative samples, the sample to be analyzed must be collected from selected every drum. Considering the expense and time of analysis, however, the number of sample has to be minimized. In this study, RI waste drums were classified by the various conditions of the half-life, surface dose, acceptance date, waste form, generator, etc. A sample for radiochemical assay was obtained through mixing samples of each drum. The sample has to be prepared for radiochemical assay and although the sample should be reasonably uniform, it is rare that a completely homogeneous material is received. Every sample is shredded by a 1 ∼ 2 cm 2 diameter and a representative aliquot taken for the required analysis. For verification of representative sampling, classified every group is tested for evaluation of 'selection of representative drum in a group' and 'representative sampling in a drum'

  2. A simulative comparison of respondent driven sampling with incentivized snowball sampling--the "strudel effect".

    Science.gov (United States)

    Gyarmathy, V Anna; Johnston, Lisa G; Caplinskiene, Irma; Caplinskas, Saulius; Latkin, Carl A

    2014-02-01

    Respondent driven sampling (RDS) and incentivized snowball sampling (ISS) are two sampling methods that are commonly used to reach people who inject drugs (PWID). We generated a set of simulated RDS samples on an actual sociometric ISS sample of PWID in Vilnius, Lithuania ("original sample") to assess if the simulated RDS estimates were statistically significantly different from the original ISS sample prevalences for HIV (9.8%), Hepatitis A (43.6%), Hepatitis B (Anti-HBc 43.9% and HBsAg 3.4%), Hepatitis C (87.5%), syphilis (6.8%) and Chlamydia (8.8%) infections and for selected behavioral risk characteristics. The original sample consisted of a large component of 249 people (83% of the sample) and 13 smaller components with 1-12 individuals. Generally, as long as all seeds were recruited from the large component of the original sample, the simulation samples simply recreated the large component. There were no significant differences between the large component and the entire original sample for the characteristics of interest. Altogether 99.2% of 360 simulation sample point estimates were within the confidence interval of the original prevalence values for the characteristics of interest. When population characteristics are reflected in large network components that dominate the population, RDS and ISS may produce samples that have statistically non-different prevalence values, even though some isolated network components may be under-sampled and/or statistically significantly different from the main groups. This so-called "strudel effect" is discussed in the paper. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  3. Experimental breakdown of selected anodized aluminum samples in dilute plasmas

    Science.gov (United States)

    Grier, Norman T.; Domitz, Stanley

    1992-01-01

    Anodized aluminum samples representative of Space Station Freedom structural material were tested for electrical breakdown under space plasma conditions. In space, this potential arises across the insulating anodized coating when the spacecraft structure is driven to a negative bias relative to the external plasma potential due to plasma-surface interaction phenomena. For anodized materials used in the tests, it was found that breakdown voltage varied from 100 to 2000 volts depending on the sample. The current in the arcs depended on the sample, the capacitor, and the voltage. The level of the arc currents varied from 60 to 1000 amperes. The plasma number density varied from 3 x 10 exp 6 to 10 exp 3 ions per cc. The time between arcs increased as the number density was lowered. Corona testing of anodized samples revealed that samples with higher corona inception voltage had higher arcing inception voltages. From this it is concluded that corona testing may provide a method of screening the samples.

  4. 42 CFR 431.814 - Sampling plan and procedures.

    Science.gov (United States)

    2010-10-01

    ... reliability of the reduced sample. (4) The sample selection procedure. Systematic random sampling is... sampling, and yield estimates with the same or better precision than achieved in systematic random sampling... 42 Public Health 4 2010-10-01 2010-10-01 false Sampling plan and procedures. 431.814 Section 431...

  5. Selective extraction of dimethoate from cucumber samples by use of molecularly imprinted microspheres

    Directory of Open Access Journals (Sweden)

    Jiao-Jiao Du

    2015-06-01

    Full Text Available Molecularly imprinted polymers for dimethoate recognition were synthesized by the precipitation polymerization technique using methyl methacrylate (MMA as the functional monomer and ethylene glycol dimethacrylate (EGDMA as the cross-linker. The morphology, adsorption and recognition properties were investigated by scanning electron microscopy (SEM, static adsorption test, and competitive adsorption test. To obtain the best selectivity and binding performance, the synthesis and adsorption conditions of MIPs were optimized through single factor experiments. Under the optimized conditions, the resultant polymers exhibited uniform size, satisfactory binding capacity and significant selectivity. Furthermore, the imprinted polymers were successfully applied as a specific solid-phase extractants combined with high performance liquid chromatography (HPLC for determination of dimethoate residues in the cucumber samples. The average recoveries of three spiked samples ranged from 78.5% to 87.9% with the relative standard deviations (RSDs less than 4.4% and the limit of detection (LOD obtained for dimethoate as low as 2.3 μg/mL. Keywords: Molecularly imprinted polymer, Precipitation polymerization, Dimethoate, Cucumber, HPLC

  6. Molecularly imprinted membrane extraction combined with high-performance liquid chromatography for selective analysis of cloxacillin from shrimp samples.

    Science.gov (United States)

    Du, Wei; Sun, Min; Guo, Pengqi; Chang, Chun; Fu, Qiang

    2018-09-01

    Nowadays, the abuse of antibiotics in aquaculture has generated considerable problems for food safety. Therefore, it is imperative to develop a simple and selective method for monitoring illegal use of antibiotics in aquatic products. In this study, a method combined molecularly imprinted membranes (MIMs) extraction and liquid chromatography was developed for the selective analysis of cloxacillin from shrimp samples. The MIMs was synthesized by UV photopolymerization, and characterized by scanning electron microscope, Fourier transform infrared spectra, thermo-gravimetric analysis and swelling test. The results showed that the MIMs exhibited excellent permselectivity, high adsorption capacity and fast adsorption rate for cloxacillin. Finally, the method was utilized to determine cloxacillin from shrimp samples, with good accuracies and acceptable relative standard deviation values for precision. The proposed method was a promising alternative for selective analysis of cloxacillin in shrimp samples, due to the easy-operation and excellent selectivity. Copyright © 2018. Published by Elsevier Ltd.

  7. 40 CFR Appendix Xi to Part 86 - Sampling Plans for Selective Enforcement Auditing of Light-Duty Vehicles

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Sampling Plans for Selective Enforcement Auditing of Light-Duty Vehicles XI Appendix XI to Part 86 Protection of Environment ENVIRONMENTAL... Enforcement Auditing of Light-Duty Vehicles 40% AQL Table 1—Sampling Plan Code Letter Annual sales of...

  8. Soft X-Ray Observations of a Complete Sample of X-Ray--selected BL Lacertae Objects

    Science.gov (United States)

    Perlman, Eric S.; Stocke, John T.; Wang, Q. Daniel; Morris, Simon L.

    1996-01-01

    We present the results of ROSAT PSPC observations of the X-ray selected BL Lacertae objects (XBLs) in the complete Einstein Extended Medium Sensitivity Survey (EM MS) sample. None of the objects is resolved in their respective PSPC images, but all are easily detected. All BL Lac objects in this sample are well-fitted by single power laws. Their X-ray spectra exhibit a variety of spectral slopes, with best-fit energy power-law spectral indices between α = 0.5-2.3. The PSPC spectra of this sample are slightly steeper than those typical of flat ratio-spectrum quasars. Because almost all of the individual PSPC spectral indices are equal to or slightly steeper than the overall optical to X-ray spectral indices for these same objects, we infer that BL Lac soft X-ray continua are dominated by steep-spectrum synchrotron radiation from a broad X-ray jet, rather than flat-spectrum inverse Compton radiation linked to the narrower radio/millimeter jet. The softness of the X-ray spectra of these XBLs revives the possibility proposed by Guilbert, Fabian, & McCray (1983) that BL Lac objects are lineless because the circumnuclear gas cannot be heated sufficiently to permit two stable gas phases, the cooler of which would comprise the broad emission-line clouds. Because unified schemes predict that hard self-Compton radiation is beamed only into a small solid angle in BL Lac objects, the steep-spectrum synchrotron tail controls the temperature of the circumnuclear gas at r ≤ 1018 cm and prevents broad-line cloud formation. We use these new ROSAT data to recalculate the X-ray luminosity function and cosmological evolution of the complete EMSS sample by determining accurate K-corrections for the sample and estimating the effects of variability and the possibility of incompleteness in the sample. Our analysis confirms that XBLs are evolving "negatively," opposite in sense to quasars, with Ve/Va = 0.331±0.060. The statistically significant difference between the values for X

  9. Automatic sample changers maintenance manual

    International Nuclear Information System (INIS)

    Myers, T.A.

    1978-10-01

    This manual describes and provides trouble-shooting aids for the Automatic Sample Changer electronics on the automatic beta counting system, developed by the Los Alamos Scientific Laboratory Group CNC-11. The output of a gas detector is shaped by a preamplifier, then is coupled to an amplifier. Amplifier output is discriminated and is the input to a scaler. An identification number is associated with each sample. At a predetermined count length, the identification number, scaler data plus other information is punched out on a data card. The next sample to be counted is automatically selected. The beta counter uses the same electronics as the prior count did, the only difference being the sample identification number and sample itself. This manual is intended as a step-by-step aid in trouble-shooting the electronics associated with positioning the sample, counting the sample, and getting the needed data punched on an 80-column data card

  10. Studying hardness, workability and minimum bending radius in selectively laser-sintered Ti–6Al–4V alloy samples

    Science.gov (United States)

    Galkina, N. V.; Nosova, Y. A.; Balyakin, A. V.

    2018-03-01

    This research is relevant as it tries to improve the mechanical and service performance of the Ti–6Al–4V titanium alloy obtained by selective laser sintering. For that purpose, sintered samples were annealed at 750 and 850°C for an hour. Sintered and annealed samples were tested for hardness, workability and microstructure. It was found that incomplete annealing of selectively laser-sintered Ti–6Al–4V samples results in an insignificant reduction in hardness and ductility. Sintered and incompletely annealed samples had a hardness of 32..33 HRC, which is lower than the value of annealed parts specified in standards. Complete annealing at temperature 850°C reduces the hardness to 25 HRC and ductility by 15...20%. Incomplete annealing lowers the ductility factor from 0.08 to 0.06. Complete annealing lowers that value to 0.025. Complete annealing probably results in the embrittlement of sintered samples, perhaps due to their oxidation and hydrogenation in the air. Optical metallography showed lateral fractures in both sintered and annealed samples, which might be the reason why they had lower hardness and ductility.

  11. Sampling a guide for internal auditors

    CERN Document Server

    Apostolou, Barbara

    2004-01-01

    While it is possible to examine 100 percent of an audit customer's data, the time and cost associated with such a study are often prohibitive. To obtain sufficient, reliable, and relevant information with a limited data set, sampling is an efficient and effective tool. It can help you evaluate the customer's assertions, as well as reach audit conclusions and provide reasonable assurance to your organization. This handbook will help you understand sampling. It also serves as a guide for auditors and students preparing for certification. Topics include: An overview of sampling. Statistical and nonstatistical sampling issues. Sampling selection methods and risks. The pros and cons of popular sampling plans.

  12. X-Ray Temperatures, Luminosities, and Masses from XMM-Newton Follow-up of the First Shear-selected Galaxy Cluster Sample

    Energy Technology Data Exchange (ETDEWEB)

    Deshpande, Amruta J.; Hughes, John P. [Department of Physics and Astronomy, Rutgers the State University of New Jersey, 136 Frelinghuysen Road, Piscataway, NJ 08854 (United States); Wittman, David, E-mail: amrejd@physics.rutgers.edu, E-mail: jph@physics.rutgers.edu, E-mail: dwittman@physics.ucdavis.edu [Department of Physics, University of California, Davis, One Shields Avenue, Davis, CA 95616 (United States)

    2017-04-20

    We continue the study of the first sample of shear-selected clusters from the initial 8.6 square degrees of the Deep Lens Survey (DLS); a sample with well-defined selection criteria corresponding to the highest ranked shear peaks in the survey area. We aim to characterize the weak lensing selection by examining the sample’s X-ray properties. There are multiple X-ray clusters associated with nearly all the shear peaks: 14 X-ray clusters corresponding to seven DLS shear peaks. An additional three X-ray clusters cannot be definitively associated with shear peaks, mainly due to large positional offsets between the X-ray centroid and the shear peak. Here we report on the XMM-Newton properties of the 17 X-ray clusters. The X-ray clusters display a wide range of luminosities and temperatures; the L {sub X} − T {sub X} relation we determine for the shear-associated X-ray clusters is consistent with X-ray cluster samples selected without regard to dynamical state, while it is inconsistent with self-similarity. For a subset of the sample, we measure X-ray masses using temperature as a proxy, and compare to weak lensing masses determined by the DLS team. The resulting mass comparison is consistent with equality. The X-ray and weak lensing masses show considerable intrinsic scatter (∼48%), which is consistent with X-ray selected samples when their X-ray and weak lensing masses are independently determined.

  13. Determination of specific activity of americium and plutonium in selected environmental samples

    International Nuclear Information System (INIS)

    Trebunova, T.

    1999-01-01

    The aim of this work was development of method for determination of americium and plutonium in environmental samples. Developed method was evaluated on soil samples and after they was applied on selected samples of fishes (smoked mackerel, herring and fillet from Alaska hake). The method for separation of americium is based on liquid separation with Aliquate-336, precipitation with oxalic acid and using of chromatographic material TRU-Spec TM .The intervals of radiochemical yields were from 13.0% to 80.9% for plutonium-236 and from 10.5% to 100% for americium-241. Determined specific activities of plutonium-239,240 were from (2.3 ± 1.4) mBq/kg to (82 ± 29) mBq/kg, the specific activities of plutonium-238 were from (14.2 ± 3.7) mBq/kg to (708 ± 86) mBq/kg. The specific activities of americium-241 were from (1.4 ± 0.9) mBq/kg to (3360 ± 210) mBq/kg. The fishes from Baltic Sea as well as from North Sea show highest specific activities then fresh-water fishes from Slovakia. Therefore the monitoring of alpha radionuclides in foods imported from territories with nuclear testing is recommended

  14. Phytochemical analysis and biological evaluation of selected African propolis samples from Cameroon and Congo

    NARCIS (Netherlands)

    Papachroni, D.; Graikou, K.; Kosalec, I.; Damianakos, H.; Ingram, V.J.; Chinou, I.

    2015-01-01

    The objective of this study was the chemical analysis of four selected samples of African propolis (Congo and Cameroon) and their biological evaluation. Twenty-one secondary metabolites belonging to four different chemical groups were isolated from the 70% ethanolic extracts of propolis and their

  15. Climate Change and Agricultural Productivity in Sub-Saharan Africa: A Spatial Sample Selection Model

    NARCIS (Netherlands)

    Ward, P.S.; Florax, R.J.G.M.; Flores-Lagunes, A.

    2014-01-01

    Using spatially explicit data, we estimate a cereal yield response function using a recently developed estimator for spatial error models when endogenous sample selection is of concern. Our results suggest that yields across Sub-Saharan Africa will decline with projected climatic changes, and that

  16. Sampling in epidemiological research: issues, hazards and pitfalls

    Science.gov (United States)

    Tyrer, Stephen; Heyman, Bob

    2016-01-01

    Surveys of people's opinions are fraught with difficulties. It is easier to obtain information from those who respond to text messages or to emails than to attempt to obtain a representative sample. Samples of the population that are selected non-randomly in this way are termed convenience samples as they are easy to recruit. This introduces a sampling bias. Such non-probability samples have merit in many situations, but an epidemiological enquiry is of little value unless a random sample is obtained. If a sufficient number of those selected actually complete a survey, the results are likely to be representative of the population. This editorial describes probability and non-probability sampling methods and illustrates the difficulties and suggested solutions in performing accurate epidemiological research. PMID:27087985

  17. Sampling Methods in Cardiovascular Nursing Research: An Overview.

    Science.gov (United States)

    Kandola, Damanpreet; Banner, Davina; O'Keefe-McCarthy, Sheila; Jassal, Debbie

    2014-01-01

    Cardiovascular nursing research covers a wide array of topics from health services to psychosocial patient experiences. The selection of specific participant samples is an important part of the research design and process. The sampling strategy employed is of utmost importance to ensure that a representative sample of participants is chosen. There are two main categories of sampling methods: probability and non-probability. Probability sampling is the random selection of elements from the population, where each element of the population has an equal and independent chance of being included in the sample. There are five main types of probability sampling including simple random sampling, systematic sampling, stratified sampling, cluster sampling, and multi-stage sampling. Non-probability sampling methods are those in which elements are chosen through non-random methods for inclusion into the research study and include convenience sampling, purposive sampling, and snowball sampling. Each approach offers distinct advantages and disadvantages and must be considered critically. In this research column, we provide an introduction to these key sampling techniques and draw on examples from the cardiovascular research. Understanding the differences in sampling techniques may aid nurses in effective appraisal of research literature and provide a reference pointfor nurses who engage in cardiovascular research.

  18. 40 CFR 761.353 - Second level of sample selection.

    Science.gov (United States)

    2010-07-01

    .... At the chemical extraction and analysis laboratory, pour the 19-liter subsample onto a plastic sheet..., AND USE PROHIBITIONS Sampling Non-Liquid, Non-Metal PCB Bulk Product Waste for Purposes of Characterization for PCB Disposal in Accordance With § 761.62, and Sampling PCB Remediation Waste Destined for Off...

  19. PWR steam generator tubing sample library

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    In order to compile the tubing sample library, two approaches were employed: (a) tubing sample replication by either chemical or mechanical means, based on field tube data and metallography reports for tubes already destructively examined; and (b) acquisition of field tubes removed from operating or retired steam generators. In addition, a unique mercury modeling concept is in use to guide the selection of replica samples. A compendium was compiled that summarizes field observations and morphologies of steam generator tube degradation types based on available NDE, destructive examinations, and field reports. This compendium was used in selecting candidate degradation types that were manufactured for inclusion in the tube library

  20. Antibiotic content of selective culture media for isolation of Capnocytophaga species from oral polymicrobial samples.

    Science.gov (United States)

    Ehrmann, E; Jolivet-Gougeon, A; Bonnaure-Mallet, M; Fosse, T

    2013-10-01

    In oral microbiome, because of the abundance of commensal competitive flora, selective media with antibiotics are necessary for the recovery of fastidious Capnocytophaga species. The performances of six culture media (blood agar, chocolate blood agar, VCAT medium, CAPE medium, bacitracin chocolate blood agar and VK medium) were compared with literature data concerning five other media (FAA, LB, TSBV, CapR and TBBP media). To understand variable growth on selective media, the MICs of each antimicrobial agent contained in this different media (colistin, kanamycin, trimethoprim, trimethoprim-sulfamethoxazole, vancomycin, aztreonam and bacitracin) were determined for all Capnocytophaga species. Overall, VCAT medium (Columbia, 10% cooked horse blood, polyvitaminic supplement, 3·75 mg l(-1) of colistin, 1·5 mg l(-1) of trimethoprim, 1 mg l(-1) of vancomycin and 0·5 mg l(-1) of amphotericin B, Oxoid, France) was the more efficient selective medium, with regard to the detection of Capnocytophaga species from oral samples (P culture, a simple blood agar allowed the growth of all Capnocytophaga species. Nonetheless, in oral samples, because of the abundance of commensal competitive flora, selective media with antibiotics are necessary for the recovery of Capnocytophaga species. The demonstrated superiority of VCAT medium made its use essential for the optimal detection of this bacterial genus. This work showed that extreme caution should be exercised when reporting the isolation of Capnocytophaga species from oral polymicrobial samples, because the culture medium is a determining factor. © 2013 The Society for Applied Microbiology.

  1. Assessment of selected contaminants in streambed- and suspended-sediment samples collected in Bexar County, Texas, 2007-09

    Science.gov (United States)

    Wilson, Jennifer T.

    2011-01-01

    Elevated concentrations of sediment-associated contaminants are typically associated with urban areas such as San Antonio, Texas, in Bexar County, the seventh most populous city in the United States. This report describes an assessment of selected sediment-associated contaminants in samples collected in Bexar County from sites on the following streams: Medio Creek, Medina River, Elm Creek, Martinez Creek, Chupaderas Creek, Leon Creek, Salado Creek, and San Antonio River. During 2007-09, the U.S. Geological Survey periodically collected surficial streambed-sediment samples during base flow and suspended-sediment (large-volume suspended-sediment) samples from selected streams during stormwater runoff. All sediment samples were analyzed for major and trace elements and for organic compounds including halogenated organic compounds and polycyclic aromatic hydrocarbons (PAHs). Selected contaminants in streambed and suspended sediments in watersheds of the eight major streams in Bexar County were assessed by using a variety of methods—observations of occurrence and distribution, comparison to sediment-quality guidelines and data from previous studies, statistical analyses, and source indicators. Trace elements concentrations were low compared to the consensus-based sediment-quality guidelines threshold effect concentration (TEC) and probable effect concentration (PEC). Trace element concentrations were greater than the TEC in 28 percent of the samples and greater than the PEC in 1.5 percent of the samples. Chromium concentrations exceeded sediment-quality guidelines more frequently than concentrations of any other constituents analyzed in this study (greater than the TEC in 69 percent of samples and greater than the PEC in 8 percent of samples). Mean trace element concentrations generally are lower in Bexar County samples compared to concentrations in samples collected during previous studies in the Austin and Fort Worth, Texas, areas, but considering the relatively

  2. Sample Selection for Training Cascade Detectors

    OpenAIRE

    V?llez, Noelia; Deniz, Oscar; Bueno, Gloria

    2015-01-01

    Automatic detection systems usually require large and representative training datasets in order to obtain good detection and false positive rates. Training datasets are such that the positive set has few samples and/or the negative set should represent anything except the object of interest. In this respect, the negative set typically contains orders of magnitude more images than the positive set. However, imbalanced training databases lead to biased classifiers. In this paper, we focus our a...

  3. Systematic sampling with errors in sample locations

    DEFF Research Database (Denmark)

    Ziegel, Johanna; Baddeley, Adrian; Dorph-Petersen, Karl-Anton

    2010-01-01

    analysis using point process methods. We then analyze three different models for the error process, calculate exact expressions for the variances, and derive asymptotic variances. Errors in the placement of sample points can lead to substantial inflation of the variance, dampening of zitterbewegung......Systematic sampling of points in continuous space is widely used in microscopy and spatial surveys. Classical theory provides asymptotic expressions for the variance of estimators based on systematic sampling as the grid spacing decreases. However, the classical theory assumes that the sample grid...... is exactly periodic; real physical sampling procedures may introduce errors in the placement of the sample points. This paper studies the effect of errors in sample positioning on the variance of estimators in the case of one-dimensional systematic sampling. First we sketch a general approach to variance...

  4. Selective whole genome amplification for resequencing target microbial species from complex natural samples.

    Science.gov (United States)

    Leichty, Aaron R; Brisson, Dustin

    2014-10-01

    Population genomic analyses have demonstrated power to address major questions in evolutionary and molecular microbiology. Collecting populations of genomes is hindered in many microbial species by the absence of a cost effective and practical method to collect ample quantities of sufficiently pure genomic DNA for next-generation sequencing. Here we present a simple method to amplify genomes of a target microbial species present in a complex, natural sample. The selective whole genome amplification (SWGA) technique amplifies target genomes using nucleotide sequence motifs that are common in the target microbe genome, but rare in the background genomes, to prime the highly processive phi29 polymerase. SWGA thus selectively amplifies the target genome from samples in which it originally represented a minor fraction of the total DNA. The post-SWGA samples are enriched in target genomic DNA, which are ideal for population resequencing. We demonstrate the efficacy of SWGA using both laboratory-prepared mixtures of cultured microbes as well as a natural host-microbe association. Targeted amplification of Borrelia burgdorferi mixed with Escherichia coli at genome ratios of 1:2000 resulted in >10(5)-fold amplification of the target genomes with genomic extracts from Wolbachia pipientis-infected Drosophila melanogaster resulted in up to 70% of high-throughput resequencing reads mapping to the W. pipientis genome. By contrast, 2-9% of sequencing reads were derived from W. pipientis without prior amplification. The SWGA technique results in high sequencing coverage at a fraction of the sequencing effort, thus allowing population genomic studies at affordable costs. Copyright © 2014 by the Genetics Society of America.

  5. A large sample of Kohonen-selected SDSS quasars with weak emission lines: selection effects and statistical properties

    Science.gov (United States)

    Meusinger, H.; Balafkan, N.

    2014-08-01

    Aims: A tiny fraction of the quasar population shows remarkably weak emission lines. Several hypotheses have been developed, but the weak line quasar (WLQ) phenomenon still remains puzzling. The aim of this study was to create a sizeable sample of WLQs and WLQ-like objects and to evaluate various properties of this sample. Methods: We performed a search for WLQs in the spectroscopic data from the Sloan Digital Sky Survey Data Release 7 based on Kohonen self-organising maps for nearly 105 quasar spectra. The final sample consists of 365 quasars in the redshift range z = 0.6 - 4.2 (z¯ = 1.50 ± 0.45) and includes in particular a subsample of 46 WLQs with equivalent widths WMg iiattention was paid to selection effects. Results: The WLQs have, on average, significantly higher luminosities, Eddington ratios, and accretion rates. About half of the excess comes from a selection bias, but an intrinsic excess remains probably caused primarily by higher accretion rates. The spectral energy distribution shows a bluer continuum at rest-frame wavelengths ≳1500 Å. The variability in the optical and UV is relatively low, even taking the variability-luminosity anti-correlation into account. The percentage of radio detected quasars and of core-dominant radio sources is significantly higher than for the control sample, whereas the mean radio-loudness is lower. Conclusions: The properties of our WLQ sample can be consistently understood assuming that it consists of a mix of quasars at the beginning of a stage of increased accretion activity and of beamed radio-quiet quasars. The higher luminosities and Eddington ratios in combination with a bluer spectral energy distribution can be explained by hotter continua, i.e. higher accretion rates. If quasar activity consists of subphases with different accretion rates, a change towards a higher rate is probably accompanied by an only slow development of the broad line region. The composite WLQ spectrum can be reasonably matched by the

  6. Study on the effects of sample selection on spectral reflectance reconstruction based on the algorithm of compressive sensing

    International Nuclear Information System (INIS)

    Zhang, Leihong; Liang, Dong

    2016-01-01

    In order to solve the problem that reconstruction efficiency and precision is not high, in this paper different samples are selected to reconstruct spectral reflectance, and a new kind of spectral reflectance reconstruction method based on the algorithm of compressive sensing is provided. Four different color numbers of matte color cards such as the ColorChecker Color Rendition Chart and Color Checker SG, the copperplate paper spot color card of Panton, and the Munsell colors card are chosen as training samples, the spectral image is reconstructed respectively by the algorithm of compressive sensing and pseudo-inverse and Wiener, and the results are compared. These methods of spectral reconstruction are evaluated by root mean square error and color difference accuracy. The experiments show that the cumulative contribution rate and color difference of the Munsell colors card are better than those of the other three numbers of color cards in the same conditions of reconstruction, and the accuracy of the spectral reconstruction will be affected by the training sample of different numbers of color cards. The key technology of reconstruction means that the uniformity and representation of the training sample selection has important significance upon reconstruction. In this paper, the influence of the sample selection on the spectral image reconstruction is studied. The precision of the spectral reconstruction based on the algorithm of compressive sensing is higher than that of the traditional algorithm of spectral reconstruction. By the MATLAB simulation results, it can be seen that the spectral reconstruction precision and efficiency are affected by the different color numbers of the training sample. (paper)

  7. Cold Spray Deposition of Freestanding Inconel Samples and Comparative Analysis with Selective Laser Melting

    Science.gov (United States)

    Bagherifard, Sara; Roscioli, Gianluca; Zuccoli, Maria Vittoria; Hadi, Mehdi; D'Elia, Gaetano; Demir, Ali Gökhan; Previtali, Barbara; Kondás, Ján; Guagliano, Mario

    2017-10-01

    Cold spray offers the possibility of obtaining almost zero-porosity buildups with no theoretical limit to the thickness. Moreover, cold spray can eliminate particle melting, evaporation, crystallization, grain growth, unwanted oxidation, undesirable phases and thermally induced tensile residual stresses. Such characteristics can boost its potential to be used as an additive manufacturing technique. Indeed, deposition via cold spray is recently finding its path toward fabrication of freeform components since it can address the common challenges of powder-bed additive manufacturing techniques including major size constraints, deposition rate limitations and high process temperature. Herein, we prepared nickel-based superalloy Inconel 718 samples with cold spray technique and compared them with similar samples fabricated by selective laser melting method. The samples fabricated using both methods were characterized in terms of mechanical strength, microstructural and porosity characteristics, Vickers microhardness and residual stresses distribution. Different heat treatment cycles were applied to the cold-sprayed samples in order to enhance their mechanical characteristics. The obtained data confirm that cold spray technique can be used as a complementary additive manufacturing method for fabrication of high-quality freestanding components where higher deposition rate, larger final size and lower fabrication temperatures are desired.

  8. Sample Curation at a Lunar Outpost

    Science.gov (United States)

    Allen, Carlton C.; Lofgren, Gary E.; Treiman, A. H.; Lindstrom, Marilyn L.

    2007-01-01

    The six Apollo surface missions returned 2,196 individual rock and soil samples, with a total mass of 381.6 kg. Samples were collected based on visual examination by the astronauts and consultation with geologists in the science back room in Houston. The samples were photographed during collection, packaged in uniquely-identified containers, and transported to the Lunar Module. All samples collected on the Moon were returned to Earth. NASA's upcoming return to the Moon will be different. Astronauts will have extended stays at an out-post and will collect more samples than they will return. They will need curation and analysis facilities on the Moon in order to carefully select samples for return to Earth.

  9. Edge Effects in Line Intersect Sampling With

    Science.gov (United States)

    David L. R. Affleck; Timothy G. Gregoire; Harry T. Valentine

    2005-01-01

    Transects consisting of multiple, connected segments with a prescribed configuration are commonly used in ecological applications of line intersect sampling. The transect configuration has implications for the probability with which population elements are selected and for how the selection probabilities can be modified by the boundary of the tract being sampled. As...

  10. Obscured AGN at z ~ 1 from the zCOSMOS-Bright Survey. I. Selection and optical properties of a [Ne v]-selected sample

    Science.gov (United States)

    Mignoli, M.; Vignali, C.; Gilli, R.; Comastri, A.; Zamorani, G.; Bolzonella, M.; Bongiorno, A.; Lamareille, F.; Nair, P.; Pozzetti, L.; Lilly, S. J.; Carollo, C. M.; Contini, T.; Kneib, J.-P.; Le Fèvre, O.; Mainieri, V.; Renzini, A.; Scodeggio, M.; Bardelli, S.; Caputi, K.; Cucciati, O.; de la Torre, S.; de Ravel, L.; Franzetti, P.; Garilli, B.; Iovino, A.; Kampczyk, P.; Knobel, C.; Kovač, K.; Le Borgne, J.-F.; Le Brun, V.; Maier, C.; Pellò, R.; Peng, Y.; Perez Montero, E.; Presotto, V.; Silverman, J. D.; Tanaka, M.; Tasca, L.; Tresse, L.; Vergani, D.; Zucca, E.; Bordoloi, R.; Cappi, A.; Cimatti, A.; Koekemoer, A. M.; McCracken, H. J.; Moresco, M.; Welikala, N.

    2013-08-01

    Aims: The application of multi-wavelength selection techniques is essential for obtaining a complete and unbiased census of active galactic nuclei (AGN). We present here a method for selecting z ~ 1 obscured AGN from optical spectroscopic surveys. Methods: A sample of 94 narrow-line AGN with 0.65 advantage of the large amount of data available in the COSMOS field, the properties of the [Ne v]-selected type 2 AGN were investigated, focusing on their host galaxies, X-ray emission, and optical line-flux ratios. Finally, a previously developed diagnostic, based on the X-ray-to-[Ne v] luminosity ratio, was exploited to search for the more heavily obscured AGN. Results: We found that [Ne v]-selected narrow-line AGN have Seyfert 2-like optical spectra, although their emission line ratios are diluted by a star-forming component. The ACS morphologies and stellar component in the optical spectra indicate a preference for our type 2 AGN to be hosted in early-type spirals with stellar masses greater than 109.5 - 10 M⊙, on average higher than those of the galaxy parent sample. The fraction of galaxies hosting [Ne v]-selected obscured AGN increases with the stellar mass, reaching a maximum of about 3% at ≈2 × 1011 M⊙. A comparison with other selection techniques at z ~ 1, namely the line-ratio diagnostics and X-ray detections, shows that the detection of the [Ne v] λ3426 line is an effective method for selecting AGN in the optical band, in particular the most heavily obscured ones, but cannot provide a complete census of type 2 AGN by itself. Finally, the high fraction of [Ne v]-selected type 2 AGN not detected in medium-deep (≈100-200 ks) Chandra observations (67%) is suggestive of the inclusion of Compton-thick (i.e., with NH > 1024 cm-2) sources in our sample. The presence of a population of heavily obscured AGN is corroborated by the X-ray-to-[Ne v] ratio; we estimated, by means of an X-ray stacking technique and simulations, that the Compton-thick fraction in our

  11. Method and apparatus for sampling atmospheric mercury

    Science.gov (United States)

    Trujillo, Patricio E.; Campbell, Evan E.; Eutsler, Bernard C.

    1976-01-20

    A method of simultaneously sampling particulate mercury, organic mercurial vapors, and metallic mercury vapor in the working and occupational environment and determining the amount of mercury derived from each such source in the sampled air. A known volume of air is passed through a sampling tube containing a filter for particulate mercury collection, a first adsorber for the selective adsorption of organic mercurial vapors, and a second adsorber for the adsorption of metallic mercury vapor. Carbon black molecular sieves are particularly useful as the selective adsorber for organic mercurial vapors. The amount of mercury adsorbed or collected in each section of the sampling tube is readily quantitatively determined by flameless atomic absorption spectrophotometry.

  12. Uranium tailings sampling manual

    International Nuclear Information System (INIS)

    Feenstra, S.; Reades, D.W.; Cherry, J.A.; Chambers, D.B.; Case, G.G.; Ibbotson, B.G.

    1985-01-01

    The purpose of this manual is to describe the requisite sampling procedures for the application of uniform high-quality standards to detailed geotechnical, hydrogeological, geochemical and air quality measurements at Canadian uranium tailings disposal sites. The selection and implementation of applicable sampling procedures for such measurements at uranium tailings disposal sites are complicated by two primary factors. Firstly, the physical and chemical nature of uranium mine tailings and effluent is considerably different from natural soil materials and natural waters. Consequently, many conventional methods for the collection and analysis of natural soils and waters are not directly applicable to tailings. Secondly, there is a wide range in the physical and chemical nature of uranium tailings. The composition of the ore, the milling process, the nature of tailings depositon, and effluent treatment vary considerably and are highly site-specific. Therefore, the definition and implementation of sampling programs for uranium tailings disposal sites require considerable evaluation, and often innovation, to ensure that appropriate sampling and analysis methods are used which provide the flexibility to take into account site-specific considerations. The following chapters describe the objective and scope of a sampling program, preliminary data collection, and the procedures for sampling of tailings solids, surface water and seepage, tailings pore-water, and wind-blown dust and radon

  13. Critical evaluation of sample pretreatment techniques.

    Science.gov (United States)

    Hyötyläinen, Tuulia

    2009-06-01

    Sample preparation before chromatographic separation is the most time-consuming and error-prone part of the analytical procedure. Therefore, selecting and optimizing an appropriate sample preparation scheme is a key factor in the final success of the analysis, and the judicious choice of an appropriate procedure greatly influences the reliability and accuracy of a given analysis. The main objective of this review is to critically evaluate the applicability, disadvantages, and advantages of various sample preparation techniques. Particular emphasis is placed on extraction techniques suitable for both liquid and solid samples.

  14. The Atacama Cosmology Telescope: Physical Properties and Purity of a Galaxy Cluster Sample Selected Via the Sunyaev-Zel'Dovich Effect

    Science.gov (United States)

    Menanteau, Felipe; Gonzalez, Jorge; Juin, Jean-Baptiste; Marriage, Tobias; Reese, Erik D.; Acquaviva, Viviana; Aguirre, Paula; Appel, John Willam; Baker, Andrew J.; Barrientos, L. Felipe; hide

    2010-01-01

    We present optical and X-ray properties for the first confirmed galaxy cluster sample selected by the Sunyaev-Zel'dovich Effect from 148 GHz maps over 455 square degrees of sky made with the Atacama Cosmology Telescope. These maps. coupled with multi-band imaging on 4-meter-class optical telescopes, have yielded a sample of 23 galaxy clusters with redshifts between 0.118 and 1.066. Of these 23 clusters, 10 are newly discovered. The selection of this sample is approximately mass limited and essentially independent of redshift. We provide optical positions, images, redshifts and X-ray fluxes and luminosities for the full sample, and X-ray temperatures of an important subset. The mass limit of the full sample is around 8.0 x 10(exp 14) Stellar Mass. with a number distribution that peaks around a redshift of 0.4. For the 10 highest significance SZE-selected cluster candidates, all of which are optically confirmed, the mass threshold is 1 x 10(exp 15) Stellar Mass and the redshift range is 0.167 to 1.066. Archival observations from Chandra, XMM-Newton. and ROSAT provide X-ray luminosities and temperatures that are broadly consistent with this mass threshold. Our optical follow-up procedure also allowed us to assess the purity of the ACT cluster sample. Eighty (one hundred) percent of the 148 GHz candidates with signal-to-noise ratios greater than 5.1 (5.7) are confirmed as massive clusters. The reported sample represents one of the largest SZE-selected sample of massive clusters over all redshifts within a cosmologically-significant survey volume, which will enable cosmological studies as well as future studies on the evolution, morphology, and stellar populations in the most massive clusters in the Universe.

  15. The Viking X ray fluorescence experiment - Sampling strategies and laboratory simulations. [Mars soil sampling

    Science.gov (United States)

    Baird, A. K.; Castro, A. J.; Clark, B. C.; Toulmin, P., III; Rose, H., Jr.; Keil, K.; Gooding, J. L.

    1977-01-01

    Ten samples of Mars regolith material (six on Viking Lander 1 and four on Viking Lander 2) have been delivered to the X ray fluorescence spectrometers as of March 31, 1977. An additional six samples at least are planned for acquisition in the remaining Extended Mission (to January 1979) for each lander. All samples acquired are Martian fines from the near surface (less than 6-cm depth) of the landing sites except the latest on Viking Lander 1, which is fine material from the bottom of a trench dug to a depth of 25 cm. Several attempts on each lander to acquire fresh rock material (in pebble sizes) for analysis have yielded only cemented surface crustal material (duricrust). Laboratory simulation and experimentation are required both for mission planning of sampling and for interpretation of data returned from Mars. This paper is concerned with the rationale for sample site selections, surface sampler operations, and the supportive laboratory studies needed to interpret X ray results from Mars.

  16. Teknik Sampling Snowball dalam Penelitian Lapangan

    Directory of Open Access Journals (Sweden)

    Nina Nurdiani

    2014-12-01

    Full Text Available Field research can be associated with both qualitative and quantitative research methods, depending on the problems faced and the goals to be achieved. The success of data collection in the field research depends on the determination of the appropriate sampling technique, to obtain accurate data, and reliably. In studies that have problems related to specific issues, requiring a non-probability sampling techniques one of which is the snowball sampling technique. This technique is useful for finding, identifying, selecting and taking samples in a network or chain of relationships. Phased implementation procedures performed through interviews and questionnaires. Snowball sampling technique has strengths and weaknesses in its application. Field research housing sector become the case study to explain this sampling technique.

  17. ITOUGH2 sample problems

    International Nuclear Information System (INIS)

    Finsterle, S.

    1997-11-01

    This report contains a collection of ITOUGH2 sample problems. It complements the ITOUGH2 User's Guide [Finsterle, 1997a], and the ITOUGH2 Command Reference [Finsterle, 1997b]. ITOUGH2 is a program for parameter estimation, sensitivity analysis, and uncertainty propagation analysis. It is based on the TOUGH2 simulator for non-isothermal multiphase flow in fractured and porous media [Preuss, 1987, 1991a]. The report ITOUGH2 User's Guide [Finsterle, 1997a] describes the inverse modeling framework and provides the theoretical background. The report ITOUGH2 Command Reference [Finsterle, 1997b] contains the syntax of all ITOUGH2 commands. This report describes a variety of sample problems solved by ITOUGH2. Table 1.1 contains a short description of the seven sample problems discussed in this report. The TOUGH2 equation-of-state (EOS) module that needs to be linked to ITOUGH2 is also indicated. Each sample problem focuses on a few selected issues shown in Table 1.2. ITOUGH2 input features and the usage of program options are described. Furthermore, interpretations of selected inverse modeling results are given. Problem 1 is a multipart tutorial, describing basic ITOUGH2 input files for the main ITOUGH2 application modes; no interpretation of results is given. Problem 2 focuses on non-uniqueness, residual analysis, and correlation structure. Problem 3 illustrates a variety of parameter and observation types, and describes parameter selection strategies. Problem 4 compares the performance of minimization algorithms and discusses model identification. Problem 5 explains how to set up a combined inversion of steady-state and transient data. Problem 6 provides a detailed residual and error analysis. Finally, Problem 7 illustrates how the estimation of model-related parameters may help compensate for errors in that model

  18. Information sampling behavior with explicit sampling costs

    Science.gov (United States)

    Juni, Mordechai Z.; Gureckis, Todd M.; Maloney, Laurence T.

    2015-01-01

    The decision to gather information should take into account both the value of information and its accrual costs in time, energy and money. Here we explore how people balance the monetary costs and benefits of gathering additional information in a perceptual-motor estimation task. Participants were rewarded for touching a hidden circular target on a touch-screen display. The target’s center coincided with the mean of a circular Gaussian distribution from which participants could sample repeatedly. Each “cue” — sampled one at a time — was plotted as a dot on the display. Participants had to repeatedly decide, after sampling each cue, whether to stop sampling and attempt to touch the hidden target or continue sampling. Each additional cue increased the participants’ probability of successfully touching the hidden target but reduced their potential reward. Two experimental conditions differed in the initial reward associated with touching the hidden target and the fixed cost per cue. For each condition we computed the optimal number of cues that participants should sample, before taking action, to maximize expected gain. Contrary to recent claims that people gather less information than they objectively should before taking action, we found that participants over-sampled in one experimental condition, and did not significantly under- or over-sample in the other. Additionally, while the ideal observer model ignores the current sample dispersion, we found that participants used it to decide whether to stop sampling and take action or continue sampling, a possible consequence of imperfect learning of the underlying population dispersion across trials. PMID:27429991

  19. Samples and Sampling Protocols for Scientific Investigations | Joel ...

    African Journals Online (AJOL)

    ... from sampling, through sample preparation, calibration to final measurement and reporting. This paper, therefore offers useful information on practical guidance on sampling protocols in line with best practice and international standards. Keywords: Sampling, sampling protocols, chain of custody, analysis, documentation ...

  20. Evaluation of sampling strategies to estimate crown biomass

    Directory of Open Access Journals (Sweden)

    Krishna P Poudel

    2015-01-01

    Full Text Available Background Depending on tree and site characteristics crown biomass accounts for a significant portion of the total aboveground biomass in the tree. Crown biomass estimation is useful for different purposes including evaluating the economic feasibility of crown utilization for energy production or forest products, fuel load assessments and fire management strategies, and wildfire modeling. However, crown biomass is difficult to predict because of the variability within and among species and sites. Thus the allometric equations used for predicting crown biomass should be based on data collected with precise and unbiased sampling strategies. In this study, we evaluate the performance different sampling strategies to estimate crown biomass and to evaluate the effect of sample size in estimating crown biomass. Methods Using data collected from 20 destructively sampled trees, we evaluated 11 different sampling strategies using six evaluation statistics: bias, relative bias, root mean square error (RMSE, relative RMSE, amount of biomass sampled, and relative biomass sampled. We also evaluated the performance of the selected sampling strategies when different numbers of branches (3, 6, 9, and 12 are selected from each tree. Tree specific log linear model with branch diameter and branch length as covariates was used to obtain individual branch biomass. Results Compared to all other methods stratified sampling with probability proportional to size estimation technique produced better results when three or six branches per tree were sampled. However, the systematic sampling with ratio estimation technique was the best when at least nine branches per tree were sampled. Under the stratified sampling strategy, selecting unequal number of branches per stratum produced approximately similar results to simple random sampling, but it further decreased RMSE when information on branch diameter is used in the design and estimation phases. Conclusions Use of

  1. Soil sampling strategies: Evaluation of different approaches

    Energy Technology Data Exchange (ETDEWEB)

    De Zorzi, Paolo [Agenzia per la Protezione dell' Ambiente e per i Servizi Tecnici (APAT), Servizio Metrologia Ambientale, Via di Castel Romano, 100-00128 Roma (Italy)], E-mail: paolo.dezorzi@apat.it; Barbizzi, Sabrina; Belli, Maria [Agenzia per la Protezione dell' Ambiente e per i Servizi Tecnici (APAT), Servizio Metrologia Ambientale, Via di Castel Romano, 100-00128 Roma (Italy); Mufato, Renzo; Sartori, Giuseppe; Stocchero, Giulia [Agenzia Regionale per la Prevenzione e Protezione dell' Ambiente del Veneto, ARPA Veneto, U.O. Centro Qualita Dati, Via Spalato, 14-36045 Vicenza (Italy)

    2008-11-15

    The National Environmental Protection Agency of Italy (APAT) performed a soil sampling intercomparison, inviting 14 regional agencies to test their own soil sampling strategies. The intercomparison was carried out at a reference site, previously characterised for metal mass fraction distribution. A wide range of sampling strategies, in terms of sampling patterns, type and number of samples collected, were used to assess the mean mass fraction values of some selected elements. The different strategies led in general to acceptable bias values (D) less than 2{sigma}, calculated according to ISO 13258. Sampling on arable land was relatively easy, with comparable results between different sampling strategies.

  2. Soil sampling strategies: Evaluation of different approaches

    International Nuclear Information System (INIS)

    De Zorzi, Paolo; Barbizzi, Sabrina; Belli, Maria; Mufato, Renzo; Sartori, Giuseppe; Stocchero, Giulia

    2008-01-01

    The National Environmental Protection Agency of Italy (APAT) performed a soil sampling intercomparison, inviting 14 regional agencies to test their own soil sampling strategies. The intercomparison was carried out at a reference site, previously characterised for metal mass fraction distribution. A wide range of sampling strategies, in terms of sampling patterns, type and number of samples collected, were used to assess the mean mass fraction values of some selected elements. The different strategies led in general to acceptable bias values (D) less than 2σ, calculated according to ISO 13258. Sampling on arable land was relatively easy, with comparable results between different sampling strategies

  3. Soil sampling strategies: evaluation of different approaches.

    Science.gov (United States)

    de Zorzi, Paolo; Barbizzi, Sabrina; Belli, Maria; Mufato, Renzo; Sartori, Giuseppe; Stocchero, Giulia

    2008-11-01

    The National Environmental Protection Agency of Italy (APAT) performed a soil sampling intercomparison, inviting 14 regional agencies to test their own soil sampling strategies. The intercomparison was carried out at a reference site, previously characterised for metal mass fraction distribution. A wide range of sampling strategies, in terms of sampling patterns, type and number of samples collected, were used to assess the mean mass fraction values of some selected elements. The different strategies led in general to acceptable bias values (D) less than 2sigma, calculated according to ISO 13258. Sampling on arable land was relatively easy, with comparable results between different sampling strategies.

  4. Subrandom methods for multidimensional nonuniform sampling.

    Science.gov (United States)

    Worley, Bradley

    2016-08-01

    Methods of nonuniform sampling that utilize pseudorandom number sequences to select points from a weighted Nyquist grid are commonplace in biomolecular NMR studies, due to the beneficial incoherence introduced by pseudorandom sampling. However, these methods require the specification of a non-arbitrary seed number in order to initialize a pseudorandom number generator. Because the performance of pseudorandom sampling schedules can substantially vary based on seed number, this can complicate the task of routine data collection. Approaches such as jittered sampling and stochastic gap sampling are effective at reducing random seed dependence of nonuniform sampling schedules, but still require the specification of a seed number. This work formalizes the use of subrandom number sequences in nonuniform sampling as a means of seed-independent sampling, and compares the performance of three subrandom methods to their pseudorandom counterparts using commonly applied schedule performance metrics. Reconstruction results using experimental datasets are also provided to validate claims made using these performance metrics. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Adrenal vein sampling in primary aldosteronism: concordance of simultaneous vs sequential sampling.

    Science.gov (United States)

    Almarzooqi, Mohamed-Karji; Chagnon, Miguel; Soulez, Gilles; Giroux, Marie-France; Gilbert, Patrick; Oliva, Vincent L; Perreault, Pierre; Bouchard, Louis; Bourdeau, Isabelle; Lacroix, André; Therasse, Eric

    2017-02-01

    Many investigators believe that basal adrenal venous sampling (AVS) should be done simultaneously, whereas others opt for sequential AVS for simplicity and reduced cost. This study aimed to evaluate the concordance of sequential and simultaneous AVS methods. Between 1989 and 2015, bilateral simultaneous sets of basal AVS were obtained twice within 5 min, in 188 consecutive patients (59 women and 129 men; mean age: 53.4 years). Selectivity was defined by adrenal-to-peripheral cortisol ratio ≥2, and lateralization was defined as an adrenal aldosterone-to-cortisol ratio ≥2, the contralateral side. Sequential AVS was simulated using right sampling at -5 min (t = -5) and left sampling at 0 min (t = 0). There was no significant difference in mean selectivity ratio (P = 0.12 and P = 0.42 for the right and left sides respectively) and in mean lateralization ratio (P = 0.93) between t = -5 and t = 0. Kappa for selectivity between 2 simultaneous AVS was 0.71 (95% CI: 0.60-0.82), whereas it was 0.84 (95% CI: 0.76-0.92) and 0.85 (95% CI: 0.77-0.93) between sequential and simultaneous AVS at respectively -5 min and at 0 min. Kappa for lateralization between 2 simultaneous AVS was 0.84 (95% CI: 0.75-0.93), whereas it was 0.86 (95% CI: 0.78-0.94) and 0.80 (95% CI: 0.71-0.90) between sequential AVS and simultaneous AVS at respectively -5 min at 0 min. Concordance between simultaneous and sequential AVS was not different than that between 2 repeated simultaneous AVS in the same patient. Therefore, a better diagnostic performance is not a good argument to select the AVS method. © 2017 European Society of Endocrinology.

  6. Probability Sampling Method for a Hidden Population Using Respondent-Driven Sampling: Simulation for Cancer Survivors.

    Science.gov (United States)

    Jung, Minsoo

    2015-01-01

    When there is no sampling frame within a certain group or the group is concerned that making its population public would bring social stigma, we say the population is hidden. It is difficult to approach this kind of population survey-methodologically because the response rate is low and its members are not quite honest with their responses when probability sampling is used. The only alternative known to address the problems caused by previous methods such as snowball sampling is respondent-driven sampling (RDS), which was developed by Heckathorn and his colleagues. RDS is based on a Markov chain, and uses the social network information of the respondent. This characteristic allows for probability sampling when we survey a hidden population. We verified through computer simulation whether RDS can be used on a hidden population of cancer survivors. According to the simulation results of this thesis, the chain-referral sampling of RDS tends to minimize as the sample gets bigger, and it becomes stabilized as the wave progresses. Therefore, it shows that the final sample information can be completely independent from the initial seeds if a certain level of sample size is secured even if the initial seeds were selected through convenient sampling. Thus, RDS can be considered as an alternative which can improve upon both key informant sampling and ethnographic surveys, and it needs to be utilized for various cases domestically as well.

  7. A redshift survey of IRAS galaxies. I. Sample selection

    International Nuclear Information System (INIS)

    Strauss, M.A.; Davis, M.; Yahil, A.; Huchra, J.P.

    1990-01-01

    A complete all-sky sample of objects, flux-limited at 60 microns, has been extracted from the data base of the IRAS. The sample consists of 5014 objects, of which 2649 are galaxies and 13 are not yet identified. In order to study large-scale structure with this sample, it must be free of systematic biases. Corrections are applied for a major systematic effect in the flux densities listed in the IRAS Point Source Catalog: sources resolved by the IRAS beam have flux densities systematically underestimated. In addition, accurate flux densities are obtained for sources flagged as variable, or of moderate flux quality at 60 microns. The IRAS detectors suffered radiation-induced responsivity enhancement (hysteresis) due to crossings of the satellite scans across the Galactic plane; this effect is measured and is shown to be negligible. 53 refs

  8. Sampling dynamics: an alternative to payoff-monotone selection dynamics

    DEFF Research Database (Denmark)

    Berkemer, Rainer

    payoff-monotone nor payoff-positive which has interesting consequences. This can be demonstrated by application to the travelers dilemma, a deliberately constructed social dilemma. The game has just one symmetric Nash equilibrium which is Pareto inefficient. Especially when the travelers have many......'' of the standard game theory result. Both, analytical tools and agent based simulation are used to investigate the dynamic stability of sampling equilibria in a generalized travelers dilemma. Two parameters are of interest: the number of strategy options (m) available to each traveler and an experience parameter...... (k), which indicates the number of samples an agent would evaluate before fixing his decision. The special case (k=1) can be treated analytically. The stationary points of the dynamics must be sampling equilibria and one can calculate that for m>3 there will be an interior solution in addition...

  9. Development of ion imprinted polymers for the selective extraction of lanthanides from environmental samples

    International Nuclear Information System (INIS)

    Moussa, Manel

    2016-01-01

    The analysis of the lanthanide ions present at trace level in complex environmental matrices requires often a purification and preconcentration step. The solid phase extraction (SPE) is the most used sample preparation technique. To improve the selectivity of this step, Ion Imprinted Polymers (IIPs) can be used as SPE solid supports. The aim of this work was the development of IIPs for the selective extraction of lanthanide ions from environmental samples. In a first part, IIPs were prepared according to the trapping approach using 5,7-dichloroquinoline-8-ol as non-vinylated ligand. For the first time, the loss of the trapped ligand during template ion removal and sedimentation steps was demonstrated by HPLC-UV. Moreover, this loss was not repeatable, which led to a lack of repeatability of the SPE profiles. It was then demonstrated that the trapping approach is not appropriate for the IIPs synthesis. In a second part, IIPs were synthesized by chemical immobilization of methacrylic acid as vinylated monomer. The repeatability of the synthesis and the SPE protocol were confirmed. A good selectivity of the IIPs for all the lanthanide ions was obtained. IIPs were successfully used to selectively extract lanthanide ions from tap and river water. Finally, IIPs were synthesized by chemical immobilization of methacrylic acid and 4-vinylpyridine as functional monomers and either a light (Nd 3+ ) or a heavy (Er 3+ ) lanthanide ion as template. Both kinds of IIPs led to a similar selectivity for all lanthanide ions. Nevertheless, this selectivity can be modified by changing the nature and the pH of the washing solution used in the SPE protocol. (author)

  10. Sampling and chemical analysis in environmental samples around Nuclear Power Plants and some environmental samples

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Yong Woo; Han, Man Jung; Cho, Seong Won; Cho, Hong Jun; Oh, Hyeon Kyun; Lee, Jeong Min; Chang, Jae Sook [KORTIC, Taejon (Korea, Republic of)

    2002-12-15

    Twelve kinds of environmental samples such as soil, seawater, underground water, etc. around Nuclear Power Plants(NPPs) were collected. Tritium chemical analysis was tried for the samples of rain water, pine-needle, air, seawater, underground water, chinese cabbage, a grain of rice and milk sampled around NPPs, and surface seawater and rain water sampled over the country. Strontium in the soil that sere sampled at 60 point of district in Korea were analyzed. Tritium were sampled at 60 point of district in Korea were analyzed. Tritium were analyzed in 21 samples of surface seawater around the Korea peninsular that were supplied from KFRDI(National Fisheries Research and Development Institute). Sampling and chemical analysis environmental samples around Kori, Woolsung, Youngkwang, Wooljin Npps and Taeduk science town for tritium and strontium analysis was managed according to plans. Succeed to KINS after all samples were tried.

  11. Progressive sample processing of band selection for hyperspectral imagery

    Science.gov (United States)

    Liu, Keng-Hao; Chien, Hung-Chang; Chen, Shih-Yu

    2017-10-01

    Band selection (BS) is one of the most important topics in hyperspectral image (HSI) processing. The objective of BS is to find a set of representative bands that can represent the whole image with lower inter-band redundancy. Many types of BS algorithms were proposed in the past. However, most of them can be carried on in an off-line manner. It means that they can only be implemented on the pre-collected data. Those off-line based methods are sometime useless for those applications that are timeliness, particular in disaster prevention and target detection. To tackle this issue, a new concept, called progressive sample processing (PSP), was proposed recently. The PSP is an "on-line" framework where the specific type of algorithm can process the currently collected data during the data transmission under band-interleavedby-sample/pixel (BIS/BIP) protocol. This paper proposes an online BS method that integrates a sparse-based BS into PSP framework, called PSP-BS. In PSP-BS, the BS can be carried out by updating BS result recursively pixel by pixel in the same way that a Kalman filter does for updating data information in a recursive fashion. The sparse regression is solved by orthogonal matching pursuit (OMP) algorithm, and the recursive equations of PSP-BS are derived by using matrix decomposition. The experiments conducted on a real hyperspectral image show that the PSP-BS can progressively output the BS status with very low computing time. The convergence of BS results during the transmission can be quickly achieved by using a rearranged pixel transmission sequence. This significant advantage allows BS to be implemented in a real time manner when the HSI data is transmitted pixel by pixel.

  12. THE SAMPLING PROCESS IN THE FINANCIAL AUDIT .TECHNICAL PRACTICE APPROACH

    Directory of Open Access Journals (Sweden)

    Cardos Vasile-Daniel

    2014-12-01

    “Audit sampling” (sampling assumes appliancing audit procedures for less than 100% of the elements within an account or a trasaction class balance, such that all the samples will be selected. This will allow the auditor to obtain and to evaluate the audit evidence on some features for the selected elements, in purpose to assist or to express a conclusion regardind the population within the sample was extracted. The sampling in audit can use both a statistical or a non-statistical approach. (THE AUDIT INTERNATIONAl STANDARD 530 –THE SAMPLING IN AUDIT AND OTHER SELECTIVE TESTING PROCEDURES

  13. THE SAMPLING PROCESS IN THE FINANCIAL AUDIT .TECHNICAL PRACTICE APPROACH

    Directory of Open Access Journals (Sweden)

    GRIGORE MARIAN

    2014-07-01

    “Audit sampling” (sampling assumes appliancing audit procedures for less than 100% of the elements within an account or a trasaction class balance, such that all the samples will be selected. This will allow the auditor to obtain and to evaluate the audit evidence on some features for the selected elements, in purpose to assist or to express a conclusion regardind the population within the sample was extracted. The sampling in audit can use both a statistical or a non-statistical approach. (THE AUDIT INTERNATIONAl STANDARD 530 –THE SAMPLING IN AUDIT AND OTHER SELECTIVE TESTING PROCEDURES

  14. Sampling Efficiency and Performance of Selected Thoracic Aerosol Samplers.

    Science.gov (United States)

    Görner, Peter; Simon, Xavier; Boivin, Alexis; Bau, Sébastien

    2017-08-01

    Measurement of worker exposure to a thoracic health-related aerosol fraction is necessary in a number of occupational situations. This is the case of workplaces with atmospheres polluted by fibrous particles, such as cotton dust or asbestos, and by particles inducing irritation or bronchoconstriction such as acid mists or flour dust. Three personal and two static thoracic aerosol samplers were tested under laboratory conditions. Sampling efficiency with respect to particle aerodynamic diameter was measured in a horizontal low wind tunnel and in a vertical calm air chamber. Sampling performance was evaluated against conventional thoracic penetration. Three of the tested samplers performed well, when sampling the thoracic aerosol at nominal flow rate and two others performed well at optimized flow rate. The limit of flow rate optimization was found when using cyclone samplers. © The Author 2017. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.

  15. Determination of selected metals in coal samples from Lafia-Obi and ...

    African Journals Online (AJOL)

    coal samples were determined using atomic absorption spectroscopy (AAS). All the samples have comparable chromium and copper contents, while iron, aluminum, magnesium and potassium content vary to some extent. Metals concentrations in both Lafia-Obi and Chikila coal samples are within the limits allowed by the ...

  16. 40 CFR Appendix X to Part 86 - Sampling Plans for Selective Enforcement Auditing of Heavy-Duty Engines and Light-Duty Trucks

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Sampling Plans for Selective Enforcement Auditing of Heavy-Duty Engines and Light-Duty Trucks X Appendix X to Part 86 Protection of... Plans for Selective Enforcement Auditing of Heavy-Duty Engines and Light-Duty Trucks Table 1—Sampling...

  17. Systematic Sampling and Cluster Sampling of Packet Delays

    OpenAIRE

    Lindh, Thomas

    2006-01-01

    Based on experiences of a traffic flow performance meter this papersuggests and evaluates cluster sampling and systematic sampling as methods toestimate average packet delays. Systematic sampling facilitates for exampletime analysis, frequency analysis and jitter measurements. Cluster samplingwith repeated trains of periodically spaced sampling units separated by randomstarting periods, and systematic sampling are evaluated with respect to accuracyand precision. Packet delay traces have been ...

  18. Innovative methods for inorganic sample preparation

    Energy Technology Data Exchange (ETDEWEB)

    Essling, A.M.; Huff, E.A.; Graczyk, D.G.

    1992-04-01

    Procedures and guidelines are given for the dissolution of a variety of selected materials using fusion, microwave, and Parr bomb techniques. These materials include germanium glass, corium-concrete mixtures, and zeolites. Emphasis is placed on sample-preparation approaches that produce a single master solution suitable for complete multielement characterization of the sample. In addition, data are presented on the soil microwave digestion method approved by the Environmental Protection Agency (EPA). Advantages and disadvantages of each sample-preparation technique are summarized.

  19. Innovative methods for inorganic sample preparation

    International Nuclear Information System (INIS)

    Essling, A.M.; Huff, E.A.; Graczyk, D.G.

    1992-04-01

    Procedures and guidelines are given for the dissolution of a variety of selected materials using fusion, microwave, and Parr bomb techniques. These materials include germanium glass, corium-concrete mixtures, and zeolites. Emphasis is placed on sample-preparation approaches that produce a single master solution suitable for complete multielement characterization of the sample. In addition, data are presented on the soil microwave digestion method approved by the Environmental Protection Agency (EPA). Advantages and disadvantages of each sample-preparation technique are summarized

  20. Optimal sampling strategy for data mining

    International Nuclear Information System (INIS)

    Ghaffar, A.; Shahbaz, M.; Mahmood, W.

    2013-01-01

    Latest technology like Internet, corporate intranets, data warehouses, ERP's, satellites, digital sensors, embedded systems, mobiles networks all are generating such a massive amount of data that it is getting very difficult to analyze and understand all these data, even using data mining tools. Huge datasets are becoming a difficult challenge for classification algorithms. With increasing amounts of data, data mining algorithms are getting slower and analysis is getting less interactive. Sampling can be a solution. Using a fraction of computing resources, Sampling can often provide same level of accuracy. The process of sampling requires much care because there are many factors involved in the determination of correct sample size. The approach proposed in this paper tries to find a solution to this problem. Based on a statistical formula, after setting some parameters, it returns a sample size called s ufficient sample size , which is then selected through probability sampling. Results indicate the usefulness of this technique in coping with the problem of huge datasets. (author)

  1. A simulative comparison of respondent driven sampling with incentivized snowball sampling – the “strudel effect”

    Science.gov (United States)

    Gyarmathy, V. Anna; Johnston, Lisa G.; Caplinskiene, Irma; Caplinskas, Saulius; Latkin, Carl A.

    2014-01-01

    Background Respondent driven sampling (RDS) and Incentivized Snowball Sampling (ISS) are two sampling methods that are commonly used to reach people who inject drugs (PWID). Methods We generated a set of simulated RDS samples on an actual sociometric ISS sample of PWID in Vilnius, Lithuania (“original sample”) to assess if the simulated RDS estimates were statistically significantly different from the original ISS sample prevalences for HIV (9.8%), Hepatitis A (43.6%), Hepatitis B (Anti-HBc 43.9% and HBsAg 3.4%), Hepatitis C (87.5%), syphilis (6.8%) and Chlamydia (8.8%) infections and for selected behavioral risk characteristics. Results The original sample consisted of a large component of 249 people (83% of the sample) and 13 smaller components with 1 to 12 individuals. Generally, as long as all seeds were recruited from the large component of the original sample, the simulation samples simply recreated the large component. There were no significant differences between the large component and the entire original sample for the characteristics of interest. Altogether 99.2% of 360 simulation sample point estimates were within the confidence interval of the original prevalence values for the characteristics of interest. Conclusions When population characteristics are reflected in large network components that dominate the population, RDS and ISS may produce samples that have statistically non-different prevalence values, even though some isolated network components may be under-sampled and/or statistically significantly different from the main groups. This so-called “strudel effect” is discussed in the paper. PMID:24360650

  2. Composition of Trace Metals in Dust Samples Collected from Selected High Schools in Pretoria, South Africa

    Directory of Open Access Journals (Sweden)

    J. O. Olowoyo

    2016-01-01

    Full Text Available Potential health risks associated with trace metal pollution have necessitated the importance of monitoring their levels in the environment. The present study investigated the concentrations and compositions of trace metals in dust samples collected from classrooms and playing ground from the selected high schools In Pretoria. Schools were selected from Pretoria based on factors such as proximity to high traffic ways, industrial areas, and residential areas. Thirty-two dust samples were collected from inside and outside the classrooms, where learners often stay during recess period. The dust samples were analysed for trace metal concentrations using Inductively Coupled Plasma-Mass Spectrometry (ICP-MS. The composition of the elements showed that the concentrations of Zn were more than all other elements except from one of the schools. There were significant differences in the concentrations of trace metals from the schools (p<0.05. Regular cleaning, proximity to busy road, and well maintained gardens seem to have positive effects on the concentrations of trace metals recorded from the classrooms dust. The result further revealed a positive correlation for elements such as Pb, Cu, Zn, Mn, and Sb, indicating that the dust might have a common source.

  3. Soil sampling intercomparison exercise by selected laboratories of the ALMERA Network

    International Nuclear Information System (INIS)

    2009-01-01

    The IAEA's Seibersdorf Laboratories in Austria have the programmatic responsibility to provide assistance to Member State laboratories in maintaining and improving the reliability of analytical measurement results, both in radionuclide and trace element determinations. This is accomplished through the provision of reference materials of terrestrial origin, validated analytical procedures, training in the implementation of internal quality control, and through the evaluation of measurement performance by the organization of worldwide and regional interlaboratory comparison exercises. The IAEA is mandated to support global radionuclide measurement systems related to accidental or intentional releases of radioactivity in the environment. To fulfil this obligation and ensure a reliable, worldwide, rapid and consistent response, the IAEA coordinates an international network of analytical laboratories for the measurement of environmental radioactivity (ALMERA). The network was established by the IAEA in 1995 and makes available to Member States a world-wide network of analytical laboratories capable of providing reliable and timely analysis of environmental samples in the event of an accidental or intentional release of radioactivity. A primary requirement for the ALMERA members is participation in the IAEA interlaboratory comparison exercises, which are specifically organized for ALMERA on a regular basis. These exercises are designed to monitor and demonstrate the performance and analytical capabilities of the network members, and to identify gaps and problem areas where further development is needed. In this framework, the IAEA organized a soil sampling intercomparison exercise (IAEA/SIE/01) for selected laboratories of the ALMERA network. The main objective of this exercise was to compare soil sampling procedures used by different participating laboratories. The performance evaluation results of the interlaboratory comparison exercises performed in the framework of

  4. Blood sampling from adrenal gland vein

    International Nuclear Information System (INIS)

    Sun Yong; Ni Caifang

    2009-01-01

    Adrenal gland vein sampling is an interventional method to get the blood samples from the adrenal gland vein. The blood is obtained via a catheter which is selectively inserted in the adrenal gland vein. This technique is mainly used to be diagnostic for primary hyperaldosteronism. A full knowledge of the anatomy and variations of the adrenal gland vein, serious preoperative preparation and skilled catheterization manipulation are necessary for obtaining sufficient blood sample and for reducing the occurrence of complications. Providing the physicians with definite diagnostic evidence and being technically feasible, adrenal gland vein sampling should become one of the routine examinations for clarifying the cause of primary hyperaldosteronism. (authors)

  5. Tomographic imaging of 12 fracture samples selected from Olkiluoto deep drillholes

    International Nuclear Information System (INIS)

    Kuva, J.; Voutilainen, M.; Timonen, J.; Aaltonen, I.

    2010-06-01

    Rock samples from Olkiluoto were imaged with X-ray tomography to analyze distributions of mineral components and alteration of rock around different fracture types. Twelve samples were analyzed, which contained three types of fractures, and each sample was scanned with two different resolutions. Three dimensional reconstructions of the samples with four or five distinct mineral components displayed changes in the mineral distribution around previously water conducting fractures, which extended to a depth of several millimeters away from fracture surfaces. In addition, structure of fracture filling minerals is depicted. (orig.)

  6. Sampling

    CERN Document Server

    Thompson, Steven K

    2012-01-01

    Praise for the Second Edition "This book has never had a competitor. It is the only book that takes a broad approach to sampling . . . any good personal statistics library should include a copy of this book." —Technometrics "Well-written . . . an excellent book on an important subject. Highly recommended." —Choice "An ideal reference for scientific researchers and other professionals who use sampling." —Zentralblatt Math Features new developments in the field combined with all aspects of obtaining, interpreting, and using sample data Sampling provides an up-to-date treat

  7. A novel variable selection approach that iteratively optimizes variable space using weighted binary matrix sampling.

    Science.gov (United States)

    Deng, Bai-chuan; Yun, Yong-huan; Liang, Yi-zeng; Yi, Lun-zhao

    2014-10-07

    In this study, a new optimization algorithm called the Variable Iterative Space Shrinkage Approach (VISSA) that is based on the idea of model population analysis (MPA) is proposed for variable selection. Unlike most of the existing optimization methods for variable selection, VISSA statistically evaluates the performance of variable space in each step of optimization. Weighted binary matrix sampling (WBMS) is proposed to generate sub-models that span the variable subspace. Two rules are highlighted during the optimization procedure. First, the variable space shrinks in each step. Second, the new variable space outperforms the previous one. The second rule, which is rarely satisfied in most of the existing methods, is the core of the VISSA strategy. Compared with some promising variable selection methods such as competitive adaptive reweighted sampling (CARS), Monte Carlo uninformative variable elimination (MCUVE) and iteratively retaining informative variables (IRIV), VISSA showed better prediction ability for the calibration of NIR data. In addition, VISSA is user-friendly; only a few insensitive parameters are needed, and the program terminates automatically without any additional conditions. The Matlab codes for implementing VISSA are freely available on the website: https://sourceforge.net/projects/multivariateanalysis/files/VISSA/.

  8. Automatic Samples Selection Using Histogram of Oriented Gradients (HOG Feature Distance

    Directory of Open Access Journals (Sweden)

    Inzar Salfikar

    2018-01-01

    Full Text Available Finding victims at a disaster site is the primary goal of Search-and-Rescue (SAR operations. Many technologies created from research for searching disaster victims through aerial imaging. but, most of them are difficult to detect victims at tsunami disaster sites with victims and backgrounds which are look similar. This research collects post-tsunami aerial imaging data from the internet to builds dataset and model for detecting tsunami disaster victims. Datasets are built based on distance differences from features every sample using Histogram-of-Oriented-Gradient (HOG method. We use the longest distance to collect samples from photo to generate victim and non-victim samples. We claim steps to collect samples by measuring HOG feature distance from all samples. the longest distance between samples will take as a candidate to build the dataset, then classify victim (positives and non-victim (negatives samples manually. The dataset of tsunami disaster victims was re-analyzed using cross-validation Leave-One-Out (LOO with Support-Vector-Machine (SVM method. The experimental results show the performance of two test photos with 61.70% precision, 77.60% accuracy, 74.36% recall and f-measure 67.44% to distinguish victim (positives and non-victim (negatives.

  9. Water born pollutants sampling using porous suction samples

    International Nuclear Information System (INIS)

    Baig, M.A.

    1997-01-01

    The common standard method of sampling water born pollutants in the vadoze zone is core sampling and it is followed by extraction of pore fluid. This method does not allow sampling at the same location next time and again later on. There is an alternative approach for sampling fluids (water born pollutants) from both saturated and unsaturated regions of vadose zone using porous suction samplers. There are three types of porous suction samplers, vacuum-operated, pressure-vacuum lysimeters, high pressure vacuum samples. The suction samples are operated in the range of 0-70 centi bars and usually consist of ceramic and polytetrafluorethylene (PTFE). The operation range of PTFE is higher than ceramic cups. These samplers are well suited for in situ and repeated sampling form the same location. This paper discusses the physical properties and operating condition of such samplers to the utilized under our environmental sampling. (author)

  10. Apparatus for sampling hazardous media

    International Nuclear Information System (INIS)

    Gardner, J.F.; Showalter, T.W.

    1984-01-01

    An apparatus for sampling a hazardous medium, such as radioactive or chemical waste, selectively collects a predetermined quantity of the medium in a recess of an end-over-end rotatable valving member. This collected quantity is deposited in a receiving receptacle located in a cavity while the receiving receptacle is in a sealed relationship with a recess to prevent dusting of the sampled media outside the receiving receptacle. The receiving receptacle is removably fitted within a vehicle body which is, in turn, slidably movable upon a track within a transport tube. The receiving receptacle is transported in the vehicle body from its sample receiving position within a container for the hazardous medium to a sample retrieval position outside the medium container. The receiving receptacle may then be removed from the vehicle body, capped and taken to a laboratory for chemical analysis. (author)

  11. Testing the normality assumption in the sample selection model with an application to travel demand

    NARCIS (Netherlands)

    van der Klaauw, B.; Koning, R.H.

    2003-01-01

    In this article we introduce a test for the normality assumption in the sample selection model. The test is based on a flexible parametric specification of the density function of the error terms in the model. This specification follows a Hermite series with bivariate normality as a special case.

  12. Testing the normality assumption in the sample selection model with an application to travel demand

    NARCIS (Netherlands)

    van der Klauw, B.; Koning, R.H.

    In this article we introduce a test for the normality assumption in the sample selection model. The test is based on a flexible parametric specification of the density function of the error terms in the model. This specification follows a Hermite series with bivariate normality as a special case.

  13. Microbiological sampling plan based on risk classification to verify supplier selection and production of served meals in food service operation.

    Science.gov (United States)

    Lahou, Evy; Jacxsens, Liesbeth; Van Landeghem, Filip; Uyttendaele, Mieke

    2014-08-01

    Food service operations are confronted with a diverse range of raw materials and served meals. The implementation of a microbial sampling plan in the framework of verification of suppliers and their own production process (functionality of their prerequisite and HACCP program), demands selection of food products and sampling frequencies. However, these are often selected without a well described scientifically underpinned sampling plan. Therefore, an approach on how to set-up a focused sampling plan, enabled by a microbial risk categorization of food products, for both incoming raw materials and meals served to the consumers is presented. The sampling plan was implemented as a case study during a one-year period in an institutional food service operation to test the feasibility of the chosen approach. This resulted in 123 samples of raw materials and 87 samples of meal servings (focused on high risk categorized food products) which were analyzed for spoilage bacteria, hygiene indicators and food borne pathogens. Although sampling plans are intrinsically limited in assessing the quality and safety of sampled foods, it was shown to be useful to reveal major non-compliances and opportunities to improve the food safety management system in place. Points of attention deduced in the case study were control of Listeria monocytogenes in raw meat spread and raw fish as well as overall microbial quality of served sandwiches and salads. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Sampling and examination methods used for TMI-2 samples

    International Nuclear Information System (INIS)

    Marley, A.W.; Akers, D.W.; McIsaac, C.V.

    1988-01-01

    The purpose of this paper is to summarize the sampling and examination techniques that were used in the collection and analysis of TMI-2 samples. Samples ranging from auxiliary building air to core debris were collected and analyzed. Handling of the larger samples and many of the smaller samples had to be done remotely and many standard laboratory analytical techniques were modified to accommodate the extremely high radiation fields associated with these samples. The TMI-2 samples presented unique problems with sampling and the laboratory analysis of prior molten fuel debris. 14 refs., 8 figs

  15. Geochemical and mineralogical study of selected weathered samples from Olkiluoto site

    International Nuclear Information System (INIS)

    Lindberg, A.

    2009-02-01

    Optical microscopy, chemical analyses and X-ray diffraction method were used to study the influence of weathering from 11 drill core samples from shallow depths (< 25 m). The samples, 4 to 22 cm in length were drilled from Olkiluoto study site, Eurajoki, and they represent the common rock types of local bedrock: mica gneiss, tonalitic and granodioritic gneiss. Two of the samples were macroscopically unweathered and 9 of them were remarkably altered. The alteration was shown as porosity, the abundance of chlorite instead of biotite and pink, unclear feldspars. Many samples also contained red-brown hematite and fractures, some of them coated with secondary minerals, even clay. Microscopically the most visible feature of weathering was the total alteration of plagioclase and cordierite to sericite. In many samples also biotite was richly altered to chlorite and opaque minerals. Microfractures were common and they were filled by hematite, kaolinite and fine-grained muscovite (sericite). Hematite was, in some cases, also largely replacing the weathered minerals, feldspars and cordierite. Chemical alteration was not clear, because the alteration of main minerals have produced secondary minerals with almost the same chemical composition without any reasonable depleting or enrichment of certain elements. X-ray diffraction determination of samples proved, that often plagioclase was replaced by mica and biotite by chlorite. In some cases the samples contained products of chemical weathering, kaolinite and smectite. (orig.)

  16. PeptideManager: A Peptide Selection Tool for Targeted Proteomic Studies Involving Mixed Samples from Different Species

    Directory of Open Access Journals (Sweden)

    Kevin eDemeure

    2014-09-01

    Full Text Available The search for clinically useful protein biomarkers using advanced mass spectrometry approaches represents a major focus in cancer research. However, the direct analysis of human samples may be challenging due to limited availability, the absence of appropriate control samples, or the large background variability observed in patient material. As an alternative approach, human tumors orthotopically implanted into a different species (xenografts are clinically relevant models that have proven their utility in pre-clinical research. Patient derived xenografts for glioblastoma have been extensively characterized in our laboratory and have been shown to retain the characteristics of the parental tumor at the phenotypic and genetic level. Such models were also found to adequately mimic the behavior and treatment response of human tumors. The reproducibility of such xenograft models, the possibility to identify their host background and perform tumor-host interaction studies, are major advantages over the direct analysis of human samples.At the proteome level, the analysis of xenograft samples is challenged by the presence of proteins from two different species which, depending on tumor size, type or location, often appear at variable ratios. Any proteomics approach aimed at quantifying proteins within such samples must consider the identification of species specific peptides in order to avoid biases introduced by the host proteome. Here, we present an in-house methodology and tool developed to select peptides used as surrogates for protein candidates from a defined proteome (e.g., human in a host proteome background (e.g., mouse, rat suited for a mass spectrometry analysis. The tools presented here are applicable to any species specific proteome, provided a protein database is available. By linking the information from both proteomes, PeptideManager significantly facilitates and expedites the selection of peptides used as surrogates to analyze

  17. Novel joint selection methods can reduce sample size for rheumatoid arthritis clinical trials with ultrasound endpoints.

    Science.gov (United States)

    Allen, John C; Thumboo, Julian; Lye, Weng Kit; Conaghan, Philip G; Chew, Li-Ching; Tan, York Kiat

    2018-03-01

    To determine whether novel methods of selecting joints through (i) ultrasonography (individualized-ultrasound [IUS] method), or (ii) ultrasonography and clinical examination (individualized-composite-ultrasound [ICUS] method) translate into smaller rheumatoid arthritis (RA) clinical trial sample sizes when compared to existing methods utilizing predetermined joint sites for ultrasonography. Cohen's effect size (ES) was estimated (ES^) and a 95% CI (ES^L, ES^U) calculated on a mean change in 3-month total inflammatory score for each method. Corresponding 95% CIs [nL(ES^U), nU(ES^L)] were obtained on a post hoc sample size reflecting the uncertainty in ES^. Sample size calculations were based on a one-sample t-test as the patient numbers needed to provide 80% power at α = 0.05 to reject a null hypothesis H 0 : ES = 0 versus alternative hypotheses H 1 : ES = ES^, ES = ES^L and ES = ES^U. We aimed to provide point and interval estimates on projected sample sizes for future studies reflecting the uncertainty in our study ES^S. Twenty-four treated RA patients were followed up for 3 months. Utilizing the 12-joint approach and existing methods, the post hoc sample size (95% CI) was 22 (10-245). Corresponding sample sizes using ICUS and IUS were 11 (7-40) and 11 (6-38), respectively. Utilizing a seven-joint approach, the corresponding sample sizes using ICUS and IUS methods were nine (6-24) and 11 (6-35), respectively. Our pilot study suggests that sample size for RA clinical trials with ultrasound endpoints may be reduced using the novel methods, providing justification for larger studies to confirm these observations. © 2017 Asia Pacific League of Associations for Rheumatology and John Wiley & Sons Australia, Ltd.

  18. Strategy for Ranking the Science Value of the Surface of Asteroid 101955 Bennu for Sample Site Selection for Osiris-REx

    Science.gov (United States)

    Nakamura-Messenger, K.; Connolly, H. C., Jr.; Lauretta, D. S.

    2014-01-01

    OSRIS-REx is NASA's New Frontiers 3 sample return mission that will return at least 60 g of pristine surface material from near-Earth asteroid 101955 Bennu in September 2023. The scientific value of the sample increases enormously with the amount of knowledge captured about the geological context from which the sample is collected. The OSIRIS-REx spacecraft is highly maneuverable and capable of investigating the surface of Bennu at scales down to the sub-cm. The OSIRIS-REx instruments will characterize the overall surface geology including spectral properties, microtexture, and geochemistry of the regolith at the sampling site in exquisite detail for up to 505 days after encountering Bennu in August 2018. The mission requires at the very minimum one acceptable location on the asteroid where a touch-and-go (TAG) sample collection maneuver can be successfully per-formed. Sample site selection requires that the follow-ing maps be produced: Safety, Deliverability, Sampleability, and finally Science Value. If areas on the surface are designated as safe, navigation can fly to them, and they have ingestible regolith, then the scientific value of one site over another will guide site selection.

  19. Forecasting Urban Air Quality via a Back-Propagation Neural Network and a Selection Sample Rule

    Directory of Open Access Journals (Sweden)

    Yonghong Liu

    2015-07-01

    Full Text Available In this paper, based on a sample selection rule and a Back Propagation (BP neural network, a new model of forecasting daily SO2, NO2, and PM10 concentration in seven sites of Guangzhou was developed using data from January 2006 to April 2012. A meteorological similarity principle was applied in the development of the sample selection rule. The key meteorological factors influencing SO2, NO2, and PM10 daily concentrations as well as weight matrices and threshold matrices were determined. A basic model was then developed based on the improved BP neural network. Improving the basic model, identification of the factor variation consistency was added in the rule, and seven sets of sensitivity experiments in one of the seven sites were conducted to obtain the selected model. A comparison of the basic model from May 2011 to April 2012 in one site showed that the selected model for PM10 displayed better forecasting performance, with Mean Absolute Percentage Error (MAPE values decreasing by 4% and R2 values increasing from 0.53 to 0.68. Evaluations conducted at the six other sites revealed a similar performance. On the whole, the analysis showed that the models presented here could provide local authorities with reliable and precise predictions and alarms about air quality if used at an operational scale.

  20. β-NMR sample optimization

    CERN Document Server

    Zakoucka, Eva

    2013-01-01

    During my summer student programme I was working on sample optimization for a new β-NMR project at the ISOLDE facility. The β-NMR technique is well-established in solid-state physics and just recently it is being introduced for applications in biochemistry and life sciences. The β-NMR collaboration will be applying for beam time to the INTC committee in September for three nuclei: Cu, Zn and Mg. Sample optimization for Mg was already performed last year during the summer student programme. Therefore sample optimization for Cu and Zn had to be completed as well for the project proposal. My part in the project was to perform thorough literature research on techniques studying Cu and Zn complexes in native conditions, search for relevant binding candidates for Cu and Zn applicable for ß-NMR and eventually evaluate selected binding candidates using UV-VIS spectrometry.

  1. Guidance for establishment and implementation of field sample management programs in support of EM environmental sampling and analysis activities

    International Nuclear Information System (INIS)

    1994-01-01

    The role of the National Sample Management Program (NSMP) proposed by the Department of Energy's Office of Environmental Management (EM) is to be a resource for EM programs and for local Field Sample Management Programs (FSMPs). It will be a source of information on sample analysis and data collection within the DOE complex. The purpose of this document is to establish the suggested scope of the FSMP activities to be performed under each Operations Office, list the drivers under which the program will operate, define terms and list references. This guidance will apply only to EM sampling and analysis activities associated with project planning, contracting, laboratory selection, sample collection, sample transportation, laboratory analysis and data management

  2. Re-Emergence of Under-Selected Stimuli, after the Extinction of Over-Selected Stimuli in an Automated Match to Samples Procedure

    Science.gov (United States)

    Broomfield, Laura; McHugh, Louise; Reed, Phil

    2008-01-01

    Stimulus over-selectivity occurs when one of potentially many aspects of the environment comes to control behaviour. In two experiments, adults with no developmental disabilities, were trained and tested in an automated match to samples (MTS) paradigm. In Experiment 1, participants completed two conditions, in one of which the over-selected…

  3. Autonomous site selection and instrument positioning for sample acquisition

    Science.gov (United States)

    Shaw, A.; Barnes, D.; Pugh, S.

    The European Space Agency Aurora Exploration Program aims to establish a European long-term programme for the exploration of Space, culminating in a human mission to space in the 2030 timeframe. Two flagship missions, namely Mars Sample Return and ExoMars, have been proposed as recognised steps along the way. The Exomars Rover is the first of these flagship missions and includes a rover carrying the Pasteur Payload, a mobile exobiology instrumentation package, and the Beagle 2 arm. The primary objective is the search for evidence of past or present life on mars, but the payload will also study the evolution of the planet and the atmosphere, look for evidence of seismological activity and survey the environment in preparation for future missions. The operation of rovers in unknown environments is complicated, and requires large resources not only on the planet but also in ground based operations. Currently, this can be very labour intensive, and costly, if large teams of scientists and engineers are required to assess mission progress, plan mission scenarios, and construct a sequence of events or goals for uplink. Furthermore, the constraints in communication imposed by the time delay involved over such large distances, and line-of-sight required, make autonomy paramount to mission success, affording the ability to operate in the event of communications outages and be opportunistic with respect to scientific discovery. As part of this drive to reduce mission costs and increase autonomy the Space Robotics group at the University of Wales, Aberystwyth is researching methods of autonomous site selection and instrument positioning, directly applicable to the ExoMars mission. The site selection technique used builds on the geometric reasoning algorithms used previously for localisation and navigation [Shaw 03]. It is proposed that a digital elevation model (DEM) of the local surface, generated during traverse and without interaction from ground based operators, can be

  4. Concurrent and Longitudinal Associations Among Temperament, Parental Feeding Styles, and Selective Eating in a Preschool Sample.

    Science.gov (United States)

    Kidwell, Katherine M; Kozikowski, Chelsea; Roth, Taylor; Lundahl, Alyssa; Nelson, Timothy D

    2018-06-01

    To examine the associations among negative/reactive temperament, feeding styles, and selective eating in a sample of preschoolers because preschool eating behaviors likely have lasting implications for children's health. A community sample of preschoolers aged 3-5 years (M = 4.49 years, 49.5% female, 75.7% European American) in the Midwest of the United States was recruited to participate in the study (N = 297). Parents completed measures of temperament and feeding styles at two time points 6 months apart. A series of regressions indicated that children who had temperaments high in negative affectivity were significantly more likely to experience instrumental and emotional feeding styles. They were also significantly more likely to be selective eaters. These associations were present when examined both concurrently and after 6 months. This study provides a novel investigation of child temperament and eating behaviors, allowing for a better understanding of how negative affectivity is associated with instrumental feeding, emotional feeding, and selective eating. These results inform interventions to improve child health.

  5. Sampling and measurement of long-lived radionuclides in environmental samples

    International Nuclear Information System (INIS)

    Brauer, F.P.; Goles, R.W.; Kaye, J.H.; Rieck, H.G. Jr.

    1977-01-01

    The volatile and semivolatile long-lived man-made radionuclides 3 H, 14 C, 79 Se, 85 Kr, 99 Tc, 129 I, 135 Cs, and 137 Cs are of concern in operation of nuclear facilities because they are difficult and expensive to contain and once emitted to the environment they become permanent ecological constituents with both local and global distributions. Species-selective sampling and analytical methods (radiochemical, neutron activation, and mass spectrometric) have been developed for many of these nuclides with sensitivities well below those required for radiation protection. These sampling and analytical methods have been applied to the measurement of current environmental levels of some of the more ecologically important radionuclides. The detection and tracing of long-lived radionuclides is being conducted in order to establish base-line values and to study environmental behavior. This paper describes detection and measurement techniques and summarizes current measurement results

  6. Sampling and sample processing in pesticide residue analysis.

    Science.gov (United States)

    Lehotay, Steven J; Cook, Jo Marie

    2015-05-13

    Proper sampling and sample processing in pesticide residue analysis of food and soil have always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agrochemical industry and contract laboratories. As global food trade and the importance of monitoring increase, the food industry and regulatory laboratories are also considering miniaturized high-throughput methods. In conjunction with a summary of the symposium "Residues in Food and Feed - Going from Macro to Micro: The Future of Sample Processing in Residue Analytical Methods" held at the 13th IUPAC International Congress of Pesticide Chemistry, this is an opportune time to review sampling theory and sample processing for pesticide residue analysis. If collected samples and test portions do not adequately represent the actual lot from which they came and provide meaningful results, then all costs, time, and efforts involved in implementing programs using sophisticated analytical instruments and techniques are wasted and can actually yield misleading results. This paper is designed to briefly review the often-neglected but crucial topic of sample collection and processing and put the issue into perspective for the future of pesticide residue analysis. It also emphasizes that analysts should demonstrate the validity of their sample processing approaches for the analytes/matrices of interest and encourages further studies on sampling and sample mass reduction to produce a test portion.

  7. Boat sampling

    International Nuclear Information System (INIS)

    Citanovic, M.; Bezlaj, H.

    1994-01-01

    This presentation describes essential boat sampling activities: on site boat sampling process optimization and qualification; boat sampling of base material (beltline region); boat sampling of weld material (weld No. 4); problems accompanied with weld crown varieties, RPV shell inner radius tolerance, local corrosion pitting and water clarity. The equipment used for boat sampling is described too. 7 pictures

  8. A Bayesian Justification for Random Sampling in Sample Survey

    Directory of Open Access Journals (Sweden)

    Glen Meeden

    2012-07-01

    Full Text Available In the usual Bayesian approach to survey sampling the sampling design, plays a minimal role, at best. Although a close relationship between exchangeable prior distributions and simple random sampling has been noted; how to formally integrate simple random sampling into the Bayesian paradigm is not clear. Recently it has been argued that the sampling design can be thought of as part of a Bayesian's prior distribution. We will show here that under this scenario simple random sample can be given a Bayesian justification in survey sampling.

  9. Stream sediment sampling and analysis. Final report

    International Nuclear Information System (INIS)

    Means, J.L.; Voris, P.V.; Headington, G.L.

    1986-04-01

    The objectives were to sample and analyze sediments from upstream and downstream locations (relative to the Goodyear Atomic plant site) of three streams for selected pollutants. The three streams sampled were the Scioto River, Big Beaver Creek, and Big Run Creek. Sediment samples were analyzed for EPA's 129 priority pollutants (Clean Water Act) as well as isotopic uranium ( 234 U, 235 U, and 238 U) and technetium-99

  10. Recent results of the investigation of a micro-fluidic sampling chip and sampling system for hot cell aqueous processing streams

    International Nuclear Information System (INIS)

    Tripp, J.; Smith, T.; Law, J.

    2013-01-01

    A Fuel Cycle Research and Development project has investigated an innovative sampling method that could evolve into the next generation sampling and analysis system for metallic elements present in aqueous processing streams. Initially sampling technologies were evaluated and micro-fluidic sampling chip technology was selected and tested. A conceptual design for a fully automated microcapillary-based system was completed and a robotic automated sampling system was fabricated. The mechanical and sampling operation of the completed sampling system was investigated. Different sampling volumes have been tested. It appears that the 10 μl volume has produced data that had much smaller relative standard deviations than the 2 μl volume. In addition, the production of a less expensive, mass produced sampling chip was investigated to avoid chip reuse thus increasing sampling reproducibility/accuracy. The micro-fluidic-based robotic sampling system's mechanical elements were tested to ensure analytical reproducibility and the optimum robotic handling of micro-fluidic sampling chips. (authors)

  11. Graph sampling

    OpenAIRE

    Zhang, L.-C.; Patone, M.

    2017-01-01

    We synthesise the existing theory of graph sampling. We propose a formal definition of sampling in finite graphs, and provide a classification of potential graph parameters. We develop a general approach of Horvitz–Thompson estimation to T-stage snowball sampling, and present various reformulations of some common network sampling methods in the literature in terms of the outlined graph sampling theory.

  12. Progressive sampling-based Bayesian optimization for efficient and automatic machine learning model selection.

    Science.gov (United States)

    Zeng, Xueqiang; Luo, Gang

    2017-12-01

    Machine learning is broadly used for clinical data analysis. Before training a model, a machine learning algorithm must be selected. Also, the values of one or more model parameters termed hyper-parameters must be set. Selecting algorithms and hyper-parameter values requires advanced machine learning knowledge and many labor-intensive manual iterations. To lower the bar to machine learning, miscellaneous automatic selection methods for algorithms and/or hyper-parameter values have been proposed. Existing automatic selection methods are inefficient on large data sets. This poses a challenge for using machine learning in the clinical big data era. To address the challenge, this paper presents progressive sampling-based Bayesian optimization, an efficient and automatic selection method for both algorithms and hyper-parameter values. We report an implementation of the method. We show that compared to a state of the art automatic selection method, our method can significantly reduce search time, classification error rate, and standard deviation of error rate due to randomization. This is major progress towards enabling fast turnaround in identifying high-quality solutions required by many machine learning-based clinical data analysis tasks.

  13. A soil sampling intercomparison exercise for the ALMERA network

    Energy Technology Data Exchange (ETDEWEB)

    Belli, Maria [Istituto Superiore per la Protezione e la Ricerca Ambientale (ISPRA), Via di Castel Romano 100, I-00128 Roma (Italy)], E-mail: maria.belli@apat.it; Zorzi, Paolo de [Istituto Superiore per la Protezione e la Ricerca Ambientale (ISPRA), Via di Castel Romano 100, I-00128 Roma (Italy)], E-mail: paolo.dezorzi@isprambiente.it; Sansone, Umberto [International Atomic Energy Agency (IAEA), Agency' s Laboratories Seibersdorf, A-2444 Seibersdorf (Austria)], E-mail: u.sansone@iaea.org; Shakhashiro, Abduhlghani [International Atomic Energy Agency (IAEA), Agency' s Laboratories Seibersdorf, A-2444 Seibersdorf (Austria)], E-mail: a.shakhashiro@iaea.org; Gondin da Fonseca, Adelaide [International Atomic Energy Agency (IAEA), Agency' s Laboratories Seibersdorf, A-2444 Seibersdorf (Austria)], E-mail: a.gondin-da-fonseca-azeredo@iaea.org; Trinkl, Alexander [International Atomic Energy Agency (IAEA), Agency' s Laboratories Seibersdorf, A-2444 Seibersdorf (Austria)], E-mail: a.trinkl@iaea.org; Benesch, Thomas [International Atomic Energy Agency (IAEA), Agency' s Laboratories Seibersdorf, A-2444 Seibersdorf (Austria)], E-mail: t.benesch@iaea.org

    2009-11-15

    Soil sampling and analysis for radionuclides after an accidental or routine release is a key factor for the dose calculation to members of the public, and for the establishment of possible countermeasures. The IAEA organized for selected laboratories of the ALMERA (Analytical Laboratories for the Measurement of Environmental Radioactivity) network a Soil Sampling Intercomparison Exercise (IAEA/SIE/01) with the objective of comparing soil sampling procedures used by different laboratories. The ALMERA network is a world-wide network of analytical laboratories located in IAEA member states capable of providing reliable and timely analysis of environmental samples in the event of an accidental or intentional release of radioactivity. Ten ALMERA laboratories were selected to participate in the sampling exercise. The soil sampling intercomparison exercise took place in November 2005 in an agricultural area qualified as a 'reference site', aimed at assessing the uncertainties associated with soil sampling in agricultural, semi-natural, urban and contaminated environments and suitable for performing sampling intercomparison. In this paper, the laboratories sampling performance were evaluated.

  14. A soil sampling intercomparison exercise for the ALMERA network

    International Nuclear Information System (INIS)

    Belli, Maria; Zorzi, Paolo de; Sansone, Umberto; Shakhashiro, Abduhlghani; Gondin da Fonseca, Adelaide; Trinkl, Alexander; Benesch, Thomas

    2009-01-01

    Soil sampling and analysis for radionuclides after an accidental or routine release is a key factor for the dose calculation to members of the public, and for the establishment of possible countermeasures. The IAEA organized for selected laboratories of the ALMERA (Analytical Laboratories for the Measurement of Environmental Radioactivity) network a Soil Sampling Intercomparison Exercise (IAEA/SIE/01) with the objective of comparing soil sampling procedures used by different laboratories. The ALMERA network is a world-wide network of analytical laboratories located in IAEA member states capable of providing reliable and timely analysis of environmental samples in the event of an accidental or intentional release of radioactivity. Ten ALMERA laboratories were selected to participate in the sampling exercise. The soil sampling intercomparison exercise took place in November 2005 in an agricultural area qualified as a 'reference site', aimed at assessing the uncertainties associated with soil sampling in agricultural, semi-natural, urban and contaminated environments and suitable for performing sampling intercomparison. In this paper, the laboratories sampling performance were evaluated.

  15. The Quasar Fraction in Low-Frequency Selected Complete Samples and Implications for Unified Schemes

    Science.gov (United States)

    Willott, Chris J.; Rawlings, Steve; Blundell, Katherine M.; Lacy, Mark

    2000-01-01

    Low-frequency radio surveys are ideal for selecting orientation-independent samples of extragalactic sources because the sample members are selected by virtue of their isotropic steep-spectrum extended emission. We use the new 7C Redshift Survey along with the brighter 3CRR and 6C samples to investigate the fraction of objects with observed broad emission lines - the 'quasar fraction' - as a function of redshift and of radio and narrow emission line luminosity. We find that the quasar fraction is more strongly dependent upon luminosity (both narrow line and radio) than it is on redshift. Above a narrow [OII] emission line luminosity of log(base 10) (L(sub [OII])/W) approximately > 35 [or radio luminosity log(base 10) (L(sub 151)/ W/Hz.sr) approximately > 26.5], the quasar fraction is virtually independent of redshift and luminosity; this is consistent with a simple unified scheme with an obscuring torus with a half-opening angle theta(sub trans) approximately equal 53 deg. For objects with less luminous narrow lines, the quasar fraction is lower. We show that this is not due to the difficulty of detecting lower-luminosity broad emission lines in a less luminous, but otherwise similar, quasar population. We discuss evidence which supports at least two probable physical causes for the drop in quasar fraction at low luminosity: (i) a gradual decrease in theta(sub trans) and/or a gradual increase in the fraction of lightly-reddened (0 approximately quasar luminosity; and (ii) the emergence of a distinct second population of low luminosity radio sources which, like M8T, lack a well-fed quasar nucleus and may well lack a thick obscuring torus.

  16. INEL Sample Management Office

    International Nuclear Information System (INIS)

    Watkins, C.

    1994-01-01

    The Idaho National Engineering Laboratory (INEL) Sample Management Office (SMO) was formed as part of the EG ampersand G Idaho Environmental Restoration Program (ERP) in June, 1990. Since then, the SMO has been recognized and sought out by other prime contractors and programs at the INEL. Since December 1991, the DOE-ID Division Directors for the Environmental Restoration Division and Waste Management Division supported the expansion of the INEL ERP SMO into the INEL site wide SMO. The INEL SMO serves as a point of contact for multiple environmental analytical chemistry and laboratory issues (e.g., capacity, capability). The SMO chemists work with project managers during planning to help develop data quality objectives, select appropriate analytical methods, identify special analytical services needs, identify a source for the services, and ensure that requirements for sampling and analysis (e.g., preservations, sample volumes) are clear and technically accurate. The SMO chemists also prepare work scope statements for the laboratories performing the analyses

  17. Sampling and analysis plan for the consolidated sludge samples from the canisters and floor of the 105-K East basin

    International Nuclear Information System (INIS)

    BAKER, R.B.

    1999-01-01

    This Sampling and Analysis Plan (SAP) provides direction for sampling of fuel canister and floor Sludge from the K East Basin to complete the inventory of samples needed for Sludge treatment process testing. Sample volumes and sources consider recent reviews made by the Sludge treatment subproject. The representative samples will be characterized to the extent needed for the material to be used effectively for testing. Sampling equipment used allows drawing of large volume sludge samples and consolidation of sample material from a number of basin locations into one container. Once filled, the containers will be placed in a cask and transported to Hanford laboratories for recovery and evaluation. Included in the present SAP are the logic for sample location selection, laboratory analysis procedures required, and reporting needed to meet the Data Quality Objectives (DQOs) for this initiative

  18. A comparison of attribute sampling plans

    International Nuclear Information System (INIS)

    Lanning, B.M.

    1997-05-01

    This report describes, compares, and provides sample size selection criteria for the most common sampling plans for attribute data (i.e., data that is qualitative in nature such as Pass-Fail, Yes-No, Defect-Nondefect data). This report is being issued as a guide in prudently choosing the correct sampling plan to meet statistical plan objectives. The report discusses three types of sampling plans: AQL (Acceptable Quality Level expressed as a percent), RQL (Rejectable Quality Level as a percent), and the AQL/RQL plan which emphasizes both risks simultaneously. These plans are illustrated with six examples, one of which is an inventory of UF 6 cans whose weight must agree within 100 grams of its listed weight to be acceptable

  19. 40 CFR 1065.245 - Sample flow meter for batch sampling.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Sample flow meter for batch sampling... Sample flow meter for batch sampling. (a) Application. Use a sample flow meter to determine sample flow... difference between a diluted exhaust sample flow meter and a dilution air meter to calculate raw exhaust flow...

  20. Rational approach to solvent system selection for liquid-liquid extraction-assisted sample pretreatment in counter-current chromatography.

    Science.gov (United States)

    Wang, Jiajia; Gu, Dongyu; Wang, Miao; Guo, Xinfeng; Li, Haoquan; Dong, Yue; Guo, Hong; Wang, Yi; Fan, Mengqi; Yang, Yi

    2017-05-15

    A rational liquid-liquid extraction approach was established to pre-treat samples for high-speed counter-current chromatography (HSCCC). n-Hexane-ethyl acetate-methanol-water (4:5:4:5, v/v) and (1:5:1:5, v/v) were selected as solvent systems for liquid-liquid extraction by systematically screening K of target compounds to remove low- and high-polarity impurities in the sample, respectively. After liquid-liquid extraction was performed, 1.4g of crude sample II was obtained from 18.5g of crude sample I which was extracted from the flowers of Robinia pseudoacacia L., and then separated with HSCCC by using a solvent system composed of n-hexane-ethyl acetate-methanol-water (1:2:1:2, v/v). As a result, 31mg of robinin and 37mg of kaempferol 7-O-α-l-rhamnopyranoside were isolated from 200mg of crude sample II in a single run of HSCCC. A scale-up separation was also performed, and 160mg of robinin with 95% purity and 188mg of kaempferol 7-O-α-l-rhamnopyranoside with 97% purity were produced from 1.2g of crude sample II. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Ensemble Sampling

    OpenAIRE

    Lu, Xiuyuan; Van Roy, Benjamin

    2017-01-01

    Thompson sampling has emerged as an effective heuristic for a broad range of online decision problems. In its basic form, the algorithm requires computing and sampling from a posterior distribution over models, which is tractable only for simple special cases. This paper develops ensemble sampling, which aims to approximate Thompson sampling while maintaining tractability even in the face of complex models such as neural networks. Ensemble sampling dramatically expands on the range of applica...

  2. A novel knot selection method for the error-bounded B-spline curve fitting of sampling points in the measuring process

    International Nuclear Information System (INIS)

    Liang, Fusheng; Zhao, Ji; Ji, Shijun; Zhang, Bing; Fan, Cheng

    2017-01-01

    The B-spline curve has been widely used in the reconstruction of measurement data. The error-bounded sampling points reconstruction can be achieved by the knot addition method (KAM) based B-spline curve fitting. In KAM, the selection pattern of initial knot vector has been associated with the ultimate necessary number of knots. This paper provides a novel initial knots selection method to condense the knot vector required for the error-bounded B-spline curve fitting. The initial knots are determined by the distribution of features which include the chord length (arc length) and bending degree (curvature) contained in the discrete sampling points. Firstly, the sampling points are fitted into an approximate B-spline curve Gs with intensively uniform knot vector to substitute the description of the feature of the sampling points. The feature integral of Gs is built as a monotone increasing function in an analytic form. Then, the initial knots are selected according to the constant increment of the feature integral. After that, an iterative knot insertion (IKI) process starting from the initial knots is introduced to improve the fitting precision, and the ultimate knot vector for the error-bounded B-spline curve fitting is achieved. Lastly, two simulations and the measurement experiment are provided, and the results indicate that the proposed knot selection method can reduce the number of ultimate knots available. (paper)

  3. Efficient sampling over rough energy landscapes with high barriers: A combination of metadynamics with integrated tempering sampling

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Y. Isaac [Institute of Theoretical and Computational Chemistry, College of Chemistry and Molecular Engineering, Peking University, Beijing 100871 (China); Zhang, Jun; Che, Xing; Yang, Lijiang; Gao, Yi Qin, E-mail: gaoyq@pku.edu.cn [Institute of Theoretical and Computational Chemistry, College of Chemistry and Molecular Engineering, Peking University, Beijing 100871 (China); Biodynamic Optical Imaging Center, Peking University, Beijing 100871 (China)

    2016-03-07

    In order to efficiently overcome high free energy barriers embedded in a complex energy landscape and calculate overall thermodynamics properties using molecular dynamics simulations, we developed and implemented a sampling strategy by combining the metadynamics with (selective) integrated tempering sampling (ITS/SITS) method. The dominant local minima on the potential energy surface (PES) are partially exalted by accumulating history-dependent potentials as in metadynamics, and the sampling over the entire PES is further enhanced by ITS/SITS. With this hybrid method, the simulated system can be rapidly driven across the dominant barrier along selected collective coordinates. Then, ITS/SITS ensures a fast convergence of the sampling over the entire PES and an efficient calculation of the overall thermodynamic properties of the simulation system. To test the accuracy and efficiency of this method, we first benchmarked this method in the calculation of ϕ − ψ distribution of alanine dipeptide in explicit solvent. We further applied it to examine the design of template molecules for aromatic meta-C—H activation in solutions and investigate solution conformations of the nonapeptide Bradykinin involving slow cis-trans isomerizations of three proline residues.

  4. Modern survey sampling

    CERN Document Server

    Chaudhuri, Arijit

    2014-01-01

    Exposure to SamplingAbstract Introduction Concepts of Population, Sample, and SamplingInitial RamificationsAbstract Introduction Sampling Design, Sampling SchemeRandom Numbers and Their Uses in Simple RandomSampling (SRS)Drawing Simple Random Samples with and withoutReplacementEstimation of Mean, Total, Ratio of Totals/Means:Variance and Variance EstimationDetermination of Sample SizesA.2 Appendix to Chapter 2 A.More on Equal Probability Sampling A.Horvitz-Thompson EstimatorA.SufficiencyA.LikelihoodA.Non-Existence Theorem More Intricacies Abstract Introduction Unequal Probability Sampling StrategiesPPS Sampling Exploring Improved WaysAbstract Introduction Stratified Sampling Cluster SamplingMulti-Stage SamplingMulti-Phase Sampling: Ratio and RegressionEstimationviiviii ContentsControlled SamplingModeling Introduction Super-Population ModelingPrediction Approach Model-Assisted Approach Bayesian Methods Spatial SmoothingSampling on Successive Occasions: Panel Rotation Non-Response and Not-at-Homes Weighting Adj...

  5. Respondent-Driven Sampling – Testing Assumptions: Sampling with Replacement

    Directory of Open Access Journals (Sweden)

    Barash Vladimir D.

    2016-03-01

    Full Text Available Classical Respondent-Driven Sampling (RDS estimators are based on a Markov Process model in which sampling occurs with replacement. Given that respondents generally cannot be interviewed more than once, this assumption is counterfactual. We join recent work by Gile and Handcock in exploring the implications of the sampling-with-replacement assumption for bias of RDS estimators. We differ from previous studies in examining a wider range of sampling fractions and in using not only simulations but also formal proofs. One key finding is that RDS estimates are surprisingly stable even in the presence of substantial sampling fractions. Our analyses show that the sampling-with-replacement assumption is a minor contributor to bias for sampling fractions under 40%, and bias is negligible for the 20% or smaller sampling fractions typical of field applications of RDS.

  6. Structure and Mechanical Properties of the AlSi10Mg Alloy Samples Manufactured by Selective Laser Melting

    Science.gov (United States)

    Li, Xiaodan; Ni, Jiaqiang; Zhu, Qingfeng; Su, Hang; Cui, Jianzhong; Zhang, Yifei; Li, Jianzhong

    2017-11-01

    The AlSi10Mg alloy samples with the size of 14×14×91mm were produced by the selective laser melting (SLM) method in different building direction. The structures and the properties at -70°C of the sample in different direction were investigated. The results show that the structure in different building direction shows different morphology. The fish scale structures distribute on the side along the building direction, and the oval structures distribute on the side vertical to the building direction. Some pores in with the maximum size of 100 μm exist of the structure. And there is no major influence for the build orientation on the tensile properties. The tensile strength and the elongation of the sample in the building direction are 340 Mpa and 11.2 % respectively. And the tensile strength and the elongation of the sample vertical to building direction are 350 Mpa and 13.4 % respectively

  7. SDSS-IV MaNGA: faint quenched galaxies - I. Sample selection and evidence for environmental quenching

    Science.gov (United States)

    Penny, Samantha J.; Masters, Karen L.; Weijmans, Anne-Marie; Westfall, Kyle B.; Bershady, Matthew A.; Bundy, Kevin; Drory, Niv; Falcón-Barroso, Jesús; Law, David; Nichol, Robert C.; Thomas, Daniel; Bizyaev, Dmitry; Brownstein, Joel R.; Freischlad, Gordon; Gaulme, Patrick; Grabowski, Katie; Kinemuchi, Karen; Malanushenko, Elena; Malanushenko, Viktor; Oravetz, Daniel; Roman-Lopes, Alexandre; Pan, Kaike; Simmons, Audrey; Wake, David A.

    2016-11-01

    Using kinematic maps from the Sloan Digital Sky Survey (SDSS) Mapping Nearby Galaxies at Apache Point Observatory (MaNGA) survey, we reveal that the majority of low-mass quenched galaxies exhibit coherent rotation in their stellar kinematics. Our sample includes all 39 quenched low-mass galaxies observed in the first year of MaNGA. The galaxies are selected with Mr > -19.1, stellar masses 109 M⊙ 1.9. They lie on the size-magnitude and σ-luminosity relations for previously studied dwarf galaxies. Just six (15 ± 5.7 per cent) are found to have rotation speeds ve, rot 5 × 1010 M⊙), supporting the hypothesis that galaxy-galaxy or galaxy-group interactions quench star formation in low-mass galaxies. The local bright galaxy density for our sample is ρproj = 8.2 ± 2.0 Mpc-2, compared to ρproj = 2.1 ± 0.4 Mpc-2 for a star-forming comparison sample, confirming that the quenched low-mass galaxies are preferentially found in higher density environments.

  8. Obscured AGN at z similar to 1 from the zCOSMOS-Bright Survey : I. Selection and optical properties of a [Ne v]-selected sample

    NARCIS (Netherlands)

    Mignoli, M.; Vignali, C.; Gilli, R.; Comastri, A.; Zamorani, G.; Bolzonella, M.; Bongiorno, A.; Lamareille, F.; Nair, P.; Pozzetti, L.; Lilly, S. J.; Carollo, C. M.; Contini, T.; Kneib, J. -P.; Le Fevre, O.; Mainieri, V.; Renzini, A.; Scodeggio, M.; Bardelli, S.; Caputi, K.; Cucciati, O.; de la Torre, S.; de Ravel, L.; Franzetti, P.; Garilli, B.; Iovino, A.; Kampczyk, P.; Knobel, C.; Kovac, K.; Le Borgne, J. -F.; Le Brun, V.; Maier, C.; Pello, R.; Peng, Y.; Montero, E. Perez; Presotto, V.; Silverman, J. D.; Tanaka, M.; Tasca, L.; Tresse, L.; Vergani, D.; Zucca, E.; Bordoloi, R.; Cappi, A.; Cimatti, A.; Koekemoer, A. M.; McCracken, H. J.; Moresco, M.; Welikala, N.

    Aims. The application of multi-wavelength selection techniques is essential for obtaining a complete and unbiased census of active galactic nuclei (AGN). We present here a method for selecting z similar to 1 obscured AGN from optical spectroscopic surveys. Methods. A sample of 94 narrow-line AGN

  9. Sampling pig farms at the abattoir in a cross-sectional study - Evaluation of a sampling method.

    Science.gov (United States)

    Birkegård, Anna Camilla; Halasa, Tariq; Toft, Nils

    2017-09-15

    procedure was reproducible with results comparable to the collected sample. However, the sampling procedure favoured sampling of large farms. Furthermore, both under-sampled and over-sampled areas were found using scan statistics. In conclusion, sampling conducted at abattoirs can provide a spatially representative sample. Hence it is a possible cost-effective alternative to simple random sampling. However, it is important to assess the properties of the resulting sample so that any potential selection bias can be addressed when reporting the findings. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Representative process sampling - in practice

    DEFF Research Database (Denmark)

    Esbensen, Kim; Friis-Pedersen, Hans Henrik; Julius, Lars Petersen

    2007-01-01

    Didactic data sets representing a range of real-world processes are used to illustrate "how to do" representative process sampling and process characterisation. The selected process data lead to diverse variogram expressions with different systematics (no range vs. important ranges; trends and....../or periodicity; different nugget effects and process variations ranging from less than one lag to full variogram lag). Variogram data analysis leads to a fundamental decomposition into 0-D sampling vs. 1-D process variances, based on the three principal variogram parameters: range, sill and nugget effect...

  11. Sample preparation

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    Sample preparation prior to HPLC analysis is certainly one of the most important steps to consider in trace or ultratrace analysis. For many years scientists have tried to simplify the sample preparation process. It is rarely possible to inject a neat liquid sample or a sample where preparation may not be any more complex than dissolution of the sample in a given solvent. The last process alone can remove insoluble materials, which is especially helpful with the samples in complex matrices if other interactions do not affect extraction. Here, it is very likely a large number of components will not dissolve and are, therefore, eliminated by a simple filtration process. In most cases, the process of sample preparation is not as simple as dissolution of the component interest. At times, enrichment is necessary, that is, the component of interest is present in very large volume or mass of material. It needs to be concentrated in some manner so a small volume of the concentrated or enriched sample can be injected into HPLC. 88 refs

  12. Selective trace enrichment of chlorotriazine pesticides from natural waters and sediment samples using terbuthylazine molecularly imprinted polymers

    Science.gov (United States)

    Ferrer, I.; Lanza, F.; Tolokan, A.; Horvath, V.; Sellergren, B.; Horvai, G.; Barcelo, D.

    2000-01-01

    Two molecularly imprinted polymers were synthesized using either dichloromethane or toluene as the porogen and terbuthylazine as the template and were used as solid-phase extraction cartridges for the enrichment of six chlorotriazines (deisopropylatrazine, deethylatrazine, simazine, atrazine, propazine, and terbuthylazine) in natural water and sediment samples. The extracted samples were analyzed by liquid chromatography/diode array detection (LC/DAD). Several washing solvents, as well as different volumes, were tested for their ability to remove the matrix components nonspecifically adsorbed on the sorbents. This cleanup step was shown to be of prime importance to the successful extraction of the pesticides from the aqueous samples. The optimal analytical conditions were obtained when the MIP imprinted using dichloromethane was the sorbent, 2 mL of dichloromethane was used in the washing step, and the preconcentrated analytes were eluted with 8 mL of methanol. The recoveries were higher than 80% for all the chlorotriazines except for propazine (53%) when 50- or 100-mL groundwater samples, spiked at 1 ??g/L level, were analyzed. The limits of detection varied from 0.05 to 0.2 ??g/L when preconcentrating a 100-mL groundwater sample. Natural sediment samples from the Ebre Delta area (Tarragona, Spain) containing atrazine and deethylatrazine were Soxhlet extracted and analyzed by the methodology developed in this work. No significant interferences from the sample matrix were noticed, thus indicating good selectivity of the MIP sorbents used.

  13. Characteristic of selected frequency luminescence for samples collected in deserts north to Beijing

    International Nuclear Information System (INIS)

    Li Dongxu; Wei Mingjian; Wang Junping; Pan Baolin; Zhao Shiyuan; Liu Zhaowen

    2009-01-01

    Surface sand samples were collected in eight sites of the Horqin and Otindag deserts located in north to Beijing. BG2003 luminescence spectrograph was used to analyze the emitted photons and characteristic spectra of the selected frequency luminescence were obtained. It was found that high intensities of emitted photons stimulated by heat from 85 degree C-135 degree C and 350 degree C-400 degree C. It belong to the traps of 4.13 eV (300 nm), 4.00 eV (310 nm), 3.88 eV (320 nm) and 2.70 eV (460 nm), and the emitted photons belong to traps of 4.00 eV (310 nm), 3.88 eV (320 nm) and 2.70 eV (460 nm) were stimulated by green laser. And sand samples of the eight sites can respond to the increase of definite radiological dose at each wavelength, which is the characteristic spectrum to provide radiation dosimetry basis for dating. There are definite district characteristic in their characteristic spectra. (authors)

  14. Woody species diversity in forest plantations in a mountainous region of Beijing, China: effects of sampling scale and species selection.

    Directory of Open Access Journals (Sweden)

    Yuxin Zhang

    Full Text Available The role of forest plantations in biodiversity conservation has gained more attention in recent years. However, most work on evaluating the diversity of forest plantations focuses only on one spatial scale; thus, we examined the effects of sampling scale on diversity in forest plantations. We designed a hierarchical sampling strategy to collect data on woody species diversity in planted pine (Pinus tabuliformis Carr., planted larch (Larix principis-rupprechtii Mayr., and natural secondary deciduous broadleaf forests in a mountainous region of Beijing, China. Additive diversity partition analysis showed that, compared to natural forests, the planted pine forests had a different woody species diversity partitioning pattern at multi-scales (except the Simpson diversity in the regeneration layer, while the larch plantations did not show multi-scale diversity partitioning patterns that were obviously different from those in the natural secondary broadleaf forest. Compare to the natural secondary broadleaf forests, the effects of planted pine forests on woody species diversity are dependent on the sampling scale and layers selected for analysis. Diversity in the planted larch forest, however, was not significantly different from that in the natural forest for all diversity components at all sampling levels. Our work demonstrated that the species selected for afforestation and the sampling scales selected for data analysis alter the conclusions on the levels of diversity supported by plantations. We suggest that a wide range of scales should be considered in the evaluation of the role of forest plantations on biodiversity conservation.

  15. ACTIVE LEARNING TO OVERCOME SAMPLE SELECTION BIAS: APPLICATION TO PHOTOMETRIC VARIABLE STAR CLASSIFICATION

    Energy Technology Data Exchange (ETDEWEB)

    Richards, Joseph W.; Starr, Dan L.; Miller, Adam A.; Bloom, Joshua S.; Butler, Nathaniel R.; Berian James, J. [Astronomy Department, University of California, Berkeley, CA 94720-7450 (United States); Brink, Henrik [Dark Cosmology Centre, Juliane Maries Vej 30, 2100 Copenhagen O (Denmark); Long, James P.; Rice, John, E-mail: jwrichar@stat.berkeley.edu [Statistics Department, University of California, Berkeley, CA 94720-7450 (United States)

    2012-01-10

    Despite the great promise of machine-learning algorithms to classify and predict astrophysical parameters for the vast numbers of astrophysical sources and transients observed in large-scale surveys, the peculiarities of the training data often manifest as strongly biased predictions on the data of interest. Typically, training sets are derived from historical surveys of brighter, more nearby objects than those from more extensive, deeper surveys (testing data). This sample selection bias can cause catastrophic errors in predictions on the testing data because (1) standard assumptions for machine-learned model selection procedures break down and (2) dense regions of testing space might be completely devoid of training data. We explore possible remedies to sample selection bias, including importance weighting, co-training, and active learning (AL). We argue that AL-where the data whose inclusion in the training set would most improve predictions on the testing set are queried for manual follow-up-is an effective approach and is appropriate for many astronomical applications. For a variable star classification problem on a well-studied set of stars from Hipparcos and Optical Gravitational Lensing Experiment, AL is the optimal method in terms of error rate on the testing data, beating the off-the-shelf classifier by 3.4% and the other proposed methods by at least 3.0%. To aid with manual labeling of variable stars, we developed a Web interface which allows for easy light curve visualization and querying of external databases. Finally, we apply AL to classify variable stars in the All Sky Automated Survey, finding dramatic improvement in our agreement with the ASAS Catalog of Variable Stars, from 65.5% to 79.5%, and a significant increase in the classifier's average confidence for the testing set, from 14.6% to 42.9%, after a few AL iterations.

  16. ACTIVE LEARNING TO OVERCOME SAMPLE SELECTION BIAS: APPLICATION TO PHOTOMETRIC VARIABLE STAR CLASSIFICATION

    International Nuclear Information System (INIS)

    Richards, Joseph W.; Starr, Dan L.; Miller, Adam A.; Bloom, Joshua S.; Butler, Nathaniel R.; Berian James, J.; Brink, Henrik; Long, James P.; Rice, John

    2012-01-01

    Despite the great promise of machine-learning algorithms to classify and predict astrophysical parameters for the vast numbers of astrophysical sources and transients observed in large-scale surveys, the peculiarities of the training data often manifest as strongly biased predictions on the data of interest. Typically, training sets are derived from historical surveys of brighter, more nearby objects than those from more extensive, deeper surveys (testing data). This sample selection bias can cause catastrophic errors in predictions on the testing data because (1) standard assumptions for machine-learned model selection procedures break down and (2) dense regions of testing space might be completely devoid of training data. We explore possible remedies to sample selection bias, including importance weighting, co-training, and active learning (AL). We argue that AL—where the data whose inclusion in the training set would most improve predictions on the testing set are queried for manual follow-up—is an effective approach and is appropriate for many astronomical applications. For a variable star classification problem on a well-studied set of stars from Hipparcos and Optical Gravitational Lensing Experiment, AL is the optimal method in terms of error rate on the testing data, beating the off-the-shelf classifier by 3.4% and the other proposed methods by at least 3.0%. To aid with manual labeling of variable stars, we developed a Web interface which allows for easy light curve visualization and querying of external databases. Finally, we apply AL to classify variable stars in the All Sky Automated Survey, finding dramatic improvement in our agreement with the ASAS Catalog of Variable Stars, from 65.5% to 79.5%, and a significant increase in the classifier's average confidence for the testing set, from 14.6% to 42.9%, after a few AL iterations.

  17. Active Learning to Overcome Sample Selection Bias: Application to Photometric Variable Star Classification

    Science.gov (United States)

    Richards, Joseph W.; Starr, Dan L.; Brink, Henrik; Miller, Adam A.; Bloom, Joshua S.; Butler, Nathaniel R.; James, J. Berian; Long, James P.; Rice, John

    2012-01-01

    Despite the great promise of machine-learning algorithms to classify and predict astrophysical parameters for the vast numbers of astrophysical sources and transients observed in large-scale surveys, the peculiarities of the training data often manifest as strongly biased predictions on the data of interest. Typically, training sets are derived from historical surveys of brighter, more nearby objects than those from more extensive, deeper surveys (testing data). This sample selection bias can cause catastrophic errors in predictions on the testing data because (1) standard assumptions for machine-learned model selection procedures break down and (2) dense regions of testing space might be completely devoid of training data. We explore possible remedies to sample selection bias, including importance weighting, co-training, and active learning (AL). We argue that AL—where the data whose inclusion in the training set would most improve predictions on the testing set are queried for manual follow-up—is an effective approach and is appropriate for many astronomical applications. For a variable star classification problem on a well-studied set of stars from Hipparcos and Optical Gravitational Lensing Experiment, AL is the optimal method in terms of error rate on the testing data, beating the off-the-shelf classifier by 3.4% and the other proposed methods by at least 3.0%. To aid with manual labeling of variable stars, we developed a Web interface which allows for easy light curve visualization and querying of external databases. Finally, we apply AL to classify variable stars in the All Sky Automated Survey, finding dramatic improvement in our agreement with the ASAS Catalog of Variable Stars, from 65.5% to 79.5%, and a significant increase in the classifier's average confidence for the testing set, from 14.6% to 42.9%, after a few AL iterations.

  18. Procedures for sampling radium-contaminated soils

    International Nuclear Information System (INIS)

    Fleischhauer, H.L.

    1985-10-01

    Two procedures for sampling the surface layer (0 to 15 centimeters) of radium-contaminated soil are recommended for use in remedial action projects. Both procedures adhere to the philosophy that soil samples should have constant geometry and constant volume in order to ensure uniformity. In the first procedure, a ''cookie cutter'' fashioned from pipe or steel plate, is driven to the desired depth by means of a slide hammer, and the sample extracted as a core or plug. The second procedure requires use of a template to outline the sampling area, from which the sample is obtained using a trowel or spoon. Sampling to the desired depth must then be performed incrementally. Selection of one procedure over the other is governed primarily by soil conditions, the cookie cutter being effective in nongravelly soils, and the template procedure appropriate for use in both gravelly and nongravelly soils. In any event, a minimum sample volume of 1000 cubic centimeters is recommended. The step-by-step procedures are accompanied by a description of the minimum requirements for sample documentation. Transport of the soil samples from the field is then addressed in a discussion of the federal regulations for shipping radioactive materials. Interpretation of those regulations, particularly in light of their application to remedial action soil-sampling programs, is provided in the form of guidance and suggested procedures. Due to the complex nature of the regulations, however, there is no guarantee that our interpretations of them are complete or entirely accurate. Preparation of soil samples for radium-226 analysis by means of gamma-ray spectroscopy is described

  19. Using lot quality-assurance sampling and area sampling to identify priority areas for trachoma control: Viet Nam.

    Science.gov (United States)

    Myatt, Mark; Mai, Nguyen Phuong; Quynh, Nguyen Quang; Nga, Nguyen Huy; Tai, Ha Huy; Long, Nguyen Hung; Minh, Tran Hung; Limburg, Hans

    2005-10-01

    To report on the use of lot quality-assurance sampling (LQAS) surveys undertaken within an area-sampling framework to identify priority areas for intervention with trachoma control activities in Viet Nam. The LQAS survey method for the rapid assessment of the prevalence of active trachoma was adapted for use in Viet Nam with the aim of classifying individual communes by the prevalence of active trachoma among children in primary school. School-based sampling was used; school sites to be sampled were selected using an area-sampling approach. A total of 719 communes in 41 districts in 18 provinces were surveyed. Survey staff found the LQAS survey method both simple and rapid to use after initial problems with area-sampling methods were identified and remedied. The method yielded a finer spatial resolution of prevalence than had been previously achieved in Viet Nam using semiquantitative rapid assessment surveys and multistage cluster-sampled surveys. When used with area-sampling techniques, the LQAS survey method has the potential to form the basis of survey instruments that can be used to efficiently target resources for interventions against active trachoma. With additional work, such methods could provide a generally applicable tool for effective programme planning and for the certification of the elimination of trachoma as a blinding disease.

  20. Sample summary report for ARG 1 pressure tube sample

    International Nuclear Information System (INIS)

    Belinco, C.

    2006-01-01

    The ARG 1 sample is made from an un-irradiated Zr-2.5% Nb pressure tube. The sample has 103.4 mm ID, 112 mm OD and approximately 500 mm length. A punch mark was made very close to one end of the sample. The punch mark indicates the 12 O'clock position and also identifies the face of the tube for making all the measurements. ARG 1 sample contains flaws on ID and OD surface. There was no intentional flaw within the wall of the pressure tube sample. Once the flaws are machined the pressure tube sample was covered from outside to hide the OD flaws. Approximately 50 mm length of pressure tube was left open at both the ends to facilitate the holding of sample in the fixtures for inspection. No flaw was machined in this zone of 50 mm on either end of the pressure tube sample. A total of 20 flaws were machined in ARG 1 sample. Out of these, 16 flaws were on the OD surface and the remaining 4 on the ID surface of the pressure tube. The flaws were characterized in to various groups like axial flaws, circumferential flaws, etc

  1. Characterization of Vadose Zone Sediment: Uncontaminated RCRA Borehole Core Samples and Composite Samples

    International Nuclear Information System (INIS)

    Serne, R. Jeffrey; Bjornstad, Bruce N.; Schaef, Herbert T.; Williams, Bruce A.; Lanigan, David C.; Horton, Duane G.; Clayton, Ray E.; Mitroshkov, Alexandre V.; Legore, Virginia L.; O'Hara, Matthew J.; Brown, Christopher F.; Parker, Kent E.; Kutnyakov, Igor V.; Serne, Jennifer N.; Last, George V.; Smith, Steven C.; Lindenmeier, Clark W.; Zachara, John M.; Burke, Deborah Sd.

    2001-01-01

    The overall goal of the of the Tank Farm Vadose Zone Project, led by CH2M HILL Hanford Group, Inc., is to define risks from past and future single-shell tank farm activities. To meet this goal, CH2M HILL Hanford Group, Inc. asked scientists from Pacific Northwest National Laboratory to perform detailed analyses on vadose zone sediment from within the S-SX Waste Management Area. This report is the first in a series of four reports to present the results of these analyses. Specifically, this report contains all the geologic, geochemical, and selected physical characterization data collected on vadose zone sediment recovered from RCRA borehole bore samples and composite samples. Intact cores from two RCRA boreholes (299-W22-48 and 299-W22-50) near the SX Tank Farm and four, large-quantity grab samples from outcrop sediment on and off the Hanford Site were sampled to better understand the fate of contaminants in the vadose zone beneath underground storage tanks at the Hanford Site. Borehole and outcrop samples analyzed for this report are located outside the tank farms, and therefore may be considered standard or background samples from which to compare contaminated sediments within the tank farms themselves. This report presents our interpretation of the physical, chemical, and mineralogical properties of the uncontaminated vadose zone sediments, and variations in the vertical distribution of these properties. The information presented in this report is intended to support preparation of the S-SX Field Investigation Report to be prepared by CH2M Hill Hanford Group, Inc. as well as future remediation actions at the S-SX Tank Farm

  2. Semi-selective medium for Fusarium graminearum detection in seed samples

    Directory of Open Access Journals (Sweden)

    Marivane Segalin

    2010-12-01

    Full Text Available Fungi of the genus Fusarium cause a variety of difficult to control diseases in different crops, including winter cereals and maize. Among the species of this genus Fusarium graminearum deserves attention. The aim of this work was to develop a semi-selective medium to study this fungus. In several experiments, substrates for fungal growth were tested, including fungicides and antibiotics such as iprodiona, nystatin and triadimenol, and the antibacterial agents streptomycin and neomycin sulfate. Five seed samples of wheat, barley, oat, black beans and soybeans for F. graminearum detection by using the media Nash and Snyder agar (NSA, Segalin & Reis agar (SRA and one-quarter dextrose agar (1/4PDA; potato 50g; dextrose 5g and agar 20g, either unsupplemented or supplemented with various concentrations of the antimicrobial agents cited above. The selected components and concentrations (g.L-1 of the proposed medium, Segalin & Reis agar (SRA-FG, were: iprodiona 0.05; nystatin 0,025; triadimenol 0.015; neomycin sulfate 0.05; and streptomycin sulfate, 0.3 added of ¼ potato sucrose agar. In the isolation from seeds of cited plant species, the sensitivity of this medium was similar to that of NSA but with de advantage of maintaining the colony morphological aspects similar to those observed in potato-dextrose-agar medium.

  3. Sampling frequency affects ActiGraph activity counts

    DEFF Research Database (Denmark)

    Brønd, Jan Christian; Arvidsson, Daniel

    that is normally performed at frequencies higher than 2.5 Hz. With the ActiGraph model GT3X one has the option to select sample frequency from 30 to 100 Hz. This study investigated the effect of the sampling frequency on the ouput of the bandpass filter.Methods: A synthetic frequency sweep of 0-15 Hz was generated...... in Matlab and sampled at frequencies of 30-100 Hz. Also, acceleration signals during indoor walking and running were sampled at 30 Hz using the ActiGraph GT3X and resampled in Matlab to frequencies of 40-100 Hz. All data was processed with the ActiLife software.Results: Acceleration frequencies between 5......-15 Hz escaped the bandpass filter when sampled at 40, 50, 70, 80 and 100 Hz, while this was not the case when sampled at 30, 60 and 90 Hz. During the ambulatory activities this artifact resultet in different activity count output from the ActiLife software with different sampling frequency...

  4. Where Will All Your Samples Go?

    Science.gov (United States)

    Lehnert, K.

    2017-12-01

    Even in the digital age, physical samples remain an essential component of Earth and space science research. Geoscientists collect samples, sometimes locally, often in remote locations during expensive field expeditions, or at sample repositories and museums. They take these samples to their labs to describe and analyze them. When the analyses are completed and the results are published, the samples get stored away in sheds, basements, or desk drawers, where they remain unknown and inaccessible to the broad science community. In some cases, they will get re-analyzed or shared with other researchers, who know of their existence through personal connections. The sad end comes when the researcher retires: There are many stories of samples and entire collections being discarded to free up space for new samples or other purposes, even though these samples may be unique and irreplaceable. Institutions do not feel obligated and do not have the resources to store samples in perpetuity. Only samples collected in large sampling campaigns such as the Ocean Discovery Program or cores taken on ships find a home in repositories that curate and preserve them for reuse in future science endeavors. In the era of open, transparent, and reproducible science, preservation and persistent access to samples must be considered a mandate. Policies need to be developed that guide investigators, institutions, and funding agencies to plan and implement solutions for reliably and persistently curating and providing access to samples. Registration of samples in online catalogs and use of persistent identifiers such as the International Geo Sample Number are first steps to ensure discovery and access of samples. But digital discovery and access loses its value if the physical objects are not preserved and accessible. It is unreasonable to expect that every sample ever collected can be archived. Selection of those samples that are worth preserving requires guidelines and policies. We also need to

  5. Newly introduced sample preparation techniques: towards miniaturization.

    Science.gov (United States)

    Costa, Rosaria

    2014-01-01

    Sampling and sample preparation are of crucial importance in an analytical procedure, representing quite often a source of errors. The technique chosen for the isolation of analytes greatly affects the success of a chemical determination. On the other hand, growing concerns about environmental and human safety, along with the introduction of international regulations for quality control, have moved the interest of scientists towards specific needs. Newly introduced sample preparation techniques are challenged to meet new criteria: (i) miniaturization, (ii) higher sensitivity and selectivity, and (iii) automation. In this survey, the most recent techniques introduced in the field of sample preparation will be described and discussed, along with many examples of applications.

  6. Forward selection two sample binomial test

    Science.gov (United States)

    Wong, Kam-Fai; Wong, Weng-Kee; Lin, Miao-Shan

    2016-01-01

    Fisher’s exact test (FET) is a conditional method that is frequently used to analyze data in a 2 × 2 table for small samples. This test is conservative and attempts have been made to modify the test to make it less conservative. For example, Crans and Shuster (2008) proposed adding more points in the rejection region to make the test more powerful. We provide another way to modify the test to make it less conservative by using two independent binomial distributions as the reference distribution for the test statistic. We compare our new test with several methods and show that our test has advantages over existing methods in terms of control of the type 1 and type 2 errors. We reanalyze results from an oncology trial using our proposed method and our software which is freely available to the reader. PMID:27335577

  7. Judgment sampling: a health care improvement perspective.

    Science.gov (United States)

    Perla, Rocco J; Provost, Lloyd P

    2012-01-01

    Sampling plays a major role in quality improvement work. Random sampling (assumed by most traditional statistical methods) is the exception in improvement situations. In most cases, some type of "judgment sample" is used to collect data from a system. Unfortunately, judgment sampling is not well understood. Judgment sampling relies upon those with process and subject matter knowledge to select useful samples for learning about process performance and the impact of changes over time. It many cases, where the goal is to learn about or improve a specific process or system, judgment samples are not merely the most convenient and economical approach, they are technically and conceptually the most appropriate approach. This is because improvement work is done in the real world in complex situations involving specific areas of concern and focus; in these situations, the assumptions of classical measurement theory neither can be met nor should an attempt be made to meet them. The purpose of this article is to describe judgment sampling and its importance in quality improvement work and studies with a focus on health care settings.

  8. 40 CFR Appendix A to Subpart F of... - Sampling Plans for Selective Enforcement Auditing of Small Nonroad Engines

    Science.gov (United States)

    2010-07-01

    ... Enforcement Auditing of Small Nonroad Engines A Appendix A to Subpart F of Part 90 Protection of Environment...-IGNITION ENGINES AT OR BELOW 19 KILOWATTS Selective Enforcement Auditing Pt. 90, Subpt. F, App. A Appendix A to Subpart F of Part 90—Sampling Plans for Selective Enforcement Auditing of Small Nonroad Engines...

  9. A review of sample preparation and its influence on pH determination in concrete samples

    International Nuclear Information System (INIS)

    Manso, S.; Aguado, A.

    2017-01-01

    If we are to monitor the chemical processes in cementitious materials, then pH assays in the pore solutions of cement pastes, mortars, and concretes are of key importance. However, there is no standard method that regulates the sample-preparation method for pH determination. The state-of-the-art of different methods for pH determination in cementitious materials is presented in this paper and the influence of sample preparation in each case. Moreover, an experimental campaign compares three different techniques for pH determination. Its results contribute to establishing a basic criterion to help researchers select the most suitable method, depending on the purpose of the research. A simple tool is described for selecting the easiest and the most economic pH determination method, depending on the objective; especially for researchers and those with limited experience in this field.

  10. Enhanced conformational sampling using enveloping distribution sampling.

    Science.gov (United States)

    Lin, Zhixiong; van Gunsteren, Wilfred F

    2013-10-14

    To lessen the problem of insufficient conformational sampling in biomolecular simulations is still a major challenge in computational biochemistry. In this article, an application of the method of enveloping distribution sampling (EDS) is proposed that addresses this challenge and its sampling efficiency is demonstrated in simulations of a hexa-β-peptide whose conformational equilibrium encompasses two different helical folds, i.e., a right-handed 2.7(10∕12)-helix and a left-handed 3(14)-helix, separated by a high energy barrier. Standard MD simulations of this peptide using the GROMOS 53A6 force field did not reach convergence of the free enthalpy difference between the two helices even after 500 ns of simulation time. The use of soft-core non-bonded interactions in the centre of the peptide did enhance the number of transitions between the helices, but at the same time led to neglect of relevant helical configurations. In the simulations of a two-state EDS reference Hamiltonian that envelops both the physical peptide and the soft-core peptide, sampling of the conformational space of the physical peptide ensures that physically relevant conformations can be visited, and sampling of the conformational space of the soft-core peptide helps to enhance the transitions between the two helices. The EDS simulations sampled many more transitions between the two helices and showed much faster convergence of the relative free enthalpy of the two helices compared with the standard MD simulations with only a slightly larger computational effort to determine optimized EDS parameters. Combined with various methods to smoothen the potential energy surface, the proposed EDS application will be a powerful technique to enhance the sampling efficiency in biomolecular simulations.

  11. User-driven sampling strategies in image exploitation

    Science.gov (United States)

    Harvey, Neal; Porter, Reid

    2013-12-01

    Visual analytics and interactive machine learning both try to leverage the complementary strengths of humans and machines to solve complex data exploitation tasks. These fields overlap most significantly when training is involved: the visualization or machine learning tool improves over time by exploiting observations of the human-computer interaction. This paper focuses on one aspect of the human-computer interaction that we call user-driven sampling strategies. Unlike relevance feedback and active learning sampling strategies, where the computer selects which data to label at each iteration, we investigate situations where the user selects which data is to be labeled at each iteration. User-driven sampling strategies can emerge in many visual analytics applications but they have not been fully developed in machine learning. User-driven sampling strategies suggest new theoretical and practical research questions for both visualization science and machine learning. In this paper we identify and quantify the potential benefits of these strategies in a practical image analysis application. We find user-driven sampling strategies can sometimes provide significant performance gains by steering tools towards local minima that have lower error than tools trained with all of the data. In preliminary experiments we find these performance gains are particularly pronounced when the user is experienced with the tool and application domain.

  12. Standard methods for sampling and sample preparation for gamma spectroscopy

    International Nuclear Information System (INIS)

    Taskaeva, M.; Taskaev, E.; Nikolov, P.

    1993-01-01

    The strategy for sampling and sample preparation is outlined: necessary number of samples; analysis and treatment of the results received; quantity of the analysed material according to the radionuclide concentrations and analytical methods; the minimal quantity and kind of the data needed for making final conclusions and decisions on the base of the results received. This strategy was tested in gamma spectroscopic analysis of radionuclide contamination of the region of Eleshnitsa Uranium Mines. The water samples was taken and stored according to the ASTM D 3370-82. The general sampling procedures were in conformity with the recommendations of ISO 5667. The radionuclides was concentrated by coprecipitation with iron hydroxide and ion exchange. The sampling of soil samples complied with the rules of ASTM C 998, and their sample preparation - with ASTM C 999. After preparation the samples were sealed hermetically and measured. (author)

  13. The Internet of Samples in the Earth Sciences (iSamples)

    Science.gov (United States)

    Carter, M. R.; Lehnert, K. A.

    2015-12-01

    Across most Earth Science disciplines, research depends on the availability of samples collected above, at, and beneath Earth's surface, on the moon and in space, or generated in experiments. Many domains in the Earth Sciences have recently expressed the need for better discovery, access, and sharing of scientific samples and collections (EarthCube End-User Domain workshops, 2012 and 2013, http://earthcube.org/info/about/end-user-workshops), as has the US government (OSTP Memo, March 2014). The Internet of Samples in the Earth Sciences (iSamples) is an initiative funded as a Research Coordination Network (RCN) within the EarthCube program to address this need. iSamples aims to advance the use of innovative cyberinfrastructure to connect physical samples and sample collections across the Earth Sciences with digital data infrastructures to revolutionize their utility for science. iSamples strives to build, grow, and foster a new community of practice, in which domain scientists, curators of sample repositories and collections, computer and information scientists, software developers and technology innovators engage in and collaborate on defining, articulating, and addressing the needs and challenges of physical samples as a critical component of digital data infrastructure. A primary goal of iSamples is to deliver a community-endorsed set of best practices and standards for the registration, description, identification, and citation of physical specimens and define an actionable plan for implementation. iSamples conducted a broad community survey about sample sharing and has created 5 different working groups to address the different challenges of developing the internet of samples - from metadata schemas and unique identifiers to an architecture of a shared cyberinfrastructure for collections, to digitization of existing collections, to education, and ultimately to establishing the physical infrastructure that will ensure preservation and access of the physical

  14. Vibration and acoustic frequency spectra for industrial process modeling using selective fusion multi-condition samples and multi-source features

    Science.gov (United States)

    Tang, Jian; Qiao, Junfei; Wu, ZhiWei; Chai, Tianyou; Zhang, Jian; Yu, Wen

    2018-01-01

    Frequency spectral data of mechanical vibration and acoustic signals relate to difficult-to-measure production quality and quantity parameters of complex industrial processes. A selective ensemble (SEN) algorithm can be used to build a soft sensor model of these process parameters by fusing valued information selectively from different perspectives. However, a combination of several optimized ensemble sub-models with SEN cannot guarantee the best prediction model. In this study, we use several techniques to construct mechanical vibration and acoustic frequency spectra of a data-driven industrial process parameter model based on selective fusion multi-condition samples and multi-source features. Multi-layer SEN (MLSEN) strategy is used to simulate the domain expert cognitive process. Genetic algorithm and kernel partial least squares are used to construct the inside-layer SEN sub-model based on each mechanical vibration and acoustic frequency spectral feature subset. Branch-and-bound and adaptive weighted fusion algorithms are integrated to select and combine outputs of the inside-layer SEN sub-models. Then, the outside-layer SEN is constructed. Thus, "sub-sampling training examples"-based and "manipulating input features"-based ensemble construction methods are integrated, thereby realizing the selective information fusion process based on multi-condition history samples and multi-source input features. This novel approach is applied to a laboratory-scale ball mill grinding process. A comparison with other methods indicates that the proposed MLSEN approach effectively models mechanical vibration and acoustic signals.

  15. Sample vial inserts: A better approach for sampling heterogeneous slurry samples in the SRS Defense Waste Processing Facility

    International Nuclear Information System (INIS)

    Coleman, C.J.; Goode, S.R.

    1996-01-01

    A convenient and effective new approach for analyzing DWPF samples involves the use of inserts with volumes of 1.5--3 ml placed in the neck of 14 ml sample vials. The inserts have rims that conform to the rim of the vials so that they sit straight and stable in the vial. The DWPF tank sampling system fills the pre-weighed insert rather than the entire vial, so the vial functions only as the insert holder. The shielded cell operator then removes the vial cap and decants the insert containing the sample into a plastic bottle, crucible, etc., for analysis. Inert materials such as Teflon, plastic, and zirconium are used for the insert so it is unnecessary to separate the insert from the sample for most analyses. The key technique advantage of using inserts to take DWPF samples versus filling sample vials is that it provides a convenient and almost foolproof way of obtaining and handling small volumes of slurry samples in a shielded cell without corrupting the sample. Since the insert allows the entire sample to be analyzed, this approach eliminates the errors inherent with subsampling heterogeneous slurries that comprise DWPF samples. Slurry samples can then be analyzed with confidence. Analysis times are dramatically reduced by eliminating the drying and vitrification steps normally used to produce a homogeneous solid sample. Direct dissolution and elemental analysis of slurry samples are achieved in 8 hours or less compared with 40 hours for analysis of vitrified slurry samples. Comparison of samples taken in inserts versus full vials indicate that the insert does not significantly affect sample composition

  16. The NuSTAR  Extragalactic Surveys: X-Ray Spectroscopic Analysis of the Bright Hard-band Selected Sample

    Science.gov (United States)

    Zappacosta, L.; Comastri, A.; Civano, F.; Puccetti, S.; Fiore, F.; Aird, J.; Del Moro, A.; Lansbury, G. B.; Lanzuisi, G.; Goulding, A.; Mullaney, J. R.; Stern, D.; Ajello, M.; Alexander, D. M.; Ballantyne, D. R.; Bauer, F. E.; Brandt, W. N.; Chen, C.-T. J.; Farrah, D.; Harrison, F. A.; Gandhi, P.; Lanz, L.; Masini, A.; Marchesi, S.; Ricci, C.; Treister, E.

    2018-02-01

    We discuss the spectral analysis of a sample of 63 active galactic nuclei (AGN) detected above a limiting flux of S(8{--}24 {keV})=7× {10}-14 {erg} {{{s}}}-1 {{cm}}-2 in the multi-tiered NuSTAR extragalactic survey program. The sources span a redshift range z=0{--}2.1 (median =0.58). The spectral analysis is performed over the broad 0.5–24 keV energy range, combining NuSTAR with Chandra and/or XMM-Newton data and employing empirical and physically motivated models. This constitutes the largest sample of AGN selected at > 10 {keV} to be homogeneously spectrally analyzed at these flux levels. We study the distribution of spectral parameters such as photon index, column density ({N}{{H}}), reflection parameter ({\\boldsymbol{R}}), and 10–40 keV luminosity ({L}{{X}}). Heavily obscured ({log}[{N}{{H}}/{{cm}}-2]≥slant 23) and Compton-thick (CT; {log}[{N}{{H}}/{{cm}}-2]≥slant 24) AGN constitute ∼25% (15–17 sources) and ∼2–3% (1–2 sources) of the sample, respectively. The observed {N}{{H}} distribution agrees fairly well with predictions of cosmic X-ray background population-synthesis models (CXBPSM). We estimate the intrinsic fraction of AGN as a function of {N}{{H}}, accounting for the bias against obscured AGN in a flux-selected sample. The fraction of CT AGN relative to {log}[{N}{{H}}/{{cm}}-2]=20{--}24 AGN is poorly constrained, formally in the range 2–56% (90% upper limit of 66%). We derived a fraction (f abs) of obscured AGN ({log}[{N}{{H}}/{{cm}}-2]=22{--}24) as a function of {L}{{X}} in agreement with CXBPSM and previous zvalues.

  17. Signal sampling circuit

    NARCIS (Netherlands)

    Louwsma, S.M.; Vertregt, Maarten

    2011-01-01

    A sampling circuit for sampling a signal is disclosed. The sampling circuit comprises a plurality of sampling channels adapted to sample the signal in time-multiplexed fashion, each sampling channel comprising a respective track-and-hold circuit connected to a respective analogue to digital

  18. Signal sampling circuit

    NARCIS (Netherlands)

    Louwsma, S.M.; Vertregt, Maarten

    2010-01-01

    A sampling circuit for sampling a signal is disclosed. The sampling circuit comprises a plurality of sampling channels adapted to sample the signal in time-multiplexed fashion, each sampling channel comprising a respective track-and-hold circuit connected to a respective analogue to digital

  19. Distribution-Preserving Stratified Sampling for Learning Problems.

    Science.gov (United States)

    Cervellera, Cristiano; Maccio, Danilo

    2017-06-09

    The need for extracting a small sample from a large amount of real data, possibly streaming, arises routinely in learning problems, e.g., for storage, to cope with computational limitations, obtain good training/test/validation sets, and select minibatches for stochastic gradient neural network training. Unless we have reasons to select the samples in an active way dictated by the specific task and/or model at hand, it is important that the distribution of the selected points is as similar as possible to the original data. This is obvious for unsupervised learning problems, where the goal is to gain insights on the distribution of the data, but it is also relevant for supervised problems, where the theory explains how the training set distribution influences the generalization error. In this paper, we analyze the technique of stratified sampling from the point of view of distances between probabilities. This allows us to introduce an algorithm, based on recursive binary partition of the input space, aimed at obtaining samples that are distributed as much as possible as the original data. A theoretical analysis is proposed, proving the (greedy) optimality of the procedure together with explicit error bounds. An adaptive version of the algorithm is also introduced to cope with streaming data. Simulation tests on various data sets and different learning tasks are also provided.

  20. XRF analysis of mineralised samples

    International Nuclear Information System (INIS)

    Ahmedali, T.

    2002-01-01

    Full text: Software now supplied by instrument manufacturers has made it practical and convenient for users to analyse unusual samples routinely. Semiquantitative scanning software can be used for rapid preliminary screening of elements ranging from Carbon to Uranium, prior to assigning mineralised samples to an appropriate quantitative analysis routine. The general quality and precision of analytical results obtained from modern XRF spectrometers can be significantly enhanced by several means: a. Modifications in preliminary sample preparation can result in less contamination from crushing and grinding equipment. Optimised techniques of actual sample preparation can significantly increase precision of results. b. Employment of automatic data recording balances and the use of catch weights during sample preparation reduces technician time as well as weighing errors. * c. Consistency of results can be improved significantly by the use of appropriate stable drift monitors with a statistically significant content of the analyte d. A judicious selection of kV/mA combinations, analysing crystals, primary beam filters, collimators, peak positions, accurate background correction and peak overlap corrections, followed by the use of appropriate matrix correction procedures. e. Preventative maintenance procedures for XRF spectrometers and ancillary equipment, which can also contribute significantly to reducing instrument down times, are described. Examples of various facets of sample processing routines are given from the XRF spectrometer component of a multi-instrument analytical university facility, which provides XRF data to 17 Canadian universities. Copyright (2002) Australian X-ray Analytical Association Inc

  1. Surface reconstruction through poisson disk sampling.

    Directory of Open Access Journals (Sweden)

    Wenguang Hou

    Full Text Available This paper intends to generate the approximate Voronoi diagram in the geodesic metric for some unbiased samples selected from original points. The mesh model of seeds is then constructed on basis of the Voronoi diagram. Rather than constructing the Voronoi diagram for all original points, the proposed strategy is to run around the obstacle that the geodesic distances among neighboring points are sensitive to nearest neighbor definition. It is obvious that the reconstructed model is the level of detail of original points. Hence, our main motivation is to deal with the redundant scattered points. In implementation, Poisson disk sampling is taken to select seeds and helps to produce the Voronoi diagram. Adaptive reconstructions can be achieved by slightly changing the uniform strategy in selecting seeds. Behaviors of this method are investigated and accuracy evaluations are done. Experimental results show the proposed method is reliable and effective.

  2. Analysis of Selected Legacy 85Kr Samples

    Energy Technology Data Exchange (ETDEWEB)

    Jubin, Robert Thomas [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bruffey, Stephanie H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-02

    Legacy samples composed of 85Kr encapsulated in solid zeolite 5A material and five small metal tubes containing a mixture of the zeolite combined with a glass matrix resulting from hot isostatic pressing have been preserved. The samples were a result of krypton R&D encapsulation efforts in the late 1970s performed at the Idaho Chemical Processing Plant. These samples were shipped to Oak Ridge National Laboratory (ORNL) in mid-FY 2014. Upon receipt the outer shipping package was opened, and the inner package, removed and placed in a radiological hood. The individual capsules were double bagged as they were removed from the inner shipping pig and placed into individual glass sample bottles for further analysis. The five capsules were then x-ray imaged. Capsules 1 and 4 appear intact and to contain an amorphous mass within the capsules. Capsule 2 clearly shows the saw marks on the capsule and a quantity of loose pellet or bead-like material remaining in the capsule. Capsule 3 shows similar bead-like material within the intact capsule. Capsule 5 had been opened at an undetermined time in the past. The end of this capsule appears to have been cut off, and there are additional saw marks on the side of the capsule. X-ray tomography allowed the capsules to be viewed along the three axes. Of most interest was determining whether there was any residual material in the closed end of Capsule 5. The images confirmed the presence of residual material within this capsule. The material appears to be compacted but still retains some of the bead-like morphology. Based on the nondestructive analysis (NDA) results, a proposed path forward was formulated to advance this effort toward the original goals of understanding the effects of extended storage on the waste form and package. Based on the initial NDA and the fact that there are at least two breached samples, it was proposed that exploratory tests be conducted with the breached specimens before opening the three intact

  3. Microfabricated Devices for Sample Extraction, Concentrations, and Related Sample Processing Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Gang; Lin, Yuehe

    2006-12-01

    This is an invited book chapter. As with other analytical techniques, sample pretreatments, sample extraction, sample introduction, and related techniques are of extreme importance for micro-electro-mechanical systems (MEMS). Bio-MEMS devices and systems start with a sampling step. The biological sample then usually undergoes some kinds of sample preparation steps before the actual analysis. These steps may involve extracting the target sample from its matrix, removing interferences from the sample, derivatizing the sample to detectable species, or performing a sample preconcentration step. The integration of the components for sample pretreatment into microfluidic devices represents one of the remaining the bottle-neck towards achieving true miniaturized total analysis systems (?TAS). This chapter provides a thorough state-of-art of the developments in this field to date.

  4. The SDSS-IV MaNGA Sample: Design, Optimization, and Usage Considerations

    Science.gov (United States)

    Wake, David A.; Bundy, Kevin; Diamond-Stanic, Aleksandar M.; Yan, Renbin; Blanton, Michael R.; Bershady, Matthew A.; Sánchez-Gallego, José R.; Drory, Niv; Jones, Amy; Kauffmann, Guinevere; Law, David R.; Li, Cheng; MacDonald, Nicholas; Masters, Karen; Thomas, Daniel; Tinker, Jeremy; Weijmans, Anne-Marie; Brownstein, Joel R.

    2017-09-01

    We describe the sample design for the SDSS-IV MaNGA survey and present the final properties of the main samples along with important considerations for using these samples for science. Our target selection criteria were developed while simultaneously optimizing the size distribution of the MaNGA integral field units (IFUs), the IFU allocation strategy, and the target density to produce a survey defined in terms of maximizing signal-to-noise ratio, spatial resolution, and sample size. Our selection strategy makes use of redshift limits that only depend on I-band absolute magnitude (M I ), or, for a small subset of our sample, M I and color (NUV - I). Such a strategy ensures that all galaxies span the same range in angular size irrespective of luminosity and are therefore covered evenly by the adopted range of IFU sizes. We define three samples: the Primary and Secondary samples are selected to have a flat number density with respect to M I and are targeted to have spectroscopic coverage to 1.5 and 2.5 effective radii (R e ), respectively. The Color-Enhanced supplement increases the number of galaxies in the low-density regions of color-magnitude space by extending the redshift limits of the Primary sample in the appropriate color bins. The samples cover the stellar mass range 5× {10}8≤slant {M}* ≤slant 3× {10}11 {M}⊙ {h}-2 and are sampled at median physical resolutions of 1.37 and 2.5 kpc for the Primary and Secondary samples, respectively. We provide weights that will statistically correct for our luminosity and color-dependent selection function and IFU allocation strategy, thus correcting the observed sample to a volume-limited sample.

  5. The Swift Gamma-Ray Burst Host Galaxy Legacy Survey. I. Sample Selection and Redshift Distribution

    Science.gov (United States)

    Perley, D. A.; Kruhler, T.; Schulze, S.; Postigo, A. De Ugarte; Hjorth, J.; Berger, E.; Cenko, S. B.; Chary, R.; Cucchiara, A.; Ellis, R.; hide

    2016-01-01

    We introduce the Swift Gamma-Ray Burst Host Galaxy Legacy Survey (SHOALS), a multi-observatory high redshift galaxy survey targeting the largest unbiased sample of long-duration gamma-ray burst (GRB) hosts yet assembled (119 in total). We describe the motivations of the survey and the development of our selection criteria, including an assessment of the impact of various observability metrics on the success rate of afterglow-based redshift measurement. We briefly outline our host galaxy observational program, consisting of deep Spitzer/IRAC imaging of every field supplemented by similarly deep, multicolor optical/near-IR photometry, plus spectroscopy of events without preexisting redshifts. Our optimized selection cuts combined with host galaxy follow-up have so far enabled redshift measurements for 110 targets (92%) and placed upper limits on all but one of the remainder. About 20% of GRBs in the sample are heavily dust obscured, and at most 2% originate from z > 5.5. Using this sample, we estimate the redshift-dependent GRB rate density, showing it to peak at z approx. 2.5 and fall by at least an order of magnitude toward low (z = 0) redshift, while declining more gradually toward high (z approx. 7) redshift. This behavior is consistent with a progenitor whose formation efficiency varies modestly over cosmic history. Our survey will permit the most detailed examination to date of the connection between the GRB host population and general star-forming galaxies, directly measure evolution in the host population over cosmic time and discern its causes, and provide new constraints on the fraction of cosmic star formation occurring in undetectable galaxies at all redshifts.

  6. A review of sample preparation and its influence on pH determination in concrete samples

    Directory of Open Access Journals (Sweden)

    S. Manso

    2017-01-01

    Full Text Available If we are to monitor the chemical processes in cementitious materials, then pH assays in the pore solutions of cement pastes, mortars, and concretes are of key importance. However, there is no standard method that regulates the sample-preparation method for pH determination. The state-of-the-art of different methods for pH determination in cementitious materials is presented in this paper and the influence of sample preparation in each case. Moreover, an experimental campaign compares three different techniques for pH determination. Its results contribute to establishing a basic criterion to help researchers select the most suitable method, depending on the purpose of the research. A simple tool is described for selecting the easiest and the most economic pH determination method, depending on the objective; especially for researchers and those with limited experience in this field.

  7. Modular microfluidic system for biological sample preparation

    Science.gov (United States)

    Rose, Klint A.; Mariella, Jr., Raymond P.; Bailey, Christopher G.; Ness, Kevin Dean

    2015-09-29

    A reconfigurable modular microfluidic system for preparation of a biological sample including a series of reconfigurable modules for automated sample preparation adapted to selectively include a) a microfluidic acoustic focusing filter module, b) a dielectrophoresis bacteria filter module, c) a dielectrophoresis virus filter module, d) an isotachophoresis nucleic acid filter module, e) a lyses module, and f) an isotachophoresis-based nucleic acid filter.

  8. Clinical impact of strict criteria for selectivity and lateralization in adrenal vein sampling.

    Science.gov (United States)

    Gasparetto, Alessandro; Angle, John F; Darvishi, Pasha; Freeman, Colbey W; Norby, Ray G; Carey, Robert M

    2015-04-01

    Selectivity index (SI) and lateralization index (LI) thresholds determine the adequacy of adrenal vein sampling (AVS) and the degree of lateralization. The purpose of this study was investigate the clinical outcome of patients whose adrenal vein sampling was interpreted using "strict criteria" (SC) (SIpre-stimuli≥3, SIpost-stimuli≥5 and LIpre-stimuli≥4, LIpost-stimuli≥4). A retrospective review of 73 consecutive AVS procedures was performed and 67 were technically successful. Forty-three patients showed lateralization and underwent surgery, while 24 did not lateralize and were managed conservatively. Systolic blood pressure (SBP), diastolic blood pressure (DBP), kalemia (K(+)), and the change in number of blood pressure (BP) medications were recorded for each patient before and after AVS and potential surgery were performed. In the surgery group, BP and K(+) changed respectively from 160±5.3/100±2.0 mmHg to 127±3.3/80±1.9 (p blood pressure medications were six (14.0%) in the lateralized group and 22 (91.7%) in the non-lateralized group (p <0.001). AVS interpretation with SC leads to significant clinical improvement in both patients who underwent surgery and those managed conservatively.

  9. The development of a Martian atmospheric Sample collection canister

    Science.gov (United States)

    Kulczycki, E.; Galey, C.; Kennedy, B.; Budney, C.; Bame, D.; Van Schilfgaarde, R.; Aisen, N.; Townsend, J.; Younse, P.; Piacentine, J.

    The collection of an atmospheric sample from Mars would provide significant insight to the understanding of the elemental composition and sub-surface out-gassing rates of noble gases. A team of engineers at the Jet Propulsion Laboratory (JPL), California Institute of Technology have developed an atmospheric sample collection canister for Martian application. The engineering strategy has two basic elements: first, to collect two separately sealed 50 cubic centimeter unpressurized atmospheric samples with minimal sensing and actuation in a self contained pressure vessel; and second, to package this atmospheric sample canister in such a way that it can be easily integrated into the orbiting sample capsule for collection and return to Earth. Sample collection and integrity are demonstrated by emulating the atmospheric collection portion of the Mars Sample Return mission on a compressed timeline. The test results achieved by varying the pressure inside of a thermal vacuum chamber while opening and closing the valve on the sample canister at Mars ambient pressure. A commercial off-the-shelf medical grade micro-valve is utilized in the first iteration of this design to enable rapid testing of the system. The valve has been independently leak tested at JPL to quantify and separate the leak rates associated with the canister. The results are factored in to an overall system design that quantifies mass, power, and sensing requirements for a Martian atmospheric Sample Collection (MASC) canister as outlined in the Mars Sample Return mission profile. Qualitative results include the selection of materials to minimize sample contamination, preliminary science requirements, priorities in sample composition, flight valve selection criteria, a storyboard from sample collection to loading in the orbiting sample capsule, and contributions to maintaining “ Earth” clean exterior surfaces on the orbiting sample capsule.

  10. Getting DNA copy numbers without control samples.

    Science.gov (United States)

    Ortiz-Estevez, Maria; Aramburu, Ander; Rubio, Angel

    2012-08-16

    The selection of the reference to scale the data in a copy number analysis has paramount importance to achieve accurate estimates. Usually this reference is generated using control samples included in the study. However, these control samples are not always available and in these cases, an artificial reference must be created. A proper generation of this signal is crucial in terms of both noise and bias.We propose NSA (Normality Search Algorithm), a scaling method that works with and without control samples. It is based on the assumption that genomic regions enriched in SNPs with identical copy numbers in both alleles are likely to be normal. These normal regions are predicted for each sample individually and used to calculate the final reference signal. NSA can be applied to any CN data regardless the microarray technology and preprocessing method. It also finds an optimal weighting of the samples minimizing possible batch effects. Five human datasets (a subset of HapMap samples, Glioblastoma Multiforme (GBM), Ovarian, Prostate and Lung Cancer experiments) have been analyzed. It is shown that using only tumoral samples, NSA is able to remove the bias in the copy number estimation, to reduce the noise and therefore, to increase the ability to detect copy number aberrations (CNAs). These improvements allow NSA to also detect recurrent aberrations more accurately than other state of the art methods. NSA provides a robust and accurate reference for scaling probe signals data to CN values without the need of control samples. It minimizes the problems of bias, noise and batch effects in the estimation of CNs. Therefore, NSA scaling approach helps to better detect recurrent CNAs than current methods. The automatic selection of references makes it useful to perform bulk analysis of many GEO or ArrayExpress experiments without the need of developing a parser to find the normal samples or possible batches within the data. The method is available in the open-source R package

  11. Honest Importance Sampling with Multiple Markov Chains.

    Science.gov (United States)

    Tan, Aixin; Doss, Hani; Hobert, James P

    2015-01-01

    selection in linear regression, and for this application, importance sampling based on multiple chains enables an empirical Bayes approach to variable selection.

  12. Environmental sampling

    International Nuclear Information System (INIS)

    Puckett, J.M.

    1998-01-01

    Environmental Sampling (ES) is a technology option that can have application in transparency in nuclear nonproliferation. The basic process is to take a sample from the environment, e.g., soil, water, vegetation, or dust and debris from a surface, and through very careful sample preparation and analysis, determine the types, elemental concentration, and isotopic composition of actinides in the sample. The sample is prepared and the analysis performed in a clean chemistry laboratory (CCL). This ES capability is part of the IAEA Strengthened Safeguards System. Such a Laboratory is planned to be built by JAERI at Tokai and will give Japan an intrinsic ES capability. This paper presents options for the use of ES as a transparency measure for nuclear nonproliferation

  13. Correcting Classifiers for Sample Selection Bias in Two-Phase Case-Control Studies

    Science.gov (United States)

    Theis, Fabian J.

    2017-01-01

    Epidemiological studies often utilize stratified data in which rare outcomes or exposures are artificially enriched. This design can increase precision in association tests but distorts predictions when applying classifiers on nonstratified data. Several methods correct for this so-called sample selection bias, but their performance remains unclear especially for machine learning classifiers. With an emphasis on two-phase case-control studies, we aim to assess which corrections to perform in which setting and to obtain methods suitable for machine learning techniques, especially the random forest. We propose two new resampling-based methods to resemble the original data and covariance structure: stochastic inverse-probability oversampling and parametric inverse-probability bagging. We compare all techniques for the random forest and other classifiers, both theoretically and on simulated and real data. Empirical results show that the random forest profits from only the parametric inverse-probability bagging proposed by us. For other classifiers, correction is mostly advantageous, and methods perform uniformly. We discuss consequences of inappropriate distribution assumptions and reason for different behaviors between the random forest and other classifiers. In conclusion, we provide guidance for choosing correction methods when training classifiers on biased samples. For random forests, our method outperforms state-of-the-art procedures if distribution assumptions are roughly fulfilled. We provide our implementation in the R package sambia. PMID:29312464

  14. Soil Gas Sample Handling: Evaluation of Water Removal and Sample Ganging

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Brad G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Abrecht, David G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hayes, James C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Mendoza, Donaldo P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-10-31

    Soil gas sampling is currently conducted in support of Nuclear Test Ban treaty verification. Soil gas samples are collected and analyzed for isotopes of interest. Some issues that can impact sampling and analysis of these samples are excess moisture and sample processing time. Here we discuss three potential improvements to the current sampling protocol; a desiccant for water removal, use of molecular sieve to remove CO2 from the sample during collection, and a ganging manifold to allow composite analysis of multiple samples.

  15. The AlSi10Mg samples produced by selective laser melting: single track, densification, microstructure and mechanical behavior

    International Nuclear Information System (INIS)

    Wei, Pei; Wei, Zhengying; Chen, Zhen; Du, Jun; He, Yuyang; Li, Junfeng; Zhou, Yatong

    2017-01-01

    Highlights: • The thermal behavior of AlSi10Mg molten pool was analyzed. • The SLM-processed sample with a relatively low surface roughness was obtained. • Effects of parameters on surface topography of scan track were investigated. • Effects of parameters on microstructure of parts were investigated. • Optimum processing parameters for AlSi10Mg SLM was obtained. - Abstract: This densification behavior and attendant microstructural characteristics of the selective laser melting (SLM) processed AlSi10Mg alloy affected by the processing parameters were systematically investigated. The samples with a single track were produced by SLM to study the influences of laser power and scanning speed on the surface morphologies of scan tracks. Additionally, the bulk samples were produced to investigate the influence of the laser power, scanning speed, and hatch spacing on the densification level and the resultant microstructure. The experimental results showed that the level of porosity of the SLM-processed samples was significantly governed by energy density of laser beam and the hatch spacing. The tensile properties of SLM-processed samples and the attendant fracture surface can be enhanced by decreasing the level of porosity. The microstructure of SLM-processed samples consists of supersaturated Al-rich cellular structure along with eutectic Al/Si situated at the cellular boundaries. The Si content in the cellular boundaries increases with increasing the laser power and decreasing the scanning speed. The hardness of SLM-processed samples was significantly improved by this fine microstructure compared with the cast samples. Moreover, the hardness of SLM-processed samples at overlaps was lower than the hardness observed at track cores.

  16. The AlSi10Mg samples produced by selective laser melting: single track, densification, microstructure and mechanical behavior

    Energy Technology Data Exchange (ETDEWEB)

    Wei, Pei; Wei, Zhengying, E-mail: zywei@mail.xjtu.edu.cn; Chen, Zhen; Du, Jun; He, Yuyang; Li, Junfeng; Zhou, Yatong

    2017-06-30

    Highlights: • The thermal behavior of AlSi10Mg molten pool was analyzed. • The SLM-processed sample with a relatively low surface roughness was obtained. • Effects of parameters on surface topography of scan track were investigated. • Effects of parameters on microstructure of parts were investigated. • Optimum processing parameters for AlSi10Mg SLM was obtained. - Abstract: This densification behavior and attendant microstructural characteristics of the selective laser melting (SLM) processed AlSi10Mg alloy affected by the processing parameters were systematically investigated. The samples with a single track were produced by SLM to study the influences of laser power and scanning speed on the surface morphologies of scan tracks. Additionally, the bulk samples were produced to investigate the influence of the laser power, scanning speed, and hatch spacing on the densification level and the resultant microstructure. The experimental results showed that the level of porosity of the SLM-processed samples was significantly governed by energy density of laser beam and the hatch spacing. The tensile properties of SLM-processed samples and the attendant fracture surface can be enhanced by decreasing the level of porosity. The microstructure of SLM-processed samples consists of supersaturated Al-rich cellular structure along with eutectic Al/Si situated at the cellular boundaries. The Si content in the cellular boundaries increases with increasing the laser power and decreasing the scanning speed. The hardness of SLM-processed samples was significantly improved by this fine microstructure compared with the cast samples. Moreover, the hardness of SLM-processed samples at overlaps was lower than the hardness observed at track cores.

  17. Sample summary report for KOR1 pressure tube sample

    International Nuclear Information System (INIS)

    Lee, Hee Jong; Nam, Min Woo; Choi, Young Ha

    2006-01-01

    This summary report includes basically the following: - The FLAW CHARACTERIZATION TABLE of KOR1 sample and supporting documentation. - The CROSS REFERENCE TABLES for each investigator, which is the SAMPLE INSPECTION TABLE that cross reference to the FLAW CHARACTERIZATION TABLE. - Each Sample Inspection Report as Appendices

  18. Language sampling

    DEFF Research Database (Denmark)

    Rijkhoff, Jan; Bakker, Dik

    1998-01-01

    This article has two aims: [1] to present a revised version of the sampling method that was originally proposed in 1993 by Rijkhoff, Bakker, Hengeveld and Kahrel, and [2] to discuss a number of other approaches to language sampling in the light of our own method. We will also demonstrate how our...... sampling method is used with different genetic classifications (Voegelin & Voegelin 1977, Ruhlen 1987, Grimes ed. 1997) and argue that —on the whole— our sampling technique compares favourably with other methods, especially in the case of exploratory research....

  19. Sampling in Developmental Science: Situations, Shortcomings, Solutions, and Standards.

    Science.gov (United States)

    Bornstein, Marc H; Jager, Justin; Putnick, Diane L

    2013-12-01

    Sampling is a key feature of every study in developmental science. Although sampling has far-reaching implications, too little attention is paid to sampling. Here, we describe, discuss, and evaluate four prominent sampling strategies in developmental science: population-based probability sampling, convenience sampling, quota sampling, and homogeneous sampling. We then judge these sampling strategies by five criteria: whether they yield representative and generalizable estimates of a study's target population, whether they yield representative and generalizable estimates of subsamples within a study's target population, the recruitment efforts and costs they entail, whether they yield sufficient power to detect subsample differences, and whether they introduce "noise" related to variation in subsamples and whether that "noise" can be accounted for statistically. We use sample composition of gender, ethnicity, and socioeconomic status to illustrate and assess the four sampling strategies. Finally, we tally the use of the four sampling strategies in five prominent developmental science journals and make recommendations about best practices for sample selection and reporting.

  20. Polymer platforms for selective detection of cocaine in street samples adulterated with levamisole.

    Science.gov (United States)

    Florea, Anca; Cowen, Todd; Piletsky, Sergey; De Wael, Karolien

    2018-08-15

    Accurate drug detection is of utmost importance for fighting against drug abuse. With a high number of cutting agents and adulterants being added to cut or mask drugs in street powders the number of false results is increasing. We demonstrate for the first time the usefulness of employing polymers readily synthesized by electrodeposition to selectively detect cocaine in the presence of the commonly used adulterant levamisole. The polymers were selected by computational modelling to exhibit high binding affinity towards cocaine and deposited directly on the surface of graphene-modified electrodes via electropolymerization. The resulting platforms allowed a distinct electrochemical signal for cocaine, which is otherwise suppressed by levamisole. Square wave voltammetry was used to quantify cocaine alone and in the presence of levamisole. The usefulness of the platforms was demonstrated in the screening of real street samples. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. Legacy sample disposition project. Volume 2: Final report

    International Nuclear Information System (INIS)

    Gurley, R.N.; Shifty, K.L.

    1998-02-01

    This report describes the legacy sample disposition project at the Idaho Engineering and Environmental Laboratory (INEEL), which assessed Site-wide facilities/areas to locate legacy samples and owner organizations and then characterized and dispositioned these samples. This project resulted from an Idaho Department of Environmental Quality inspection of selected areas of the INEEL in January 1996, which identified some samples at the Test Reactor Area and Idaho Chemical Processing Plant that had not been characterized and dispositioned according to Resource Conservation and Recovery Act (RCRA) requirements. The objective of the project was to manage legacy samples in accordance with all applicable environmental and safety requirements. A systems engineering approach was used throughout the project, which included collecting the legacy sample information and developing a system for amending and retrieving the information. All legacy samples were dispositioned by the end of 1997. Closure of the legacy sample issue was achieved through these actions

  2. Alpha Matting with KL-Divergence Based Sparse Sampling.

    Science.gov (United States)

    Karacan, Levent; Erdem, Aykut; Erdem, Erkut

    2017-06-22

    In this paper, we present a new sampling-based alpha matting approach for the accurate estimation of foreground and background layers of an image. Previous sampling-based methods typically rely on certain heuristics in collecting representative samples from known regions, and thus their performance deteriorates if the underlying assumptions are not satisfied. To alleviate this, we take an entirely new approach and formulate sampling as a sparse subset selection problem where we propose to pick a small set of candidate samples that best explains the unknown pixels. Moreover, we describe a new dissimilarity measure for comparing two samples which is based on KLdivergence between the distributions of features extracted in the vicinity of the samples. The proposed framework is general and could be easily extended to video matting by additionally taking temporal information into account in the sampling process. Evaluation on standard benchmark datasets for image and video matting demonstrates that our approach provides more accurate results compared to the state-of-the-art methods.

  3. A Note on Information-Directed Sampling and Thompson Sampling

    OpenAIRE

    Zhou, Li

    2015-01-01

    This note introduce three Bayesian style Multi-armed bandit algorithms: Information-directed sampling, Thompson Sampling and Generalized Thompson Sampling. The goal is to give an intuitive explanation for these three algorithms and their regret bounds, and provide some derivations that are omitted in the original papers.

  4. How Sample Size Affects a Sampling Distribution

    Science.gov (United States)

    Mulekar, Madhuri S.; Siegel, Murray H.

    2009-01-01

    If students are to understand inferential statistics successfully, they must have a profound understanding of the nature of the sampling distribution. Specifically, they must comprehend the determination of the expected value and standard error of a sampling distribution as well as the meaning of the central limit theorem. Many students in a high…

  5. Comparisons of methods for generating conditional Poisson samples and Sampford samples

    OpenAIRE

    Grafström, Anton

    2005-01-01

    Methods for conditional Poisson sampling (CP-sampling) and Sampford sampling are compared and the focus is on the efficiency of the methods. The efficiency is investigated by simulation in different sampling situations. It was of interest to compare methods since new methods for both CP-sampling and Sampford sampling were introduced by Bondesson, Traat & Lundqvist in 2004. The new methods are acceptance rejection methods that use the efficient Pareto sampling method. They are found to be ...

  6. Cushing's disease; inferior petrosal sinus venography and samplings

    International Nuclear Information System (INIS)

    Chung, Tae Sub; Suh, Jung Ho; Kim, Dong Ik; Lee, Hyun Chul; Huh, Kap Bum; Kim, Young Soo; Chung, Sang Sup

    1991-01-01

    Hypersecretion of ACTH in patients with Cushing's syndrome originates from either a pituitary tumor (Cushing's disease) or an ectopic ACTH-secreting tumor. These 2 entities may be clinically indistinguishable, and additional difficulty arise because pituitary microadenomas may be radiologically occult. Recently, bilateral selective venous sampling from the inferior petrosal sinuses became the procedure of choice for confirming a false negative study of a combined hormonal test and pituitary ACTH hypersecretion. We performed selective venous catheterization and sampling for ACTH. The central location of the lesion was detected in 1 case (intersinus gradient = 1.1 : 1), and the remaining 5 cases revealed lateralization of the lesions (intersinus gradient = 3.7 - 20.1 : 1), which correlated well with transsphenoidal microadenomectomies in all the cases. We concluded that selective venous ACTH sampling from the inferior petroal sinus is a reliable and useful aid in the diagnosis of Cushing's disease when standard clinical and biochemical studies are inconclusive

  7. Comparison of sampling methods for hard-to-reach francophone populations: yield and adequacy of advertisement and respondent-driven sampling.

    Science.gov (United States)

    Ngwakongnwi, Emmanuel; King-Shier, Kathryn M; Hemmelgarn, Brenda R; Musto, Richard; Quan, Hude

    2014-01-01

    Francophones who live outside the primarily French-speaking province of Quebec, Canada, risk being excluded from research by lack of a sampling frame. We examined the adequacy of random sampling, advertising, and respondent-driven sampling for recruitment of francophones for survey research. We recruited francophones residing in the city of Calgary, Alberta, through advertising and respondentdriven sampling. These 2 samples were then compared with a random subsample of Calgary francophones derived from the 2006 Canadian Community Health Survey (CCHS). We assessed the effectiveness of advertising and respondent-driven sampling in relation to the CCHS sample by comparing demographic characteristics and selected items from the CCHS (specifically self-reported general health status, perceived weight, and having a family doctor). We recruited 120 francophones through advertising and 145 through respondent-driven sampling; the random sample from the CCHS consisted of 259 records. The samples derived from advertising and respondentdriven sampling differed from the CCHS in terms of age (mean ages 41.0, 37.6, and 42.5 years, respectively), sex (proportion of males 26.1%, 40.6%, and 56.6%, respectively), education (college or higher 86.7% , 77.9% , and 59.1%, respectively), place of birth (immigrants accounting for 45.8%, 55.2%, and 3.7%, respectively), and not having a regular medical doctor (16.7%, 34.5%, and 16.6%, respectively). Differences were not tested statistically because of limitations on the analysis of CCHS data imposed by Statistics Canada. The samples generated exclusively through advertising and respondent-driven sampling were not representative of the gold standard sample from the CCHS. Use of such biased samples for research studies could generate misleading results.

  8. Dynamic Headspace Sampling as an Initial Step for Sample Preparation in Chromatographic Analysis.

    Science.gov (United States)

    Wojnowski, Wojciech; Majchrzak, Tomasz; Dymerski, Tomasz; Gębicki, Jacek; Namieśnik, Jacek

    2017-11-01

    This work represents a brief summary of the use of dynamic headspace (DHS) as a technique for sample preparation in chromatographic analysis. Despite numerous developments in the area of analyte isolation and enrichment, DHS remains one of the fundamental methods used with GC. In our opinion, interest in this technique will not diminish significantly because it conforms to stipulations of green analytical chemistry. Moreover, DHS fulfills the need for methods that facilitate detection and determination of analytes present at ultratrace levels in complex matrixes. The main focus of this work was placed on the theoretical fundamentals of this method. Also described herein were DHS development, the advantages and disadvantages of this technique compared with other headspace sampling techniques, and selected examples of its applications in food and environmental analyses.

  9. A large sample of Kohonen selected E+A (post-starburst) galaxies from the Sloan Digital Sky Survey

    Science.gov (United States)

    Meusinger, H.; Brünecke, J.; Schalldach, P.; in der Au, A.

    2017-01-01

    Context. The galaxy population in the contemporary Universe is characterised by a clear bimodality, blue galaxies with significant ongoing star formation and red galaxies with only a little. The migration between the blue and the red cloud of galaxies is an issue of active research. Post starburst (PSB) galaxies are thought to be observed in the short-lived transition phase. Aims: We aim to create a large sample of local PSB galaxies from the Sloan Digital Sky Survey (SDSS) to study their characteristic properties, particularly morphological features indicative of gravitational distortions and indications for active galactic nuclei (AGNs). Another aim is to present a tool set for an efficient search in a large database of SDSS spectra based on Kohonen self-organising maps (SOMs). Methods: We computed a huge Kohonen SOM for ∼106 spectra from SDSS data release 7. The SOM is made fully available, in combination with an interactive user interface, for the astronomical community. We selected a large sample of PSB galaxies taking advantage of the clustering behaviour of the SOM. The morphologies of both PSB galaxies and randomly selected galaxies from a comparison sample in SDSS Stripe 82 (S82) were inspected on deep co-added SDSS images to search for indications of gravitational distortions. We used the Portsmouth galaxy property computations to study the evolutionary stage of the PSB galaxies and archival multi-wavelength data to search for hidden AGNs. Results: We compiled a catalogue of 2665 PSB galaxies with redshifts z 3 Å and z cloud, in agreement with the idea that PSB galaxies represent the transitioning phase between actively and passively evolving galaxies. The relative frequency of distorted PSB galaxies is at least 57% for EW(Hδ) > 5 Å, significantly higher than in the comparison sample. The search for AGNs based on conventional selection criteria in the radio and MIR results in a low AGN fraction of ∼2-3%. We confirm an MIR excess in the mean SED of

  10. Multiwavelength diagnostics of accretion in an X-ray selected sample of CTTSs

    Science.gov (United States)

    Curran, R. L.; Argiroffi, C.; Sacco, G. G.; Orlando, S.; Peres, G.; Reale, F.; Maggio, A.

    2011-02-01

    Context. High resolution X-ray spectroscopy has revealed soft X-rays from high density plasma in classical T Tauri stars (CTTSs), probably arising from the accretion shock region. However, the mass accretion rates derived from the X-ray observations are consistently lower than those derived from UV/optical/NIR studies. Aims: We aim to test the hypothesis that the high density soft X-ray emission originates from accretion by analysing, in a homogeneous manner, optical accretion indicators for an X-ray selected sample of CTTSs. Methods: We analyse optical spectra of the X-ray selected sample of CTTSs and calculate the accretion rates based on measuring the Hα, Hβ, Hγ, He ii 4686 Å, He i 5016 Å, He i 5876 Å, O i 6300 Å, and He i 6678 Å equivalent widths. In addition, we also calculate the accretion rates based on the full width at 10% maximum of the Hα line. The different optical tracers of accretion are compared and discussed. The derived accretion rates are then compared to the accretion rates derived from the X-ray spectroscopy. Results: We find that, for each CTTS in our sample, the different optical tracers predict mass-accretion rates that agree within the errors, albeit with a spread of ≈ 1 order of magnitude. Typically, mass-accretion rates derived from Hα and He i 5876 Å are larger than those derived from Hβ, Hγ, and O i. In addition, the Hα full width at 10%, whilst a good indicator of accretion, may not accurately measure the mass-accretion rate. When the optical mass-accretion rates are compared to the X-ray derived mass-accretion rates, we find that: a) the latter are always lower (but by varying amounts); b) the latter range within a factor of ≈ 2 around 2 × 10-10 M⊙ yr-1, despite the former spanning a range of ≈ 3 orders of magnitude. We suggest that the systematic underestimate of the X-ray derived mass-accretion rates could depend on the density distribution inside the accretion streams, where the densest part of the stream is

  11. Large Sample Neutron Activation Analysis of Heterogeneous Samples

    International Nuclear Information System (INIS)

    Stamatelatos, I.E.; Vasilopoulou, T.; Tzika, F.

    2018-01-01

    A Large Sample Neutron Activation Analysis (LSNAA) technique was developed for non-destructive analysis of heterogeneous bulk samples. The technique incorporated collimated scanning and combining experimental measurements and Monte Carlo simulations for the identification of inhomogeneities in large volume samples and the correction of their effect on the interpretation of gamma-spectrometry data. Corrections were applied for the effect of neutron self-shielding, gamma-ray attenuation, geometrical factor and heterogeneous activity distribution within the sample. A benchmark experiment was performed to investigate the effect of heterogeneity on the accuracy of LSNAA. Moreover, a ceramic vase was analyzed as a whole demonstrating the feasibility of the technique. The LSNAA results were compared against results obtained by INAA and a satisfactory agreement between the two methods was observed. This study showed that LSNAA is a technique capable to perform accurate non-destructive, multi-elemental compositional analysis of heterogeneous objects. It also revealed the great potential of the technique for the analysis of precious objects and artefacts that need to be preserved intact and cannot be damaged for sampling purposes. (author)

  12. Sampling in practice

    DEFF Research Database (Denmark)

    Esbensen, Kim Harry; Petersen, Lars

    2005-01-01

    A basic knowledge of the Theory of Sampling (TOS) and a set of only eight sampling unit operations is all the practical sampler needs to ensure representativeness of samples extracted from all kinds of lots: production batches, - truckloads, - barrels, sub-division in the laboratory, sampling...... in nature and in the field (environmental sampling, forestry, geology, biology), from raw materials or manufactory processes etc. We here can only give a brief introduction to the Fundamental Sampling Principle (FSP) and these eight Sampling Unit Operations (SUO’s). Always respecting FSP and invoking only...... the necessary SUO’s (dependent on the practical situation) is the only prerequisite needed for eliminating all sampling bias and simultaneously minimizing sampling variance, and this is in addition a sure guarantee for making the final analytical results trustworthy. No reliable conclusions can be made unless...

  13. Improved sample treatment for the determination of insoluble soap in sewage sludge samples by liquid chromatography with fluorescence detection.

    Science.gov (United States)

    Cantarero, Samuel; Zafra-Gómez, A; Ballesteros, O; Navalón, A; Vílchez, J L; Crovetto, G; Verge, C; de Ferrer, J A

    2010-09-15

    A new selective and sensitive method for the determination of insoluble fatty acid salts (soap) in sewage sludge samples is proposed. The method involves a clean up of sample with petroleum ether, the conversion of calcium and magnesium insoluble salts into soluble potassium salts, potassium salts extraction with methanol, and a derivatization procedure previous to the liquid chromatography with fluorescence detection (LC-FLD) analysis. Three different extraction techniques (Soxhlet, microwave-assisted extraction and ultrasounds) were compared and microwave-assisted extraction (MAE) was selected as appropriate for our purpose. This allowed to reduce the extraction time and solvent waste (50 mL of methanol in contrast with 250 mL for Soxhlet procedure). The absence of matrix effect was demonstrated with two standards (C(13:0) and C(17:0)) that are not commercials and neither of them has been detected in sewage sludge samples. Therefore, it was possible to evaluate the matrix effect since both standards have similar environmental behaviour (adsorption and precipitation) to commercial soaps (C(10:0)-C(18:0)). The method was successfully applied to samples from different sources and consequently, with different composition. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  14. Effective sample labeling

    International Nuclear Information System (INIS)

    Rieger, J.T.; Bryce, R.W.

    1990-01-01

    Ground-water samples collected for hazardous-waste and radiological monitoring have come under strict regulatory and quality assurance requirements as a result of laws such as the Resource Conservation and Recovery Act. To comply with these laws, the labeling system used to identify environmental samples had to be upgraded to ensure proper handling and to protect collection personnel from exposure to sample contaminants and sample preservatives. The sample label now used as the Pacific Northwest Laboratory is a complete sample document. In the event other paperwork on a labeled sample were lost, the necessary information could be found on the label

  15. On the Sampling

    OpenAIRE

    Güleda Doğan

    2017-01-01

    This editorial is on statistical sampling, which is one of the most two important reasons for editorial rejection from our journal Turkish Librarianship. The stages of quantitative research, the stage in which we are sampling, the importance of sampling for a research, deciding on sample size and sampling methods are summarised briefly.

  16. Pilot studies for the North American Soil Geochemical Landscapes Project - Site selection, sampling protocols, analytical methods, and quality control protocols

    Science.gov (United States)

    Smith, D.B.; Woodruff, L.G.; O'Leary, R. M.; Cannon, W.F.; Garrett, R.G.; Kilburn, J.E.; Goldhaber, M.B.

    2009-01-01

    In 2004, the US Geological Survey (USGS) and the Geological Survey of Canada sampled and chemically analyzed soils along two transects across Canada and the USA in preparation for a planned soil geochemical survey of North America. This effort was a pilot study to test and refine sampling protocols, analytical methods, quality control protocols, and field logistics for the continental survey. A total of 220 sample sites were selected at approximately 40-km intervals along the two transects. The ideal sampling protocol at each site called for a sample from a depth of 0-5 cm and a composite of each of the O, A, and C horizons. The Ca, Fe, K, Mg, Na, S, Ti, Ag, As, Ba, Be, Bi, Cd, Ce, Co, Cr, Cs, Cu, Ga, In, La, Li, Mn, Mo, Nb, Ni, P, Pb, Rb, Sb, Sc, Sn, Sr, Te, Th, Tl, U, V, W, Y, and Zn by inductively coupled plasma-mass spectrometry and inductively coupled plasma-atomic emission spectrometry following a near-total digestion in a mixture of HCl, HNO3, HClO4, and HF. Separate methods were used for Hg, Se, total C, and carbonate-C on this same size fraction. Only Ag, In, and Te had a large percentage of concentrations below the detection limit. Quality control (QC) of the analyses was monitored at three levels: the laboratory performing the analysis, the USGS QC officer, and the principal investigator for the study. This level of review resulted in an average of one QC sample for every 20 field samples, which proved to be minimally adequate for such a large-scale survey. Additional QC samples should be added to monitor within-batch quality to the extent that no more than 10 samples are analyzed between a QC sample. Only Cr (77%), Y (82%), and Sb (80%) fell outside the acceptable limits of accuracy (% recovery between 85 and 115%) because of likely residence in mineral phases resistant to the acid digestion. A separate sample of 0-5-cm material was collected at each site for determination of organic compounds. A subset of 73 of these samples was analyzed for a suite of

  17. Adaptive sampling program support for expedited site characterization

    International Nuclear Information System (INIS)

    Johnson, R.

    1993-01-01

    Expedited site characterizations offer substantial savings in time and money when assessing hazardous waste sites. Key to some of these savings is the ability to adapt a sampling program to the ''real-time'' data generated by an expedited site characterization. This paper presents a two-prong approach to supporting adaptive sampling programs: a specialized object-oriented database/geographical information system for data fusion, management and display; and combined Bayesian/geostatistical methods for contamination extent estimation and sample location selection

  18. Sampling of ore

    International Nuclear Information System (INIS)

    Boehme, R.C.; Nicholas, B.L.

    1987-01-01

    This invention relates to a method of an apparatus for ore sampling. The method includes the steps of periodically removing a sample of the output material of a sorting machine, weighing each sample so that each is of the same weight, measuring a characteristic such as the radioactivity, magnetivity or the like of each sample, subjecting at least an equal portion of each sample to chemical analysis to determine the mineral content of the sample and comparing the characteristic measurement with desired mineral content of the chemically analysed portion of the sample to determine the characteristic/mineral ratio of the sample. The apparatus includes an ore sample collector, a deflector for deflecting a sample of ore particles from the output of an ore sorter into the collector and means for moving the deflector from a first position in which it is clear of the particle path from the sorter to a second position in which it is in the particle path at predetermined time intervals and for predetermined time periods to deflect the sample particles into the collector. The apparatus conveniently includes an ore crusher for comminuting the sample particle, a sample hopper means for weighing the hopper, a detector in the hopper for measuring a characteristic such as radioactivity, magnetivity or the like of particles in the hopper, a discharge outlet from the hopper and means for feeding the particles from the collector to the crusher and then to the hopper

  19. Probability Sampling - A Guideline for Quantitative Health Care ...

    African Journals Online (AJOL)

    A more direct definition is the method used for selecting a given ... description of the chosen population, the sampling procedure giving ... target population, precision, and stratification. The ... survey estimates, it is recommended that researchers first analyze a .... The optimum sample size has a relation to the type of planned ...

  20. Laser sampling

    International Nuclear Information System (INIS)

    Gorbatenko, A A; Revina, E I

    2015-01-01

    The review is devoted to the major advances in laser sampling. The advantages and drawbacks of the technique are considered. Specific features of combinations of laser sampling with various instrumental analytical methods, primarily inductively coupled plasma mass spectrometry, are discussed. Examples of practical implementation of hybrid methods involving laser sampling as well as corresponding analytical characteristics are presented. The bibliography includes 78 references

  1. Pesticides, selected elements, and other chemicals in adult total diet samples October 1979-September 1980

    International Nuclear Information System (INIS)

    Gartrell, M.J.; Craun, J.C.; Podrebarac, D.S.; Gunderson, E.L.

    1985-01-01

    The US Food and Drug Administration (FDA) conducts Total Diet Studies to determine the dietary intake of selected pesticides, industrial chemicals, and elements (including radionuclides). These studies involve the retail purchase and analysis of foods representative of the diets of infants, toddlers, and adults. The individual food items are separated into a number of food groups, each of which is analyzed as a composite. This report summarizes the results for adult Total Diet samples collected in 20 cities between October 1979 and September 1980. The average concentration, range of concentrations, and calculated average daily intake of each chemical found are presented by food group. The average daily intakes of the chemicals are similar to those found in the several preceding years and are within acceptable limits. The results for samples collected during the same period that represent the diets of infants and toddlers are reported separately

  2. A novel method of selective removal of human DNA improves PCR sensitivity for detection of Salmonella Typhi in blood samples.

    Science.gov (United States)

    Zhou, Liqing; Pollard, Andrew J

    2012-07-27

    Enteric fever is a major public health problem, causing an estimated 21million new cases and 216,000 or more deaths every year. Current diagnosis of the disease is inadequate. Blood culture only identifies 45 to 70% of the cases and is time-consuming. Serological tests have very low sensitivity and specificity. Clinical samples obtained for diagnosis of enteric fever in the field generally have blood, so that even PCR-based methods, widely used for detection of other infectious diseases, are not a straightforward option in typhoid diagnosis. We developed a novel method to enrich target bacterial DNA by selective removal of human DNA from blood samples, enhancing the sensitivity of PCR tests. This method offers the possibility of improving PCR assays directly using clinical specimens for diagnosis of this globally important infectious disease. Blood samples were mixed with ox bile for selective lysis of human blood cells and the released human DNA was then digested with addition of bile resistant micrococcal nuclease. The intact Salmonella Typhi bacteria were collected from the specimen by centrifugation and the DNA extracted with QIAamp DNA mini kit. The presence of Salmonella Typhi bacteria in blood samples was detected by PCR with the fliC-d gene of Salmonella Typhi as the target. Micrococcal nuclease retained activity against human blood DNA in the presence of up to 9% ox bile. Background human DNA was dramatically removed from blood samples through the use of ox bile lysis and micrococcal nuclease for removal of mammalian DNA. Consequently target Salmonella Typhi DNA was enriched in DNA preparations and the PCR sensitivity for detection of Salmonella Typhi in spiked blood samples was enhanced by 1,000 fold. Use of a combination of selective ox-bile blood cell lysis and removal of human DNA with micrococcal nuclease significantly improves PCR sensitivity and offers a better option for improved typhoid PCR assays directly using clinical specimens in diagnosis of

  3. Demystifying Theoretical Sampling in Grounded Theory Research

    Directory of Open Access Journals (Sweden)

    Jenna Breckenridge BSc(Hons,Ph.D.Candidate

    2009-06-01

    Full Text Available Theoretical sampling is a central tenet of classic grounded theory and is essential to the development and refinement of a theory that is ‘grounded’ in data. While many authors appear to share concurrent definitions of theoretical sampling, the ways in which the process is actually executed remain largely elusive and inconsistent. As such, employing and describing the theoretical sampling process can present a particular challenge to novice researchers embarking upon their first grounded theory study. This article has been written in response to the challenges faced by the first author whilst writing a grounded theory proposal. It is intended to clarify theoretical sampling for new grounded theory researchers, offering some insight into the practicalities of selecting and employing a theoretical sampling strategy. It demonstrates that the credibility of a theory cannot be dissociated from the process by which it has been generated and seeks to encourage and challenge researchers to approach theoretical sampling in a way that is apposite to the core principles of the classic grounded theory methodology.

  4. Sample Transport for a European Sample Curation Facility

    Science.gov (United States)

    Berthoud, L.; Vrublevskis, J. B.; Bennett, A.; Pottage, T.; Bridges, J. C.; Holt, J. M. C.; Dirri, F.; Longobardo, A.; Palomba, E.; Russell, S.; Smith, C.

    2018-04-01

    This work has looked at the recovery of Mars Sample Return capsule once it arrives on Earth. It covers possible landing sites, planetary protection requirements, and transportation from the landing site to a European Sample Curation Facility.

  5. Sampling strategies in antimicrobial resistance monitoring: evaluating how precision and sensitivity vary with the number of animals sampled per farm.

    Directory of Open Access Journals (Sweden)

    Takehisa Yamamoto

    Full Text Available Because antimicrobial resistance in food-producing animals is a major public health concern, many countries have implemented antimicrobial monitoring systems at a national level. When designing a sampling scheme for antimicrobial resistance monitoring, it is necessary to consider both cost effectiveness and statistical plausibility. In this study, we examined how sampling scheme precision and sensitivity can vary with the number of animals sampled from each farm, while keeping the overall sample size constant to avoid additional sampling costs. Five sampling strategies were investigated. These employed 1, 2, 3, 4 or 6 animal samples per farm, with a total of 12 animals sampled in each strategy. A total of 1,500 Escherichia coli isolates from 300 fattening pigs on 30 farms were tested for resistance against 12 antimicrobials. The performance of each sampling strategy was evaluated by bootstrap resampling from the observational data. In the bootstrapping procedure, farms, animals, and isolates were selected randomly with replacement, and a total of 10,000 replications were conducted. For each antimicrobial, we observed that the standard deviation and 2.5-97.5 percentile interval of resistance prevalence were smallest in the sampling strategy that employed 1 animal per farm. The proportion of bootstrap samples that included at least 1 isolate with resistance was also evaluated as an indicator of the sensitivity of the sampling strategy to previously unidentified antimicrobial resistance. The proportion was greatest with 1 sample per farm and decreased with larger samples per farm. We concluded that when the total number of samples is pre-specified, the most precise and sensitive sampling strategy involves collecting 1 sample per farm.

  6. Sampling in Developmental Science: Situations, Shortcomings, Solutions, and Standards

    Science.gov (United States)

    Bornstein, Marc H.; Jager, Justin; Putnick, Diane L.

    2014-01-01

    Sampling is a key feature of every study in developmental science. Although sampling has far-reaching implications, too little attention is paid to sampling. Here, we describe, discuss, and evaluate four prominent sampling strategies in developmental science: population-based probability sampling, convenience sampling, quota sampling, and homogeneous sampling. We then judge these sampling strategies by five criteria: whether they yield representative and generalizable estimates of a study’s target population, whether they yield representative and generalizable estimates of subsamples within a study’s target population, the recruitment efforts and costs they entail, whether they yield sufficient power to detect subsample differences, and whether they introduce “noise” related to variation in subsamples and whether that “noise” can be accounted for statistically. We use sample composition of gender, ethnicity, and socioeconomic status to illustrate and assess the four sampling strategies. Finally, we tally the use of the four sampling strategies in five prominent developmental science journals and make recommendations about best practices for sample selection and reporting. PMID:25580049

  7. Large scale sample management and data analysis via MIRACLE

    DEFF Research Database (Denmark)

    Block, Ines; List, Markus; Pedersen, Marlene Lemvig

    Reverse-phase protein arrays (RPPAs) allow sensitive quantification of relative protein abundance in thousands of samples in parallel. In the past years the technology advanced based on improved methods and protocols concerning sample preparation and printing, antibody selection, optimization...... of staining conditions and mode of signal analysis. However, the sample management and data analysis still poses challenges because of the high number of samples, sample dilutions, customized array patterns, and various programs necessary for array construction and data processing. We developed...... a comprehensive and user-friendly web application called MIRACLE (MIcroarray R-based Analysis of Complex Lysate Experiments), which bridges the gap between sample management and array analysis by conveniently keeping track of the sample information from lysate preparation, through array construction and signal...

  8. Challenging genosensors in food samples: The case of gluten determination in highly processed samples.

    Science.gov (United States)

    Martín-Fernández, Begoña; de-los-Santos-Álvarez, Noemí; Martín-Clemente, Juan Pedro; Lobo-Castañón, María Jesús; López-Ruiz, Beatriz

    2016-01-01

    Electrochemical genosensors have undergone an enormous development in the last decades, but only very few have achieved a quantification of target content in highly processed food samples. The detection of allergens, and particularly gluten, is challenging because legislation establishes a threshold of 20 ppm for labeling as gluten-free but most genosensors expresses the results in DNA concentration or DNA copies. This paper describes the first attempt to correlate the genosensor response and the wheat content in real samples, even in the case of highly processed food samples. A sandwich-based format, comprising a capture probe immobilized onto the screen-printed gold electrode, and a signaling probe functionalized with fluorescein isothiocyanate (FITC), both hybridizing with the target was used. The hybridization event was electrochemically monitored by adding an anti-FITC peroxidase (antiFITC-HRP) and its substrate, tetramethylbenzidine. Binary model mixtures, as a reference material, and real samples have been analyzed. DNA from food was extracted and a fragment encoding the immunodominant peptide of α2-gliadin amplified by a tailored PCR. The sensor was able to selectively detect toxic cereals for celiac patients, such as different varieties of wheat, barley, rye and oats, from non-toxic plants. As low as 0.001% (10 mg/kg) of wheat flour in an inert matrix was reliably detected, which directly compete with the current method of choice for DNA detection, the real-time PCR. A good correlation with the official immunoassay was found in highly processed food samples. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. A simple vibrating sample magnetometer for macroscopic samples

    Science.gov (United States)

    Lopez-Dominguez, V.; Quesada, A.; Guzmán-Mínguez, J. C.; Moreno, L.; Lere, M.; Spottorno, J.; Giacomone, F.; Fernández, J. F.; Hernando, A.; García, M. A.

    2018-03-01

    We here present a simple model of a vibrating sample magnetometer (VSM). The system allows recording magnetization curves at room temperature with a resolution of the order of 0.01 emu and is appropriated for macroscopic samples. The setup can be mounted with different configurations depending on the requirements of the sample to be measured (mass, saturation magnetization, saturation field, etc.). We also include here examples of curves obtained with our setup and comparison curves measured with a standard commercial VSM that confirms the reliability of our device.

  10. Antimicrobial and antibiofilm effects of selected food preservatives against Salmonella spp. isolated from chicken samples.

    Science.gov (United States)

    Er, Buket; Demirhan, Burak; Onurdag, Fatma Kaynak; Ozgacar, Selda Özgen; Oktem, Aysel Bayhan

    2014-03-01

    Salmonella spp. are widespread foodborne pathogens that contaminate egg and poultry meats. Attachment, colonization, as well as biofilm formation capacity of Salmonella spp. on food and contact surfaces of food may cause continuous contamination. Biofilm may play a crucial role in the survival of salmonellae under unfavorable environmental conditions, such as in animal slaughterhouses and processing plants. This could serve as a reservoir compromising food safety and human health. Addition of antimicrobial preservatives extends shelf lives of food products, but even when products are supplemented with adequate amounts of preservatives, it is not always possible to inhibit the microorganisms in a biofilm community. In this study, our aims were i) to determine the minimum inhibitory concentrations (MIC) and minimum biofilm inhibitory concentrations (MBIC) of selected preservatives against planktonic and biofilm forms of Salmonella spp. isolated from chicken samples and Salmonella Typhimurium SL1344 standard strain, ii) to show the differences in the susceptibility patterns of same strains versus the planktonic and biofilm forms to the same preservative agent, and iii) to determine and compare antimicrobial and antibiofilm effects of selected food preservatives against Salmonella spp. For this purpose, Salmonella Typhimurium SL1344 standard strain and 4 Salmonella spp. strains isolated from chicken samples were used. Investigation of antimicrobial and antibiofilm effects of selected food preservatives against Salmonella spp. was done according to Clinical and Laboratory Standards Institute M100-S18 guidelines and BioTimer assay, respectively. As preservative agents, pure ciprofloxacin, sodium nitrite, potassium sorbate, sodium benzoate, methyl paraben, and propyl paraben were selected. As a result, it was determined that MBIC values are greater than the MIC values of the preservatives. This result verified the resistance seen in a biofilm community to food

  11. A census-weighted, spatially-stratified household sampling strategy for urban malaria epidemiology

    Directory of Open Access Journals (Sweden)

    Slutsker Laurence

    2008-02-01

    Full Text Available Abstract Background Urban malaria is likely to become increasingly important as a consequence of the growing proportion of Africans living in cities. A novel sampling strategy was developed for urban areas to generate a sample simultaneously representative of population and inhabited environments. Such a strategy should facilitate analysis of important epidemiological relationships in this ecological context. Methods Census maps and summary data for Kisumu, Kenya, were used to create a pseudo-sampling frame using the geographic coordinates of census-sampled structures. For every enumeration area (EA designated as urban by the census (n = 535, a sample of structures equal to one-tenth the number of households was selected. In EAs designated as rural (n = 32, a geographically random sample totalling one-tenth the number of households was selected from a grid of points at 100 m intervals. The selected samples were cross-referenced to a geographic information system, and coordinates transferred to handheld global positioning units. Interviewers found the closest eligible household to the sampling point and interviewed the caregiver of a child aged Results 4,336 interviews were completed in 473 of the 567 study area EAs from June 2002 through February 2003. EAs without completed interviews were randomly distributed, and non-response was approximately 2%. Mean distance from the assigned sampling point to the completed interview was 74.6 m, and was significantly less in urban than rural EAs, even when controlling for number of households. The selected sample had significantly more children and females of childbearing age than the general population, and fewer older individuals. Conclusion This method selected a sample that was simultaneously population-representative and inclusive of important environmental variation. The use of a pseudo-sampling frame and pre-programmed handheld GPS units is more efficient and may yield a more complete sample than

  12. Source apportionment and location by selective wind sampling and Positive Matrix Factorization.

    Science.gov (United States)

    Venturini, Elisa; Vassura, Ivano; Raffo, Simona; Ferroni, Laura; Bernardi, Elena; Passarini, Fabrizio

    2014-10-01

    In order to determine the pollution sources in a suburban area and identify the main direction of their origin, PM2.5 was collected with samplers coupled with a wind select sensor and then subjected to Positive Matrix Factorization (PMF) analysis. In each sample, soluble ions, organic carbon, elemental carbon, levoglucosan, metals, and Polycyclic Aromatic Hydrocarbons (PAHs) were determined. PMF results identified six main sources affecting the area: natural gas home appliances, motor vehicles, regional transport, biomass combustion, manufacturing activities, and secondary aerosol. The connection of factor temporal trends with other parameters (i.e., temperature, PM2.5 concentration, and photochemical processes) confirms factor attributions. PMF analysis indicated that the main source of PM2.5 in the area is secondary aerosol. This should be mainly due to regional contributions, owing to both the secondary nature of the source itself and the higher concentration registered in inland air masses. The motor vehicle emission source contribution is also important. This source likely has a prevalent local origin. The most toxic determined components, i.e., PAHs, Cd, Pb, and Ni, are mainly due to vehicular traffic. Even if this is not the main source in the study area, it is the one of greatest concern. The application of PMF analysis to PM2.5 collected with this new sampling technique made it possible to obtain more detailed results on the sources affecting the area compared to a classical PMF analysis.

  13. An Optimization Study on Listening Experiments to Improve the Comparability of Annoyance Ratings of Noise Samples from Different Experimental Sample Sets.

    Science.gov (United States)

    Di, Guoqing; Lu, Kuanguang; Shi, Xiaofan

    2018-03-08

    Annoyance ratings obtained from listening experiments are widely used in studies on health effect of environmental noise. In listening experiments, participants usually give the annoyance rating of each noise sample according to its relative annoyance degree among all samples in the experimental sample set if there are no reference sound samples, which leads to poor comparability between experimental results obtained from different experimental sample sets. To solve this problem, this study proposed to add several pink noise samples with certain loudness levels into experimental sample sets as reference sound samples. On this basis, the standard curve between logarithmic mean annoyance and loudness level of pink noise was used to calibrate the experimental results and the calibration procedures were described in detail. Furthermore, as a case study, six different types of noise sample sets were selected to conduct listening experiments using this method to examine the applicability of it. Results showed that the differences in the annoyance ratings of each identical noise sample from different experimental sample sets were markedly decreased after calibration. The determination coefficient ( R ²) of linear fitting functions between psychoacoustic annoyance (PA) and mean annoyance (MA) of noise samples from different experimental sample sets increased obviously after calibration. The case study indicated that the method above is applicable to calibrating annoyance ratings obtained from different types of noise sample sets. After calibration, the comparability of annoyance ratings of noise samples from different experimental sample sets can be distinctly improved.

  14. Optimal sampling designs for large-scale fishery sample surveys in Greece

    Directory of Open Access Journals (Sweden)

    G. BAZIGOS

    2007-12-01

    The paper deals with the optimization of the following three large scale sample surveys: biological sample survey of commercial landings (BSCL, experimental fishing sample survey (EFSS, and commercial landings and effort sample survey (CLES.

  15. Feasibility of self-sampled dried blood spot and saliva samples sent by mail in a population-based study.

    Science.gov (United States)

    Sakhi, Amrit Kaur; Bastani, Nasser Ezzatkhah; Ellingjord-Dale, Merete; Gundersen, Thomas Erik; Blomhoff, Rune; Ursin, Giske

    2015-04-11

    In large epidemiological studies it is often challenging to obtain biological samples. Self-sampling by study participants using dried blood spots (DBS) technique has been suggested to overcome this challenge. DBS is a type of biosampling where blood samples are obtained by a finger-prick lancet, blotted and dried on filter paper. However, the feasibility and efficacy of collecting DBS samples from study participants in large-scale epidemiological studies is not known. The aim of the present study was to test the feasibility and response rate of collecting self-sampled DBS and saliva samples in a population-based study of women above 50 years of age. We determined response proportions, number of phone calls to the study center with questions about sampling, and quality of the DBS. We recruited women through a study conducted within the Norwegian Breast Cancer Screening Program. Invitations, instructions and materials were sent to 4,597 women. The data collection took place over a 3 month period in the spring of 2009. Response proportions for the collection of DBS and saliva samples were 71.0% (3,263) and 70.9% (3,258), respectively. We received 312 phone calls (7% of the 4,597 women) with questions regarding sampling. Of the 3,263 individuals that returned DBS cards, 3,038 (93.1%) had been packaged and shipped according to instructions. A total of 3,032 DBS samples were sufficient for at least one biomarker analysis (i.e. 92.9% of DBS samples received by the laboratory). 2,418 (74.1%) of the DBS cards received by the laboratory were filled with blood according to the instructions (i.e. 10 completely filled spots with up to 7 punches per spot for up to 70 separate analyses). To assess the quality of the samples, we selected and measured two biomarkers (carotenoids and vitamin D). The biomarker levels were consistent with previous reports. Collecting self-sampled DBS and saliva samples through the postal services provides a low cost, effective and feasible

  16. Getting DNA copy numbers without control samples

    Directory of Open Access Journals (Sweden)

    Ortiz-Estevez Maria

    2012-08-01

    Full Text Available Abstract Background The selection of the reference to scale the data in a copy number analysis has paramount importance to achieve accurate estimates. Usually this reference is generated using control samples included in the study. However, these control samples are not always available and in these cases, an artificial reference must be created. A proper generation of this signal is crucial in terms of both noise and bias. We propose NSA (Normality Search Algorithm, a scaling method that works with and without control samples. It is based on the assumption that genomic regions enriched in SNPs with identical copy numbers in both alleles are likely to be normal. These normal regions are predicted for each sample individually and used to calculate the final reference signal. NSA can be applied to any CN data regardless the microarray technology and preprocessing method. It also finds an optimal weighting of the samples minimizing possible batch effects. Results Five human datasets (a subset of HapMap samples, Glioblastoma Multiforme (GBM, Ovarian, Prostate and Lung Cancer experiments have been analyzed. It is shown that using only tumoral samples, NSA is able to remove the bias in the copy number estimation, to reduce the noise and therefore, to increase the ability to detect copy number aberrations (CNAs. These improvements allow NSA to also detect recurrent aberrations more accurately than other state of the art methods. Conclusions NSA provides a robust and accurate reference for scaling probe signals data to CN values without the need of control samples. It minimizes the problems of bias, noise and batch effects in the estimation of CNs. Therefore, NSA scaling approach helps to better detect recurrent CNAs than current methods. The automatic selection of references makes it useful to perform bulk analysis of many GEO or ArrayExpress experiments without the need of developing a parser to find the normal samples or possible batches within the

  17. Event dependent sampling of recurrent events

    DEFF Research Database (Denmark)

    Kvist, Tine Kajsa; Andersen, Per Kragh; Angst, Jules

    2010-01-01

    The effect of event-dependent sampling of processes consisting of recurrent events is investigated when analyzing whether the risk of recurrence increases with event count. We study the situation where processes are selected for study if an event occurs in a certain selection interval. Motivation...... retrospective and prospective disease course histories are used. We examine two methods to correct for the selection depending on which data are used in the analysis. In the first case, the conditional distribution of the process given the pre-selection history is determined. In the second case, an inverse...

  18. Modification of ion chromatograph for analyses of radioactive samples

    International Nuclear Information System (INIS)

    Curfman, L.L.; Johnson, S.J.

    1979-01-01

    In ion chromatographic analysis, the sample is injected through a sample loop onto an analytical column where separation occurs. The sample then passes through a suppressor column to remove or neutralize background ions. A flow-through conductivity cell is used as a detector. Depending upon column and eluent selection, ion chromatography can be used for anion or cation analyses. Ion chromatography has proven to be a versatile analytical tool for the analysis of anions in Hanford waste samples. These radioactive samples range from caustic high salt solutions to hydrochloric acid dissolutions of insoluble sludges. Instrument modifications which provide safe and convenient handling of these samples without lengthening analysis time or altering instrument performance are described

  19. Selection of reference genes for tissue/organ samples on day 3 fifth-instar larvae in silkworm, Bombyx mori.

    Science.gov (United States)

    Wang, Genhong; Chen, Yanfei; Zhang, Xiaoying; Bai, Bingchuan; Yan, Hao; Qin, Daoyuan; Xia, Qingyou

    2018-06-01

    The silkworm, Bombyx mori, is one of the world's most economically important insect. Surveying variations in gene expression among multiple tissue/organ samples will provide clues for gene function assignments and will be helpful for identifying genes related to economic traits or specific cellular processes. To ensure their accuracy, commonly used gene expression quantification methods require a set of stable reference genes for data normalization. In this study, 24 candidate reference genes were assessed in 10 tissue/organ samples of day 3 fifth-instar B. mori larvae using geNorm and NormFinder. The results revealed that, using the combination of the expression of BGIBMGA003186 and BGIBMGA008209 was the optimum choice for normalizing the expression data of the B. mori tissue/organ samples. The most stable gene, BGIBMGA003186, is recommended if just one reference gene is used. Moreover, the commonly used reference gene encoding cytoplasmic actin was the least appropriate reference gene of the samples investigated. The reliability of the selected reference genes was further confirmed by evaluating the expression profiles of two cathepsin genes. Our results may be useful for future studies involving the quantification of relative gene expression levels of different tissue/organ samples in B. mori. © 2018 Wiley Periodicals, Inc.

  20. Colorimetric biomimetic sensor systems based on molecularly imprinted polymer membranes for highly-selective detection of phenol in environmental samples

    Directory of Open Access Journals (Sweden)

    Sergeyeva T. A.

    2014-05-01

    Full Text Available Aim. Development of an easy-to-use colorimetric sensor system for fast and accurate detection of phenol in envi- ronmental samples. Methods. Technique of molecular imprinting, method of in situ polymerization of molecularly imprinted polymer membranes. Results. The proposed sensor is based on free-standing molecularly imprinted polymer (MIP membranes, synthesized by in situ polymerization, and having in their structure artificial binding sites capable of selective phenol recognition. The quantitative detection of phenol, selectively adsorbed by the MIP membranes, is based on its reaction with 4-aminoantipyrine, which gives a pink-colored product. The intensity of staining of the MIP membrane is proportional to phenol concentration in the analyzed sample. Phenol can be detected within the range 50 nM–10 mM with limit of detection 50 nM, which corresponds to the concentrations that have to be detected in natural and waste waters in accordance with environmental protection standards. Stability of the MIP-membrane-based sensors was assessed during 12 months storage at room temperature. Conclusions. The sensor system provides highly-selective and sensitive detection of phenol in both mo- del and real (drinking, natural, and waste water samples. As compared to traditional methods of phenol detection, the proposed system is characterized by simplicity of operation and can be used in non-laboratory conditions.

  1. Spectral Characterization of H2020/PTAL Mineral Samples: Implications for In Situ Martian Exploration and Mars Sample Selection

    Science.gov (United States)

    Lantz, C.; Pilorget, C.; Poulet, F.; Riu, L.; Dypvik, H.; Hellevang, H.; Rull Perez, F.; Veneranda, M.; Cousin, A.; Viennet, J.-C.; Werner, S. C.

    2018-04-01

    We present combined analysis performed in the framework of the Planetary Terrestrial Analogues Library (H2020 project). XRD, NIR, Raman, and LIBS spectroscopies are used to characterise samples to prepare ExoMars/ESA and Mars2020/NASA observations.

  2. The subtyping of primary aldosteronism by adrenal vein sampling: sequential blood sampling causes factitious lateralization.

    Science.gov (United States)

    Rossitto, Giacomo; Battistel, Michele; Barbiero, Giulio; Bisogni, Valeria; Maiolino, Giuseppe; Diego, Miotto; Seccia, Teresa M; Rossi, Gian Paolo

    2018-02-01

    The pulsatile secretion of adrenocortical hormones and a stress reaction occurring when starting adrenal vein sampling (AVS) can affect the selectivity and also the assessment of lateralization when sequential blood sampling is used. We therefore tested the hypothesis that a simulated sequential blood sampling could decrease the diagnostic accuracy of lateralization index for identification of aldosterone-producing adenoma (APA), as compared with bilaterally simultaneous AVS. In 138 consecutive patients who underwent subtyping of primary aldosteronism, we compared the results obtained simultaneously bilaterally when starting AVS (t-15) and 15 min after (t0), with those gained with a simulated sequential right-to-left AVS technique (R ⇒ L) created by combining hormonal values obtained at t-15 and at t0. The concordance between simultaneously obtained values at t-15 and t0, and between simultaneously obtained values and values gained with a sequential R ⇒ L technique, was also assessed. We found a marked interindividual variability of lateralization index values in the patients with bilaterally selective AVS at both time point. However, overall the lateralization index simultaneously determined at t0 provided a more accurate identification of APA than the simulated sequential lateralization indexR ⇒ L (P = 0.001). Moreover, regardless of which side was sampled first, the sequential AVS technique induced a sequence-dependent overestimation of lateralization index. While in APA patients the concordance between simultaneous AVS at t0 and t-15 and between simultaneous t0 and sequential technique was moderate-to-good (K = 0.55 and 0.66, respectively), in non-APA patients, it was poor (K = 0.12 and 0.13, respectively). Sequential AVS generates factitious between-sides gradients, which lower its diagnostic accuracy, likely because of the stress reaction arising upon starting AVS.

  3. Fluidic sampling

    International Nuclear Information System (INIS)

    Houck, E.D.

    1992-01-01

    This paper covers the development of the fluidic sampler and its testing in a fluidic transfer system. The major findings of this paper are as follows. Fluidic jet samples can dependably produce unbiased samples of acceptable volume. The fluidic transfer system with a fluidic sampler in-line will transfer water to a net lift of 37.2--39.9 feet at an average ratio of 0.02--0.05 gpm (77--192 cc/min). The fluidic sample system circulation rate compares very favorably with the normal 0.016--0.026 gpm (60--100 cc/min) circulation rate that is commonly produced for this lift and solution with the jet-assisted airlift sample system that is normally used at ICPP. The volume of the sample taken with a fluidic sampler is dependant on the motive pressure to the fluidic sampler, the sample bottle size and on the fluidic sampler jet characteristics. The fluidic sampler should be supplied with fluid having the motive pressure of the 140--150 percent of the peak vacuum producing motive pressure for the jet in the sampler. Fluidic transfer systems should be operated by emptying a full pumping chamber to nearly empty or empty during the pumping cycle, this maximizes the solution transfer rate

  4. Defining And Characterizing Sample Representativeness For DWPF Melter Feed Samples

    Energy Technology Data Exchange (ETDEWEB)

    Shine, E. P.; Poirier, M. R.

    2013-10-29

    Representative sampling is important throughout the Defense Waste Processing Facility (DWPF) process, and the demonstrated success of the DWPF process to achieve glass product quality over the past two decades is a direct result of the quality of information obtained from the process. The objective of this report was to present sampling methods that the Savannah River Site (SRS) used to qualify waste being dispositioned at the DWPF. The goal was to emphasize the methodology, not a list of outcomes from those studies. This methodology includes proven methods for taking representative samples, the use of controlled analytical methods, and data interpretation and reporting that considers the uncertainty of all error sources. Numerous sampling studies were conducted during the development of the DWPF process and still continue to be performed in order to evaluate options for process improvement. Study designs were based on use of statistical tools applicable to the determination of uncertainties associated with the data needs. Successful designs are apt to be repeated, so this report chose only to include prototypic case studies that typify the characteristics of frequently used designs. Case studies have been presented for studying in-tank homogeneity, evaluating the suitability of sampler systems, determining factors that affect mixing and sampling, comparing the final waste glass product chemical composition and durability to that of the glass pour stream sample and other samples from process vessels, and assessing the uniformity of the chemical composition in the waste glass product. Many of these studies efficiently addressed more than one of these areas of concern associated with demonstrating sample representativeness and provide examples of statistical tools in use for DWPF. The time when many of these designs were implemented was in an age when the sampling ideas of Pierre Gy were not as widespread as they are today. Nonetheless, the engineers and

  5. Defining And Characterizing Sample Representativeness For DWPF Melter Feed Samples

    International Nuclear Information System (INIS)

    Shine, E. P.; Poirier, M. R.

    2013-01-01

    Representative sampling is important throughout the Defense Waste Processing Facility (DWPF) process, and the demonstrated success of the DWPF process to achieve glass product quality over the past two decades is a direct result of the quality of information obtained from the process. The objective of this report was to present sampling methods that the Savannah River Site (SRS) used to qualify waste being dispositioned at the DWPF. The goal was to emphasize the methodology, not a list of outcomes from those studies. This methodology includes proven methods for taking representative samples, the use of controlled analytical methods, and data interpretation and reporting that considers the uncertainty of all error sources. Numerous sampling studies were conducted during the development of the DWPF process and still continue to be performed in order to evaluate options for process improvement. Study designs were based on use of statistical tools applicable to the determination of uncertainties associated with the data needs. Successful designs are apt to be repeated, so this report chose only to include prototypic case studies that typify the characteristics of frequently used designs. Case studies have been presented for studying in-tank homogeneity, evaluating the suitability of sampler systems, determining factors that affect mixing and sampling, comparing the final waste glass product chemical composition and durability to that of the glass pour stream sample and other samples from process vessels, and assessing the uniformity of the chemical composition in the waste glass product. Many of these studies efficiently addressed more than one of these areas of concern associated with demonstrating sample representativeness and provide examples of statistical tools in use for DWPF. The time when many of these designs were implemented was in an age when the sampling ideas of Pierre Gy were not as widespread as they are today. Nonetheless, the engineers and

  6. Optimal time points sampling in pathway modelling.

    Science.gov (United States)

    Hu, Shiyan

    2004-01-01

    Modelling cellular dynamics based on experimental data is at the heart of system biology. Considerable progress has been made to dynamic pathway modelling as well as the related parameter estimation. However, few of them gives consideration for the issue of optimal sampling time selection for parameter estimation. Time course experiments in molecular biology rarely produce large and accurate data sets and the experiments involved are usually time consuming and expensive. Therefore, to approximate parameters for models with only few available sampling data is of significant practical value. For signal transduction, the sampling intervals are usually not evenly distributed and are based on heuristics. In the paper, we investigate an approach to guide the process of selecting time points in an optimal way to minimize the variance of parameter estimates. In the method, we first formulate the problem to a nonlinear constrained optimization problem by maximum likelihood estimation. We then modify and apply a quantum-inspired evolutionary algorithm, which combines the advantages of both quantum computing and evolutionary computing, to solve the optimization problem. The new algorithm does not suffer from the morass of selecting good initial values and being stuck into local optimum as usually accompanied with the conventional numerical optimization techniques. The simulation results indicate the soundness of the new method.

  7. AND/OR Importance Sampling

    OpenAIRE

    Gogate, Vibhav; Dechter, Rina

    2012-01-01

    The paper introduces AND/OR importance sampling for probabilistic graphical models. In contrast to importance sampling, AND/OR importance sampling caches samples in the AND/OR space and then extracts a new sample mean from the stored samples. We prove that AND/OR importance sampling may have lower variance than importance sampling; thereby providing a theoretical justification for preferring it over importance sampling. Our empirical evaluation demonstrates that AND/OR importance sampling is ...

  8. Hanford site transuranic waste sampling plan

    International Nuclear Information System (INIS)

    GREAGER, T.M.

    1999-01-01

    This sampling plan (SP) describes the selection of containers for sampling of homogeneous solids and soil/gravel and for visual examination of transuranic and mixed transuranic (collectively referred to as TRU) waste generated at the U.S. Department of Energy (DOE) Hanford Site. The activities described in this SP will be conducted under the Hanford Site TRU Waste Certification Program. This SP is designed to meet the requirements of the Transuranic Waste Characterization Quality Assurance Program Plan (CAO-94-1010) (DOE 1996a) (QAPP), site-specific implementation of which is described in the Hanford Site Transuranic Waste Characterization Program Quality Assurance Project Plan (HNF-2599) (Hanford 1998b) (QAPP). The QAPP defines the quality assurance (QA) requirements and protocols for TRU waste characterization activities at the Hanford Site. In addition, the QAPP identifies responsible organizations, describes required program activities, outlines sampling and analysis strategies, and identifies procedures for characterization activities. The QAPP identifies specific requirements for TRU waste sampling plans. Table 1-1 presents these requirements and indicates sections in this SP where these requirements are addressed

  9. Compressive sampling of polynomial chaos expansions: Convergence analysis and sampling strategies

    International Nuclear Information System (INIS)

    Hampton, Jerrad; Doostan, Alireza

    2015-01-01

    Sampling orthogonal polynomial bases via Monte Carlo is of interest for uncertainty quantification of models with random inputs, using Polynomial Chaos (PC) expansions. It is known that bounding a probabilistic parameter, referred to as coherence, yields a bound on the number of samples necessary to identify coefficients in a sparse PC expansion via solution to an ℓ 1 -minimization problem. Utilizing results for orthogonal polynomials, we bound the coherence parameter for polynomials of Hermite and Legendre type under their respective natural sampling distribution. In both polynomial bases we identify an importance sampling distribution which yields a bound with weaker dependence on the order of the approximation. For more general orthonormal bases, we propose the coherence-optimal sampling: a Markov Chain Monte Carlo sampling, which directly uses the basis functions under consideration to achieve a statistical optimality among all sampling schemes with identical support. We demonstrate these different sampling strategies numerically in both high-order and high-dimensional, manufactured PC expansions. In addition, the quality of each sampling method is compared in the identification of solutions to two differential equations, one with a high-dimensional random input and the other with a high-order PC expansion. In both cases, the coherence-optimal sampling scheme leads to similar or considerably improved accuracy

  10. Compressive sampling of polynomial chaos expansions: Convergence analysis and sampling strategies

    Science.gov (United States)

    Hampton, Jerrad; Doostan, Alireza

    2015-01-01

    Sampling orthogonal polynomial bases via Monte Carlo is of interest for uncertainty quantification of models with random inputs, using Polynomial Chaos (PC) expansions. It is known that bounding a probabilistic parameter, referred to as coherence, yields a bound on the number of samples necessary to identify coefficients in a sparse PC expansion via solution to an ℓ1-minimization problem. Utilizing results for orthogonal polynomials, we bound the coherence parameter for polynomials of Hermite and Legendre type under their respective natural sampling distribution. In both polynomial bases we identify an importance sampling distribution which yields a bound with weaker dependence on the order of the approximation. For more general orthonormal bases, we propose the coherence-optimal sampling: a Markov Chain Monte Carlo sampling, which directly uses the basis functions under consideration to achieve a statistical optimality among all sampling schemes with identical support. We demonstrate these different sampling strategies numerically in both high-order and high-dimensional, manufactured PC expansions. In addition, the quality of each sampling method is compared in the identification of solutions to two differential equations, one with a high-dimensional random input and the other with a high-order PC expansion. In both cases, the coherence-optimal sampling scheme leads to similar or considerably improved accuracy.

  11. Sampling trace organic compounds in water: a comparison of a continuous active sampler to continuous passive and discrete sampling methods.

    Science.gov (United States)

    Coes, Alissa L; Paretti, Nicholas V; Foreman, William T; Iverson, Jana L; Alvarez, David A

    2014-03-01

    A continuous active sampling method was compared to continuous passive and discrete sampling methods for the sampling of trace organic compounds (TOCs) in water. Results from each method are compared and contrasted in order to provide information for future investigators to use while selecting appropriate sampling methods for their research. The continuous low-level aquatic monitoring (CLAM) sampler (C.I.Agent® Storm-Water Solutions) is a submersible, low flow-rate sampler, that continuously draws water through solid-phase extraction media. CLAM samplers were deployed at two wastewater-dominated stream field sites in conjunction with the deployment of polar organic chemical integrative samplers (POCIS) and the collection of discrete (grab) water samples. All samples were analyzed for a suite of 69 TOCs. The CLAM and POCIS samples represent time-integrated samples that accumulate the TOCs present in the water over the deployment period (19-23 h for CLAM and 29 days for POCIS); the discrete samples represent only the TOCs present in the water at the time and place of sampling. Non-metric multi-dimensional scaling and cluster analysis were used to examine patterns in both TOC detections and relative concentrations between the three sampling methods. A greater number of TOCs were detected in the CLAM samples than in corresponding discrete and POCIS samples, but TOC concentrations in the CLAM samples were significantly lower than in the discrete and (or) POCIS samples. Thirteen TOCs of varying polarity were detected by all of the three methods. TOC detections and concentrations obtained by the three sampling methods, however, are dependent on multiple factors. This study found that stream discharge, constituent loading, and compound type all affected TOC concentrations detected by each method. In addition, TOC detections and concentrations were affected by the reporting limits, bias, recovery, and performance of each method. Published by Elsevier B.V.

  12. Sampling trace organic compounds in water: a comparison of a continuous active sampler to continuous passive and discrete sampling methods

    Science.gov (United States)

    Coes, Alissa L.; Paretti, Nicholas V.; Foreman, William T.; Iverson, Jana L.; Alvarez, David A.

    2014-01-01

    A continuous active sampling method was compared to continuous passive and discrete sampling methods for the sampling of trace organic compounds (TOCs) in water. Results from each method are compared and contrasted in order to provide information for future investigators to use while selecting appropriate sampling methods for their research. The continuous low-level aquatic monitoring (CLAM) sampler (C.I.Agent® Storm-Water Solutions) is a submersible, low flow-rate sampler, that continuously draws water through solid-phase extraction media. CLAM samplers were deployed at two wastewater-dominated stream field sites in conjunction with the deployment of polar organic chemical integrative samplers (POCIS) and the collection of discrete (grab) water samples. All samples were analyzed for a suite of 69 TOCs. The CLAM and POCIS samples represent time-integrated samples that accumulate the TOCs present in the water over the deployment period (19–23 h for CLAM and 29 days for POCIS); the discrete samples represent only the TOCs present in the water at the time and place of sampling. Non-metric multi-dimensional scaling and cluster analysis were used to examine patterns in both TOC detections and relative concentrations between the three sampling methods. A greater number of TOCs were detected in the CLAM samples than in corresponding discrete and POCIS samples, but TOC concentrations in the CLAM samples were significantly lower than in the discrete and (or) POCIS samples. Thirteen TOCs of varying polarity were detected by all of the three methods. TOC detections and concentrations obtained by the three sampling methods, however, are dependent on multiple factors. This study found that stream discharge, constituent loading, and compound type all affected TOC concentrations detected by each method. In addition, TOC detections and concentrations were affected by the reporting limits, bias, recovery, and performance of each method.

  13. Sampling soils for 137Cs using various field-sampling volumes

    International Nuclear Information System (INIS)

    Nyhan, J.W.; Schofield, T.G.; White, G.C.; Trujillo, G.

    1981-10-01

    The sediments from a liquid effluent receiving area at the Los Alamos National Laboratory and soils from intensive study area in the fallout pathway of Trinity were sampled for 137 Cs using 25-, 500-, 2500-, and 12 500-cm 3 field sampling volumes. A highly replicated sampling program was used to determine mean concentrations and inventories of 137 Cs at each site, as well as estimates of spatial, aliquoting, and counting variance components of the radionuclide data. The sampling methods were also analyzed as a function of soil size fractions collected in each field sampling volume and of the total cost of the program for a given variation in the radionuclide survey results. Coefficients of variation (CV) of 137 Cs inventory estimates ranged from 0.063 to 0.14 for Mortandad Canyon sediments, where CV values for Trinity soils were observed from 0.38 to 0.57. Spatial variance components of 137 Cs concentration data were usually found to be larger than either the aliquoting or counting variance estimates and were inversely related to field sampling volume at the Trinity intensive site. Subsequent optimization studies of the sampling schemes demonstrated that each aliquot should be counted once, and that only 2 to 4 aliquots out of an many as 30 collected need be assayed for 137 Cs. The optimization studies showed that as sample costs increased to 45 man-hours of labor per sample, the variance of the mean 137 Cs concentration decreased dramatically, but decreased very little with additional labor

  14. A gigahertz sampling transient analyzer

    International Nuclear Information System (INIS)

    Andrieu, F.; Balanca, C.; Bernet, J.M.; Lejeune, G.

    1975-01-01

    The AN 800 equipment was designed for the purpose of digital conversion of fast signals. The recording device had to be located close to the sensor. The equipment had to be highly reliable and had to transmit its output signals before and electromagnetic pulse disturbance. The sampling approach, with its readiness to digitalization, was selected as more convenient

  15. Choice of Sample Split in Out-of-Sample Forecast Evaluation

    DEFF Research Database (Denmark)

    Hansen, Peter Reinhard; Timmermann, Allan

    , while conversely the power of forecast evaluation tests is strongest with long out-of-sample periods. To deal with size distortions, we propose a test statistic that is robust to the effect of considering multiple sample split points. Empirical applications to predictabil- ity of stock returns......Out-of-sample tests of forecast performance depend on how a given data set is split into estimation and evaluation periods, yet no guidance exists on how to choose the split point. Empirical forecast evaluation results can therefore be difficult to interpret, particularly when several values...... and inflation demonstrate that out-of-sample forecast evaluation results can critically depend on how the sample split is determined....

  16. Spot the difference. Impact of different selection criteria on observed properties of passive galaxies in zCOSMOS-20k sample

    Science.gov (United States)

    Moresco, M.; Pozzetti, L.; Cimatti, A.; Zamorani, G.; Bolzonella, M.; Lamareille, F.; Mignoli, M.; Zucca, E.; Lilly, S. J.; Carollo, C. M.; Contini, T.; Kneib, J.-P.; Le Fèvre, O.; Mainieri, V.; Renzini, A.; Scodeggio, M.; Bardelli, S.; Bongiorno, A.; Caputi, K.; Cucciati, O.; de la Torre, S.; de Ravel, L.; Franzetti, P.; Garilli, B.; Iovino, A.; Kampczyk, P.; Knobel, C.; Kovač, K.; Le Borgne, J.-F.; Le Brun, V.; Maier, C.; Pelló, R.; Peng, Y.; Perez-Montero, E.; Presotto, V.; Silverman, J. D.; Tanaka, M.; Tasca, L.; Tresse, L.; Vergani, D.; Barnes, L.; Bordoloi, R.; Cappi, A.; Diener, C.; Koekemoer, A. M.; Le Floc'h, E.; López-Sanjuan, C.; McCracken, H. J.; Nair, P.; Oesch, P.; Scarlata, C.; Scoville, N.; Welikala, N.

    2013-10-01

    Aims: We present the analysis of photometric, spectroscopic, and morphological properties for differently selected samples of passive galaxies up to z = 1 extracted from the zCOSMOS-20k spectroscopic survey. This analysis intends toexplore the dependence of galaxy properties on the selection criterion adopted, study the degree of contamination due to star-forming outliers, and provide a comparison between different commonly used selection criteria. This work is a first step to fully investigating the selection effects of passive galaxies for future massive surveys such as Euclid. Methods: We extracted from the zCOSMOS-20k catalog six different samples of passive galaxies, based on morphology (3336 "morphological" early-type galaxies), optical colors (4889 "red-sequence" galaxies and 4882 "red UVJ" galaxies), specific star-formation rate (2937 "quiescent" galaxies), a best fit to the observed spectral energy distribution (2603 "red SED" galaxies), and a criterion that combines morphological, spectroscopic, and photometric information (1530 "red & passive early-type galaxies"). For all the samples, we studied optical and infrared colors, morphological properties, specific star-formation rates (SFRs), and the equivalent widths of the residual emission lines; this analysis was performed as a function of redshift and stellar mass to inspect further possible dependencies. Results: We find that each passive galaxy sample displays a certain level of contamination due to blue/star-forming/nonpassive outliers. The morphological sample is the one that presents the higher percentage of contamination, with ~12-65% (depending on the mass range) of galaxies not located in the red sequence, ~25-80% of galaxies with a specific SFR up to ~25 times higher than the adopted definition of passive, and significant emission lines found in the median stacked spectra, at least for log (M/M⊙) contamination in color 10.25, very limited tails in sSFR, a median value ~20% higher than the

  17. Soil sampling

    International Nuclear Information System (INIS)

    Fortunati, G.U.; Banfi, C.; Pasturenzi, M.

    1994-01-01

    This study attempts to survey the problems associated with techniques and strategies of soil sampling. Keeping in mind the well defined objectives of a sampling campaign, the aim was to highlight the most important aspect of representativeness of samples as a function of the available resources. Particular emphasis was given to the techniques and particularly to a description of the many types of samplers which are in use. The procedures and techniques employed during the investigations following the Seveso accident are described. (orig.)

  18. On some common practices of systematic sampling

    OpenAIRE

    Zhang, Li-Chun

    2006-01-01

    With permission from Statistics Sweden. The original publication is available at http://www.jos.nu/Contents/jos_online.asp Artikkelen er også utgitt i Statistisk sentralbyrås serie Særtrykk/Reprints nr 325. Systematic sampling is widely used technique in survey sampling. It is easy to execute, whether the units are to be selected with equal probability or with probabilities proportional to auxiliary sizes. It can be very efficient if one manage to achieve favourable stratification effect...

  19. Sample representativeness verification of the FADN CZ farm business sample

    Directory of Open Access Journals (Sweden)

    Marie Prášilová

    2011-01-01

    Full Text Available Sample representativeness verification is one of the key stages of statistical work. After having joined the European Union the Czech Republic joined also the Farm Accountancy Data Network system of the Union. This is a sample of bodies and companies doing business in agriculture. Detailed production and economic data on the results of farming business are collected from that sample annually and results for the entire population of the country´s farms are then estimated and assessed. It is important hence, that the sample be representative. Representativeness is to be assessed as to the number of farms included in the survey and also as to the degree of accordance of the measures and indices as related to the population. The paper deals with the special statistical techniques and methods of the FADN CZ sample representativeness verification including the necessary sample size statement procedure. The Czech farm population data have been obtained from the Czech Statistical Office data bank.

  20. Present status of NMCC and sample preparation method for bio-samples

    International Nuclear Information System (INIS)

    Futatsugawa, S.; Hatakeyama, S.; Saitou, S.; Sera, K.

    1993-01-01

    In NMCC(Nishina Memorial Cyclotron Center) we are doing researches on PET of nuclear medicine (Positron Emission Computed Tomography) and PIXE analysis (Particle Induced X-ray Emission) using a small cyclotron of compactly designed. The NMCC facilities have been opened to researchers of other institutions since April 1993. The present status of NMCC is described. Bio-samples (medical samples, plants, animals and environmental samples) have mainly been analyzed by PIXE in NMCC. Small amounts of bio-samples for PIXE are decomposed quickly and easily in a sealed PTFE (polytetrafluoroethylene) vessel with a microwave oven. This sample preparation method of bio-samples also is described. (author)

  1. Sampling methods for amphibians in streams in the Pacific Northwest.

    Science.gov (United States)

    R. Bruce Bury; Paul Stephen. Corn

    1991-01-01

    Methods describing how to sample aquatic and semiaquatic amphibians in small streams and headwater habitats in the Pacific Northwest are presented. We developed a technique that samples 10-meter stretches of selected streams, which was adequate to detect presence or absence of amphibian species and provided sample sizes statistically sufficient to compare abundance of...

  2. Mate-sampling costs and sexy sons.

    Science.gov (United States)

    Kokko, H; Booksmythe, I; Jennions, M D

    2015-01-01

    Costly female mating preferences for purely Fisherian male traits (i.e. sexual ornaments that are genetically uncorrelated with inherent viability) are not expected to persist at equilibrium. The indirect benefit of producing 'sexy sons' (Fisher process) disappears: in some models, the male trait becomes fixed; in others, a range of male trait values persist, but a larger trait confers no net fitness advantage because it lowers survival. Insufficient indirect selection to counter the direct cost of producing fewer offspring means that preferences are lost. The only well-cited exception assumes biased mutation on male traits. The above findings generally assume constant direct selection against female preferences (i.e. fixed costs). We show that if mate-sampling costs are instead derived based on an explicit account of how females acquire mates, an initially costly mating preference can coevolve with a male trait so that both persist in the presence or absence of biased mutation. Our models predict that empirically detecting selection at equilibrium will be difficult, even if selection was responsible for the location of the current equilibrium. In general, it appears useful to integrate mate sampling theory with models of genetic consequences of mating preferences: being explicit about the process by which individuals select mates can alter equilibria. © 2014 European Society For Evolutionary Biology. Journal of Evolutionary Biology © 2014 European Society For Evolutionary Biology.

  3. Reproducibility of NMR analysis of urine samples: impact of sample preparation, storage conditions, and animal health status.

    Science.gov (United States)

    Schreier, Christina; Kremer, Werner; Huber, Fritz; Neumann, Sindy; Pagel, Philipp; Lienemann, Kai; Pestel, Sabine

    2013-01-01

    Spectroscopic analysis of urine samples from laboratory animals can be used to predict the efficacy and side effects of drugs. This employs methods combining (1)H NMR spectroscopy with quantification of biomarkers or with multivariate data analysis. The most critical steps in data evaluation are analytical reproducibility of NMR data (collection, storage, and processing) and the health status of the animals, which may influence urine pH and osmolarity. We treated rats with a solvent, a diuretic, or a nephrotoxicant and collected urine samples. Samples were titrated to pH 3 to 9, or salt concentrations increased up to 20-fold. The effects of storage conditions and freeze-thaw cycles were monitored. Selected metabolites and multivariate data analysis were evaluated after (1)H NMR spectroscopy. We showed that variation of pH from 3 to 9 and increases in osmolarity up to 6-fold had no effect on the quantification of the metabolites or on multivariate data analysis. Storage led to changes after 14 days at 4°C or after 12 months at -20°C, independent of sample composition. Multiple freeze-thaw cycles did not affect data analysis. Reproducibility of NMR measurements is not dependent on sample composition under physiological or pathological conditions.

  4. Reproducibility of NMR Analysis of Urine Samples: Impact of Sample Preparation, Storage Conditions, and Animal Health Status

    Directory of Open Access Journals (Sweden)

    Christina Schreier

    2013-01-01

    Full Text Available Introduction. Spectroscopic analysis of urine samples from laboratory animals can be used to predict the efficacy and side effects of drugs. This employs methods combining 1H NMR spectroscopy with quantification of biomarkers or with multivariate data analysis. The most critical steps in data evaluation are analytical reproducibility of NMR data (collection, storage, and processing and the health status of the animals, which may influence urine pH and osmolarity. Methods. We treated rats with a solvent, a diuretic, or a nephrotoxicant and collected urine samples. Samples were titrated to pH 3 to 9, or salt concentrations increased up to 20-fold. The effects of storage conditions and freeze-thaw cycles were monitored. Selected metabolites and multivariate data analysis were evaluated after 1H NMR spectroscopy. Results. We showed that variation of pH from 3 to 9 and increases in osmolarity up to 6-fold had no effect on the quantification of the metabolites or on multivariate data analysis. Storage led to changes after 14 days at 4°C or after 12 months at −20°C, independent of sample composition. Multiple freeze-thaw cycles did not affect data analysis. Conclusion. Reproducibility of NMR measurements is not dependent on sample composition under physiological or pathological conditions.

  5. Feasibility of self-sampled dried blood spot and saliva samples sent by mail in a population-based study

    International Nuclear Information System (INIS)

    Sakhi, Amrit Kaur; Bastani, Nasser Ezzatkhah; Ellingjord-Dale, Merete; Gundersen, Thomas Erik; Blomhoff, Rune; Ursin, Giske

    2015-01-01

    In large epidemiological studies it is often challenging to obtain biological samples. Self-sampling by study participants using dried blood spots (DBS) technique has been suggested to overcome this challenge. DBS is a type of biosampling where blood samples are obtained by a finger-prick lancet, blotted and dried on filter paper. However, the feasibility and efficacy of collecting DBS samples from study participants in large-scale epidemiological studies is not known. The aim of the present study was to test the feasibility and response rate of collecting self-sampled DBS and saliva samples in a population–based study of women above 50 years of age. We determined response proportions, number of phone calls to the study center with questions about sampling, and quality of the DBS. We recruited women through a study conducted within the Norwegian Breast Cancer Screening Program. Invitations, instructions and materials were sent to 4,597 women. The data collection took place over a 3 month period in the spring of 2009. Response proportions for the collection of DBS and saliva samples were 71.0% (3,263) and 70.9% (3,258), respectively. We received 312 phone calls (7% of the 4,597 women) with questions regarding sampling. Of the 3,263 individuals that returned DBS cards, 3,038 (93.1%) had been packaged and shipped according to instructions. A total of 3,032 DBS samples were sufficient for at least one biomarker analysis (i.e. 92.9% of DBS samples received by the laboratory). 2,418 (74.1%) of the DBS cards received by the laboratory were filled with blood according to the instructions (i.e. 10 completely filled spots with up to 7 punches per spot for up to 70 separate analyses). To assess the quality of the samples, we selected and measured two biomarkers (carotenoids and vitamin D). The biomarker levels were consistent with previous reports. Collecting self-sampled DBS and saliva samples through the postal services provides a low cost, effective and feasible

  6. The ATLAS3D project - I. A volume-limited sample of 260 nearby early-type galaxies: science goals and selection criteria

    Science.gov (United States)

    Cappellari, Michele; Emsellem, Eric; Krajnović, Davor; McDermid, Richard M.; Scott, Nicholas; Verdoes Kleijn, G. A.; Young, Lisa M.; Alatalo, Katherine; Bacon, R.; Blitz, Leo; Bois, Maxime; Bournaud, Frédéric; Bureau, M.; Davies, Roger L.; Davis, Timothy A.; de Zeeuw, P. T.; Duc, Pierre-Alain; Khochfar, Sadegh; Kuntschner, Harald; Lablanche, Pierre-Yves; Morganti, Raffaella; Naab, Thorsten; Oosterloo, Tom; Sarzi, Marc; Serra, Paolo; Weijmans, Anne-Marie

    2011-05-01

    The ATLAS3D project is a multiwavelength survey combined with a theoretical modelling effort. The observations span from the radio to the millimetre and optical, and provide multicolour imaging, two-dimensional kinematics of the atomic (H I), molecular (CO) and ionized gas (Hβ, [O III] and [N I]), together with the kinematics and population of the stars (Hβ, Fe5015 and Mg b), for a carefully selected, volume-limited (1.16 × 105 Mpc3) sample of 260 early-type (elliptical E and lenticular S0) galaxies (ETGs). The models include semi-analytic, N-body binary mergers and cosmological simulations of galaxy formation. Here we present the science goals for the project and introduce the galaxy sample and the selection criteria. The sample consists of nearby (D 15°) morphologically selected ETGs extracted from a parent sample of 871 galaxies (8 per cent E, 22 per cent S0 and 70 per cent spirals) brighter than MK statistically representative of the nearby galaxy population. We present the size-luminosity relation for the spirals and ETGs and show that the ETGs in the ATLAS3D sample define a tight red sequence in a colour-magnitude diagram, with few objects in the transition from the blue cloud. We describe the strategy of the SAURON integral field observations and the extraction of the stellar kinematics with the pPXF method. We find typical 1σ errors of ΔV≈ 6 km s-1, Δσ≈ 7 km s-1, Δh3≈Δh4≈ 0.03 in the mean velocity, the velocity dispersion and Gauss-Hermite (GH) moments for galaxies with effective dispersion σe≳ 120 km s-1. For galaxies with lower σe (≈40 per cent of the sample) the GH moments are gradually penalized by pPXF towards zero to suppress the noise produced by the spectral undersampling and only V and σ can be measured. We give an overview of the characteristics of the other main data sets already available for our sample and of the ongoing modelling projects.

  7. Evaluation of pump pulsation in respirable size-selective sampling: Part III. Investigation of European standard methods.

    Science.gov (United States)

    Soo, Jhy-Charm; Lee, Eun Gyung; Lee, Larry A; Kashon, Michael L; Harper, Martin

    2014-10-01

    Lee et al. (Evaluation of pump pulsation in respirable size-selective sampling: part I. Pulsation measurements. Ann Occup Hyg 2014a;58:60-73) introduced an approach to measure pump pulsation (PP) using a real-world sampling train, while the European Standards (EN) (EN 1232-1997 and EN 12919-1999) suggest measuring PP using a resistor in place of the sampler. The goal of this study is to characterize PP according to both EN methods and to determine the relationship of PP between the published method (Lee et al., 2014a) and the EN methods. Additional test parameters were investigated to determine whether the test conditions suggested by the EN methods were appropriate for measuring pulsations. Experiments were conducted using a factorial combination of personal sampling pumps (six medium- and two high-volumetric flow rate pumps), back pressures (six medium- and seven high-flow rate pumps), resistors (two types), tubing lengths between a pump and resistor (60 and 90 cm), and different flow rates (2 and 2.5 l min(-1) for the medium- and 4.4, 10, and 11.2 l min(-1) for the high-flow rate pumps). The selection of sampling pumps and the ranges of back pressure were based on measurements obtained in the previous study (Lee et al., 2014a). Among six medium-flow rate pumps, only the Gilian5000 and the Apex IS conformed to the 10% criterion specified in EN 1232-1997. Although the AirChek XR5000 exceeded the 10% limit, the average PP (10.9%) was close to the criterion. One high-flow rate pump, the Legacy (PP=8.1%), conformed to the 10% criterion in EN 12919-1999, while the Elite12 did not (PP=18.3%). Conducting supplemental tests with additional test parameters beyond those used in the two subject EN standards did not strengthen the characterization of PPs. For the selected test conditions, a linear regression model [PPEN=0.014+0.375×PPNIOSH (adjusted R2=0.871)] was developed to determine the PP relationship between the published method (Lee et al., 2014a) and the EN methods

  8. Sample collection and sample analysis plan in support of the 105-C/190-C concrete and soil sampling activities

    International Nuclear Information System (INIS)

    Marske, S.G.

    1996-07-01

    This sampling and analysis plan describes the sample collection and sample analysis in support of the 105-C water tunnels and 190-C main pumphouse concrete and soil sampling activities. These analytical data will be used to identify the radiological contamination and presence of hazardous materials to support the decontamination and disposal activities

  9. Sample diversity and premise typicality in inductive reasoning: evidence for developmental change.

    Science.gov (United States)

    Rhodes, Marjorie; Brickman, Daniel; Gelman, Susan A

    2008-08-01

    Evaluating whether a limited sample of evidence provides a good basis for induction is a critical cognitive task. We hypothesized that whereas adults evaluate the inductive strength of samples containing multiple pieces of evidence by attending to the relations among the exemplars (e.g., sample diversity), six-year-olds would attend to the degree to which each individual exemplar in a sample independently appears informative (e.g., premise typicality). To test these hypotheses, participants were asked to select between diverse and non-diverse samples to help them learn about basic-level animal categories. Across various between-subject conditions (N=133), we varied the typicality present in the diverse and non-diverse samples. We found that adults reliably selected to examine diverse over non-diverse samples, regardless of exemplar typicality, six-year-olds preferred to examine samples containing typical exemplars, regardless of sample diversity, and nine-year-olds were somewhat in the midst of this developmental transition.

  10. Sample size of the reference sample in a case-augmented study.

    Science.gov (United States)

    Ghosh, Palash; Dewanji, Anup

    2017-05-01

    The case-augmented study, in which a case sample is augmented with a reference (random) sample from the source population with only covariates information known, is becoming popular in different areas of applied science such as pharmacovigilance, ecology, and econometrics. In general, the case sample is available from some source (for example, hospital database, case registry, etc.); however, the reference sample is required to be drawn from the corresponding source population. The required minimum size of the reference sample is an important issue in this regard. In this work, we address the minimum sample size calculation and discuss related issues. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  11. Water pollution screening by large-volume injection of aqueous samples and application to GC/MS analysis of a river Elbe sample

    Energy Technology Data Exchange (ETDEWEB)

    Mueller, S.; Efer, J.; Engewald, W. [Leipzig Univ. (Germany). Inst. fuer Analytische Chemie

    1997-03-01

    The large-volume sampling of aqueous samples in a programmed temperature vaporizer (PTV) injector was used successfully for the target and non-target analysis of real samples. In this still rarely applied method, e.g., 1 mL of the water sample to be analyzed is slowly injected direct into the PTV. The vaporized water is eliminated through the split vent. The analytes are concentrated onto an adsorbent inside the insert and subsequently thermally desorbed. The capability of the method is demonstrated using a sample from the river Elbe. By means of coupling this method with a mass selective detector in SIM mode (target analysis) the method allows the determination of pollutants in the concentration range up to 0.01 {mu}g/L. Furthermore, PTV enrichment is an effective and time-saving method for non-target analysis in SCAN mode. In a sample from the river Elbe over 20 compounds were identified. (orig.) With 3 figs., 2 tabs.

  12. Effect of selective logging on genetic diversity and gene flow in Cariniana legalis sampled from a cacao agroforestry system.

    Science.gov (United States)

    Leal, J B; Santos, R P; Gaiotto, F A

    2014-01-28

    The fragments of the Atlantic Forest of southern Bahia have a long history of intense logging and selective cutting. Some tree species, such as jequitibá rosa (Cariniana legalis), have experienced a reduction in their populations with respect to both area and density. To evaluate the possible effects of selective logging on genetic diversity, gene flow, and spatial genetic structure, 51 C. legalis individuals were sampled, representing the total remaining population from the cacao agroforestry system. A total of 120 alleles were observed from the 11 microsatellite loci analyzed. The average observed heterozygosity (0.486) was less than the expected heterozygosity (0.721), indicating a loss of genetic diversity in this population. A high fixation index (FIS = 0.325) was found, which is possibly due to a reduction in population size, resulting in increased mating among relatives. The maximum (1055 m) and minimum (0.095 m) distances traveled by pollen or seeds were inferred based on paternity tests. We found 36.84% of unique parents among all sampled seedlings. The progenitors of the remaining seedlings (63.16%) were most likely out of the sampled area. Positive and significant spatial genetic structure was identified in this population among classes 10 to 30 m away with an average coancestry coefficient between pairs of individuals of 0.12. These results suggest that the agroforestry system of cacao cultivation is contributing to maintaining levels of diversity and gene flow in the studied population, thus minimizing the effects of selective logging.

  13. Distribution of Gas Phase Polycyclic Aromatic Hydrocarbons (PAHs in Selected Indoor and Outdoor Air Samples of Malaysia: a Case Study in Serdang, Selangor and Bachang, Malacca

    Directory of Open Access Journals (Sweden)

    Haris Hafizal Abd Hamid

    2017-07-01

    Full Text Available Distribution of 10 polycyclic aromatic hydrocarbons (PAHs in the gas phase of air from selected indoor and outdoor areas of Selangor and Malacca, Malaysia has been investigated. A locally designed Semi Permeable Membrane Device (SPMD was applied for passive air sampling for 37 days at selected locations. Cleanup was carried out with Gas Purge - Micro Syringe Extraction (GP-MSE and the final analysis was using Gas Chromatography-Mass Spectrometry (GC-MS. In this study, 6 indoor and 12 outdoor locations were selected for air sampling. A total of 10 compounds of PAHs (Ʃ10PAHs were determined in the range of 0.218 ng/m3 - 1.692 ng/m3 and 0.378 ng/m3 - 1.492 ng/m3 in outdoor and indoor samples respectively. In the outdoor samples, locations such as near a petrol station and heavy traffic showed the maximum levels of Ʃ10PAHs, while rooftop samples showed the lowest Ʃ10PAHs. The distribution of gas phase Ʃ10PAHs was influenced by vehicular emission. Low molecular weight (LMW compounds (2-3 rings were dominant in all samples (>70% indicating that SPMD has successfully sampled the gas phase of the air.

  14. Generalization of the Ewens sampling formula to arbitrary fitness landscapes.

    Directory of Open Access Journals (Sweden)

    Pavel Khromov

    Full Text Available In considering evolution of transcribed regions, regulatory sequences, and other genomic loci, we are often faced with a situation in which the number of allelic states greatly exceeds the size of the population. In this limit, the population eventually adopts a steady state characterized by mutation-selection-drift balance. Although new alleles continue to be explored through mutation, the statistics of the population, and in particular the probabilities of seeing specific allelic configurations in samples taken from the population, do not change with time. In the absence of selection, the probabilities of allelic configurations are given by the Ewens sampling formula, widely used in population genetics to detect deviations from neutrality. Here we develop an extension of this formula to arbitrary fitness distributions. Although our approach is general, we focus on the class of fitness landscapes, inspired by recent high-throughput genotype-phenotype maps, in which alleles can be in several distinct phenotypic states. This class of landscapes yields sampling probabilities that are computationally more tractable and can form a basis for inference of selection signatures from genomic data. Using an efficient numerical implementation of the sampling probabilities, we demonstrate that, for a sizable range of mutation rates and selection coefficients, the steady-state allelic diversity is not neutral. Therefore, it may be used to infer selection coefficients, as well as other evolutionary parameters from population data. We also carry out numerical simulations to challenge various approximations involved in deriving our sampling formulas, such as the infinite-allele limit and the "full connectivity" assumption inherent in the Ewens theory, in which each allele can mutate into any other allele. We find that, at least for the specific numerical examples studied, our theory remains sufficiently accurate even if these assumptions are relaxed. Thus our

  15. Large sample neutron activation analysis of a reference inhomogeneous sample

    International Nuclear Information System (INIS)

    Vasilopoulou, T.; Athens National Technical University, Athens; Tzika, F.; Stamatelatos, I.E.; Koster-Ammerlaan, M.J.J.

    2011-01-01

    A benchmark experiment was performed for Neutron Activation Analysis (NAA) of a large inhomogeneous sample. The reference sample was developed in-house and consisted of SiO 2 matrix and an Al-Zn alloy 'inhomogeneity' body. Monte Carlo simulations were employed to derive appropriate correction factors for neutron self-shielding during irradiation as well as self-attenuation of gamma rays and sample geometry during counting. The large sample neutron activation analysis (LSNAA) results were compared against reference values and the trueness of the technique was evaluated. An agreement within ±10% was observed between LSNAA and reference elemental mass values, for all matrix and inhomogeneity elements except Samarium, provided that the inhomogeneity body was fully simulated. However, in cases that the inhomogeneity was treated as not known, the results showed a reasonable agreement for most matrix elements, while large discrepancies were observed for the inhomogeneity elements. This study provided a quantification of the uncertainties associated with inhomogeneity in large sample analysis and contributed to the identification of the needs for future development of LSNAA facilities for analysis of inhomogeneous samples. (author)

  16. Theory of sampling and its application in tissue based diagnosis

    Directory of Open Access Journals (Sweden)

    Kayser Gian

    2009-02-01

    Full Text Available Abstract Background A general theory of sampling and its application in tissue based diagnosis is presented. Sampling is defined as extraction of information from certain limited spaces and its transformation into a statement or measure that is valid for the entire (reference space. The procedure should be reproducible in time and space, i.e. give the same results when applied under similar circumstances. Sampling includes two different aspects, the procedure of sample selection and the efficiency of its performance. The practical performance of sample selection focuses on search for localization of specific compartments within the basic space, and search for presence of specific compartments. Methods When a sampling procedure is applied in diagnostic processes two different procedures can be distinguished: I the evaluation of a diagnostic significance of a certain object, which is the probability that the object can be grouped into a certain diagnosis, and II the probability to detect these basic units. Sampling can be performed without or with external knowledge, such as size of searched objects, neighbourhood conditions, spatial distribution of objects, etc. If the sample size is much larger than the object size, the application of a translation invariant transformation results in Kriege's formula, which is widely used in search for ores. Usually, sampling is performed in a series of area (space selections of identical size. The size can be defined in relation to the reference space or according to interspatial relationship. The first method is called random sampling, the second stratified sampling. Results Random sampling does not require knowledge about the reference space, and is used to estimate the number and size of objects. Estimated features include area (volume fraction, numerical, boundary and surface densities. Stratified sampling requires the knowledge of objects (and their features and evaluates spatial features in relation to

  17. Determination of selected quaternary ammonium compounds by liquid chromatography with mass spectrometry. Part II. Application to sediment and sludge samples in Austria

    International Nuclear Information System (INIS)

    Martinez-Carballo, Elena; Gonzalez-Barreiro, Carmen; Sitka, Andrea; Kreuzinger, Norbert; Scharf, Sigrid; Gans, Oliver

    2007-01-01

    Soxhlet extraction and high-performance liquid chromatography (HPLC) coupled to tandem mass spectrometry detection (MS/MS) was used for the determination of selected quaternary ammonium compounds (QACs) in solid samples. The method was applied for the determination of alkyl benzyl, dialkyl and trialkyl quaternary ammonium compounds in sediment and sludge samples in Austria. The overall method quantification limits range from 0.6 to 3 μg/kg for sediments and from 2 to 5 μg/kg for sewage sludges. Mean recoveries between 67% and 95% are achieved. In general sediments were especially contaminated by C12 chain benzalkonium chloride (BAC-C12) as well as by the long C-chain dialkyldimethylammonium chloride (DDAC-C18) with a maximum concentration of 3.6 mg/kg and 2.1 mg/kg, respectively. Maxima of 27 mg/kg for DDAC-C10, 25 mg/kg for BAC-C12 and 23 mg/kg for BAC-C14 were determined for sludge samples. The sums of the 12 selected target compounds range from 22 mg/kg to 103 mg/kg in the sludge samples. - The presence of quaternary ammonium surfactants in sediment and sludge samples is reported for the first time in Austria

  18. Influence of sampling depth and post-sampling analysis time on the ...

    African Journals Online (AJOL)

    Bacteriological analysis was carried out for samples taken at water depth and at 1, 6, 12 and 24 hours post-sampling. It was observed that the total and faecal coliform bacteria were significantly higher in the 3 m water depth samples than in the surface water samples (ANOVA, F = 59.41, 26.751, 9.82 (T.C); 46.41, 26.81, ...

  19. DNA Sampling Hook

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The DNA Sampling Hook is a significant improvement on a method of obtaining a tissue sample from a live fish in situ from an aquatic environment. A tissue sample...

  20. Accretion Properties of a Sample of Hard X-Ray (<60 keV) Selected Seyfert 1 Galaxies

    Science.gov (United States)

    Wang, J.; Mao, Y. F.; Wei, J. Y.

    2009-02-01

    We examine the accretion properties in a sample of 42 hard (3-60 keV) X-ray selected nearby broad-line active galactic nuclei (AGNs). The energy range in the sample is harder than that usually used in similar previous studies. These AGNs are mainly complied from the RXTE All Sky Survey, and complemented by the released INTEGRAL AGN catalog. The black hole masses, bolometric luminosities of AGN, and Eddington ratios are derived from their optical spectra in terms of the broad Hβ emission line. The tight correlation between the hard X-ray (3-20 keV) and bolometric/line luminosity is well identified in our sample. Also identified is a strong inverse Baldwin relationship of the Hβ emission line. In addition, all of these hard X-ray AGNs are biased toward luminous objects with a high Eddington ratio (mostly between 0.01 and 0.1) and a low column density (Simbol-X, and NeXT. Finally, the correlation between RFe (= optical Fe II/Hβ) and disk temperature as assessed by T vprop (L/L Edd)M -1 BH leads us to suggest that the strength of the Fe II emission is mainly determined by the shape of the ionizing spectrum.

  1. Why sample selection matters in exploratory factor analysis: implications for the 12-item World Health Organization Disability Assessment Schedule 2.0.

    Science.gov (United States)

    Gaskin, Cadeyrn J; Lambert, Sylvie D; Bowe, Steven J; Orellana, Liliana

    2017-03-11

    Sample selection can substantially affect the solutions generated using exploratory factor analysis. Validation studies of the 12-item World Health Organization (WHO) Disability Assessment Schedule 2.0 (WHODAS 2.0) have generally involved samples in which substantial proportions of people had no, or minimal, disability. With the WHODAS 2.0 oriented towards measuring disability across six life domains (cognition, mobility, self-care, getting along, life activities, and participation in society), performing factor analysis with samples of people with disability may be more appropriate. We determined the influence of the sampling strategy on (a) the number of factors extracted and (b) the factor structure of the WHODAS 2.0. Using data from adults aged 50+ from the six countries in Wave 1 of the WHO's longitudinal Study on global AGEing and adult health (SAGE), we repeatedly selected samples (n = 750) using two strategies: (1) simple random sampling that reproduced nationally representative distributions of WHODAS 2.0 summary scores for each country (i.e., positively skewed distributions with many zero scores indicating the absence of disability), and (2) stratified random sampling with weights designed to obtain approximately symmetric distributions of summary scores for each country (i.e. predominantly including people with varying degrees of disability). Samples with skewed distributions typically produced one-factor solutions, except for the two countries with the lowest percentages of zero scores, in which the majority of samples produced two factors. Samples with approximately symmetric distributions, generally produced two- or three-factor solutions. In the two-factor solutions, the getting along domain items loaded on one factor (commonly with a cognition domain item), with remaining items loading on a second factor. In the three-factor solutions, the getting along and self-care domain items loaded separately on two factors and three other domains

  2. Laboratory Sampling Guide

    Science.gov (United States)

    2012-05-11

    environment, and by ingestion of foodstuffs that have incorporated C-14 by photosynthesis . Like tritium, C-14 is a very low energy beta emitter and is... bacterial growth and to minimize development of solids in the sample. • Properly identify each sample container with name, SSN, and collection start and...sampling in the same cardboard carton. The sample may be kept cool or frozen during collection to control odor and bacterial growth. • Once

  3. The size selectivity of the main body of a sampling pelagic pair trawl in freshwater reservoirs during the night

    Czech Academy of Sciences Publication Activity Database

    Říha, Milan; Jůza, Tomáš; Prchalová, Marie; Mrkvička, Tomáš; Čech, Martin; Draštík, Vladislav; Muška, Milan; Kratochvíl, Michal; Peterka, Jiří; Tušer, Michal; Vašek, Mojmír; Kubečka, Jan

    2012-01-01

    Roč. 127, September (2012), s. 56-60 ISSN 0165-7836 R&D Projects: GA MZe(CZ) QH81046 Institutional support: RVO:60077344 Keywords : quantitative sampling * gear selectivity * trawl * reservoirs Subject RIV: GL - Fishing Impact factor: 1.695, year: 2012

  4. Dietary trace element intakes of a selected sample of Canadian elderly women

    International Nuclear Information System (INIS)

    Gibson, R.S.; MacDonald, A.C.; Martinez, O.B.

    1984-01-01

    Energy, and selected trace intakes of a sample of 90 noninstitutionalized Canadian women (mean age 66.2 +/- 6.2 years) living in a University community and consuming self-selected diets were assessed by chemical analysis of one-day duplicate diets and via 1-day dietary records collected by the subjects. Mean gross energy intake (determined via bomb calorimetry was 6.0 +/- 2.4 MJ (1435 +/- 580 kcal) and mean intakes of Cu and Mn (determined via atomic absorption spectrophotometry) were 1.2 +/- 0.6 mg and 3.8 +/- 2.1 mg/day, respectively. Instrumental neutron activation analysis was used for Cr - median = 77.4 μg/day; Se - median = 69.6 μg/day; Zn - mean + SD = 7.7 +/- 3.6 mg/day; Ag - median = 26.9 μg/day; Cs - median = 4.8 μg/day; Rb - median = 1.6 mg/day; Sb - median = 1.8 μg/day; Sc - median = 0.3 μg/day. Dietary intakes of Cr, Mn and Se for the majority of the subjects fell within the US safe and adequate range. In contrast, a high proportion of subjects had apparently low intakes of dietary Cu and Zn in relation to current US dietary recommendations

  5. Sampling Development

    Science.gov (United States)

    Adolph, Karen E.; Robinson, Scott R.

    2011-01-01

    Research in developmental psychology requires sampling at different time points. Accurate depictions of developmental change provide a foundation for further empirical studies and theories about developmental mechanisms. However, overreliance on widely spaced sampling intervals in cross-sectional and longitudinal designs threatens the validity of…

  6. Differences between Internet samples and conventional samples of men who have sex with men: implications for research and HIV interventions.

    Science.gov (United States)

    Ross, M W; Tikkanen, R; Månsson, S A

    2000-09-01

    The Internet is becoming a new erotic oasis for obtaining sex online or in person. We reviewed the literature on cybersex and compared differences in data from samples of homosexually active men obtained on identical questionnaires from a conventional written questionnaire, distributed through the mailing and contact lists of a large national gay organization in Sweden, and through the same organization's website and chat room. A total of 716 written questionnaires and 678 Internet questionnaires were obtained. The Internet sample was younger, more likely to live in small towns or cities, live with parents or a girlfriend, and have lower formal education. They are less likely to have previous sexual experience solely with other men (one in three of the Internet sample vs. 1 in 14 of the written sample defined themselves as bisexual) and more likely to visit erotic oases such as bathhouses, video clubs and erotic movie houses. They also visited Internet chat rooms more frequently (86% of the Internet sample vs. 50% of the written sample). One third of the Internet sample wanted the opportunity to talk with an expert about HIV compared with a quarter of the written sample. Sexual practices between the two samples were generally similar, although the Internet sample reported significantly less body contact, kissing, hugging, mutual masturbation, and more condom use for anal intercourse with steady partners. Over four times as many of the Internet samples reported sex with women in the past year as the written sample. These data indicate that Internet data collection is feasible and that this mode of data collection, despite the nonrandom and self-selected nature of both types of samples, is likely to be more significantly oriented toward the young, geographically more isolated, and more behaviorally and self-identified bisexual respondent than conventionally distributed written questionnaires.

  7. ExSample. A library for sampling Sudakov-type distributions

    Energy Technology Data Exchange (ETDEWEB)

    Plaetzer, Simon

    2011-08-15

    Sudakov-type distributions are at the heart of generating radiation in parton showers as well as contemporary NLO matching algorithms along the lines of the POWHEG algorithm. In this paper, the C++ library ExSample is introduced, which implements adaptive sampling of Sudakov-type distributions for splitting kernels which are in general only known numerically. Besides the evolution variable, the splitting kernels can depend on an arbitrary number of other degrees of freedom to be sampled, and any number of further parameters which are fixed on an event-by-event basis. (orig.)

  8. ExSample. A library for sampling Sudakov-type distributions

    International Nuclear Information System (INIS)

    Plaetzer, Simon

    2011-08-01

    Sudakov-type distributions are at the heart of generating radiation in parton showers as well as contemporary NLO matching algorithms along the lines of the POWHEG algorithm. In this paper, the C++ library ExSample is introduced, which implements adaptive sampling of Sudakov-type distributions for splitting kernels which are in general only known numerically. Besides the evolution variable, the splitting kernels can depend on an arbitrary number of other degrees of freedom to be sampled, and any number of further parameters which are fixed on an event-by-event basis. (orig.)

  9. Spherical sampling

    CERN Document Server

    Freeden, Willi; Schreiner, Michael

    2018-01-01

    This book presents, in a consistent and unified overview, results and developments in the field of today´s spherical sampling, particularly arising in mathematical geosciences. Although the book often refers to original contributions, the authors made them accessible to (graduate) students and scientists not only from mathematics but also from geosciences and geoengineering. Building a library of topics in spherical sampling theory it shows how advances in this theory lead to new discoveries in mathematical, geodetic, geophysical as well as other scientific branches like neuro-medicine. A must-to-read for everybody working in the area of spherical sampling.

  10. The lack of selection bias in a snowball sampled case-control study on drug abuse.

    Science.gov (United States)

    Lopes, C S; Rodrigues, L C; Sichieri, R

    1996-12-01

    Friend controls in matched case-control studies can be a potential source of bias based on the assumption that friends are more likely to share exposure factors. This study evaluates the role of selection bias in a case-control study that used the snowball sampling method based on friendship for the selection of cases and controls. The cases selected fro the study were drug abusers located in the community. Exposure was defined by the presence of at least one psychiatric diagnosis. Psychiatric and drug abuse/dependence diagnoses were made according to the Diagnostic and Statistical Manual of Mental Disorders (DSM-III-R) criteria. Cases and controls were matched on sex, age and friendship. The measurement of selection bias was made through the comparison of the proportion of exposed controls selected by exposed cases (p1) with the proportion of exposed controls selected by unexposed cases (p2). If p1 = p2 then, selection bias should not occur. The observed distribution of the 185 matched pairs having at least one psychiatric disorder showed a p1 value of 0.52 and a p2 value of 0.51, indicating no selection bias in this study. Our findings support the idea that the use of friend controls can produce a valid basis for a case-control study.

  11. Determination of Selected Polycyclic Aromatic Compounds in Particulate Matter Samples with Low Mass Loading: An Approach to Test Method Accuracy

    Directory of Open Access Journals (Sweden)

    Susana García-Alonso

    2017-01-01

    Full Text Available A miniaturized analytical procedure to determine selected polycyclic aromatic compounds (PACs in low mass loadings (<10 mg of particulate matter (PM is evaluated. The proposed method is based on a simple sonication/agitation method using small amounts of solvent for extraction. The use of a reduced sample size of particulate matter is often limiting for allowing the quantification of analytes. This also leads to the need for changing analytical procedures and evaluating its performance. The trueness and precision of the proposed method were tested using ambient air samples. Analytical results from the proposed method were compared with those of pressurized liquid and microwave extractions. Selected PACs (polycyclic aromatic hydrocarbons (PAHs and nitro polycyclic aromatic hydrocarbons (NPAHs were determined by liquid chromatography with fluorescence detection (HPLC/FD. Taking results from pressurized liquid extractions as reference values, recovery rates of sonication/agitation method were over 80% for the most abundant PAHs. Recovery rates of selected NPAHs were lower. Enhanced rates were obtained when methanol was used as a modifier. Intermediate precision was estimated by data comparison from two mathematical approaches: normalized difference data and pooled relative deviations. Intermediate precision was in the range of 10–20%. The effectiveness of the proposed method was evaluated in PM aerosol samples collected with very low mass loadings (<0.2 mg during characterization studies from turbofan engine exhausts.

  12. Synthesis and application of magnetic molecularly imprinted polymers in sample preparation.

    Science.gov (United States)

    Huang, Shuyao; Xu, Jianqiao; Zheng, Jiating; Zhu, Fang; Xie, Lijun; Ouyang, Gangfeng

    2018-04-12

    Magnetic molecularly imprinted polymers (MMIPs) have superior advantages in sample pretreatment because of their high selectivity for target analytes and the fast and easy isolation from samples. To meet the demand of both good magnetic property and good extraction performance, MMIPs with various structures, from traditional core-shell structures to novel composite structures with a larger specific surface area and more accessible binding sites, are fabricated by different preparation technologies. Moreover, as the molecularly imprinted polymer (MIP) layers determine the affinity, selectivity, and saturated adsorption amount of MMIPs, the development and innovation of the MIP layer are attracting attention and are reviewed here. Many studies that used MMIPs as sorbents in dispersive solid-phase extraction of complex samples, including environmental, food, and biofluid samples, are summarized. Graphical abstract The application of magnetic molecularly imprinted polymers (MIPs) in the sample preparation procedure improves the analytical performances for complex samples. MITs molecular imprinting technologies.

  13. A sampling strategy for estimating plot average annual fluxes of chemical elements from forest soils

    NARCIS (Netherlands)

    Brus, D.J.; Gruijter, de J.J.; Vries, de W.

    2010-01-01

    A sampling strategy for estimating spatially averaged annual element leaching fluxes from forest soils is presented and tested in three Dutch forest monitoring plots. In this method sampling locations and times (days) are selected by probability sampling. Sampling locations were selected by

  14. New materials for sample preparation techniques in bioanalysis.

    Science.gov (United States)

    Nazario, Carlos Eduardo Domingues; Fumes, Bruno Henrique; da Silva, Meire Ribeiro; Lanças, Fernando Mauro

    2017-02-01

    The analysis of biological samples is a complex and difficult task owing to two basic and complementary issues: the high complexity of most biological matrices and the need to determine minute quantities of active substances and contaminants in such complex sample. To succeed in this endeavor samples are usually subject to three steps of a comprehensive analytical methodological approach: sample preparation, analytes isolation (usually utilizing a chromatographic technique) and qualitative/quantitative analysis (usually with the aid of mass spectrometric tools). Owing to the complex nature of bio-samples, and the very low concentration of the target analytes to be determined, selective sample preparation techniques is mandatory in order to overcome the difficulties imposed by these two constraints. During the last decade new chemical synthesis approaches has been developed and optimized, such as sol-gel and molecularly imprinting technologies, allowing the preparation of novel materials for sample preparation including graphene and derivatives, magnetic materials, ionic liquids, molecularly imprinted polymers, and much more. In this contribution we will review these novel techniques and materials, as well as their application to the bioanalysis niche. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Approach-Induced Biases in Human Information Sampling.

    Directory of Open Access Journals (Sweden)

    Laurence T Hunt

    2016-11-01

    Full Text Available Information sampling is often biased towards seeking evidence that confirms one's prior beliefs. Despite such biases being a pervasive feature of human behavior, their underlying causes remain unclear. Many accounts of these biases appeal to limitations of human hypothesis testing and cognition, de facto evoking notions of bounded rationality, but neglect more basic aspects of behavioral control. Here, we investigated a potential role for Pavlovian approach in biasing which information humans will choose to sample. We collected a large novel dataset from 32,445 human subjects, making over 3 million decisions, who played a gambling task designed to measure the latent causes and extent of information-sampling biases. We identified three novel approach-related biases, formalized by comparing subject behavior to a dynamic programming model of optimal information gathering. These biases reflected the amount of information sampled ("positive evidence approach", the selection of which information to sample ("sampling the favorite", and the interaction between information sampling and subsequent choices ("rejecting unsampled options". The prevalence of all three biases was related to a Pavlovian approach-avoid parameter quantified within an entirely independent economic decision task. Our large dataset also revealed that individual differences in the amount of information gathered are a stable trait across multiple gameplays and can be related to demographic measures, including age and educational attainment. As well as revealing limitations in cognitive processing, our findings suggest information sampling biases reflect the expression of primitive, yet potentially ecologically adaptive, behavioral repertoires. One such behavior is sampling from options that will eventually be chosen, even when other sources of information are more pertinent for guiding future action.

  16. The collection and field chemical analysis of water samples

    International Nuclear Information System (INIS)

    Korte, N.E.; Ealey, D.T.; Hollenbach, M.H.

    1984-01-01

    A successful water sampling program requires a clear understanding of appropriate measurement and sampling procedures in order to obtain reliable field data and representative samples. It is imperative that the personnel involved have a thorough knowledge of the limitations of the techniques being used. Though this seems self-evident, many sampling and field-chemical-analysis programs are still not properly conducted. Recognizing these problems, the Department of Energy contracted with Bendix Field Engineering Corporation through the Technical Measurements Center to develop and select procedures for water sampling and field chemical analysis at waste sites. The fundamental causese of poor field programs are addressed in this paper, largely through discussion of specific field-measurement techniques and their limitations. Recommendations for improvement, including quality-assurance measures, are also presented

  17. Perilymph sampling from the cochlear apex: a reliable method to obtain higher purity perilymph samples from scala tympani.

    Science.gov (United States)

    Salt, Alec N; Hale, Shane A; Plonkte, Stefan K R

    2006-05-15

    TMPA concentration of the fluid sample was compared with that measured in perilymph prior to taking the sample using a TMPA-selective microelectrode sealed into ST. Data were interpreted with a finite element model of the cochlear fluids that was used to simulate each aspect of the experiment. The correction of sample concentration back to the perilymph concentration prior to sampling can be performed based on the known ST volume (4.7microL in the guinea pig) and the sample volume. A more precise correction requires some knowledge of the profile of drug distribution along the cochlear prior to sampling. This method of sampling from the apex is technically simple and provides a larger sample volume with a greater proportion of perilymph compared to sampling through the RW.

  18. An integrated and accessible sample data library for Mars sample return science

    Science.gov (United States)

    Tuite, M. L., Jr.; Williford, K. H.

    2015-12-01

    Over the course of the next decade or more, many thousands of geological samples will be collected and analyzed in a variety of ways by researchers at the Jet Propulsion Laboratory (California Institute of Technology) in order to facilitate discovery and contextualize observations made of Mars rocks both in situ and here on Earth if samples are eventually returned. Integration of data from multiple analyses of samples including petrography, thin section and SEM imaging, isotope and organic geochemistry, XRF, XRD, and Raman spectrometry is a challenge and a potential obstacle to discoveries that require supporting lines of evidence. We report the development of a web-accessible repository, the Sample Data Library (SDL) for the sample-based data that are generated by the laboratories and instruments that comprise JPL's Center for Analysis of Returned Samples (CARS) in order to facilitate collaborative interpretation of potential biosignatures in Mars-analog geological samples. The SDL is constructed using low-cost, open-standards-based Amazon Web Services (AWS), including web-accessible storage, relational data base services, and a virtual web server. The data structure is sample-centered with a shared registry for assigning unique identifiers to all samples including International Geo-Sample Numbers. Both raw and derived data produced by instruments and post-processing workflows are automatically uploaded to online storage and linked via the unique identifiers. Through the web interface, users are able to find all the analyses associated with a single sample or search across features shared by multiple samples, sample localities, and analysis types. Planned features include more sophisticated search and analytical interfaces as well as data discoverability through NSF's EarthCube program.

  19. Characterizing lentic freshwater fish assemblages using multiple sampling methods

    Science.gov (United States)

    Fischer, Jesse R.; Quist, Michael C.

    2014-01-01

    Characterizing fish assemblages in lentic ecosystems is difficult, and multiple sampling methods are almost always necessary to gain reliable estimates of indices such as species richness. However, most research focused on lentic fish sampling methodology has targeted recreationally important species, and little to no information is available regarding the influence of multiple methods and timing (i.e., temporal variation) on characterizing entire fish assemblages. Therefore, six lakes and impoundments (48–1,557 ha surface area) were sampled seasonally with seven gear types to evaluate the combined influence of sampling methods and timing on the number of species and individuals sampled. Probabilities of detection for species indicated strong selectivities and seasonal trends that provide guidance on optimal seasons to use gears when targeting multiple species. The evaluation of species richness and number of individuals sampled using multiple gear combinations demonstrated that appreciable benefits over relatively few gears (e.g., to four) used in optimal seasons were not present. Specifically, over 90 % of the species encountered with all gear types and season combinations (N = 19) from six lakes and reservoirs were sampled with nighttime boat electrofishing in the fall and benthic trawling, modified-fyke, and mini-fyke netting during the summer. Our results indicated that the characterization of lentic fish assemblages was highly influenced by the selection of sampling gears and seasons, but did not appear to be influenced by waterbody type (i.e., natural lake, impoundment). The standardization of data collected with multiple methods and seasons to account for bias is imperative to monitoring of lentic ecosystems and will provide researchers with increased reliability in their interpretations and decisions made using information on lentic fish assemblages.

  20. THE USE OF RANKING SAMPLING METHOD WITHIN MARKETING RESEARCH

    Directory of Open Access Journals (Sweden)

    CODRUŢA DURA

    2011-01-01

    Full Text Available Marketing and statistical literature available to practitioners provides a wide range of sampling methods that can be implemented in the context of marketing research. Ranking sampling method is based on taking apart the general population into several strata, namely into several subdivisions which are relatively homogenous regarding a certain characteristic. In fact, the sample will be composed by selecting, from each stratum, a certain number of components (which can be proportional or non-proportional to the size of the stratum until the pre-established volume of the sample is reached. Using ranking sampling within marketing research requires the determination of some relevant statistical indicators - average, dispersion, sampling error etc. To that end, the paper contains a case study which illustrates the actual approach used in order to apply the ranking sample method within a marketing research made by a company which provides Internet connection services, on a particular category of customers – small and medium enterprises.

  1. Selective reduction of arsenic species by hydride generation - atomic absorption spectrometry. Part 2 - sample storage and arsenic determination in natural waters

    Directory of Open Access Journals (Sweden)

    Quináia Sueli P.

    2001-01-01

    Full Text Available Total arsenic, arsenite, arsinate and dimethylarsinic acid (DMA were selectively determined in natural waters by hydride generation - atomic absorption spectrometry, using sodium tetrahydroborate(III as reductant but in different reduction media. River water samples from the north region of Paraná State, Brazil, were analysed and showed arsenate as the principal arsenical form. Detection limits found for As(III (citrate buffer, As(III + DMA (acetic acid and As(III + As(V (hydrochloric acid were 0.6, 1.1 and 0.5 mg As L-1, respectively. Sample storage on the proper reaction media revealed to be a useful way to preserve the water sample.

  2. Plasma metanephrine for assessing the selectivity of adrenal venous sampling

    NARCIS (Netherlands)

    Dekkers, T.; Deinum, J.; Schultze Kool, L.J.; Blondin, D.; Vonend, O.; Hermus, A.R.M.M.; Peitzsch, M.; Rump, L.C.; Antoch, G.; Sweep, F.C.; Bornstein, S.R.; Lenders, J.W.M.; Willenberg, H.S.; Eisenhofer, G.

    2013-01-01

    Adrenal vein sampling is used to establish the origins of excess production of adrenal hormones in primary aldosteronism. Correct catheter positioning is confirmed using adrenal vein measurements of cortisol, but this parameter is not always reliable. Plasma metanephrine represents an alternative

  3. Product-selective blot: a technique for measuring enzyme activities in large numbers of samples and in native electrophoresis gels

    International Nuclear Information System (INIS)

    Thompson, G.A.; Davies, H.M.; McDonald, N.

    1985-01-01

    A method termed product-selective blotting has been developed for screening large numbers of samples for enzyme activity. The technique is particularly well suited to detection of enzymes in native electrophoresis gels. The principle of the method was demonstrated by blotting samples from glutaminase or glutamate synthase reactions into an agarose gel embedded with ion-exchange resin under conditions favoring binding of product (glutamate) over substrates and other substances in the reaction mixture. After washes to remove these unbound substances, the product was measured using either fluorometric staining or radiometric techniques. Glutaminase activity in native electrophoresis gels was visualized by a related procedure in which substrates and products from reactions run in the electrophoresis gel were blotted directly into a resin-containing image gel. Considering the selective-binding materials available for use in the image gel, along with the possible detection systems, this method has potentially broad application

  4. Polycyclic aromatic hydrocarbons in air samples of meat smokehouses

    DEFF Research Database (Denmark)

    Hansen, Åse Marie; Olsen, I L; Poulsen, O M

    1992-01-01

    In a screening programme nine Danish meat smokehouses were randomly selected for measurements on concentration of airborne polycyclic aromatic hydrocarbons (PAH). A total of 23 stationary air samples were collected during the entire working period of the kiln either above the kiln doors or approx......In a screening programme nine Danish meat smokehouses were randomly selected for measurements on concentration of airborne polycyclic aromatic hydrocarbons (PAH). A total of 23 stationary air samples were collected during the entire working period of the kiln either above the kiln doors...

  5. Passive sampling of selected pesticides in aquatic environment using polar organic chemical integrative samplers.

    Science.gov (United States)

    Thomatou, Alphanna-Akrivi; Zacharias, Ierotheos; Hela, Dimitra; Konstantinou, Ioannis

    2011-08-01

    Polar chemical integrative samplers (POCIS) were examined for their sampling efficiency of 12 pesticides and one metabolite commonly detected in surface waters. Laboratory-based calibration experiments of POCISs were conducted. The determined passive sampling rates were applied for the monitoring of pesticides levels in Lake Amvrakia, Western Greece. Spot sampling was also performed for comparison purposes. Calibration experiments were performed on the basis of static renewal exposure of POCIS under stirred conditions for different time periods of up to 28 days. The analytical procedures were based on the coupling of POCIS and solid phase extraction by Oasis HLB cartridges with gas chromatography-mass spectrometry. The recovery of the target pesticides from the POCIS was generally >79% with relative standard deviation (RSD) monitoring campaign using both passive and spot sampling whereas higher concentrations were measured by spot sampling in most cases. Passive sampling by POCIS provides a useful tool for the monitoring of pesticides in aquatic systems since integrative sampling at rates sufficient for analytical quantitation of ambient levels was observed. Calibration data are in demand for a greater number of compounds in order to extend the use in environmental monitoring.

  6. 21 CFR 203.38 - Sample lot or control numbers; labeling of sample units.

    Science.gov (United States)

    2010-04-01

    ... numbers; labeling of sample units. (a) Lot or control number required on drug sample labeling and sample... identifying lot or control number that will permit the tracking of the distribution of each drug sample unit... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Sample lot or control numbers; labeling of sample...

  7. Remote sampling and analysis of highly radioactive samples in shielded boxes

    International Nuclear Information System (INIS)

    Kirpikov, D.A.; Miroshnichenko, I.V.; Pykhteev, O.Yu.

    2010-01-01

    The sampling procedure used for highly radioactive coolant water is associated with high risk of personnel irradiation and uncontrolled radioactive contamination. Remote sample manipulation with provision for proper radiation shielding is intended for safety enhancement of the sampling procedure. The sampling lines are located in an isolated compartment, a shielded box. Various equipment which enables remote or automatic sample manipulation is used for this purpose. The main issues of development of the shielded box equipment intended for a wider ranger of remote chemical analyses and manipulation techniques for highly radioactive water samples are considered in the paper. There were three principal directions of work: Transfer of chemical analysis performed in the laboratory inside the shielded box; Prevalence of computer-aided and remote techniques of highly radioactive sample manipulation inside the shielded box; and, Increase in control over sampling and determination of thermal-hydraulic parameters of the coolant water in the sampling lines. The developed equipment and solutions enable remote chemical analysis in the restricted volume of the shielded box by using ion-chromatographic, amperometrical, fluorimetric, flow injection, phototurbidimetric, conductometric and potentiometric methods. Extent of control performed in the shielded box is determined taking into account the requirements of the regulatory documents as well as feasibility and cost of the technical adaptation of various methods to the shielded box conditions. The work resulted in highly precise determination of more than 15 indexes of the coolant water quality performed in on-line mode in the shielded box. It averages to 80% of the total extent of control performed at the prototype reactor plants. The novel solutions for highly radioactive sample handling are implemented in the shielded box (for example, packaging, sample transportation to the laboratory, volume measurement). The shielded box is

  8. MAINTENANCE OF THE COAL SAMPLE BANK AND DATABASE

    Energy Technology Data Exchange (ETDEWEB)

    Alan W. Scaroni; Alan Davis; David C. Glick; Patrick G. Hatcher; Gareth D. Mitchell; Daniel Carson; Lei Hou

    1999-11-01

    This project generated and distributed coal samples and accompanying analytical data for use in research by DOE contractors and others. All activities specified under the five-year contract (as revised) and a six-month no-cost extension have been completed. Eleven DECS samples were collected, processed to a variety of particle sizes, heat-sealed in foil laminate bags under argon, and placed in refrigerated storage. All were analyzed for basic chemical composition, inorganic major and trace element composition including hazardous air pollutant elements, petrographic composition and characteristics, thermoplastic behavior (if applicable), and other properties relevant to research and commercial utilization. Most were also analyzed by NMR, py/gc/ms, and a standardized liquefaction test; trends and relationships observed were evaluated and summarized. Twenty-two DECS samples collected under the previous contract received further processing, and most of these were subjected to organic geochemical and standardized liquefaction tests as well. Selected DECS samples were monitored annually to evaluate the effectiveness of foil laminate bags for preserving samples in long-term storage. In addition to the 33 DECS samples, 23 PSOC samples collected under previous contracts and purged with argon before storage were also maintained and distributed, for a total of 56 samples covered by the contract. During the 5.5 years, 570 samples in 1,586 containers, 2,109 data printouts, and individual data items from 34,208 samples were distributed. All DECS samples are now available for distribution at minus 6 mm (-1/4 inch), minus 0.85 mm (-20 mesh U.S.), and minus 0.25 mm (-60 mesh U.S.).

  9. Novel Sample-handling Approach for XRD Analysis with Minimal Sample Preparation

    Science.gov (United States)

    Sarrazin, P.; Chipera, S.; Bish, D.; Blake, D.; Feldman, S.; Vaniman, D.; Bryson, C.

    2004-01-01

    Sample preparation and sample handling are among the most critical operations associated with X-ray diffraction (XRD) analysis. These operations require attention in a laboratory environment, but they become a major constraint in the deployment of XRD instruments for robotic planetary exploration. We are developing a novel sample handling system that dramatically relaxes the constraints on sample preparation by allowing characterization of coarse-grained material that would normally be impossible to analyze with conventional powder-XRD techniques.

  10. Types of non-probabilistic sampling used in marketing research. „Snowball” sampling

    OpenAIRE

    Manuela Rozalia Gabor

    2007-01-01

    A significant way of investigating a firm’s market is the statistical sampling. The sampling typology provides a non / probabilistic models of gathering information and this paper describes thorough information related to network sampling, named “snowball” sampling. This type of sampling enables the survey of occurrence forms concerning the decision power within an organisation and of the interpersonal relation network governing a certain collectivity, a certain consumer panel. The snowball s...

  11. Soybean yield modeling using bootstrap methods for small samples

    Energy Technology Data Exchange (ETDEWEB)

    Dalposso, G.A.; Uribe-Opazo, M.A.; Johann, J.A.

    2016-11-01

    One of the problems that occur when working with regression models is regarding the sample size; once the statistical methods used in inferential analyzes are asymptotic if the sample is small the analysis may be compromised because the estimates will be biased. An alternative is to use the bootstrap methodology, which in its non-parametric version does not need to guess or know the probability distribution that generated the original sample. In this work we used a set of soybean yield data and physical and chemical soil properties formed with fewer samples to determine a multiple linear regression model. Bootstrap methods were used for variable selection, identification of influential points and for determination of confidence intervals of the model parameters. The results showed that the bootstrap methods enabled us to select the physical and chemical soil properties, which were significant in the construction of the soybean yield regression model, construct the confidence intervals of the parameters and identify the points that had great influence on the estimated parameters. (Author)

  12. The use of laser-induced fluorescence or ultraviolet detectors for sensitive and selective analysis of tobramycin or erythropoietin in complex samples

    Science.gov (United States)

    Ahmed, Hytham M.; Ebeid, Wael B.

    2015-05-01

    Complex samples analysis is a challenge in pharmaceutical and biopharmaceutical analysis. In this work, tobramycin (TOB) analysis in human urine samples and recombinant human erythropoietin (rhEPO) analysis in the presence of similar protein were selected as representative examples of such samples analysis. Assays of TOB in urine samples are difficult because of poor detectability. Therefore laser induced fluorescence detector (LIF) was combined with a separation technique, micellar electrokinetic chromatography (MEKC), to determine TOB through derivatization with fluorescein isothiocyanate (FITC). Borate was used as background electrolyte (BGE) with negative-charged mixed micelles as additive. The method was successively applied to urine samples. The LOD and LOQ for Tobramycin in urine were 90 and 200 ng/ml respectively and recovery was >98% (n = 5). All urine samples were analyzed by direct injection without sample pre-treatment. Another use of hyphenated analytical technique, capillary zone electrophoresis (CZE) connected to ultraviolet (UV) detector was also used for sensitive analysis of rhEPO at low levels (2000 IU) in the presence of large amount of human serum albumin (HSA). Analysis of rhEPO was achieved by the use of the electrokinetic injection (EI) with discontinuous buffers. Phosphate buffer was used as BGE with metal ions as additive. The proposed method can be used for the estimation of large number of quality control rhEPO samples in a short period.

  13. Effects of state and trait anxiety on selective attention to threatening stimuli in a non-clinical sample of school children

    Directory of Open Access Journals (Sweden)

    Jeniffer Ortega Marín

    2015-01-01

    Full Text Available Attentional biases, consisting of a preferential processing of threatening stimuli, have been found in anxious adults as predicted by several cognitive models. However, studies with non-clinical samples of children have provided mixed results. therefore, the aim of this research was to determine the effects of state and trait anxiety on the selective attention towards threatening stimuli in a non-clinical sample of school children (age: 8 to 13, n = 110 using the dot-probe task. This study did not reveal an effect of trait anxiety on selective attention towards threatening stimuli. However, a significant difference was found between participants with low state anxiety and high state anxiety. Nevertheless, the effect size was small. Specifically, participants with low state anxiety showed a bias towards threatening stimuli. Overall, the findings of this research with a non-clinical sample of school children suggest that attentional biases towards threatening information, which has been repeatedly found in anxious adults, are not necessarily inherent to non-clinical anxiety in children and on the other hand, the relationship between attentional biases and anxiety in this population might be moderated by other cognitive processes.

  14. Sampling procedure, receipt and conservation of water samples to determine environmental radioactivity

    International Nuclear Information System (INIS)

    Herranz, M.; Navarro, E.; Payeras, J.

    2009-01-01

    The present document informs about essential goals, processes and contents that the subgroups Sampling and Samples Preparation and Conservation believe they should be part of the procedure to obtain a correct sampling, receipt, conservation and preparation of samples of continental, marine and waste water before qualifying its radioactive content.

  15. Mineralogy, petrology and whole-rock chemistry data compilation for selected samples of Yucca Mountain tuffs

    International Nuclear Information System (INIS)

    Connolly, J.R.

    1991-12-01

    Petrologic, bulk chemical, and mineralogic data are presented for 49 samples of tuffaceous rocks from core holes USW G-1 and UE-25a number-sign 1 at Yucca Mountain, Nevada. Included, in descending stratigraphic order, are 11 samples from the Topopah Spring Member of the Paintbrush Tuff, 12 samples from the Tuffaceous Beds of Calico Hills, 3 samples from the Prow Pass Member of the Crater Flat Tuff, 20 samples from the Bullfrog Member of the Crater Flat Tuff and 3 samples from the Tram Member of the Crater Flat Tuff. The suite of samples contains a wide variety of petrologic types, including zeolitized, glassy, and devitrified tuffs. Data vary considerably between groups of samples, and include thin section descriptions (some with modal analyses for which uncertainties are estimated), electron microprobe analyses of mineral phases and matrix, mineral identifications by X-ray diffraction, and major element analyses with uncertainty estimates

  16. Assessing Generative Braille Responding Following Training in a Matching-to-Sample Format

    Science.gov (United States)

    Putnam, Brittany C.; Tiger, Jeffrey H.

    2016-01-01

    We evaluated the effects of teaching sighted college students to select printed text letters given a braille sample stimulus in a matching-to-sample (MTS) format on the emergence of untrained (a) construction of print characters given braille samples, (b) construction of braille characters given print samples, (c) transcription of print characters…

  17. UMTRA project water sampling and analysis plan, Naturita, Colorado

    International Nuclear Information System (INIS)

    1994-04-01

    Surface remedial action is scheduled to begin at the Naturita UMTRA Project processing site in the spring of 1994. No water sampling was performed during 1993 at either the Naturita processing site (NAT-01) or the Dry Flats disposal site (NAT-12). Results of previous water sampling at the Naturita processing site indicate that ground water in the alluvium is contaminated as a result of uranium processing activities. Baseline ground water conditions have been established in the uppermost aquifer at the Dry Flats disposal site. Water sampling activities scheduled for April 1994 include preconstruction sampling of selected monitor wells at the processing site, surface water sampling of the San Miguel River, sampling of several springs/seeps in the vicinity of the disposal site, and sampling of two monitor wells in Coke Oven Valley. The monitor well locations provide sampling points to characterize ground water quality and flow conditions in the vicinity of the sites. The list of analytes has been updated to reflect constituents related to uranium processing activities and the parameters needed for geochemical evaluation. Water sampling will be conducted annually at minimum during the period of construction activities

  18. IAEA Sampling Plan

    Energy Technology Data Exchange (ETDEWEB)

    Geist, William H. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-15

    The objectives for this presentation are to describe the method that the IAEA uses to determine a sampling plan for nuclear material measurements; describe the terms detection probability and significant quantity; list the three nuclear materials measurement types; describe the sampling method applied to an item facility; and describe multiple method sampling.

  19. Sample Acquisition for Materials in Planetary Exploration (SAMPLE), Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — ORBITEC proposes to analyze, design, and develop a device for autonomous lunar surface/subsurface sampling and processing applications. The Sample Acquisition for...

  20. Selective extraction of chromium(VI) using a leaching procedure with sodium carbonate from some plant leaves, soil and sediment samples

    Energy Technology Data Exchange (ETDEWEB)

    Elci, Latif, E-mail: elci@pamukkale.edu.tr [Department of Chemistry, Pamukkale University, 20017 Denizli (Turkey); Divrikli, Umit; Akdogan, Abdullah; Hol, Aysen; Cetin, Ayse [Department of Chemistry, Pamukkale University, 20017 Denizli (Turkey); Soylak, Mustafa [Department of Chemistry, Erciyes University, 38039 Kayseri (Turkey)

    2010-01-15

    Speciation of chromium in some plant leaves, soil and sediment samples was carried out by selective leaching of Cr(VI) using a sodium carbonate leaching procedure. Total chromium from the samples was extracted using aqua regia and oxidative acid digestion, respectively. The concentrations of chromium species in the extracts were determined using by graphite furnace atomic absorption spectrometry (GFAAS). Uncoated graphite furnace tubes were used as an atomizer. Due to the presence of relatively high amounts of Na{sub 2}CO{sub 3} in the resulting samples, the possible influences of Na{sub 2}CO{sub 3} on the absorbance signals were checked. There is no interference of Na{sub 2}CO{sub 3} on the chromium absorbance up to 0.1 mol L{sup -1} Na{sub 2}CO{sub 3}. A limit of detection (LOD) for determination of Cr(VI) in 0.1 Na{sub 2}CO{sub 3} solution by GFAAS was found to be 0.93 {mu}g L{sup -1}. The procedure was applied to environmental samples. The relative standard deviation, R.S.D. as precision for 10 replicate measurements of 20 {mu} L{sup -1} Cr in processed soil sample was 4.2%.

  1. Selective extraction of chromium(VI) using a leaching procedure with sodium carbonate from some plant leaves, soil and sediment samples.

    Science.gov (United States)

    Elci, Latif; Divrikli, Umit; Akdogan, Abdullah; Hol, Aysen; Cetin, Ayse; Soylak, Mustafa

    2010-01-15

    Speciation of chromium in some plant leaves, soil and sediment samples was carried out by selective leaching of Cr(VI) using a sodium carbonate leaching procedure. Total chromium from the samples was extracted using aqua regia and oxidative acid digestion, respectively. The concentrations of chromium species in the extracts were determined using by graphite furnace atomic absorption spectrometry (GFAAS). Uncoated graphite furnace tubes were used as an atomizer. Due to the presence of relatively high amounts of Na(2)CO(3) in the resulting samples, the possible influences of Na(2)CO(3) on the absorbance signals were checked. There is no interference of Na(2)CO(3) on the chromium absorbance up to 0.1 mol L(-1) Na(2)CO(3). A limit of detection (LOD) for determination of Cr(VI) in 0.1 Na(2)CO(3) solution by GFAAS was found to be 0.93 microg L(-1). The procedure was applied to environmental samples. The relative standard deviation, R.S.D. as precision for 10 replicate measurements of 20 microL(-1) Cr in processed soil sample was 4.2%.

  2. Mendelian breeding units versus standard sampling strategies: mitochondrial DNA variation in southwest Sardinia

    Directory of Open Access Journals (Sweden)

    Daria Sanna

    2011-01-01

    Full Text Available We report a sampling strategy based on Mendelian Breeding Units (MBUs, representing an interbreeding group of individuals sharing a common gene pool. The identification of MBUs is crucial for case-control experimental design in association studies. The aim of this work was to evaluate the possible existence of bias in terms of genetic variability and haplogroup frequencies in the MBU sample, due to severe sample selection. In order to reach this goal, the MBU sampling strategy was compared to a standard selection of individuals according to their surname and place of birth. We analysed mitochondrial DNA variation (first hypervariable segment and coding region in unrelated healthy subjects from two different areas of Sardinia: the area around the town of Cabras and the western Campidano area. No statistically significant differences were observed when the two sampling methods were compared, indicating that the stringent sample selection needed to establish a MBU does not alter original genetic variability and haplogroup distribution. Therefore, the MBU sampling strategy can be considered a useful tool in association studies of complex traits.

  3. LOCAL BENCHMARKS FOR THE EVOLUTION OF MAJOR-MERGER GALAXIES-SPITZER OBSERVATIONS OF A K-BAND SELECTED SAMPLE

    International Nuclear Information System (INIS)

    Xu, C. Kevin; Cheng Yiwen; Lu Nanyao; Mazzarella, Joseph M.; Cutri, Roc; Domingue, Donovan; Huang Jiasheng; Gao Yu; Sun, W.-H.; Surace, Jason

    2010-01-01

    We present Spitzer observations for a sample of close major-merger galaxy pairs (KPAIR sample) selected from cross-matches between the Two Micron All Sky Survey and Sloan Digital Sky Survey Data Release 3. The goals are to study the star formation activity in these galaxies and to set a local bench mark for the cosmic evolution of close major mergers. The Spitzer KPAIR sample (27 pairs, 54 galaxies) includes all spectroscopically confirmed spiral-spiral (S+S) and spiral-elliptical (S+E) pairs in a parent sample that is complete for primaries brighter than K = 12.5 mag, projected separations of 5 h -1 kpc ≤ s ≤ 20 h -1 kpc, and mass ratios ≤2.5. The Spitzer data, consisting of images in seven bands (3.6, 4.5, 5.8, 8, 24, 70, 160 μm), show very diversified IR emission properties. Compared to single spiral galaxies in a control sample, only spiral galaxies in S+S pairs show significantly enhanced specific star formation rate (sSFR = SFR/M), whereas spiral galaxies in S+E pairs do not. Furthermore, the SFR enhancement of spiral galaxies in S+S pairs is highly mass-dependent. Only those with M ∼> 10 10.5 M sun show significant enhancement. Relatively low-mass (M ∼ 10 10 M sun ) spirals in S+S pairs have about the same SFR/M compared to their counterparts in the control sample, while those with 10 11 M sun have on average a ∼3 times higher SFR/M than single spirals. There is evidence for a correlation between the global star formation activities (but not the nuclear activities) of the component galaxies in massive S+S major-merger pairs (the H olmberg effect ) . There is no significant difference in the SFR/M between the primaries and the secondaries, nor between spirals of SEP KPAIR =2.54 x 10 -4 (M sun yr -1 Mpc -3 ).

  4. A study on the weather sampling method for probabilistic consequence analysis

    International Nuclear Information System (INIS)

    Oh, Hae Cheol

    1996-02-01

    The main task of probabilistic accident consequence analysis model is to predict the radiological situation and to provide a reliable quantitative data base for making decisions on countermeasures. The magnitude of accident consequence is depended on the characteristic of the accident and the weather coincident. In probabilistic accident consequence analysis, it is necessary to repeat the atmospheric dispersion calculation with several hundreds of weather sequences to predict the full distribution of consequences which may occur following a postulated accident release. It is desirable to select a representative sample of weather sequences from a meteorological record which is typical of the area over which the released radionuclides will disperse and which spans a sufficiently long period. The selection process is done by means of sampling techniques from a full year of hourly weather data characteristic of the plant site. In this study, the proposed Weighted importance sampling method selects proportional to the each bin size to closely approximate the true frequency distribution of weather condition at the site. The Weighted importance sampling method results in substantially less sampling uncertainty than the previous technique. The proposed technique can result in improve confidence in risk estimates

  5. Network and adaptive sampling

    CERN Document Server

    Chaudhuri, Arijit

    2014-01-01

    Combining the two statistical techniques of network sampling and adaptive sampling, this book illustrates the advantages of using them in tandem to effectively capture sparsely located elements in unknown pockets. It shows how network sampling is a reliable guide in capturing inaccessible entities through linked auxiliaries. The text also explores how adaptive sampling is strengthened in information content through subsidiary sampling with devices to mitigate unmanageable expanding sample sizes. Empirical data illustrates the applicability of both methods.

  6. Sampling bee communities using pan traps: alternative methods increase sample size

    Science.gov (United States)

    Monitoring of the status of bee populations and inventories of bee faunas require systematic sampling. Efficiency and ease of implementation has encouraged the use of pan traps to sample bees. Efforts to find an optimal standardized sampling method for pan traps have focused on pan trap color. Th...

  7. On sampling and modeling complex systems

    International Nuclear Information System (INIS)

    Marsili, Matteo; Mastromatteo, Iacopo; Roudi, Yasser

    2013-01-01

    The study of complex systems is limited by the fact that only a few variables are accessible for modeling and sampling, which are not necessarily the most relevant ones to explain the system behavior. In addition, empirical data typically undersample the space of possible states. We study a generic framework where a complex system is seen as a system of many interacting degrees of freedom, which are known only in part, that optimize a given function. We show that the underlying distribution with respect to the known variables has the Boltzmann form, with a temperature that depends on the number of unknown variables. In particular, when the influence of the unknown degrees of freedom on the known variables is not too irregular, the temperature decreases as the number of variables increases. This suggests that models can be predictable only when the number of relevant variables is less than a critical threshold. Concerning sampling, we argue that the information that a sample contains on the behavior of the system is quantified by the entropy of the frequency with which different states occur. This allows us to characterize the properties of maximally informative samples: within a simple approximation, the most informative frequency size distributions have power law behavior and Zipf’s law emerges at the crossover between the under sampled regime and the regime where the sample contains enough statistics to make inferences on the behavior of the system. These ideas are illustrated in some applications, showing that they can be used to identify relevant variables or to select the most informative representations of data, e.g. in data clustering. (paper)

  8. Synthesis and characterization of ion imprinted polymeric adsorbents for the selective recognition and removal of arsenic and selenium in wastewater samples

    Directory of Open Access Journals (Sweden)

    Lihle D. Mafu

    2016-09-01

    Full Text Available Arsenic (As and selenium (Se ion imprinted polymers, As-IIPs and Se-IIPs, were synthesized via bulk polymerization. The prepared materials were then characterized using Fourier Transform Infrared (FT-IR, scanning electron microscopy (SEM and thermogravimetric analysis (TGA. These characterization methods confirmed the difference between IIPs and non-imprinted polymers (NIP. From the adsorption studies done IIPs did not only show better adsorption than NIPs but also better selectivities as well. As adsorption using As-IIPs (AsAsIIPs reached a maximum of 482 μg g−1 whilst Se adsorption using Se-IIPs (SeSeIIPs reached a maximum of 447 μg g−1 after optimization of the sample pH, adsorption time and sample temperature. However these adsorption capacities were increased to 568 μg g−1 and 530 μg g−1 for As and Se respectively when column experiments were done at the same sample temperature and pH. Against Pb, Cd and Hg, As-IIPs and Se-IIPs showed selectivity towards As and Se, respectively.

  9. A Simple Analytical Method Using HPLC with Fluorescence Detection to Determine Selected Polycyclic Aromatic Compounds in Filter Samples

    International Nuclear Information System (INIS)

    Garcia, S.; Perez, R. M.

    2014-01-01

    A study on the comparison and evaluation of a miniaturized extraction method for the determination of selected PACs in sample filters is presented. The main objective was the optimization and development of simple, rapid and low cost methods, minimizing the use of extracting solvent volume. The work also includes a study on the intermediate precision. (Author)

  10. Estimating HIES Data through Ratio and Regression Methods for Different Sampling Designs

    Directory of Open Access Journals (Sweden)

    Faqir Muhammad

    2007-01-01

    Full Text Available In this study, comparison has been made for different sampling designs, using the HIES data of North West Frontier Province (NWFP for 2001-02 and 1998-99 collected from the Federal Bureau of Statistics, Statistical Division, Government of Pakistan, Islamabad. The performance of the estimators has also been considered using bootstrap and Jacknife. A two-stage stratified random sample design is adopted by HIES. In the first stage, enumeration blocks and villages are treated as the first stage Primary Sampling Units (PSU. The sample PSU’s are selected with probability proportional to size. Secondary Sampling Units (SSU i.e., households are selected by systematic sampling with a random start. They have used a single study variable. We have compared the HIES technique with some other designs, which are: Stratified Simple Random Sampling. Stratified Systematic Sampling. Stratified Ranked Set Sampling. Stratified Two Phase Sampling. Ratio and Regression methods were applied with two study variables, which are: Income (y and Household sizes (x. Jacknife and Bootstrap are used for variance replication. Simple Random Sampling with sample size (462 to 561 gave moderate variances both by Jacknife and Bootstrap. By applying Systematic Sampling, we received moderate variance with sample size (467. In Jacknife with Systematic Sampling, we obtained variance of regression estimator greater than that of ratio estimator for a sample size (467 to 631. At a sample size (952 variance of ratio estimator gets greater than that of regression estimator. The most efficient design comes out to be Ranked set sampling compared with other designs. The Ranked set sampling with jackknife and bootstrap, gives minimum variance even with the smallest sample size (467. Two Phase sampling gave poor performance. Multi-stage sampling applied by HIES gave large variances especially if used with a single study variable.

  11. Representativeness-based sampling network design for the State of Alaska

    Science.gov (United States)

    Forrest M. Hoffman; Jitendra Kumar; Richard T. Mills; William W. Hargrove

    2013-01-01

    Resource and logistical constraints limit the frequency and extent of environmental observations, particularly in the Arctic, necessitating the development of a systematic sampling strategy to maximize coverage and objectively represent environmental variability at desired scales. A quantitative methodology for stratifying sampling domains, informing site selection,...

  12. SnagPRO: snag and tree sampling and analysis methods for wildlife

    Science.gov (United States)

    Lisa J. Bate; Michael J. Wisdom; Edward O. Garton; Shawn C. Clabough

    2008-01-01

    We describe sampling methods and provide software to accurately and efficiently estimate snag and tree densities at desired scales to meet a variety of research and management objectives. The methods optimize sampling effort by choosing a plot size appropriate for the specified forest conditions and sampling goals. Plot selection and data analyses are supported by...

  13. Spectrophotometric Determination of Boron in Environmental Water Samples

    International Nuclear Information System (INIS)

    San San; Khin Win Kyi; Kwaw Naing

    2002-02-01

    The present paper deals with the study on the methods for the determination of boron in the environmental water samples. The standard methods which are useful for this determination are discussed thoroughly in this work. Among the standard methods approved by American Public Health Association, the carmine method was selected for this study. Prior to the determination of boron in the water samples, the precision and accuracy of the methods of choice were examined by using standard boron solutions. The determination of Boron was carried out by using water samples, waste water from Aquaculture Research Centre, University of Yangon, the Ayeyarwady River water near Magway Myathalon Pagoda in Magway Division, ground water from Sanchaung Township, and tap water from Universities' Research Centre, University of Yangon. Analyses of these water samples were done and statistical treatment of the results was carried out. (author)

  14. Quality Control Samples for the Radiological Determination of Tritium in Urine Samples

    International Nuclear Information System (INIS)

    Ost'pezuk, P.; Froning, M.; Laumen, S.; Richert, I.; Hill, P.

    2004-01-01

    The radioactive decay product of tritium is a low energy beta that cannot penetrate the outer dead layer of human skin. Therefore , the main hazard associated with tritium is internal exposure. In addition, due to the relatively long half life and short biological half life, tritium must be ingested in large amounts to pose a significant health risk. On the other hand, the internal exposure should be kept as low as practical. For incorporation monitoring of professional radiation workers the quality control is of utmost importance. In the Research Centre Juelich GmbH (FZJ) a considerable fraction of monitoring by excretion analysis relates to the isotope Tritium. Usually an aliquot of an urine sample is mixed with a liquid scintillator and measured in a liquid scintillation counter. Quality control samples in the form of three kind of internal reference samples (blank, reference samples with low activity and reference sample with elevated activity) were prepared from a mixed, Tritium (free) urine samples. 1 ml of these samples were pipetted into a liquid scintillation vial. In the part of theses vials a known amounts of Tritium were added. All these samples were stored at 20 degrees. Based on long term use of all these reference samples it was possible to construct appropriate control charts with the upper and lower alarm limits. Daily use of these reference samples decrease significantly the risk for false results in original urine with no significant increase of the determination time. (Author) 2 refs

  15. A Method for Choosing the Best Samples for Mars Sample Return.

    Science.gov (United States)

    Gordon, Peter R; Sephton, Mark A

    2018-05-01

    Success of a future Mars Sample Return mission will depend on the correct choice of samples. Pyrolysis-FTIR can be employed as a triage instrument for Mars Sample Return. The technique can thermally dissociate minerals and organic matter for detection. Identification of certain mineral types can determine the habitability of the depositional environment, past or present, while detection of organic matter may suggest past or present habitation. In Mars' history, the Theiikian era represents an attractive target for life search missions and the acquisition of samples. The acidic and increasingly dry Theiikian may have been habitable and followed a lengthy neutral and wet period in Mars' history during which life could have originated and proliferated to achieve relatively abundant levels of biomass with a wide distribution. Moreover, the sulfate minerals produced in the Theiikian are also known to be good preservers of organic matter. We have used pyrolysis-FTIR and samples from a Mars analog ferrous acid stream with a thriving ecosystem to test the triage concept. Pyrolysis-FTIR identified those samples with the greatest probability of habitability and habitation. A three-tier scoring system was developed based on the detection of (i) organic signals, (ii) carbon dioxide and water, and (iii) sulfur dioxide. The presence of each component was given a score of A, B, or C depending on whether the substance had been detected, tentatively detected, or not detected, respectively. Single-step (for greatest possible sensitivity) or multistep (for more diagnostic data) pyrolysis-FTIR methods informed the assignments. The system allowed the highest-priority samples to be categorized as AAA (or A*AA if the organic signal was complex), while the lowest-priority samples could be categorized as CCC. Our methods provide a mechanism with which to rank samples and identify those that should take the highest priority for return to Earth during a Mars Sample Return mission. Key Words

  16. Solid Phase Microextraction and Related Techniques for Drugs in Biological Samples

    OpenAIRE

    Moein, Mohammad Mahdi; Said, Rana; Bassyouni, Fatma; Abdel-Rehim, Mohamed

    2014-01-01

    In drug discovery and development, the quantification of drugs in biological samples is an important task for the determination of the physiological performance of the investigated drugs. After sampling, the next step in the analytical process is sample preparation. Because of the low concentration levels of drug in plasma and the variety of the metabolites, the selected extraction technique should be virtually exhaustive. Recent developments of sample handling techniques are directed, from o...

  17. Enhanced AFCI Sampling, Analysis, and Safeguards Technology Review

    Energy Technology Data Exchange (ETDEWEB)

    John Svoboda

    2009-09-01

    introduce these samples, are discussed. A suggested path forward based on an automated microchip capillary based sampling system interfaced to the analysis spectrometer is presented. The ability to obtain micro liter volume samples coupled with remote automated means of sample tracking and transport to the instrument would greatly improve analytical efficiency while reducing both personnel exposure and radioactive waste. Application of this sampling technique to new types of mass spectrometers for selective elemental isotopic analysis could also provide significant improvements in safeguards and security analyses.

  18. An overview of sample preparation procedures for LC-MS multiclass antibiotic determination in environmental and food samples.

    Science.gov (United States)

    Moreno-Bondi, María Cruz; Marazuela, María Dolores; Herranz, Sonia; Rodriguez, Erika

    2009-10-01

    Antibiotics are a class of pharmaceuticals that are of great interest due to the large volumes of these substances that are consumed in both human and veterinary medicine, and due to their status as the agents responsible for bacterial resistance. They can be present in foodstuffs and in environmental samples as multicomponent chemical mixtures that exhibit a wide range of mechanisms of action. Moreover, they can be transformed into different metabolites by the action of microorganisms, as well as by other physical or chemical means, resulting in mixtures with higher ecotoxicities and risks to human health than those of the individual compounds. Therefore, there is growing interest in the availability of multiclass methods for the analysis of antimicrobial mixtures in environmental and food samples at very low concentrations. Liquid chromatography (LC) has become the technique of choice for multiclass analysis, especially when coupled to mass spectrometry (LC-MS) and tandem MS (LC-MS(2)). However, due to the complexity of the matrix, in most cases an extraction step for sample clean-up and preconcentration is required before analysis in order to achieve the required sensitivities. This paper reviews the most recent developments and applications of multiclass antimicrobial determination in environmental and food matrices, emphasizing the practical aspects of sample preparation for the simultaneous extraction of antimicrobials from the selected samples. Future trends in the application of LC-MS-based techniques to multiclass antibiotic analysis are also presented.

  19. A COMPLETE SAMPLE OF BRIGHT SWIFT LONG GAMMA-RAY BURSTS. I. SAMPLE PRESENTATION, LUMINOSITY FUNCTION AND EVOLUTION

    International Nuclear Information System (INIS)

    Salvaterra, R.; Campana, S.; Vergani, S. D.; Covino, S.; D'Avanzo, P.; Fugazza, D.; Ghirlanda, G.; Ghisellini, G.; Melandri, A.; Sbarufatti, B.; Tagliaferri, G.; Nava, L.; Flores, H.; Piranomonte, S.

    2012-01-01

    We present a carefully selected sub-sample of Swift long gamma-ray bursts (GRBs) that is complete in redshift. The sample is constructed by considering only bursts with favorable observing conditions for ground-based follow-up searches, which are bright in the 15-150 keV Swift/BAT band, i.e., with 1-s peak photon fluxes in excess to 2.6 photons s –1 cm –2 . The sample is composed of 58 bursts, 52 of them with redshift for a completeness level of 90%, while another two have a redshift constraint, reaching a completeness level of 95%. For only three bursts we have no constraint on the redshift. The high level of redshift completeness allows us for the first time to constrain the GRB luminosity function and its evolution with cosmic times in an unbiased way. We find that strong evolution in luminosity (δ l = 2.3 ± 0.6) or in density (δ d = 1.7 ± 0.5) is required in order to account for the observations. The derived redshift distributions in the two scenarios are consistent with each other, in spite of their different intrinsic redshift distributions. This calls for other indicators to distinguish among different evolution models. Complete samples are at the base of any population studies. In future works we will use this unique sample of Swift bright GRBs to study the properties of the population of long GRBs.

  20. A COMPLETE SAMPLE OF BRIGHT SWIFT LONG GAMMA-RAY BURSTS. I. SAMPLE PRESENTATION, LUMINOSITY FUNCTION AND EVOLUTION

    Energy Technology Data Exchange (ETDEWEB)

    Salvaterra, R. [INAF, IASF Milano, via E. Bassini 15, I-20133 Milano (Italy); Campana, S.; Vergani, S. D.; Covino, S.; D' Avanzo, P.; Fugazza, D.; Ghirlanda, G.; Ghisellini, G.; Melandri, A.; Sbarufatti, B.; Tagliaferri, G. [INAF, Osservatorio Astronomico di Brera, via E. Bianchi 46, I-23807 Merate (Saint Lucia) (Italy); Nava, L. [SISSA, via Bonomea 265, I-34136 Trieste (Italy); Flores, H. [Laboratoire GEPI, Observatoire de Paris, CNRS-UMR8111, Univ. Paris-Diderot 5 place Jules Janssen, 92195 Meudon (France); Piranomonte, S., E-mail: ruben@lambrate.inaf.it [INAF, Osservatorio Astronomico di Roma, via Frascati 33, 00040 Monte Porzio Catone, Rome (Italy)

    2012-04-10

    We present a carefully selected sub-sample of Swift long gamma-ray bursts (GRBs) that is complete in redshift. The sample is constructed by considering only bursts with favorable observing conditions for ground-based follow-up searches, which are bright in the 15-150 keV Swift/BAT band, i.e., with 1-s peak photon fluxes in excess to 2.6 photons s{sup -1} cm{sup -2}. The sample is composed of 58 bursts, 52 of them with redshift for a completeness level of 90%, while another two have a redshift constraint, reaching a completeness level of 95%. For only three bursts we have no constraint on the redshift. The high level of redshift completeness allows us for the first time to constrain the GRB luminosity function and its evolution with cosmic times in an unbiased way. We find that strong evolution in luminosity ({delta}{sub l} = 2.3 {+-} 0.6) or in density ({delta}{sub d} = 1.7 {+-} 0.5) is required in order to account for the observations. The derived redshift distributions in the two scenarios are consistent with each other, in spite of their different intrinsic redshift distributions. This calls for other indicators to distinguish among different evolution models. Complete samples are at the base of any population studies. In future works we will use this unique sample of Swift bright GRBs to study the properties of the population of long GRBs.

  1. Urine sample collection protocols for bioassay samples

    Energy Technology Data Exchange (ETDEWEB)

    MacLellan, J.A.; McFadden, K.M.

    1992-11-01

    In vitro radiobioassay analyses are used to measure the amount of radioactive material excreted by personnel exposed to the potential intake of radioactive material. The analytical results are then used with various metabolic models to estimate the amount of radioactive material in the subject`s body and the original intake of radioactive material. Proper application of these metabolic models requires knowledge of the excretion period. It is normal practice to design the bioassay program based on a 24-hour excretion sample. The Hanford bioassay program simulates a total 24-hour urine excretion sample with urine collection periods lasting from one-half hour before retiring to one-half hour after rising on two consecutive days. Urine passed during the specified periods is collected in three 1-L bottles. Because the daily excretion volume given in Publication 23 of the International Commission on Radiological Protection (ICRP 1975, p. 354) for Reference Man is 1.4 L, it was proposed to use only two 1-L bottles as a cost-saving measure. This raised the broader question of what should be the design capacity of a 24-hour urine sample kit.

  2. Urine sample collection protocols for bioassay samples

    Energy Technology Data Exchange (ETDEWEB)

    MacLellan, J.A.; McFadden, K.M.

    1992-11-01

    In vitro radiobioassay analyses are used to measure the amount of radioactive material excreted by personnel exposed to the potential intake of radioactive material. The analytical results are then used with various metabolic models to estimate the amount of radioactive material in the subject's body and the original intake of radioactive material. Proper application of these metabolic models requires knowledge of the excretion period. It is normal practice to design the bioassay program based on a 24-hour excretion sample. The Hanford bioassay program simulates a total 24-hour urine excretion sample with urine collection periods lasting from one-half hour before retiring to one-half hour after rising on two consecutive days. Urine passed during the specified periods is collected in three 1-L bottles. Because the daily excretion volume given in Publication 23 of the International Commission on Radiological Protection (ICRP 1975, p. 354) for Reference Man is 1.4 L, it was proposed to use only two 1-L bottles as a cost-saving measure. This raised the broader question of what should be the design capacity of a 24-hour urine sample kit.

  3. APTIMA assay on SurePath liquid-based cervical samples compared to endocervical swab samples facilitated by a real time database

    Directory of Open Access Journals (Sweden)

    Khader Samer

    2010-01-01

    Full Text Available Background: Liquid-based cytology (LBC cervical samples are increasingly being used to test for pathogens, including: HPV, Chlamydia trachomatis (CT and Neisseria gonorrhoeae (GC using nucleic acid amplification tests. Several reports have shown the accuracy of such testing on ThinPrep (TP LBC samples. Fewer studies have evaluated SurePath (SP LBC samples, which utilize a different specimen preservative. This study was undertaken to assess the performance of the Aptima Combo 2 Assay (AC2 for CT and GC on SP versus endocervical swab samples in our laboratory. Materials and Methods: The live pathology database of Montefiore Medical Center was searched for patients with AC2 endocervical swab specimens and SP Paps taken the same day. SP samples from CT- and/or GC-positive endocervical swab patients and randomly selected negative patients were studied. In each case, 1.5 ml of the residual SP vial sample, which was in SP preservative and stored at room temperature, was transferred within seven days of collection to APTIMA specimen transfer tubes without any sample or patient identifiers. Blind testing with the AC2 assay was performed on the Tigris DTS System (Gen-probe, San Diego, CA. Finalized SP results were compared with the previously reported endocervical swab results for the entire group and separately for patients 25 years and younger and patients over 25 years. Results: SP specimens from 300 patients were tested. This included 181 swab CT-positive, 12 swab GC-positive, 7 CT and GC positive and 100 randomly selected swab CT and GC negative patients. Using the endocervical swab results as the patient′s infection status, AC2 assay of the SP samples showed: CT sensitivity 89.3%, CT specificity 100.0%; GC sensitivity and specificity 100.0%. CT sensitivity for patients 25 years or younger was 93.1%, versus 80.7% for patients over 25 years, a statistically significant difference (P = 0.02. Conclusions: Our results show that AC2 assay of 1.5 ml SP

  4. Plasma as alternatively sample to quantify tetanus antitoxin

    Directory of Open Access Journals (Sweden)

    Ariel Menéndez-Barrios

    2015-08-01

    Full Text Available Tetanus antitoxin is quantified in Cuba at blood banks, from the serum of immunized donors, to produce aspecific human gamma globulin. A heterogeneous indirect immunoenzymatic assay is used, using the serum as analytical sample. The possible use of plasma obtained from plasmapheresis as alternative sample was evaluated in this research, to minimize the volume of total blood extracted to the donors. One hundred plasma donors who came to donate between October and November 2013 were selected by simple random sampling. Serum sample was obtained for extraction of 5 mL of blood, deposited in dry glass tube. While the other sample took 1.5 mL of plasma in a plastic tube with cover, at the end of the donation directly of the unit of plasma collected. Comparison of the difference between the means of both groups was done using SPSS for Windows. It was found that the values obtained in serum were bigger than those obtained in plasma. Difference between the means of both groups was statistically significant (p 0.00. It is not advisable to use the obtained plasma of the plasmapheresis as analytic sample in this assay.

  5. Calibration model maintenance in melamine resin production: Integrating drift detection, smart sample selection and model adaptation.

    Science.gov (United States)

    Nikzad-Langerodi, Ramin; Lughofer, Edwin; Cernuda, Carlos; Reischer, Thomas; Kantner, Wolfgang; Pawliczek, Marcin; Brandstetter, Markus

    2018-07-12

    selection of samples by active learning (AL) used for subsequent model adaptation is advantageous compared to passive (random) selection in case that a drift leads to persistent prediction bias allowing more rapid adaptation at lower reference measurement rates. Fully unsupervised adaptation using FLEXFIS-PLS could improve predictive accuracy significantly for light drifts but was not able to fully compensate for prediction bias in case of significant lack of fit w.r.t. the latent variable space. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Estimation of Finite Population Mean in Multivariate Stratified Sampling under Cost Function Using Goal Programming

    Directory of Open Access Journals (Sweden)

    Atta Ullah

    2014-01-01

    Full Text Available In practical utilization of stratified random sampling scheme, the investigator meets a problem to select a sample that maximizes the precision of a finite population mean under cost constraint. An allocation of sample size becomes complicated when more than one characteristic is observed from each selected unit in a sample. In many real life situations, a linear cost function of a sample size nh is not a good approximation to actual cost of sample survey when traveling cost between selected units in a stratum is significant. In this paper, sample allocation problem in multivariate stratified random sampling with proposed cost function is formulated in integer nonlinear multiobjective mathematical programming. A solution procedure is proposed using extended lexicographic goal programming approach. A numerical example is presented to illustrate the computational details and to compare the efficiency of proposed compromise allocation.

  7. Characterization of Vadose Zone Sediment: Uncontaminated RCRA Borehole Core Samples and Composite Samples

    International Nuclear Information System (INIS)

    Serne, R. Jeffrey; Bjornstad, Bruce N.; Schaef, Herbert T.; Williams, Bruce A.; Lanigan, David C.; Horton, Duane G.; Clayton, Ray E.; Mitroshkov, Alexandre V.; Legore, Virginia L.; O'Hara, Matthew J.; Brown, Christopher F.; Parker, Kent E.; Kutnyakov, Igor V.; Serne, Jennifer N.; Last, George V.; Smith, Steven C.; Lindenmeier, Clark W.; Zachara, John M.; Burke, Deborah S.

    2008-01-01

    This report was revised in September 2008 to remove acid-extractable sodium data from Tables 4.14, 4.16, 5.20, 5.22, 5.43, and 5.45. The sodium data was removed due to potential contamination introduced during the acid extraction process. The rest of the text remains unchanged from the original report issued in February 2002. The overall goal of the of the Tank Farm Vadose Zone Project, led by CH2M HILL Hanford Group, Inc., is to define risks from past and future single-shell tank farm activities. To meet this goal, CH2M HILL Hanford Group, Inc. asked scientists from Pacific Northwest National Laboratory to perform detailed analyses on vadose zone sediment from within the S-SX Waste Management Area. This report is one in a series of four reports to present the results of these analyses. Specifically, this report contains all the geologic, geochemical, and selected physical characterization data collected on vadose zone sediment recovered from Resource Conservation and Recovery Act (RCRA) borehole bore samples and composite samples

  8. Characterization of Vadose Zone Sediment: Uncontaminated RCRA Borehole Core Samples and Composite Samples

    Energy Technology Data Exchange (ETDEWEB)

    Serne, R. Jeffrey; Bjornstad, Bruce N.; Schaef, Herbert T.; Williams, Bruce A.; Lanigan, David C.; Horton, Duane G.; Clayton, Ray E.; Mitroshkov, Alexandre V.; Legore, Virginia L.; O' Hara, Matthew J.; Brown, Christopher F.; Parker, Kent E.; Kutnyakov, Igor V.; Serne, Jennifer N.; Last, George V.; Smith, Steven C.; Lindenmeier, Clark W.; Zachara, John M.; Burke, Deborah S.

    2008-09-11

    This report was revised in September 2008 to remove acid-extractable sodium data from Tables 4.14, 4.16, 5.20, 5.22, 5.43, and 5.45. The sodium data was removed due to potential contamination introduced during the acid extraction process. The rest of the text remains unchanged from the original report issued in February 2002. The overall goal of the of the Tank Farm Vadose Zone Project, led by CH2M HILL Hanford Group, Inc., is to define risks from past and future single-shell tank farm activities. To meet this goal, CH2M HILL Hanford Group, Inc. asked scientists from Pacific Northwest National Laboratory to perform detailed analyses on vadose zone sediment from within the S-SX Waste Management Area. This report is one in a series of four reports to present the results of these analyses. Specifically, this report contains all the geologic, geochemical, and selected physical characterization data collected on vadose zone sediment recovered from Resource Conservation and Recovery Act (RCRA) borehole bore samples and composite samples.

  9. Recent advances in applications of nanomaterials for sample preparation.

    Science.gov (United States)

    Xu, Linnan; Qi, Xiaoyue; Li, Xianjiang; Bai, Yu; Liu, Huwei

    2016-01-01

    Sample preparation is a key step for qualitative and quantitative analysis of trace analytes in complicated matrix. Along with the rapid development of nanotechnology in material science, numerous nanomaterials have been developed with particularly useful applications in analytical chemistry. Benefitting from their high specific areas, increased surface activities, and unprecedented physical/chemical properties, the potentials of nanomaterials for rapid and efficient sample preparation have been exploited extensively. In this review, recent progress of novel nanomaterials applied in sample preparation has been summarized and discussed. Both nanoparticles and nanoporous materials are evaluated for their unusual performance in sample preparation. Various compositions and functionalizations extended the applications of nanomaterials in sample preparations, and distinct size and shape selectivity was generated from the diversified pore structures of nanoporous materials. Such great variety make nanomaterials a kind of versatile tools in sample preparation for almost all categories of analytes. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Methods for Sampling and Measurement of Compressed Air Contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Stroem, L

    1976-10-15

    In order to improve the technique for measuring oil and water entrained in a compressed air stream, a laboratory study has been made of some methods for sampling and measurement. For this purpose water or oil as artificial contaminants were injected in thin streams into a test loop, carrying dry compressed air. Sampling was performed in a vertical run, down-stream of the injection point. Wall attached liquid, coarse droplet flow, and fine droplet flow were sampled separately. The results were compared with two-phase flow theory and direct observation of liquid behaviour. In a study of sample transport through narrow tubes, it was observed that, below a certain liquid loading, the sample did not move, the liquid remaining stationary on the tubing wall. The basic analysis of the collected samples was made by gravimetric methods. Adsorption tubes were used with success to measure water vapour. A humidity meter with a sensor of the aluminium oxide type was found to be unreliable. Oil could be measured selectively by a flame ionization detector, the sample being pretreated in an evaporation- condensation unit

  11. Methods for Sampling and Measurement of Compressed Air Contaminants

    International Nuclear Information System (INIS)

    Stroem, L.

    1976-10-01

    In order to improve the technique for measuring oil and water entrained in a compressed air stream, a laboratory study has been made of some methods for sampling and measurement. For this purpose water or oil as artificial contaminants were injected in thin streams into a test loop, carrying dry compressed air. Sampling was performed in a vertical run, down-stream of the injection point. Wall attached liquid, coarse droplet flow, and fine droplet flow were sampled separately. The results were compared with two-phase flow theory and direct observation of liquid behaviour. In a study of sample transport through narrow tubes, it was observed that, below a certain liquid loading, the sample did not move, the liquid remaining stationary on the tubing wall. The basic analysis of the collected samples was made by gravimetric methods. Adsorption tubes were used with success to measure water vapour. A humidity meter with a sensor of the aluminium oxide type was found to be unreliable. Oil could be measured selectively by a flame ionization detector, the sample being pretreated in an evaporation- condensation unit

  12. Improved sample size determination for attributes and variables sampling

    International Nuclear Information System (INIS)

    Stirpe, D.; Picard, R.R.

    1985-01-01

    Earlier INMM papers have addressed the attributes/variables problem and, under conservative/limiting approximations, have reported analytical solutions for the attributes and variables sample sizes. Through computer simulation of this problem, we have calculated attributes and variables sample sizes as a function of falsification, measurement uncertainties, and required detection probability without using approximations. Using realistic assumptions for uncertainty parameters of measurement, the simulation results support the conclusions: (1) previously used conservative approximations can be expensive because they lead to larger sample sizes than needed; and (2) the optimal verification strategy, as well as the falsification strategy, are highly dependent on the underlying uncertainty parameters of the measurement instruments. 1 ref., 3 figs

  13. Lunar Sample Compendium

    Science.gov (United States)

    Meyer, Charles

    2005-01-01

    The purpose of the Lunar Sample Compendium will be to inform scientists, astronauts and the public about the various lunar samples that have been returned from the Moon. This Compendium will be organized rock by rock in the manor of a catalog, but will not be as comprehensive, nor as complete, as the various lunar sample catalogs that are available. Likewise, this Compendium will not duplicate the various excellent books and reviews on the subject of lunar samples (Cadogen 1981, Heiken et al. 1991, Papike et al. 1998, Warren 2003, Eugster 2003). However, it is thought that an online Compendium, such as this, will prove useful to scientists proposing to study individual lunar samples and should help provide backup information for lunar sample displays. This Compendium will allow easy access to the scientific literature by briefly summarizing the significant findings of each rock along with the documentation of where the detailed scientific data are to be found. In general, discussion and interpretation of the results is left to the formal reviews found in the scientific literature. An advantage of this Compendium will be that it can be updated, expanded and corrected as need be.

  14. Flora sampling in the vicinity of gamma greenhouse: As a flora sampling model for supporting the national nuclear power program (NPP)

    International Nuclear Information System (INIS)

    Affrida Abu Hassan; Zaiton Ahmad; Ros Anita Ahmad Ramli; Shakinah Salleh; Norazlina Noordin; Salmah Moosa; Sakinah Ariffin; Salahbiah Abdul Majid; Nur Humaira Lau Abdullah; Mohd Zaidan Kandar; Abdul Rahim Harun

    2012-01-01

    Gamma Green House was used as a model to study radiation effects on flora and ecosystems in supporting the National Nuclear Power Programme (NPP). A task force was formed in BAB which consists of 3 main groups of flora, fauna and microorganisms. For the flora group, two sampling expeditions have been carried out on July 7, 2011 and March 2, 2012 with the assistant of experts from University Putra Malaysia. From these expeditions, a preliminary data on the types and distribution of plants in selected quadrants close to the Gamma Greenhouse has been successfully observed and the collected plant samples have been preserved as part of the herbarium collection. This paper will describe on the sampling activities and sample preservation. Knowledge gained from this study will be very useful as model for flora distribution baseline data at plant site(author)

  15. Sampling free energy surfaces as slices by combining umbrella sampling and metadynamics.

    Science.gov (United States)

    Awasthi, Shalini; Kapil, Venkat; Nair, Nisanth N

    2016-06-15

    Metadynamics (MTD) is a very powerful technique to sample high-dimensional free energy landscapes, and due to its self-guiding property, the method has been successful in studying complex reactions and conformational changes. MTD sampling is based on filling the free energy basins by biasing potentials and thus for cases with flat, broad, and unbound free energy wells, the computational time to sample them becomes very large. To alleviate this problem, we combine the standard Umbrella Sampling (US) technique with MTD to sample orthogonal collective variables (CVs) in a simultaneous way. Within this scheme, we construct the equilibrium distribution of CVs from biased distributions obtained from independent MTD simulations with umbrella potentials. Reweighting is carried out by a procedure that combines US reweighting and Tiwary-Parrinello MTD reweighting within the Weighted Histogram Analysis Method (WHAM). The approach is ideal for a controlled sampling of a CV in a MTD simulation, making it computationally efficient in sampling flat, broad, and unbound free energy surfaces. This technique also allows for a distributed sampling of a high-dimensional free energy surface, further increasing the computational efficiency in sampling. We demonstrate the application of this technique in sampling high-dimensional surface for various chemical reactions using ab initio and QM/MM hybrid molecular dynamics simulations. Further, to carry out MTD bias reweighting for computing forward reaction barriers in ab initio or QM/MM simulations, we propose a computationally affordable approach that does not require recrossing trajectories. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  16. Self-attenuation of gamma rays during radioactivity concentration analysis of environmental samples

    International Nuclear Information System (INIS)

    Watson, D.; Dharmasiri, J.; Akber, R.

    2001-01-01

    Gamma spectroscopy using HPGe detector systems is a readily used technique for routine analysis of radioactivity in environmental samples. The systems are generally calibrated using standards of known radioactivity and composition. Radioactivity in environmental samples is generally distributed in the bulk of the material. When a sample of finite thickness is analysed through gamma spectroscopy, a proportion of the gamma rays emitted from the sample is either stopped or scattered from the sample material itself. These processes of self-absorption and self-attenuation depend upon the physical and elemental composition of the sample and the energy of the gamma radiation. Since environmental samples vary in composition, instrument calibration using a fixed matrix composition may not be valid for a diversity of samples. We selected and analysed five sample matrices to investigate the influence of self-absorption and self-attenuation in environmental samples. Our selection consisted of bentonite and kaolin representing clay, quartz representing silica, ash representing prepared biota, and analytical grade MnO 2 representing a co-precipitant used for extractive radioactivity from aqueous samples. Our findings show that within 5% of uncertainty the silica based standards can be used to cover the environmental samples of varying clay (silica content). The detection efficiency for ash and MnO 2 could be different particularly in the 30 - 100 keV energy range. The differences in sample behaviour can be explained on the basis of atomic number, mass number and density

  17. Focused conformational sampling in proteins

    Science.gov (United States)

    Bacci, Marco; Langini, Cassiano; Vymětal, Jiří; Caflisch, Amedeo; Vitalis, Andreas

    2017-11-01

    A detailed understanding of the conformational dynamics of biological molecules is difficult to obtain by experimental techniques due to resolution limitations in both time and space. Computer simulations avoid these in theory but are often too short to sample rare events reliably. Here we show that the progress index-guided sampling (PIGS) protocol can be used to enhance the sampling of rare events in selected parts of biomolecules without perturbing the remainder of the system. The method is very easy to use as it only requires as essential input a set of several features representing the parts of interest sufficiently. In this feature space, new states are discovered by spontaneous fluctuations alone and in unsupervised fashion. Because there are no energetic biases acting on phase space variables or projections thereof, the trajectories PIGS generates can be analyzed directly in the framework of transition networks. We demonstrate the possibility and usefulness of such focused explorations of biomolecules with two loops that are part of the binding sites of bromodomains, a family of epigenetic "reader" modules. This real-life application uncovers states that are structurally and kinetically far away from the initial crystallographic structures and are also metastable. Representative conformations are intended to be used in future high-throughput virtual screening campaigns.

  18. Analysis of arsenical metabolites in biological samples.

    Science.gov (United States)

    Hernandez-Zavala, Araceli; Drobna, Zuzana; Styblo, Miroslav; Thomas, David J

    2009-11-01

    Quantitation of iAs and its methylated metabolites in biological samples provides dosimetric information needed to understand dose-response relations. Here, methods are described for separation of inorganic and mono-, di-, and trimethylated arsenicals by thin layer chromatography. This method has been extensively used to track the metabolism of the radionuclide [(73)As] in a variety of in vitro assay systems. In addition, a hydride generation-cryotrapping-gas chromatography-atomic absorption spectrometric method is described for the quantitation of arsenicals in biological samples. This method uses pH-selective hydride generation to differentiate among arsenicals containing trivalent or pentavalent arsenic.

  19. C-Arm Computed Tomography-Assisted Adrenal Venous Sampling Improved Right Adrenal Vein Cannulation and Sampling Quality in Primary Aldosteronism.

    Science.gov (United States)

    Park, Chung Hyun; Hong, Namki; Han, Kichang; Kang, Sang Wook; Lee, Cho Rok; Park, Sungha; Rhee, Yumie

    2018-05-04

    Adrenal venous sampling (AVS) is a gold standard for subtype classification of primary aldosteronism (PA). However, this procedure has a high failure rate because of the anatomical difficulties in accessing the right adrenal vein. We investigated whether C-arm computed tomography-assisted AVS (C-AVS) could improve the success rate of adrenal sampling. A total of 156 patients, diagnosed with PA who underwent AVS from May 2004 through April 2017, were included. Based on the medical records, we retrospectively compared the overall, left, and right catheterization success rates of adrenal veins during the periods without C-AVS (2004 to 2010, n=32) and with C-AVS (2011 to 2016, n=134). The primary outcome was adequate bilateral sampling defined as a selectivity index (SI) >5. With C-AVS, the rates of adequate bilateral AVS increased from 40.6% to 88.7% (PAVS was an independent predictor of adequate bilateral sampling in the multivariate model (odds ratio, 9.01; PAVS improved the overall success rate of AVS, possibly as a result of better catheterization of right adrenal vein. Copyright © 2018 Korean Endocrine Society.

  20. Learning from Past Classification Errors: Exploring Methods for Improving the Performance of a Deep Learning-based Building Extraction Model through Quantitative Analysis of Commission Errors for Optimal Sample Selection

    Science.gov (United States)

    Swan, B.; Laverdiere, M.; Yang, L.

    2017-12-01

    In the past five years, deep Convolutional Neural Networks (CNN) have been increasingly favored for computer vision applications due to their high accuracy and ability to generalize well in very complex problems; however, details of how they function and in turn how they may be optimized are still imperfectly understood. In particular, their complex and highly nonlinear network architecture, including many hidden layers and self-learned parameters, as well as their mathematical implications, presents open questions about how to effectively select training data. Without knowledge of the exact ways the model processes and transforms its inputs, intuition alone may fail as a guide to selecting highly relevant training samples. Working in the context of improving a CNN-based building extraction model used for the LandScan USA gridded population dataset, we have approached this problem by developing a semi-supervised, highly-scalable approach to select training samples from a dataset of identified commission errors. Due to the large scope this project, tens of thousands of potential samples could be derived from identified commission errors. To efficiently trim those samples down to a manageable and effective set for creating additional training sample, we statistically summarized the spectral characteristics of areas with rates of commission errors at the image tile level and grouped these tiles using affinity propagation. Highly representative members of each commission error cluster were then used to select sites for training sample creation. The model will be incrementally re-trained with the new training data to allow for an assessment of how the addition of different types of samples affects the model performance, such as precision and recall rates. By using quantitative analysis and data clustering techniques to select highly relevant training samples, we hope to improve model performance in a manner that is resource efficient, both in terms of training process

  1. Procedures for sampling and sample reduction within quality assurance systems for solid biofuels

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-07-01

    The objective of this experimental study on sampling was to determine the size and number of samples of biofuels required (taken at two sampling points in each case) and to compare two methods of sampling. The first objective of the sample-reduction exercise was to compare the reliability of various sampling methods, and the second objective was to measure the variations introduced as a result of reducing the sample size to form suitable test portions. The materials studied were sawdust, wood chips, wood pellets and bales of straw, and these were analysed for moisture, ash, particle size and chloride. The sampling procedures are described. The study was conducted in Scandinavia. The results of the study were presented in Leipzig in October 2004. The work was carried out as part of the UK's DTI Technology Programme: New and Renewable Energy.

  2. Design unbiased estimation in line intersect sampling using segmented transects

    Science.gov (United States)

    David L.R. Affleck; Timothy G. Gregoire; Harry T. Valentine; Harry T. Valentine

    2005-01-01

    In many applications of line intersect sampling. transects consist of multiple, connected segments in a prescribed configuration. The relationship between the transect configuration and the selection probability of a population element is illustrated and a consistent sampling protocol, applicable to populations composed of arbitrarily shaped elements, is proposed. It...

  3. Procedures for sampling and sample-reduction within quality assurance systems for solid biofuels

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-04-15

    The bias introduced when sampling solid biofuels from stockpiles or containers instead of from moving streams is assessed as well as the number and size of samples required to represent accurately the bulk sample, variations introduced when reducing bulk samples into samples for testing, and the usefulness of sample reduction methods. Details are given of the experimental work carried out in Sweden and Denmark using sawdust, wood chips, wood pellets, forestry residues and straw. The production of a model European Standard for quality assurance of solid biofuels is examined.

  4. Superposition Enhanced Nested Sampling

    Directory of Open Access Journals (Sweden)

    Stefano Martiniani

    2014-08-01

    Full Text Available The theoretical analysis of many problems in physics, astronomy, and applied mathematics requires an efficient numerical exploration of multimodal parameter spaces that exhibit broken ergodicity. Monte Carlo methods are widely used to deal with these classes of problems, but such simulations suffer from a ubiquitous sampling problem: The probability of sampling a particular state is proportional to its entropic weight. Devising an algorithm capable of sampling efficiently the full phase space is a long-standing problem. Here, we report a new hybrid method for the exploration of multimodal parameter spaces exhibiting broken ergodicity. Superposition enhanced nested sampling combines the strengths of global optimization with the unbiased or athermal sampling of nested sampling, greatly enhancing its efficiency with no additional parameters. We report extensive tests of this new approach for atomic clusters that are known to have energy landscapes for which conventional sampling schemes suffer from broken ergodicity. We also introduce a novel parallelization algorithm for nested sampling.

  5. Discrimination of handlebar grip samples by fourier transform infrared microspectroscopy analysis and statistics

    Directory of Open Access Journals (Sweden)

    Zeyu Lin

    2017-01-01

    Full Text Available In this paper, the authors presented a study on the discrimination of handlebar grip samples, to provide effective forensic science service for hit and run traffic cases. 50 bicycle handlebar grip samples, 49 electric bike handlebar grip samples, and 96 motorcycle handlebar grip samples have been randomly collected by the local police in Beijing (China. Fourier transform infrared microspectroscopy (FTIR was utilized as analytical technology. Then, target absorption selection, data pretreatment, and discrimination of linked samples and unlinked samples were chosen as three steps to improve the discrimination of FTIR spectrums collected from different handlebar grip samples. Principal component analysis and receiver operating characteristic curve were utilized to evaluate different data selection methods and different data pretreatment methods, respectively. It is possible to explore the evidential value of handlebar grip residue evidence through instrumental analysis and statistical treatments. It will provide a universal discrimination method for other forensic science samples as well.

  6. Network Sampling with Memory: A proposal for more efficient sampling from social networks

    Science.gov (United States)

    Mouw, Ted; Verdery, Ashton M.

    2013-01-01

    Techniques for sampling from networks have grown into an important area of research across several fields. For sociologists, the possibility of sampling from a network is appealing for two reasons: (1) A network sample can yield substantively interesting data about network structures and social interactions, and (2) it is useful in situations where study populations are difficult or impossible to survey with traditional sampling approaches because of the lack of a sampling frame. Despite its appeal, methodological concerns about the precision and accuracy of network-based sampling methods remain. In particular, recent research has shown that sampling from a network using a random walk based approach such as Respondent Driven Sampling (RDS) can result in high design effects (DE)—the ratio of the sampling variance to the sampling variance of simple random sampling (SRS). A high design effect means that more cases must be collected to achieve the same level of precision as SRS. In this paper we propose an alternative strategy, Network Sampling with Memory (NSM), which collects network data from respondents in order to reduce design effects and, correspondingly, the number of interviews needed to achieve a given level of statistical power. NSM combines a “List” mode, where all individuals on the revealed network list are sampled with the same cumulative probability, with a “Search” mode, which gives priority to bridge nodes connecting the current sample to unexplored parts of the network. We test the relative efficiency of NSM compared to RDS and SRS on 162 school and university networks from Add Health and Facebook that range in size from 110 to 16,278 nodes. The results show that the average design effect for NSM on these 162 networks is 1.16, which is very close to the efficiency of a simple random sample (DE=1), and 98.5% lower than the average DE we observed for RDS. PMID:24159246

  7. A simple highly sensitive and selective aptamer-based colorimetric sensor for environmental toxins microcystin-LR in water samples.

    Science.gov (United States)

    Li, Xiuyan; Cheng, Ruojie; Shi, Huijie; Tang, Bo; Xiao, Hanshuang; Zhao, Guohua

    2016-03-05

    A simple and highly sensitive aptamer-based colorimetric sensor was developed for selective detection of Microcystin-LR (MC-LR). The aptamer (ABA) was employed as recognition element which could bind MC-LR with high-affinity, while gold nanoparticles (AuNPs) worked as sensing materials whose plasma resonance absorption peaks red shifted upon binding of the targets at a high concentration of sodium chloride. With the addition of MC-LR, the random coil aptamer adsorbed on Au NPs altered into regulated structure to form MC-LR-aptamer complexes and broke away from the surface of Au NPs, leading to the aggregation of AuNPs, and the color converted from red to blue due to the interparticle plasmon coupling. Results showed that our aptamer-based colorimetric sensor exhibited rapid and sensitive detection performance for MC-LR with linear range from 0.5 nM to 7.5 μM and the detection limit reached 0.37 nM. Meanwhile, the pollutants usually coexisting with MC-LR in pollutant water samples had not demonstrated disturbance for detecting of MC-LR. The mechanism was also proposed suggesting that high affinity interaction between aptamer and MC-LR significantly enhanced the sensitivity and selectivity for MC-LR detection. Besides, the established method was utilized in analyzing real water samples and splendid sensitivity and selectivity were obtained as well. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Sr-90 determination in aqueous and soils samples

    International Nuclear Information System (INIS)

    Gonzalez Sintas, Maria F.; Cerchietti, Maria L.; Arguelles, Maria G.

    2009-01-01

    The main objective of this paper is to evaluate the method for Sr-90 determination in aqueous sample and soils. Area and Personal Dosimetry laboratory (DPA) determines the presence of Sr-90 by Liquid Scintillation (LSC) by applying method of the double window and corresponding adjustments. Calibration is performed by standard solutions of 90 Sr/ 90 Y, where spectral 90 Sr and 90 Y zones are optimized. The initial treatment of the liquid samples includes the concentration for evaporation, while the solid ones dissolve for microwave and acidic digestion. The separation of the analyte involves a selective chromatographic extraction. An average efficiency for 90 Sr of 77 ± 1 % was obtained; the factor a/b was 0,85 ± 0,01 and recovery of 82 ± 8 %. The resultant MAD was 0,10 Bq/L in aqueous samples and 0,10 Bq/g in solid samples. (author)

  9. Image Sampling with Quasicrystals

    Directory of Open Access Journals (Sweden)

    Mark Grundland

    2009-07-01

    Full Text Available We investigate the use of quasicrystals in image sampling. Quasicrystals produce space-filling, non-periodic point sets that are uniformly discrete and relatively dense, thereby ensuring the sample sites are evenly spread out throughout the sampled image. Their self-similar structure can be attractive for creating sampling patterns endowed with a decorative symmetry. We present a brief general overview of the algebraic theory of cut-and-project quasicrystals based on the geometry of the golden ratio. To assess the practical utility of quasicrystal sampling, we evaluate the visual effects of a variety of non-adaptive image sampling strategies on photorealistic image reconstruction and non-photorealistic image rendering used in multiresolution image representations. For computer visualization of point sets used in image sampling, we introduce a mosaic rendering technique.

  10. A policy of routine umbilical cord blood gas analysis decreased missing samples from high-risk births.

    Science.gov (United States)

    Ahlberg, M; Elvander, C; Johansson, S; Cnattingius, S; Stephansson, O

    2017-01-01

    This study compared obstetric units practicing routine or selective umbilical cord blood gas analysis, with respect to the risk of missing samples in high-risk deliveries and in infants with birth asphyxia. This was a Swedish population-based cohort study that used register data for 155 235 deliveries of live singleton infants between 2008 and 2014. Risk ratios and 95% confidence intervals were calculated to estimate the association between routine and selective umbilical cord blood gas sampling strategies and the risk of missing samples. Selective sampling increased the risk ratios when routine sampling was used as the reference, with a value of 1.0, and these were significant in high-risk deliveries and birth asphyxia. The risk ratios for selective sampling were large-for-gestational age (9.07), preterm delivery at up to 36 weeks of gestation (8.24), small-for-gestational age (7.94), two or more foetal scalp blood samples (5.96), an Apgar score of less than seven at one minute (2.36), emergency Caesarean section (1.67) and instrumental vaginal delivery (1.24). Compared with routine sampling, selective umbilical cord blood gas sampling significantly increased the risks of missing samples in high-risk deliveries and in infants with birth asphyxia. ©2016 Foundation Acta Paediatrica. Published by John Wiley & Sons Ltd.

  11. Sample preparation of environmental samples using benzene synthesis followed by high-performance LSC

    International Nuclear Information System (INIS)

    Filippis, S. De; Noakes, J.E.

    1991-01-01

    Liquid scintillation counting (LSC) techniques have been widely employed as the detection method for determining environmental levels of tritium and 14 C. Since anthropogenic and nonanthropogenic inputs to the environment are a concern, sampling the environment surrounding a nuclear power facility or fuel reprocessing operation requires the collection of many different sample types, including agriculture products, water, biota, aquatic life, soil, and vegetation. These sample types are not suitable for the direct detection of tritium of 14 C for liquid scintillation techniques. Each sample type must be initially prepared in order to obtain the carbon or hydrogen component of interest and present this in a chemical form that is compatible with common chemicals used in scintillation counting applications. Converting the sample of interest to chemically pure benzene as a sample preparation technique has been widely accepted for processing samples for radiocarbon age-dating applications. The synthesized benzene is composed of the carbon or hydrogen atoms from the original sample and is ideal as a solvent for LSC with excellent photo-optical properties. Benzene synthesis followed by low-background scintillation counting can be applied to the preparation and measurement of environmental samples yielding good detection sensitivities, high radionuclide counting efficiency, and shorter preparation time. The method of benzene synthesis provides a unique approach to the preparation of a wide variety of environmental sample types using similar chemistry for all samples

  12. Quantitative portable gamma spectroscopy sample analysis for non-standard sample geometries

    International Nuclear Information System (INIS)

    Enghauser, M.W.; Ebara, S.B.

    1997-01-01

    Utilizing a portable spectroscopy system, a quantitative method for analysis of samples containing a mixture of fission and activation products in nonstandard geometries was developed. The method can be used with various sample and shielding configurations where analysis on a laboratory based gamma spectroscopy system is impractical. The portable gamma spectroscopy method involves calibration of the detector and modeling of the sample and shielding to identify and quantify the radionuclides present in the sample. The method utilizes the intrinsic efficiency of the detector and the unattenuated gamma fluence rate at the detector surface per unit activity from the sample to calculate the nuclide activity and Minimum Detectable Activity (MDA). For a complex geometry, a computer code written for shielding applications (MICROSHIELD) is utilized to determine the unattenuated gamma fluence rate per unit activity at the detector surface. Lastly, the method is only applicable to nuclides which emit gamma rays and cannot be used for pure beta emitters. In addition, if sample self absorption and shielding is significant, the attenuation will result in high MDA's for nuclides which solely emit low energy gamma rays. The following presents the analysis technique and presents verification results demonstrating the accuracy of the method

  13. SNS Sample Activation Calculator Flux Recommendations and Validation

    Energy Technology Data Exchange (ETDEWEB)

    McClanahan, Tucker C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Spallation Neutron Source (SNS); Gallmeier, Franz X. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Spallation Neutron Source (SNS); Iverson, Erik B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Spallation Neutron Source (SNS); Lu, Wei [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Spallation Neutron Source (SNS)

    2015-02-01

    The Spallation Neutron Source (SNS) at Oak Ridge National Laboratory (ORNL) uses the Sample Activation Calculator (SAC) to calculate the activation of a sample after the sample has been exposed to the neutron beam in one of the SNS beamlines. The SAC webpage takes user inputs (choice of beamline, the mass, composition and area of the sample, irradiation time, decay time, etc.) and calculates the activation for the sample. In recent years, the SAC has been incorporated into the user proposal and sample handling process, and instrument teams and users have noticed discrepancies in the predicted activation of their samples. The Neutronics Analysis Team validated SAC by performing measurements on select beamlines and confirmed the discrepancies seen by the instrument teams and users. The conclusions were that the discrepancies were a result of a combination of faulty neutron flux spectra for the instruments, improper inputs supplied by SAC (1.12), and a mishandling of cross section data in the Sample Activation Program for Easy Use (SAPEU) (1.1.2). This report focuses on the conclusion that the SAPEU (1.1.2) beamline neutron flux spectra have errors and are a significant contributor to the activation discrepancies. The results of the analysis of the SAPEU (1.1.2) flux spectra for all beamlines will be discussed in detail. The recommendations for the implementation of improved neutron flux spectra in SAPEU (1.1.3) are also discussed.

  14. A Virtual Observatory Census to Address Dwarfs Origins (AVOCADO). I. Science goals, sample selection, and analysis tools

    Science.gov (United States)

    Sánchez-Janssen, R.; Amorín, R.; García-Vargas, M.; Gomes, J. M.; Huertas-Company, M.; Jiménez-Esteban, F.; Mollá, M.; Papaderos, P.; Pérez-Montero, E.; Rodrigo, C.; Sánchez Almeida, J.; Solano, E.

    2013-06-01

    Context. Even though they are by far the most abundant of all galaxy types, the detailed properties of dwarf galaxies are still only poorly characterised - especially because of the observational challenge that their intrinsic faintness and weak clustering properties represent. Aims: AVOCADO aims at establishing firm conclusions on the formation and evolution of dwarf galaxies by constructing and analysing a homogeneous, multiwavelength dataset for a statistically significant sample of approximately 6500 nearby dwarfs (Mi - 5 log h100 > - 18 mag). The sample is selected to lie within the 20 < D < 60 h100-1 Mpc volume covered by the SDSS-DR7 footprint, and is thus volume-limited for Mi - 5 log h100 < -16 mag dwarfs - but includes ≈1500 fainter systems. We will investigate the roles of mass and environment in determining the current properties of the different dwarf morphological types - including their structure, their star formation activity, their chemical enrichment history, and a breakdown of their stellar, dust, and gas content. Methods: We present the sample selection criteria and describe the suite of analysis tools, some of them developed in the framework of the Virtual Observatory. We use optical spectra and UV-to-NIR imaging of the dwarf sample to derive star formation rates, stellar masses, ages, and metallicities - which are supplemented with structural parameters that are used to classify them morphologically. This unique dataset, coupled with a detailed characterisation of each dwarf's environment, allows for a fully comprehensive investigation of their origins and enables us to track the (potential) evolutionary paths between the different dwarf types. Results: We characterise the local environment of all dwarfs in our sample, paying special attention to trends with current star formation activity. We find that virtually all quiescent dwarfs are located in the vicinity (projected distances ≲ 1.5 h100-1 Mpc) of ≳ L∗ companions, consistent with

  15. The Current Role of Venous Sampling in the Localization of Endocrine Disease

    International Nuclear Information System (INIS)

    Lau, Jeshen H. G.; Drake, William; Matson, Matthew

    2007-01-01

    Endocrine venous sampling plays a specific role in the diagnosis of endocrine disorders. In this article, we cover inferior petrosal sinus sampling, selective parathyroid venous sampling, hepatic venous sampling with arterial stimulation, adrenal venous sampling, and ovarian venous sampling. We review their indications and the scientific evidence justifying these indications in the diagnosis and management of Cushing's syndrome, hyperparathyroidism, pancreatic endocrine tumors, Conn's syndrome, primary hyperaldosteronism, pheochromocytomas, and androgen-secreting ovarian tumors. For each sampling technique, we compare its diagnostic accuracy with that of other imaging techniques and, where possible, look at how it impacts patient management. Finally, we incorporate venous sampling into diagnostic algorithms used at our institution

  16. Yfiler® Plus population samples and dilution series

    DEFF Research Database (Denmark)

    Andersen, Mikkel Meyer; Mogensen, Helle Smidt; Eriksen, Poul Svante

    2017-01-01

    DNA complicated the analysis by causing drop-ins of characteristic female DNA artefacts. Even though the customised analytical threshold in combination with the custom-made artefact filters gave more alleles, crime scene samples still needed special attention from the forensic geneticist....... dynamics and performance. We determined dye-dependent analytical thresholds by receiver operating characteristics (ROC) and made a customised artefact filter that includes theoretical known artefacts by use of previously analysed population samples. Dilution series of known male DNA and a selection...

  17. Sampling design for the Study of Cardiovascular Risks in Adolescents (ERICA

    Directory of Open Access Journals (Sweden)

    Mauricio Teixeira Leite de Vasconcellos

    2015-05-01

    Full Text Available The Study of Cardiovascular Risk in Adolescents (ERICA aims to estimate the prevalence of cardiovascular risk factors and metabolic syndrome in adolescents (12-17 years enrolled in public and private schools of the 273 municipalities with over 100,000 inhabitants in Brazil. The study population was stratified into 32 geographical strata (27 capitals and five sets with other municipalities in each macro-region of the country and a sample of 1,251 schools was selected with probability proportional to size. In each school three combinations of shift (morning and afternoon and grade were selected, and within each of these combinations, one class was selected. All eligible students in the selected classes were included in the study. The design sampling weights were calculated by the product of the reciprocals of the inclusion probabilities in each sampling stage, and were later calibrated considering the projections of the numbers of adolescents enrolled in schools located in the geographical strata by sex and age.

  18. Gravimetric dust sampling for control purposes and occupational dust sampling.

    CSIR Research Space (South Africa)

    Unsted, AD

    1997-02-01

    Full Text Available Prior to the introduction of gravimetric dust sampling, konimeters had been used for dust sampling, which was largely for control purposes. Whether or not absolute results were achievable was not an issue since relative results were used to evaluate...

  19. EXPLORING THE DIVERSITY OF GROUPS AT 0.1 < z < 0.8 WITH X-RAY AND OPTICALLY SELECTED SAMPLES

    Energy Technology Data Exchange (ETDEWEB)

    Connelly, J. L. [Max-Planck-Institut fuer Extraterrestrische Physik, Giessenbachstrasse, D-85748 Garching (Germany); Wilman, David J.; Finoguenov, Alexis; Saglia, Roberto [Max Planck Institute for Extraterrestrial Physics, P.O. Box 1312, Giessenbachstr., D-85741 Garching (Germany); Hou, Annie; Parker, Laura C.; Henderson, Robert D. E. [Department of Physics and Astronomy, McMaster University, Hamilton ON L8S4M1 (Canada); Mulchaey, John S. [Observatories of the Carnegie Institution, 813 Santa Barbara Street, Pasadena, CA 91101 (United States); McGee, Sean L.; Balogh, Michael L. [Department of Physics and Astronomy, University of Waterloo, Waterloo, Ontario N2L 3G1 (Canada); Bower, Richard G. [Department of Physics, University of Durham, Durham DH1 3LE (United Kingdom)

    2012-09-10

    We present the global group properties of two samples of galaxy groups containing 39 high-quality X-ray-selected systems and 38 optically (spectroscopically) selected systems in coincident spatial regions at 0.12 < z < 0.79. The total mass range of the combined sample is {approx}(10{sup 12}-5) Multiplication-Sign 10{sup 14} M{sub Sun }. Only nine optical systems are associable with X-ray systems. We discuss the confusion inherent in the matching of both galaxies to extended X-ray emission and of X-ray emission to already identified optical systems. Extensive spectroscopy has been obtained and the resultant redshift catalog and group membership are provided here. X-ray, dynamical, and total stellar masses of the groups are also derived and presented. We explore the effects of utilizing different centers and applying three different kinds of radial cut to our systems: a constant cut of 1 Mpc and two r{sub 200} cuts, one based on the velocity dispersion of the system and the other on the X-ray emission. We find that an X-ray-based r{sub 200} results in less scatter in scaling relations and less dynamical complexity as evidenced by results of the Anderson-Darling and Dressler-Schectman tests, indicating that this radius tends to isolate the virialized part of the system. The constant and velocity dispersion based cuts can overestimate membership and can work to inflate velocity dispersion and dynamical and stellar mass. We find L{sub X} -{sigma} and M{sub stellar}-L{sub X} scaling relations for X-ray and optically selected systems are not dissimilar. The mean fraction of mass found in stars, excluding intracluster light, for our systems is {approx}0.014 with a logarithmic standard deviation of 0.398 dex. We also define and investigate a sample of groups which are X-ray underluminous given the total group stellar mass. For these systems the fraction of stellar mass contributed by the most massive galaxy is typically lower than that found for the total population of

  20. Assessing the joint effect of population stratification and sample selection in studies of gene-gene (environment interactions

    Directory of Open Access Journals (Sweden)

    Cheng KF

    2012-01-01

    Full Text Available Abstract Background It is well known that the presence of population stratification (PS may cause the usual test in case-control studies to produce spurious gene-disease associations. However, the impact of the PS and sample selection (SS is less known. In this paper, we provide a systematic study of the joint effect of PS and SS under a more general risk model containing genetic and environmental factors. We provide simulation results to show the magnitude of the bias and its impact on type I error rate of the usual chi-square test under a wide range of PS level and selection bias. Results The biases to the estimation of main and interaction effect are quantified and then their bounds derived. The estimated bounds can be used to compute conservative p-values for the association test. If the conservative p-value is smaller than the significance level, we can safely claim that the association test is significant regardless of the presence of PS or not, or if there is any selection bias. We also identify conditions for the null bias. The bias depends on the allele frequencies, exposure rates, gene-environment odds ratios and disease risks across subpopulations and the sampling of the cases and controls. Conclusion Our results show that the bias cannot be ignored even the case and control data were matched in ethnicity. A real example is given to illustrate application of the conservative p-value. These results are useful to the genetic association studies of main and interaction effects.

  1. Sampling procedures and tables

    International Nuclear Information System (INIS)

    Franzkowski, R.

    1980-01-01

    Characteristics, defects, defectives - Sampling by attributes and by variables - Sample versus population - Frequency distributions for the number of defectives or the number of defects in the sample - Operating characteristic curve, producer's risk, consumer's risk - Acceptable quality level AQL - Average outgoing quality AOQ - Standard ISQ 2859 - Fundamentals of sampling by variables for fraction defective. (RW)

  2. Eleven-Year Retrospective Report of Super-Selective Venous Sampling for the Evaluation of Recurrent or Persistent Hyperparathyroidism in 32 Patients.

    Science.gov (United States)

    Habibollahi, Peiman; Shin, Benjamin; Shamchi, Sara P; Wachtel, Heather; Fraker, Douglas L; Trerotola, Scott O

    2018-01-01

    Parathyroid venous sampling (PAVS) is usually reserved for patients with persistent or recurrent hyperparathyroidism after parathyroidectomy with inconclusive noninvasive imaging studies. A retrospective study was performed to evaluate the diagnostic efficacy of super-selective PAVS (SSVS) in patients needing revision neck surgery with inconclusive imaging. Patients undergoing PAVS between 2005 and 2016 due to persistent or recurrent hyperparathyroidism following surgery were reviewed. PAVS was performed in all patients using super-selective technique. Single-value measurements within central neck veins performed as part of super-selective PAVS were used to simulate selective venous sampling (SVS) and allow for comparison to data, which might be obtained in a non-super-selective approach. 32 patients (mean age 51 ± 15 years; 8 men and 24 women) met inclusion and exclusion criteria. The sensitivity and positive predictive value (PPV) of SSVS for localizing the source of elevated PTH to a limited area in the neck or chest was 96 and 84%, respectively. Simulated SVS, on the other hand, had a sensitivity of 28% and a PPV of 89% based on the predefined gold standard. SSVS had a significantly higher sensitivity compared to simulated SVS (p localizing the source of hyperparathyroidism in patients undergoing revision surgery for hyperparathyroidism in whom noninvasive imaging studies are inconclusive. SSVS data had also markedly higher sensitivity for localizing disease in these patients compared to simulated SVS.

  3. Sample processing, protocol, and statistical analysis of the time-of-flight secondary ion mass spectrometry (ToF-SIMS) of protein, cell, and tissue samples.

    Science.gov (United States)

    Barreto, Goncalo; Soininen, Antti; Sillat, Tarvo; Konttinen, Yrjö T; Kaivosoja, Emilia

    2014-01-01

    Time-of-flight secondary ion mass spectrometry (ToF-SIMS) is increasingly being used in analysis of biological samples. For example, it has been applied to distinguish healthy and osteoarthritic human cartilage. This chapter discusses ToF-SIMS principle and instrumentation including the three modes of analysis in ToF-SIMS. ToF-SIMS sets certain requirements for the samples to be analyzed; for example, the samples have to be vacuum compatible. Accordingly, sample processing steps for different biological samples, i.e., proteins, cells, frozen and paraffin-embedded tissues and extracellular matrix for the ToF-SIMS are presented. Multivariate analysis of the ToF-SIMS data and the necessary data preprocessing steps (peak selection, data normalization, mean-centering, and scaling and transformation) are discussed in this chapter.

  4. Integrated sampling and analysis plan for samples measuring >10 mrem/hour

    International Nuclear Information System (INIS)

    Haller, C.S.

    1992-03-01

    This integrated sampling and analysis plan was prepared to assist in planning and scheduling of Hanford Site sampling and analytical activities for all waste characterization samples that measure greater than 10 mrem/hour. This report also satisfies the requirements of the renegotiated Interim Milestone M-10-05 of the Hanford Federal Facility Agreement and Consent Order (the Tri-Party Agreement). For purposes of comparing the various analytical needs with the Hanford Site laboratory capabilities, the analytical requirements of the various programs were normalized by converting required laboratory effort for each type of sample to a common unit of work, the standard analytical equivalency unit (AEU). The AEU approximates the amount of laboratory resources required to perform an extensive suite of analyses on five core segments individually plus one additional suite of analyses on a composite sample derived from a mixture of the five core segments and prepare a validated RCRA-type data package

  5. Evaluation of Respondent-Driven Sampling

    Science.gov (United States)

    McCreesh, Nicky; Frost, Simon; Seeley, Janet; Katongole, Joseph; Tarsh, Matilda Ndagire; Ndunguse, Richard; Jichi, Fatima; Lunel, Natasha L; Maher, Dermot; Johnston, Lisa G; Sonnenberg, Pam; Copas, Andrew J; Hayes, Richard J; White, Richard G

    2012-01-01

    Background Respondent-driven sampling is a novel variant of link-tracing sampling for estimating the characteristics of hard-to-reach groups, such as HIV prevalence in sex-workers. Despite its use by leading health organizations, the performance of this method in realistic situations is still largely unknown. We evaluated respondent-driven sampling by comparing estimates from a respondent-driven sampling survey with total-population data. Methods Total-population data on age, tribe, religion, socioeconomic status, sexual activity and HIV status were available on a population of 2402 male household-heads from an open cohort in rural Uganda. A respondent-driven sampling (RDS) survey was carried out in this population, employing current methods of sampling (RDS sample) and statistical inference (RDS estimates). Analyses were carried out for the full RDS sample and then repeated for the first 250 recruits (small sample). Results We recruited 927 household-heads. Full and small RDS samples were largely representative of the total population, but both samples under-represented men who were younger, of higher socioeconomic status, and with unknown sexual activity and HIV status. Respondent-driven-sampling statistical-inference methods failed to reduce these biases. Only 31%-37% (depending on method and sample size) of RDS estimates were closer to the true population proportions than the RDS sample proportions. Only 50%-74% of respondent-driven-sampling bootstrap 95% confidence intervals included the population proportion. Conclusions Respondent-driven sampling produced a generally representative sample of this well-connected non-hidden population. However, current respondent-driven-sampling inference methods failed to reduce bias when it occurred. Whether the data required to remove bias and measure precision can be collected in a respondent-driven sampling survey is unresolved. Respondent-driven sampling should be regarded as a (potentially superior) form of convenience-sampling

  6. Evaluation of respondent-driven sampling.

    Science.gov (United States)

    McCreesh, Nicky; Frost, Simon D W; Seeley, Janet; Katongole, Joseph; Tarsh, Matilda N; Ndunguse, Richard; Jichi, Fatima; Lunel, Natasha L; Maher, Dermot; Johnston, Lisa G; Sonnenberg, Pam; Copas, Andrew J; Hayes, Richard J; White, Richard G

    2012-01-01

    Respondent-driven sampling is a novel variant of link-tracing sampling for estimating the characteristics of hard-to-reach groups, such as HIV prevalence in sex workers. Despite its use by leading health organizations, the performance of this method in realistic situations is still largely unknown. We evaluated respondent-driven sampling by comparing estimates from a respondent-driven sampling survey with total population data. Total population data on age, tribe, religion, socioeconomic status, sexual activity, and HIV status were available on a population of 2402 male household heads from an open cohort in rural Uganda. A respondent-driven sampling (RDS) survey was carried out in this population, using current methods of sampling (RDS sample) and statistical inference (RDS estimates). Analyses were carried out for the full RDS sample and then repeated for the first 250 recruits (small sample). We recruited 927 household heads. Full and small RDS samples were largely representative of the total population, but both samples underrepresented men who were younger, of higher socioeconomic status, and with unknown sexual activity and HIV status. Respondent-driven sampling statistical inference methods failed to reduce these biases. Only 31%-37% (depending on method and sample size) of RDS estimates were closer to the true population proportions than the RDS sample proportions. Only 50%-74% of respondent-driven sampling bootstrap 95% confidence intervals included the population proportion. Respondent-driven sampling produced a generally representative sample of this well-connected nonhidden population. However, current respondent-driven sampling inference methods failed to reduce bias when it occurred. Whether the data required to remove bias and measure precision can be collected in a respondent-driven sampling survey is unresolved. Respondent-driven sampling should be regarded as a (potentially superior) form of convenience sampling method, and caution is required

  7. Sample Curation in Support of the OSIRIS-REx Asteroid Sample Return Mission

    Science.gov (United States)

    Righter, Kevin; Nakamura-Messenger, Keiko

    2017-01-01

    The OSIRIS-REx asteroid sample return mission launched to asteroid Bennu Sept. 8, 2016. The spacecraft will arrive at Bennu in late 2019, orbit and map the asteroid, and perform a touch and go (TAG) sampling maneuver in July 2020. After sample is stowed and confirmed the spacecraft will return to Earth, and the sample return capsule (SRC) will land in Utah in September 2023. Samples will be recovered from Utah [2] and then transported and stored in a new sample cleanroom at NASA Johnson Space Center in Houston [3]. The materials curated for the mission are described here. a) Materials Archive and Witness Plate Collection: The SRC and TAGSAM were built between March 2014 and Summer of 2015, and instruments (OTES,OVIRS, OLA, OCAMS, REXIS) were integrated from Summer 2015 until May 2016. A total of 395 items were received for the materials archive at NASA-JSC, with archiving finishing 30 days after launch (with the final archived items being related to launch operations)[4]. The materials fall into several general categories including metals (stainless steel, aluminum, titanium alloys, brass and BeCu alloy), epoxies, paints, polymers, lubricants, non-volatile-residue samples (NVR), sapphire, and various miscellaneous materials. All through the ATLO process (from March 2015 until late August 2016) contamination knowledge witness plates (Si wafer and Al foil) were deployed in the various cleanrooms in Denver and KSC to provide an additional record of particle counts and volatiles that is archived for current and future scientific studies. These plates were deployed in roughly monthly increments with each unit containing 4 Si wafers and 4 Al foils. We archived 128 individual witness plates (64 Si wafers and 64 Al foils); one of each witness plate (Si and Al) was analyzed immediately by the science team after archiving, while the remaining 3 of each are archived indefinitely. Information about each material archived is stored in an extensive database at NASA-JSC, and key

  8. PCR detection of Burkholderia multivorans in water and soil samples.

    Science.gov (United States)

    Peeters, Charlotte; Daenekindt, Stijn; Vandamme, Peter

    2016-08-12

    Although semi-selective growth media have been developed for the isolation of Burkholderia cepacia complex bacteria from the environment, thus far Burkholderia multivorans has rarely been isolated from such samples. Because environmental B. multivorans isolates mainly originate from water samples, we hypothesized that water rather than soil is its most likely environmental niche. The aim of the present study was to assess the occurrence of B. multivorans in water samples from Flanders (Belgium) using a fast, culture-independent PCR assay. A nested PCR approach was used to achieve high sensitivity, and specificity was confirmed by sequencing the resulting amplicons. B. multivorans was detected in 11 % of the water samples (n = 112) and 92 % of the soil samples (n = 25) tested. The percentage of false positives was higher for water samples compared to soil samples, showing that the presently available B. multivorans recA primers lack specificity when applied to the analysis of water samples. The results of the present study demonstrate that B. multivorans DNA is commonly present in soil samples and to a lesser extent in water samples in Flanders (Belgium).

  9. Vibronic Boson Sampling: Generalized Gaussian Boson Sampling for Molecular Vibronic Spectra at Finite Temperature.

    Science.gov (United States)

    Huh, Joonsuk; Yung, Man-Hong

    2017-08-07

    Molecular vibroic spectroscopy, where the transitions involve non-trivial Bosonic correlation due to the Duschinsky Rotation, is strongly believed to be in a similar complexity class as Boson Sampling. At finite temperature, the problem is represented as a Boson Sampling experiment with correlated Gaussian input states. This molecular problem with temperature effect is intimately related to the various versions of Boson Sampling sharing the similar computational complexity. Here we provide a full description to this relation in the context of Gaussian Boson Sampling. We find a hierarchical structure, which illustrates the relationship among various Boson Sampling schemes. Specifically, we show that every instance of Gaussian Boson Sampling with an initial correlation can be simulated by an instance of Gaussian Boson Sampling without initial correlation, with only a polynomial overhead. Since every Gaussian state is associated with a thermal state, our result implies that every sampling problem in molecular vibronic transitions, at any temperature, can be simulated by Gaussian Boson Sampling associated with a product of vacuum modes. We refer such a generalized Gaussian Boson Sampling motivated by the molecular sampling problem as Vibronic Boson Sampling.

  10. Sampling on Quasicrystals

    OpenAIRE

    Grepstad, Sigrid

    2011-01-01

    We prove that quasicrystals are universal sets of stable sampling in any dimension. Necessary and sufficient density conditions for stable sampling and interpolation sets in one dimension are studied in detail.

  11. Sample size determination for equivalence assessment with multiple endpoints.

    Science.gov (United States)

    Sun, Anna; Dong, Xiaoyu; Tsong, Yi

    2014-01-01

    Equivalence assessment between a reference and test treatment is often conducted by two one-sided tests (TOST). The corresponding power function and sample size determination can be derived from a joint distribution of the sample mean and sample variance. When an equivalence trial is designed with multiple endpoints, it often involves several sets of two one-sided tests. A naive approach for sample size determination in this case would select the largest sample size required for each endpoint. However, such a method ignores the correlation among endpoints. With the objective to reject all endpoints and when the endpoints are uncorrelated, the power function is the production of all power functions for individual endpoints. With correlated endpoints, the sample size and power should be adjusted for such a correlation. In this article, we propose the exact power function for the equivalence test with multiple endpoints adjusted for correlation under both crossover and parallel designs. We further discuss the differences in sample size for the naive method without and with correlation adjusted methods and illustrate with an in vivo bioequivalence crossover study with area under the curve (AUC) and maximum concentration (Cmax) as the two endpoints.

  12. Chorionic villus sampling

    Science.gov (United States)

    ... medlineplus.gov/ency/article/003406.htm Chorionic villus sampling To use the sharing features on this page, please enable JavaScript. Chorionic villus sampling (CVS) is a test some pregnant women have ...

  13. Additional Samples: Where They Should Be Located

    International Nuclear Information System (INIS)

    Pilger, G. G.; Costa, J. F. C. L.; Koppe, J. C.

    2001-01-01

    Information for mine planning requires to be close spaced, if compared to the grid used for exploration and resource assessment. The additional samples collected during quasimining usually are located in the same pattern of the original diamond drillholes net but closer spaced. This procedure is not the best in mathematical sense for selecting a location. The impact of an additional information to reduce the uncertainty about the parameter been modeled is not the same everywhere within the deposit. Some locations are more sensitive in reducing the local and global uncertainty than others. This study introduces a methodology to select additional sample locations based on stochastic simulation. The procedure takes into account data variability and their spatial location. Multiple equally probable models representing a geological attribute are generated via geostatistical simulation. These models share basically the same histogram and the same variogram obtained from the original data set. At each block belonging to the model a value is obtained from the n simulations and their combination allows one to access local variability. Variability is measured using an uncertainty index proposed. This index was used to map zones of high variability. A value extracted from a given simulation is added to the original data set from a zone identified as erratic in the previous maps. The process of adding samples and simulation is repeated and the benefit of the additional sample is evaluated. The benefit in terms of uncertainty reduction is measure locally and globally. The procedure showed to be robust and theoretically sound, mapping zones where the additional information is most beneficial. A case study in a coal mine using coal seam thickness illustrates the method

  14. Mineralogical, chemical, and petrographic analysis of selected rock samples

    International Nuclear Information System (INIS)

    Roy, D.M.

    1976-01-01

    I. The majority of rocks examined from the NTS were found to be siltstones, varying from coarse into the very fine siltstone range, and containing > 60% quartz, usually much higher. Samples of the UEIL series of cores, in contrast, had a large clay mineral fraction, as well as some carbonate present. A few were intermediate silty claystones or argillites. Microphotographs are included to illustrate the variations in texture observed, while most of the data obtained are summarized in tabular form. II. Seven Michigan Salina evaporite specimens were analyzed

  15. Nanostructured conducting molecularly imprinted polymer for selective extraction of salicylate from urine and serum samples by electrochemically controlled solid-phase micro-extraction

    Energy Technology Data Exchange (ETDEWEB)

    Ameli, Akram [Department of Chemistry, Faculty of Science, Tarbiat Modares University, P.O. Box 14115-175, Tehran (Iran, Islamic Republic of); Alizadeh, Naader, E-mail: alizaden@modares.ac.ir [Department of Chemistry, Faculty of Science, Tarbiat Modares University, P.O. Box 14115-175, Tehran (Iran, Islamic Republic of)

    2011-11-30

    Highlights: Black-Right-Pointing-Pointer Overoxidized polypyrrole templated with salicylate has been utilized as conducting molecular imprinted polymer for EC-SPME. Black-Right-Pointing-Pointer This first study reported on conducting molecular imprinted polymer was used to EC-SPME of salicylate. Black-Right-Pointing-Pointer Proposed method, is particularly effective in sample clean-up and selective monitoring of salicylate in physiological samples. - Abstract: Overoxidized polypyrrole (OPPy) films templated with salicylate (SA) have been utilized as conducting molecular imprinted polymers (CMIPs) for potential-induced selective solid-phase micro-extraction processes. Various important fabrication factors for controlling the performance of the OPPy films have been investigated using fluorescence spectrometry. Several key parameters such as applied potential for uptake, release, pH of uptake and release solution were varied to achieve the optimum micro-extraction procedure. The film template with SA exhibited excellent selectivity over some interference. The calibration graphs were linear in the ranges of 5 Multiplication-Sign 10{sup -8} to 5 Multiplication-Sign 10{sup -4} and 1.2 Multiplication-Sign 10{sup -6} to 5 Multiplication-Sign 10{sup -4} mol mL{sup -1} and the detection limit was 4 Multiplication-Sign 10{sup -8} mol L{sup -1}. The OPPy film as the solid-phase micro-extraction absorbent has been applied for the selective clean-up and quantification of trace amounts of SA from physiological samples. The results of scanning electron microscopy (SEM) have confirmed the nano-structure morphologies of the films.

  16. Nanostructured conducting molecularly imprinted polymer for selective extraction of salicylate from urine and serum samples by electrochemically controlled solid-phase micro-extraction

    International Nuclear Information System (INIS)

    Ameli, Akram; Alizadeh, Naader

    2011-01-01

    Highlights: ► Overoxidized polypyrrole templated with salicylate has been utilized as conducting molecular imprinted polymer for EC-SPME. ► This first study reported on conducting molecular imprinted polymer was used to EC-SPME of salicylate. ► Proposed method, is particularly effective in sample clean-up and selective monitoring of salicylate in physiological samples. - Abstract: Overoxidized polypyrrole (OPPy) films templated with salicylate (SA) have been utilized as conducting molecular imprinted polymers (CMIPs) for potential-induced selective solid-phase micro-extraction processes. Various important fabrication factors for controlling the performance of the OPPy films have been investigated using fluorescence spectrometry. Several key parameters such as applied potential for uptake, release, pH of uptake and release solution were varied to achieve the optimum micro-extraction procedure. The film template with SA exhibited excellent selectivity over some interference. The calibration graphs were linear in the ranges of 5 × 10 −8 to 5 × 10 −4 and 1.2 × 10 −6 to 5 × 10 −4 mol mL −1 and the detection limit was 4 × 10 −8 mol L −1 . The OPPy film as the solid-phase micro-extraction absorbent has been applied for the selective clean-up and quantification of trace amounts of SA from physiological samples. The results of scanning electron microscopy (SEM) have confirmed the nano-structure morphologies of the films.

  17. Reliability of impingement sampling designs: An example from the Indian Point station

    International Nuclear Information System (INIS)

    Mattson, M.T.; Waxman, J.B.; Watson, D.A.

    1988-01-01

    A 4-year data base (1976-1979) of daily fish impingement counts at the Indian Point electric power station on the Hudson River was used to compare the precision and reliability of three random-sampling designs: (1) simple random, (2) seasonally stratified, and (3) empirically stratified. The precision of daily impingement estimates improved logarithmically for each design as more days in the year were sampled. Simple random sampling was the least, and empirically stratified sampling was the most precise design, and the difference in precision between the two stratified designs was small. Computer-simulated sampling was used to estimate the reliability of the two stratified-random-sampling designs. A seasonally stratified sampling design was selected as the most appropriate reduced-sampling program for Indian Point station because: (1) reasonably precise and reliable impingement estimates were obtained using this design for all species combined and for eight common Hudson River fish by sampling only 30% of the days in a year (110 d); and (2) seasonal strata may be more precise and reliable than empirical strata if future changes in annual impingement patterns occur. The seasonally stratified design applied to the 1976-1983 Indian Point impingement data showed that selection of sampling dates based on daily species-specific impingement variability gave results that were more precise, but not more consistently reliable, than sampling allocations based on the variability of all fish species combined. 14 refs., 1 fig., 6 tabs

  18. The new Chalk River AMS ion source, sample changer and external sample magazine

    International Nuclear Information System (INIS)

    Koslowsky, V.T.; Bray, N.; Imahori, Y.; Andrews, H.R.; Davies, W.G.

    1997-01-01

    A new sample magazine, sample changer and ion source have been developed and are in routine use at Chalk River. The system features a readily accessible 40-sample magazine at ground potential that is external to the ion source and high-voltage cage. The samples are held in an inert atmosphere and can be individually examined or removed; they can be exchanged en masse as a complete magazine concurrent with an AMS measurement. On-line sample changing is done with a pneumatic rabbit transfer system employing two stages of differential pumping. At Chalk River this is routinely performed across a 200 kV potential. Sample positioning is precise, and hundreds of 36 Cl and 129 I samples have been measured over a period of several days without interruption or alteration of ion source operating conditions. (author)

  19. Genetic Sample Inventory

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This database archives genetic tissue samples from marine mammals collected primarily from the U.S. east coast. The collection includes samples from field programs,...

  20. Fluid sample collection and distribution system. [qualitative analysis of aqueous samples from several points

    Science.gov (United States)

    Brooks, R. L. (Inventor)

    1979-01-01

    A multipoint fluid sample collection and distribution system is provided wherein the sample inputs are made through one or more of a number of sampling valves to a progressive cavity pump which is not susceptible to damage by large unfiltered particles. The pump output is through a filter unit that can provide a filtered multipoint sample. An unfiltered multipoint sample is also provided. An effluent sample can be taken and applied to a second progressive cavity pump for pumping to a filter unit that can provide one or more filtered effluent samples. The second pump can also provide an unfiltered effluent sample. Means are provided to periodically back flush each filter unit without shutting off the whole system.