WorldWideScience

Sample records for sampling

  1. Sampling

    CERN Document Server

    Thompson, Steven K

    2012-01-01

    Praise for the Second Edition "This book has never had a competitor. It is the only book that takes a broad approach to sampling . . . any good personal statistics library should include a copy of this book." —Technometrics "Well-written . . . an excellent book on an important subject. Highly recommended." —Choice "An ideal reference for scientific researchers and other professionals who use sampling." —Zentralblatt Math Features new developments in the field combined with all aspects of obtaining, interpreting, and using sample data Sampling provides an up-to-date treat

  2. Balanced sampling

    NARCIS (Netherlands)

    Brus, D.J.

    2015-01-01

    In balanced sampling a linear relation between the soil property of interest and one or more covariates with known means is exploited in selecting the sampling locations. Recent developments make this sampling design attractive for statistical soil surveys. This paper introduces balanced sampling

  3. Language sampling

    DEFF Research Database (Denmark)

    Rijkhoff, Jan; Bakker, Dik

    1998-01-01

    This article has two aims: [1] to present a revised version of the sampling method that was originally proposed in 1993 by Rijkhoff, Bakker, Hengeveld and Kahrel, and [2] to discuss a number of other approaches to language sampling in the light of our own method. We will also demonstrate how our...... sampling method is used with different genetic classifications (Voegelin & Voegelin 1977, Ruhlen 1987, Grimes ed. 1997) and argue that —on the whole— our sampling technique compares favourably with other methods, especially in the case of exploratory research....

  4. Venous Sampling

    Science.gov (United States)

    ... neck to help locate abnormally functioning glands or pituitary adenoma . This test is most often used after an unsuccessful neck exploration. Inferior petrosal sinus sampling , in which blood samples are taken from veins that drain the pituitary gland to study disorders related to pituitary hormone ...

  5. Sampling Development

    Science.gov (United States)

    Adolph, Karen E.; Robinson, Scott R.

    2011-01-01

    Research in developmental psychology requires sampling at different time points. Accurate depictions of developmental change provide a foundation for further empirical studies and theories about developmental mechanisms. However, overreliance on widely spaced sampling intervals in cross-sectional and longitudinal designs threatens the validity of…

  6. Language sampling

    DEFF Research Database (Denmark)

    Rijkhoff, Jan; Bakker, Dik

    1998-01-01

    This article has two aims: [1] to present a revised version of the sampling method that was originally proposed in 1993 by Rijkhoff, Bakker, Hengeveld and Kahrel, and [2] to discuss a number of other approaches to language sampling in the light of our own method. We will also demonstrate how our...

  7. Environmental sampling

    Energy Technology Data Exchange (ETDEWEB)

    Puckett, J.M.

    1998-12-31

    Environmental Sampling (ES) is a technology option that can have application in transparency in nuclear nonproliferation. The basic process is to take a sample from the environment, e.g., soil, water, vegetation, or dust and debris from a surface, and through very careful sample preparation and analysis, determine the types, elemental concentration, and isotopic composition of actinides in the sample. The sample is prepared and the analysis performed in a clean chemistry laboratory (CCL). This ES capability is part of the IAEA Strengthened Safeguards System. Such a Laboratory is planned to be built by JAERI at Tokai and will give Japan an intrinsic ES capability. This paper presents options for the use of ES as a transparency measure for nuclear nonproliferation.

  8. Elevating sampling

    Science.gov (United States)

    Labuz, Joseph M.; Takayama, Shuichi

    2014-01-01

    Sampling – the process of collecting, preparing, and introducing an appropriate volume element (voxel) into a system – is often under appreciated and pushed behind the scenes in lab-on-a-chip research. What often stands in the way between proof-of-principle demonstrations of potentially exciting technology and its broader dissemination and actual use, however, is the effectiveness of sample collection and preparation. The power of micro- and nanofluidics to improve reactions, sensing, separation, and cell culture cannot be accessed if sampling is not equally efficient and reliable. This perspective will highlight recent successes as well as assess current challenges and opportunities in this area. PMID:24781100

  9. Systematic sampling with errors in sample locations

    DEFF Research Database (Denmark)

    Ziegel, Johanna; Baddeley, Adrian; Dorph-Petersen, Karl-Anton

    2010-01-01

    Systematic sampling of points in continuous space is widely used in microscopy and spatial surveys. Classical theory provides asymptotic expressions for the variance of estimators based on systematic sampling as the grid spacing decreases. However, the classical theory assumes that the sample grid...... is exactly periodic; real physical sampling procedures may introduce errors in the placement of the sample points. This paper studies the effect of errors in sample positioning on the variance of estimators in the case of one-dimensional systematic sampling. First we sketch a general approach to variance...

  10. An Abecedary of Sampling.

    Science.gov (United States)

    Doyle, Kenneth O., Jr.

    1979-01-01

    The vocabulary of sampling is examined in order to provide a clear understanding of basic sampling concepts. The basic vocabulary of sampling (population, probability sampling, precision and bias, stratification), the fundamental grammar of sampling (random sample), sample size and response rate, and cluster, multiphase, snowball, and panel…

  11. Modern survey sampling

    CERN Document Server

    Chaudhuri, Arijit

    2014-01-01

    Exposure to SamplingAbstract Introduction Concepts of Population, Sample, and SamplingInitial RamificationsAbstract Introduction Sampling Design, Sampling SchemeRandom Numbers and Their Uses in Simple RandomSampling (SRS)Drawing Simple Random Samples with and withoutReplacementEstimation of Mean, Total, Ratio of Totals/Means:Variance and Variance EstimationDetermination of Sample SizesA.2 Appendix to Chapter 2 A.More on Equal Probability Sampling A.Horvitz-Thompson EstimatorA.SufficiencyA.LikelihoodA.Non-Existence Theorem More Intricacies Abstract Introduction Unequal Probability Sampling StrategiesPPS Sampling Exploring Improved WaysAbstract Introduction Stratified Sampling Cluster SamplingMulti-Stage SamplingMulti-Phase Sampling: Ratio and RegressionEstimationviiviii ContentsControlled SamplingModeling Introduction Super-Population ModelingPrediction Approach Model-Assisted Approach Bayesian Methods Spatial SmoothingSampling on Successive Occasions: Panel Rotation Non-Response and Not-at-Homes Weighting Adj...

  12. Lunar Sample Atlas

    Data.gov (United States)

    National Aeronautics and Space Administration — The Lunar Sample Atlas provides pictures of the Apollo samples taken in the Lunar Sample Laboratory, full-color views of the samples in microscopic thin-sections,...

  13. Signal sampling circuit

    NARCIS (Netherlands)

    Louwsma, S.M.; Vertregt, Maarten

    2011-01-01

    A sampling circuit for sampling a signal is disclosed. The sampling circuit comprises a plurality of sampling channels adapted to sample the signal in time-multiplexed fashion, each sampling channel comprising a respective track-and-hold circuit connected to a respective analogue to digital

  14. Signal sampling circuit

    NARCIS (Netherlands)

    Louwsma, S.M.; Vertregt, Maarten

    2010-01-01

    A sampling circuit for sampling a signal is disclosed. The sampling circuit comprises a plurality of sampling channels adapted to sample the signal in time-multiplexed fashion, each sampling channel comprising a respective track-and-hold circuit connected to a respective analogue to digital

  15. A Mars Sample Return Sample Handling System

    Science.gov (United States)

    Wilson, David; Stroker, Carol

    2013-01-01

    We present a sample handling system, a subsystem of the proposed Dragon landed Mars Sample Return (MSR) mission [1], that can return to Earth orbit a significant mass of frozen Mars samples potentially consisting of: rock cores, subsurface drilled rock and ice cuttings, pebble sized rocks, and soil scoops. The sample collection, storage, retrieval and packaging assumptions and concepts in this study are applicable for the NASA's MPPG MSR mission architecture options [2]. Our study assumes a predecessor rover mission collects samples for return to Earth to address questions on: past life, climate change, water history, age dating, understanding Mars interior evolution [3], and, human safety and in-situ resource utilization. Hence the rover will have "integrated priorities for rock sampling" [3] that cover collection of subaqueous or hydrothermal sediments, low-temperature fluidaltered rocks, unaltered igneous rocks, regolith and atmosphere samples. Samples could include: drilled rock cores, alluvial and fluvial deposits, subsurface ice and soils, clays, sulfates, salts including perchlorates, aeolian deposits, and concretions. Thus samples will have a broad range of bulk densities, and require for Earth based analysis where practical: in-situ characterization, management of degradation such as perchlorate deliquescence and volatile release, and contamination management. We propose to adopt a sample container with a set of cups each with a sample from a specific location. We considered two sample cups sizes: (1) a small cup sized for samples matching those submitted to in-situ characterization instruments, and, (2) a larger cup for 100 mm rock cores [4] and pebble sized rocks, thus providing diverse samples and optimizing the MSR sample mass payload fraction for a given payload volume. We minimize sample degradation by keeping them frozen in the MSR payload sample canister using Peltier chip cooling. The cups are sealed by interference fitted heat activated memory

  16. Information sampling behavior with explicit sampling costs

    Science.gov (United States)

    Juni, Mordechai Z.; Gureckis, Todd M.; Maloney, Laurence T.

    2015-01-01

    The decision to gather information should take into account both the value of information and its accrual costs in time, energy and money. Here we explore how people balance the monetary costs and benefits of gathering additional information in a perceptual-motor estimation task. Participants were rewarded for touching a hidden circular target on a touch-screen display. The target’s center coincided with the mean of a circular Gaussian distribution from which participants could sample repeatedly. Each “cue” — sampled one at a time — was plotted as a dot on the display. Participants had to repeatedly decide, after sampling each cue, whether to stop sampling and attempt to touch the hidden target or continue sampling. Each additional cue increased the participants’ probability of successfully touching the hidden target but reduced their potential reward. Two experimental conditions differed in the initial reward associated with touching the hidden target and the fixed cost per cue. For each condition we computed the optimal number of cues that participants should sample, before taking action, to maximize expected gain. Contrary to recent claims that people gather less information than they objectively should before taking action, we found that participants over-sampled in one experimental condition, and did not significantly under- or over-sample in the other. Additionally, while the ideal observer model ignores the current sample dispersion, we found that participants used it to decide whether to stop sampling and take action or continue sampling, a possible consequence of imperfect learning of the underlying population dispersion across trials. PMID:27429991

  17. Sampling and P-sampling expansions

    Indian Academy of Sciences (India)

    Using the hyperfinite representation of functions and generalized functions this paper develops a rigorous version of the so-called `delta method' approach to sampling theory. This yields a slightly more general version of the classical WKS sampling theorem for band-limited functions.

  18. How Sample Size Affects a Sampling Distribution

    Science.gov (United States)

    Mulekar, Madhuri S.; Siegel, Murray H.

    2009-01-01

    If students are to understand inferential statistics successfully, they must have a profound understanding of the nature of the sampling distribution. Specifically, they must comprehend the determination of the expected value and standard error of a sampling distribution as well as the meaning of the central limit theorem. Many students in a high…

  19. Chorionic Villus Sampling (CVS)

    Science.gov (United States)

    ... Pregnancy > Prenatal care > Chorionic villus sampling Chorionic villus sampling E-mail to a friend Please fill in ... It's been added to your dashboard . Chorionic villus sampling (CVS) is a prenatal test . It’s used to ...

  20. Iowa Geologic Sampling Points

    Data.gov (United States)

    Iowa State University GIS Support and Research Facility — Point locations of geologic samples/files in the IGS repository. Types of samples include well cuttings, outcrop samples, cores, drillers logs, measured sections,...

  1. DNA Sampling Hook

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The DNA Sampling Hook is a significant improvement on a method of obtaining a tissue sample from a live fish in situ from an aquatic environment. A tissue sample...

  2. Samples 9 (2010)

    OpenAIRE

    Arbeitskreis Studium Populärer Musik e.V. (ASPM)

    2010-01-01

    SCHWERPUNKTTHEMA. SAMPLING IM HIPHOP Gastherausgeber: Oliver Kautny und Adam Krims. Oliver Kautny: Talkin´ All That Jazz - Ein Plädoyer für die Analyse des Sampling im HipHop (Editorial). Adam Krims: Sampling in Scholarship (english editorial). Mark Katz: Sampling before Sampling. The Link Between DJ and Producer. Sascha Klammt aka Quasi Modo: Das Sample - eine einzigartige Momentaufnahme als Basis für eine neue Komposition. Detlev Rick aka DJ Rick Ski: Die Entstehung des A...

  3. Sampling in Practice

    DEFF Research Database (Denmark)

    Petersen, Lars; Esbensen, Kim Harry

    2005-01-01

    A basic knowledge of the Theory of Sampling (TOS) and a set of only eight sampling unit operations is all the practical sampler needs to ensure representativeness of samples extracted from all kinds of lots: production batches, - truckloads, - barrels, sub-division in the laboratory, sampling...... in nature and in the field (environmental sampling, forestry, geology, biology), from raw materials or manufactory processes etc. We here can only give a brief introduction to the Fundamental Sampling Principle (FSP) and these eight Sampling Unit Operations (SUO’s). Always respecting FSP and invoking only...

  4. Sampling and P-sampling expansions

    Indian Academy of Sciences (India)

    In this paper we consider instead a non-standard approach to sampling theory. The hyperfinite representation of functions and generalized functions has been studied in an earlier paper [2], and the same notation and conventions will be used here. In particular,. PcNI denotes a given even infinite hypernatural number, 4 И ...

  5. Genetic Sample Inventory

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This database archives genetic tissue samples from marine mammals collected primarily from the U.S. east coast. The collection includes samples from field programs,...

  6. Genetic Sample Inventory - NRDA

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This database archives genetic tissue samples from marine mammals collected in the North-Central Gulf of Mexico from 2010-2015. The collection includes samples from...

  7. Superposition Enhanced Nested Sampling

    Science.gov (United States)

    Martiniani, Stefano; Stevenson, Jacob D.; Wales, David J.; Frenkel, Daan

    2014-07-01

    The theoretical analysis of many problems in physics, astronomy, and applied mathematics requires an efficient numerical exploration of multimodal parameter spaces that exhibit broken ergodicity. Monte Carlo methods are widely used to deal with these classes of problems, but such simulations suffer from a ubiquitous sampling problem: The probability of sampling a particular state is proportional to its entropic weight. Devising an algorithm capable of sampling efficiently the full phase space is a long-standing problem. Here, we report a new hybrid method for the exploration of multimodal parameter spaces exhibiting broken ergodicity. Superposition enhanced nested sampling combines the strengths of global optimization with the unbiased or athermal sampling of nested sampling, greatly enhancing its efficiency with no additional parameters. We report extensive tests of this new approach for atomic clusters that are known to have energy landscapes for which conventional sampling schemes suffer from broken ergodicity. We also introduce a novel parallelization algorithm for nested sampling.

  8. Mold Testing or Sampling

    Science.gov (United States)

    In most cases, if visible mold growth is present, sampling is unnecessary. Since no EPA or other federal limits have been set for mold or mold spores, sampling cannot be used to check a building's compliance with federal mold standards.

  9. Chorionic villus sampling

    Science.gov (United States)

    ... medlineplus.gov/ency/article/003406.htm Chorionic villus sampling To use the sharing features on this page, please enable JavaScript. Chorionic villus sampling (CVS) is a test some pregnant women have ...

  10. Sampling on Quasicrystals

    OpenAIRE

    Grepstad, Sigrid

    2011-01-01

    We prove that quasicrystals are universal sets of stable sampling in any dimension. Necessary and sufficient density conditions for stable sampling and interpolation sets in one dimension are studied in detail.

  11. Superposition Enhanced Nested Sampling

    Directory of Open Access Journals (Sweden)

    Stefano Martiniani

    2014-08-01

    Full Text Available The theoretical analysis of many problems in physics, astronomy, and applied mathematics requires an efficient numerical exploration of multimodal parameter spaces that exhibit broken ergodicity. Monte Carlo methods are widely used to deal with these classes of problems, but such simulations suffer from a ubiquitous sampling problem: The probability of sampling a particular state is proportional to its entropic weight. Devising an algorithm capable of sampling efficiently the full phase space is a long-standing problem. Here, we report a new hybrid method for the exploration of multimodal parameter spaces exhibiting broken ergodicity. Superposition enhanced nested sampling combines the strengths of global optimization with the unbiased or athermal sampling of nested sampling, greatly enhancing its efficiency with no additional parameters. We report extensive tests of this new approach for atomic clusters that are known to have energy landscapes for which conventional sampling schemes suffer from broken ergodicity. We also introduce a novel parallelization algorithm for nested sampling.

  12. Lunar Sample Compendium

    Data.gov (United States)

    National Aeronautics and Space Administration — The purpose of the Lunar Sample Compendium is to inform scientists, astronauts and the public about the various lunar samples that have been returned from the Moon....

  13. Developing Water Sampling Standards

    Science.gov (United States)

    Environmental Science and Technology, 1974

    1974-01-01

    Participants in the D-19 symposium on aquatic sampling and measurement for water pollution assessment were informed that determining the extent of waste water stream pollution is not a cut and dry procedure. Topics discussed include field sampling, representative sampling from storm sewers, suggested sampler features and application of improved…

  14. IAEA Sampling Plan

    Energy Technology Data Exchange (ETDEWEB)

    Geist, William H. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-15

    The objectives for this presentation are to describe the method that the IAEA uses to determine a sampling plan for nuclear material measurements; describe the terms detection probability and significant quantity; list the three nuclear materials measurement types; describe the sampling method applied to an item facility; and describe multiple method sampling.

  15. Aerosol sampling system

    Science.gov (United States)

    Masquelier, Donald A.

    2004-02-10

    A system for sampling air and collecting particulate of a predetermined particle size range. A low pass section has an opening of a preselected size for gathering the air but excluding particles larger than the sample particles. An impactor section is connected to the low pass section and separates the air flow into a bypass air flow that does not contain the sample particles and a product air flow that does contain the sample particles. A wetted-wall cyclone collector, connected to the impactor section, receives the product air flow and traps the sample particles in a liquid.

  16. Sample Proficiency Test exercise

    Energy Technology Data Exchange (ETDEWEB)

    Alcaraz, A; Gregg, H; Koester, C

    2006-02-05

    The current format of the OPCW proficiency tests has multiple sets of 2 samples sent to an analysis laboratory. In each sample set, one is identified as a sample, the other as a blank. This method of conducting proficiency tests differs from how an OPCW designated laboratory would receive authentic samples (a set of three containers, each not identified, consisting of the authentic sample, a control sample, and a blank sample). This exercise was designed to test the reporting if the proficiency tests were to be conducted. As such, this is not an official OPCW proficiency test, and the attached report is one method by which LLNL might report their analyses under a more realistic testing scheme. Therefore, the title on the report ''Report of the Umpteenth Official OPCW Proficiency Test'' is meaningless, and provides a bit of whimsy for the analyses and readers of the report.

  17. The Lunar Sample Compendium

    Science.gov (United States)

    Meyer, Charles

    2009-01-01

    The Lunar Sample Compendium is a succinct summary of the data obtained from 40 years of study of Apollo and Luna samples of the Moon. Basic petrographic, chemical and age information is compiled, sample-by-sample, in the form of an advanced catalog in order to provide a basic description of each sample. The LSC can be found online using Google. The initial allocation of lunar samples was done sparingly, because it was realized that scientific techniques would improve over the years and new questions would be formulated. The LSC is important because it enables scientists to select samples within the context of the work that has already been done and facilitates better review of proposed allocations. It also provides back up material for public displays, captures information found only in abstracts, grey literature and curatorial databases and serves as a ready access to the now-vast scientific literature.

  18. Fast mixing hyperdynamic sampling

    OpenAIRE

    Sminchisescu, Cristian; Triggs, Bill

    2006-01-01

    Special issue on ECCV'02 papers; International audience; Sequential random sampling (‘Markov Chain Monte-Carlo') is a popular strategy for many vision problems involving multi-modal distributions over high-dimensional parameter spaces. It applies both to importance sampling (where one wants to sample points according to their ‘importance' for some calculation, but otherwise fairly) and to global-optimization (where one wants to find good minima, or at least good starting points for local mini...

  19. Hyperdynamics Importance Sampling

    OpenAIRE

    Sminchisescu, Cristian; Triggs, Bill

    2002-01-01

    International audience; Sequential random sampling (‘Markov Chain Monte-Carlo') is a popular strategy for many vision problems involving multimodal distributions over high-dimensional parameter spaces. It applies both to importance sampling (where one wants to sample points according to their ‘importance' for some calculation, but otherwise fairly) and to global optimization (where one wants to find good minima, or at least good starting points for local minimization, regardless of fairness)....

  20. Laboratory Sampling Guide

    Science.gov (United States)

    2012-05-11

    Barium  Beryllium  Cadmium  Chromium  Cobalt  Copper  Iron  Lead  Magnesium  Manganese  Molybdenum  Potassium  Nickel...paper or MCE filters Aluminum Antimony Arsenic Barium Beryllium Cadmium Chromium(Total) Cobalt Copper Iron Lead Manganese Molybdenum...Sampling Equipment Selection Guidea Matrix Sampling Device Image Device-Specific Guidance Sample Type Comments L iq ui ds Automatic sampler ASTM D

  1. Sample size methodology

    CERN Document Server

    Desu, M M

    2012-01-01

    One of the most important problems in designing an experiment or a survey is sample size determination and this book presents the currently available methodology. It includes both random sampling from standard probability distributions and from finite populations. Also discussed is sample size determination for estimating parameters in a Bayesian setting by considering the posterior distribution of the parameter and specifying the necessary requirements. The determination of the sample size is considered for ranking and selection problems as well as for the design of clinical trials. Appropria

  2. Sample Differentiation: Cocaine Example.

    Science.gov (United States)

    Baugh, L D; Liu, R H

    1991-12-01

    Since the analyses of drug samples in crime laboratories are often associated with investigations, potential differentiations of test samples are frequently requested and explored. Cocaine sample differentiation requires the determination of synthetic or natural origin. Synthetic samples are characterized by the presence of optical isomers, certain diastereoisomers and other by-products, and chemical residues used in synthesis. Samples derived from a natural origin (coca leaves) are characterized by the presence of certain natural products or their derivatives that are carried through the overall process and by residual chemicals reflecting the treatment procedures. Various approaches and analytical data available in the literature concerning the differentiation of cocaine samples are reviewed. Each sample must carry its own "signature"; however, true sample "individualization" cannot be accomplished using the technologies commonly available and used in crime laboratories, and is not usually needed. Alternatively, "classifying" cocaine samples in certain categories or groups can be accomplished routinely and often provides adequate information for investigatory purposes. Copyright © 1991 Central Police University.

  3. Industrial Hygiene Sampling Instructions

    Science.gov (United States)

    1987-03-01

    transport . g. Field blank tubes will be submitted with each set of samples. If the number of samples in a set exceeds 10, then submit at the rate of one...2 Gelman PSPJ037 (For PAH) 37 2 Membrana ’ - PVC 37 5 Gelman 66467 37 5 Nuclepore 361850 Filter only 240810 Pad only Swinnex Cassette 13 - Millipore

  4. Simple street tree sampling

    Science.gov (United States)

    David J. Nowak; Jeffrey T. Walton; James Baldwin; Jerry. Bond

    2015-01-01

    Information on street trees is critical for management of this important resource. Sampling of street tree populations provides an efficient means to obtain street tree population information. Long-term repeat measures of street tree samples supply additional information on street tree changes and can be used to report damages from catastrophic events. Analyses of...

  5. Sampling system and method

    Science.gov (United States)

    Decker, David L.; Lyles, Brad F.; Purcell, Richard G.; Hershey, Ronald Lee

    2013-04-16

    The present disclosure provides an apparatus and method for coupling conduit segments together. A first pump obtains a sample and transmits it through a first conduit to a reservoir accessible by a second pump. The second pump further conducts the sample from the reservoir through a second conduit.

  6. Extraterrestrial Samples at JSC

    Science.gov (United States)

    Allen, Carlton C.

    2007-01-01

    A viewgraph presentation on the curation of extraterrestrial samples at NASA Johnson Space Center is shown. The topics include: 1) Apollo lunar samples; 2) Meteorites from Antarctica; 3) Cosmic dust from the stratosphere; 4) Genesis solar wind ions; 5) Stardust comet and interstellar grains; and 5) Space-Exposed Hardware.

  7. Gaussian Boson Sampling

    Science.gov (United States)

    Hamilton, Craig S.; Kruse, Regina; Sansoni, Linda; Barkhofen, Sonja; Silberhorn, Christine; Jex, Igor

    2017-10-01

    Boson sampling has emerged as a tool to explore the advantages of quantum over classical computers as it does not require universal control over the quantum system, which favors current photonic experimental platforms. Here, we introduce Gaussian Boson sampling, a classically hard-to-solve problem that uses squeezed states as a nonclassical resource. We relate the probability to measure specific photon patterns from a general Gaussian state in the Fock basis to a matrix function called the Hafnian, which answers the last remaining question of sampling from Gaussian states. Based on this result, we design Gaussian Boson sampling, a #P hard problem, using squeezed states. This demonstrates that Boson sampling from Gaussian states is possible, with significant advantages in the photon generation probability, compared to existing protocols.

  8. Sample pretretment in microsystems

    DEFF Research Database (Denmark)

    Perch-Nielsen, Ivan R.

    2003-01-01

    When a sample, e.g. from a patient, is processed using conventional methods, the sample must be transported to the laboratory where it is analyzed, after which the results is sent back. By integrating the separate steps of the analysis in a micro total analysis system (μTAS), results can......: Sample preparation → DNA amplification → DNA analysis. The overall goal of the project is integration of as many as possible of these steps. This thesis covers mainly pretreatment in a microchip. Some methods for sample pretreatment have been tested. Most conventional is fluorescence activated cell sort...... be obtained fast and better. Preferably with all the processes from sample to signal moved to the bedside of the patient. Of course there is still much to learn and study in the process of miniaturization. DNA analysis is one process subject to integration. There are roughly three steps in a DNA analysis...

  9. Biological sample collector

    Science.gov (United States)

    Murphy, Gloria A [French Camp, CA

    2010-09-07

    A biological sample collector is adapted to a collect several biological samples in a plurality of filter wells. A biological sample collector may comprise a manifold plate for mounting a filter plate thereon, the filter plate having a plurality of filter wells therein; a hollow slider for engaging and positioning a tube that slides therethrough; and a slide case within which the hollow slider travels to allow the tube to be aligned with a selected filter well of the plurality of filter wells, wherein when the tube is aligned with the selected filter well, the tube is pushed through the hollow slider and into the selected filter well to sealingly engage the selected filter well and to allow the tube to deposit a biological sample onto a filter in the bottom of the selected filter well. The biological sample collector may be portable.

  10. Rapid Active Sampling Package

    Science.gov (United States)

    Peters, Gregory

    2010-01-01

    A field-deployable, battery-powered Rapid Active Sampling Package (RASP), originally designed for sampling strong materials during lunar and planetary missions, shows strong utility for terrestrial geological use. The technology is proving to be simple and effective for sampling and processing materials of strength. Although this originally was intended for planetary and lunar applications, the RASP is very useful as a powered hand tool for geologists and the mining industry to quickly sample and process rocks in the field on Earth. The RASP allows geologists to surgically acquire samples of rock for later laboratory analysis. This tool, roughly the size of a wrench, allows the user to cut away swaths of weathering rinds, revealing pristine rock surfaces for observation and subsequent sampling with the same tool. RASPing deeper (.3.5 cm) exposes single rock strata in-situ. Where a geologist fs hammer can only expose unweathered layers of rock, the RASP can do the same, and then has the added ability to capture and process samples into powder with particle sizes less than 150 microns, making it easier for XRD/XRF (x-ray diffraction/x-ray fluorescence). The tool uses a rotating rasp bit (or two counter-rotating bits) that resides inside or above the catch container. The container has an open slot to allow the bit to extend outside the container and to allow cuttings to enter and be caught. When the slot and rasp bit are in contact with a substrate, the bit is plunged into it in a matter of seconds to reach pristine rock. A user in the field may sample a rock multiple times at multiple depths in minutes, instead of having to cut out huge, heavy rock samples for transport back to a lab for analysis. Because of the speed and accuracy of the RASP, hundreds of samples can be taken in one day. RASP-acquired samples are small and easily carried. A user can characterize more area in less time than by using conventional methods. The field-deployable RASP used a Ni

  11. Contributions to sampling statistics

    CERN Document Server

    Conti, Pier; Ranalli, Maria

    2014-01-01

    This book contains a selection of the papers presented at the ITACOSM 2013 Conference, held in Milan in June 2013. ITACOSM is the bi-annual meeting of the Survey Sampling Group S2G of the Italian Statistical Society, intended as an international  forum of scientific discussion on the developments of theory and application of survey sampling methodologies and applications in human and natural sciences. The book gathers research papers carefully selected from both invited and contributed sessions of the conference. The whole book appears to be a relevant contribution to various key aspects of sampling methodology and techniques; it deals with some hot topics in sampling theory, such as calibration, quantile-regression and multiple frame surveys, and with innovative methodologies in important topics of both sampling theory and applications. Contributions cut across current sampling methodologies such as interval estimation for complex samples, randomized responses, bootstrap, weighting, modeling, imputati...

  12. Sample quality criteria.

    Science.gov (United States)

    Ramsey, Charles A; Wagner, Claas

    2015-01-01

    The concept of Sample Quality Criteria (SQC) is the initial step in the scientific approach to representative sampling. It includes the establishment of sampling objectives, Decision Unit (DU), and confidence. Once fully defined, these criteria serve as input, in addition to material properties, to the Theory of Sampling for developing a representative sampling protocol. The first component of the SQC establishes these questions: What is the analyte(s) of concern? What is the concentration level of interest of the analyte(s)? How will inference(s) be made from the analytical data to the DU? The second component of the SQC establishes the DU, i.e., the scale at which decisions are to be made. On a large scale, a DU could be a ship or rail car; examples for small-scale DUs are individual beans, seeds, or kernels. A well-defined DU is critical because it defines the spatial and temporal boundaries of sample collection. SQC are not limited to a single DU; they can also include multiple DUs. The third SQC component, the confidence, establishes the desired probability that a correct inference (decision) can be made. The confidence level should typically correlate to the potential consequences of an incorrect decision (e.g., health or economic). The magnitude of combined errors in the sampling, sample processing and analytical protocols determines the likelihood of an incorrect decision. Thus, controlling error to a greater extent increases the probability of a correct decision. The required confidence level directly affects the sampling effort and QC measures.

  13. Inference for Noisy Samples

    Directory of Open Access Journals (Sweden)

    I. A. Ahmad

    2012-07-01

    Full Text Available Normal 0 false false false EN-US X-NONE X-NONE In the current work, some well-known inference procedures including testing and estimation are adjusted to accommodate noisy data that lead to nonidentically distributed sample.  The main two cases addressed are the Poisson and the normal distributions. Both one and two sample cases are addressed.  Other cases including the exponential and the Pareto distributions are briefly mentioned.  In the Poisson case, the situation when the sample size is random is mentioned.

  14. Lunar Sample Display Locations

    Data.gov (United States)

    National Aeronautics and Space Administration — NASA provides a number of lunar samples for display at museums, planetariums, and scientific expositions around the world. Lunar displays are open to the public....

  15. Sample Return Robot Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This Challenge requires demonstration of an autonomous robotic system to locate and collect a set of specific sample types from a large planetary analog area and...

  16. Open port sampling interface

    Science.gov (United States)

    Van Berkel, Gary J

    2017-04-25

    A system for sampling a sample material includes a probe which can have an outer probe housing with an open end. A liquid supply conduit within the housing has an outlet positioned to deliver liquid to the open end of the housing. The liquid supply conduit can be connectable to a liquid supply for delivering liquid at a first volumetric flow rate to the open end of the housing. A liquid exhaust conduit within the housing is provided for removing liquid from the open end of the housing. A liquid exhaust system can be provided for removing liquid from the liquid exhaust conduit at a second volumetric flow rate, the first volumetric flow rate exceeding the second volumetric flow rate, wherein liquid at the open end will receive sample, liquid containing sample material will be drawn into and through the liquid exhaust conduit, and liquid will overflow from the open end.

  17. Open port sampling interface

    Energy Technology Data Exchange (ETDEWEB)

    Van Berkel, Gary J.

    2018-01-16

    A system for sampling a sample material includes a probe which can have an outer probe housing with an open end. A liquid supply conduit within the housing has an outlet positioned to deliver liquid to the open end of the housing. The liquid supply conduit can be connectable to a liquid supply for delivering liquid at a first volumetric flow rate to the open end of the housing. A liquid exhaust conduit within the housing is provided for removing liquid from the open end of the housing. A liquid exhaust system can be provided for removing liquid from the liquid exhaust conduit at a second volumetric flow rate, the first volumetric flow rate exceeding the second volumetric flow rate, wherein liquid at the open end will receive sample, liquid containing sample material will be drawn into and through the liquid exhaust conduit, and liquid will overflow from the open end.

  18. Stardust Sample Catalog

    Data.gov (United States)

    National Aeronautics and Space Administration — This Catalog summarizes the samples examined in the course of the Preliminary Examination (PE) Team (PET) of the Stardust Mission to comet Wild 2, and the results of...

  19. Roadway sampling evaluation.

    Science.gov (United States)

    2014-09-01

    The Florida Department of Transportation (FDOT) has traditionally required that all sampling : and testing of asphalt mixtures be at the Contractors production facility. With recent staffing cuts, as : well as budget reductions, FDOT has been cons...

  20. UFA Auction Sampling Data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Between 1984 January - 2002 June, personnel from NMFS/PIFSC/FRMD/FMB/FMAP and Hawaii Department of Aquatic Resources (DAR) conducted port sampling at the United...

  1. Sample Encapsulation Device Project

    Data.gov (United States)

    National Aeronautics and Space Administration — NASA's Science Mission Directorate is currently considering various sample cache and return missions to the Moon, Mars and asteroids. These missions involve the use...

  2. Mini MAX - Medicaid Sample

    Data.gov (United States)

    U.S. Department of Health & Human Services — To facilitate wider use of MAX, CMS contracted with Mathematica to convene a technical expert panel (TEP) and determine the feasibility of creating a sample file for...

  3. Dissolution actuated sample container

    Science.gov (United States)

    Nance, Thomas A.; McCoy, Frank T.

    2013-03-26

    A sample collection vial and process of using a vial is provided. The sample collection vial has an opening secured by a dissolvable plug. When dissolved, liquids may enter into the interior of the collection vial passing along one or more edges of a dissolvable blocking member. As the blocking member is dissolved, a spring actuated closure is directed towards the opening of the vial which, when engaged, secures the vial contents against loss or contamination.

  4. Two phase sampling

    CERN Document Server

    Ahmad, Zahoor; Hanif, Muhammad

    2013-01-01

    The development of estimators of population parameters based on two-phase sampling schemes has seen a dramatic increase in the past decade. Various authors have developed estimators of population using either one or two auxiliary variables. The present volume is a comprehensive collection of estimators available in single and two phase sampling. The book covers estimators which utilize information on single, two and multiple auxiliary variables of both quantitative and qualitative nature. Th...

  5. Sample size for beginners.

    OpenAIRE

    Florey, C D

    1993-01-01

    The common failure to include an estimation of sample size in grant proposals imposes a major handicap on applicants, particularly for those proposing work in any aspect of research in the health services. Members of research committees need evidence that a study is of adequate size for there to be a reasonable chance of a clear answer at the end. A simple illustrated explanation of the concepts in determining sample size should encourage the faint hearted to pay more attention to this increa...

  6. Wet gas sampling

    Energy Technology Data Exchange (ETDEWEB)

    Welker, T.F.

    1997-07-01

    The quality of gas has changed drastically in the past few years. Most gas is wet with hydrocarbons, water, and heavier contaminants that tend to condense if not handled properly. If a gas stream is contaminated with condensables, the sampling of that stream must be done in a manner that will ensure all of the components in the stream are introduced into the sample container as the composite. The sampling and handling of wet gas is extremely difficult under ideal conditions. There are no ideal conditions in the real world. The problems related to offshore operations and other wet gas systems, as well as the transportation of the sample, are additional problems that must be overcome if the analysis is to mean anything to the producer and gatherer. The sampling of wet gas systems is decidedly more difficult than sampling conventional dry gas systems. Wet gas systems were generally going to result in the measurement of one heating value at the inlet of the pipe and a drastic reduction in the heating value of the gas at the outlet end of the system. This is caused by the fallout or accumulation of the heavier products that, at the inlet, may be in the vapor state in the pipeline; hence, the high gravity and high BTU. But, in fact, because of pressure and temperature variances, these liquids condense and form a liquid that is actually running down the pipe as a stream or is accumulated in drips to be blown from the system. (author)

  7. Recommended protocols for sampling macrofungi

    Science.gov (United States)

    Gregory M. Mueller; John Paul Schmit; Sabine M. Hubndorf Leif Ryvarden; Thomas E. O' Dell; D. Jean Lodge; Patrick R. Leacock; Milagro Mata; Loengrin Umania; Qiuxin (Florence) Wu; Daniel L. Czederpiltz

    2004-01-01

    This chapter discusses several issues regarding reommended protocols for sampling macrofungi: Opportunistic sampling of macrofungi, sampling conspicuous macrofungi using fixed-size, sampling small Ascomycetes using microplots, and sampling a fixed number of downed logs.

  8. Nonuniform sampling by quantiles.

    Science.gov (United States)

    Craft, D Levi; Sonstrom, Reilly E; Rovnyak, Virginia G; Rovnyak, David

    2018-02-13

    A flexible strategy for choosing samples nonuniformly from a Nyquist grid using the concept of statistical quantiles is presented for broad classes of NMR experimentation. Quantile-directed scheduling is intuitive and flexible for any weighting function, promotes reproducibility and seed independence, and is generalizable to multiple dimensions. In brief, weighting functions are divided into regions of equal probability, which define the samples to be acquired. Quantile scheduling therefore achieves close adherence to a probability distribution function, thereby minimizing gaps for any given degree of subsampling of the Nyquist grid. A characteristic of quantile scheduling is that one-dimensional, weighted NUS schedules are deterministic, however higher dimensional schedules are similar within a user-specified jittering parameter. To develop unweighted sampling, we investigated the minimum jitter needed to disrupt subharmonic tracts, and show that this criterion can be met in many cases by jittering within 25-50% of the subharmonic gap. For nD-NUS, three supplemental components to choosing samples by quantiles are proposed in this work: (i) forcing the corner samples to ensure sampling to specified maximum values in indirect evolution times, (ii) providing an option to triangular backfill sampling schedules to promote dense/uniform tracts at the beginning of signal evolution periods, and (iii) providing an option to force the edges of nD-NUS schedules to be identical to the 1D quantiles. Quantile-directed scheduling meets the diverse needs of current NUS experimentation, but can also be used for future NUS implementations such as off-grid NUS and more. A computer program implementing these principles (a.k.a. QSched) in 1D- and 2D-NUS is available under the general public license. Copyright © 2018 Elsevier Inc. All rights reserved.

  9. AND/OR Importance Sampling

    OpenAIRE

    Gogate, Vibhav; Dechter, Rina

    2012-01-01

    The paper introduces AND/OR importance sampling for probabilistic graphical models. In contrast to importance sampling, AND/OR importance sampling caches samples in the AND/OR space and then extracts a new sample mean from the stored samples. We prove that AND/OR importance sampling may have lower variance than importance sampling; thereby providing a theoretical justification for preferring it over importance sampling. Our empirical evaluation demonstrates that AND/OR importance sampling is ...

  10. Ethics and sample size.

    Science.gov (United States)

    Bacchetti, Peter; Wolf, Leslie E; Segal, Mark R; McCulloch, Charles E

    2005-01-15

    The belief is widespread that studies are unethical if their sample size is not large enough to ensure adequate power. The authors examine how sample size influences the balance that determines the ethical acceptability of a study: the balance between the burdens that participants accept and the clinical or scientific value that a study can be expected to produce. The average projected burden per participant remains constant as the sample size increases, but the projected study value does not increase as rapidly as the sample size if it is assumed to be proportional to power or inversely proportional to confidence interval width. This implies that the value per participant declines as the sample size increases and that smaller studies therefore have more favorable ratios of projected value to participant burden. The ethical treatment of study participants therefore does not require consideration of whether study power is less than the conventional goal of 80% or 90%. Lower power does not make a study unethical. The analysis addresses only ethical acceptability, not optimality; large studies may be desirable for other than ethical reasons.

  11. Interactive Sample Book (ISB)

    DEFF Research Database (Denmark)

    Heimdal, Elisabeth Jacobsen; Lenau, Torben Anker; Guglielmi, Michel

    2009-01-01

    supervisor Torben A. Lenau. Inspiration to use smart materials Interactive textiles are still quite an unknown phenomenon to many. It is thus often difficult to communicate what kind of potentials lie within these materials. This is why the ISB project was started, as a practice based research project...... and senses in relation to integrated decoration and function primarily to indoor applications. The result of the project will be a number of interactive textiles, to be gathered in an interactive sample book (ISB), in a similar way as the sample books of wallpapers one can take home from the shop and choose...... from. In other words, it is a kind of display material, which in a simple manner can illustrate how different techniques and smart materials work. The sample book should display a number of possibilities where sensor technology, smart materials and textiles are mixed to such an extent that the textile...

  12. Radioactive air sampling methods

    CERN Document Server

    Maiello, Mark L

    2010-01-01

    Although the field of radioactive air sampling has matured and evolved over decades, it has lacked a single resource that assimilates technical and background information on its many facets. Edited by experts and with contributions from top practitioners and researchers, Radioactive Air Sampling Methods provides authoritative guidance on measuring airborne radioactivity from industrial, research, and nuclear power operations, as well as naturally occuring radioactivity in the environment. Designed for industrial hygienists, air quality experts, and heath physicists, the book delves into the applied research advancing and transforming practice with improvements to measurement equipment, human dose modeling of inhaled radioactivity, and radiation safety regulations. To present a wide picture of the field, it covers the international and national standards that guide the quality of air sampling measurements and equipment. It discusses emergency response issues, including radioactive fallout and the assets used ...

  13. Strategic Sample Selection

    DEFF Research Database (Denmark)

    Di Tillio, Alfredo; Ottaviani, Marco; Sørensen, Peter Norman

    2017-01-01

    is double logconvex, as with normal noise. The results are applied to the analysis of strategic sample selection by a biased researcher and extended to the case of uncertain and unanticipated selection. Our theoretical analysis offers applied research a new angle on the problem of selection in empirical......What is the impact of sample selection on the inference payoff of an evaluator testing a simple hypothesis based on the outcome of a location experiment? We show that anticipated selection locally reduces noise dispersion and thus increases informativeness if and only if the noise distribution...

  14. Sample size for beginners.

    Science.gov (United States)

    Florey, C D

    1993-05-01

    The common failure to include an estimation of sample size in grant proposals imposes a major handicap on applicants, particularly for those proposing work in any aspect of research in the health services. Members of research committees need evidence that a study is of adequate size for there to be a reasonable chance of a clear answer at the end. A simple illustrated explanation of the concepts in determining sample size should encourage the faint hearted to pay more attention to this increasingly important aspect of grantsmanship.

  15. Quantum private data sampling

    Science.gov (United States)

    Fattal, David; Fiorentino, Marco; Beausoleil, Raymond G.

    2009-08-01

    We present a novel quantum communication protocol for "Private Data Sampling", where a player (Bob) obtains a random sample of limited size of a classical database, while the database owner (Alice) remains oblivious as to which bits were accessed. The protocol is efficient in the sense that the communication complexity per query scales at most linearly with the size of the database. It does not violate Lo's "no-go" theorem for one-sided twoparty secure computation, since a given joint input by Alice and Bob can result in randomly different protocol outcomes. After outlining the main security features of the protocol, we present our first experimental results.

  16. Request for wood samples

    NARCIS (Netherlands)

    NN,

    1977-01-01

    In recent years the wood collection at the Rijksherbarium was greatly expanded following a renewed interest in wood anatomy as an aid for solving classification problems. Staff members of the Rijksherbarium added to the collection by taking interesting wood samples with them from their expeditions

  17. Determination of Sample Size

    OpenAIRE

    Naing, Nyi Nyi

    2003-01-01

    There is a particular importance of determining a basic minimum required ‘n’ size of the sample to recognize a particular measurement of a particular population. This article has highlighted the determination of an appropriate size to estimate population parameters.

  18. Drafting Work Sample.

    Science.gov (United States)

    Shawsheen Valley Regional Vocational-Technical High School, Billerica, MA.

    This manual contains a work sample intended to assess a handicapped student's interest in and to screen interested students into a training program in basic mechanical drawing. (The course is based on the entry level of an assistant drafter.) Section 1 describes the assessment, correlates the work performed and worker traits required for…

  19. Optimal Sampling and Interpolation

    NARCIS (Netherlands)

    Shekhawat, Hanumant

    2012-01-01

    The main objective in this thesis is to design optimal samplers, downsamplers and interpolators (holds) which are required in signal processing. The sampled-data system theory is used to fulfill this objective in a generic setup. Signal processing, which includes signal transmission, storage and

  20. Biological Sampling Variability Study

    Energy Technology Data Exchange (ETDEWEB)

    Amidan, Brett G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hutchison, Janine R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-11-08

    There are many sources of variability that exist in the sample collection and analysis process. This paper addresses many, but not all, sources of variability. The main focus of this paper was to better understand and estimate variability due to differences between samplers. Variability between days was also studied, as well as random variability within each sampler. Experiments were performed using multiple surface materials (ceramic and stainless steel), multiple contaminant concentrations (10 spores and 100 spores), and with and without the presence of interfering material. All testing was done with sponge sticks using 10-inch by 10-inch coupons. Bacillus atrophaeus was used as the BA surrogate. Spores were deposited using wet deposition. Grime was coated on the coupons which were planned to include the interfering material (Section 3.3). Samples were prepared and analyzed at PNNL using CDC protocol (Section 3.4) and then cultured and counted. Five samplers were trained so that samples were taken using the same protocol. Each sampler randomly sampled eight coupons each day, four coupons with 10 spores deposited and four coupons with 100 spores deposited. Each day consisted of one material being tested. The clean samples (no interfering materials) were run first, followed by the dirty samples (coated with interfering material). There was a significant difference in recovery efficiency between the coupons with 10 spores deposited (mean of 48.9%) and those with 100 spores deposited (mean of 59.8%). There was no general significant difference between the clean and dirty (containing interfering material) coupons or between the two surface materials; however, there was a significant interaction between concentration amount and presence of interfering material. The recovery efficiency was close to the same for coupons with 10 spores deposited, but for the coupons with 100 spores deposited, the recovery efficiency for the dirty samples was significantly larger (65

  1. Sampling system and method

    Energy Technology Data Exchange (ETDEWEB)

    Decker, David L.; Lyles, Brad F.; Purcell, Richard G.; Hershey, Ronald Lee

    2017-03-07

    In one embodiment, the present disclosure provides an apparatus and method for supporting a tubing bundle during installation or removal. The apparatus includes a clamp for securing the tubing bundle to an external wireline. In various examples, the clamp is external to the tubing bundle or integral with the tubing bundle. According to one method, a tubing bundle and wireline are deployed together and the tubing bundle periodically secured to the wireline using a clamp. In another embodiment, the present disclosure provides an apparatus and method for coupling conduit segments together. A first pump obtains a sample and transmits it through a first conduit to a reservoir accessible by a second pump. The second pump further conducts the sample from the reservoir through a second conduit. In a specific example, one or more clamps are used to connect the first and/or second conduits to an external wireline.

  2. Steered transition path sampling.

    Science.gov (United States)

    Guttenberg, Nicholas; Dinner, Aaron R; Weare, Jonathan

    2012-06-21

    We introduce a path sampling method for obtaining statistical properties of an arbitrary stochastic dynamics. The method works by decomposing a trajectory in time, estimating the probability of satisfying a progress constraint, modifying the dynamics based on that probability, and then reweighting to calculate averages. Because the progress constraint can be formulated in terms of occurrences of events within time intervals, the method is particularly well suited for controlling the sampling of currents of dynamic events. We demonstrate the method for calculating transition probabilities in barrier crossing problems and survival probabilities in strongly diffusive systems with absorbing states, which are difficult to treat by shooting. We discuss the relation of the algorithm to other methods.

  3. Digital Microfluidics Sample Analyzer

    Science.gov (United States)

    Pollack, Michael G.; Srinivasan, Vijay; Eckhardt, Allen; Paik, Philip Y.; Sudarsan, Arjun; Shenderov, Alex; Hua, Zhishan; Pamula, Vamsee K.

    2010-01-01

    Three innovations address the needs of the medical world with regard to microfluidic manipulation and testing of physiological samples in ways that can benefit point-of-care needs for patients such as premature infants, for which drawing of blood for continuous tests can be life-threatening in their own right, and for expedited results. A chip with sample injection elements, reservoirs (and waste), droplet formation structures, fluidic pathways, mixing areas, and optical detection sites, was fabricated to test the various components of the microfluidic platform, both individually and in integrated fashion. The droplet control system permits a user to control droplet microactuator system functions, such as droplet operations and detector operations. Also, the programming system allows a user to develop software routines for controlling droplet microactuator system functions, such as droplet operations and detector operations. A chip is incorporated into the system with a controller, a detector, input and output devices, and software. A novel filler fluid formulation is used for the transport of droplets with high protein concentrations. Novel assemblies for detection of photons from an on-chip droplet are present, as well as novel systems for conducting various assays, such as immunoassays and PCR (polymerase chain reaction). The lab-on-a-chip (a.k.a., lab-on-a-printed-circuit board) processes physiological samples and comprises a system for automated, multi-analyte measurements using sub-microliter samples of human serum. The invention also relates to a diagnostic chip and system including the chip that performs many of the routine operations of a central labbased chemistry analyzer, integrating, for example, colorimetric assays (e.g., for proteins), chemiluminescence/fluorescence assays (e.g., for enzymes, electrolytes, and gases), and/or conductometric assays (e.g., for hematocrit on plasma and whole blood) on a single chip platform.

  4. Samples and Sampling Protocols for Scientific Investigations | Joel ...

    African Journals Online (AJOL)

    ... from sampling, through sample preparation, calibration to final measurement and reporting. This paper, therefore offers useful information on practical guidance on sampling protocols in line with best practice and international standards. Keywords: Sampling, sampling protocols, chain of custody, analysis, documentation ...

  5. Repeated judgment sampling: Boundaries

    Directory of Open Access Journals (Sweden)

    Johannes Muller-Trede

    2011-06-01

    Full Text Available This paper investigates the boundaries of the recent result that eliciting more than one estimate from the same person and averaging these can lead to accuracy gains in judgment tasks. It first examines its generality, analysing whether the kind of question being asked has an effect on the size of potential gains. Experimental results show that the question type matters. Previous results reporting potential accuracy gains are reproduced for year-estimation questions, and extended to questions about percentage shares. On the other hand, no gains are found for general numerical questions. The second part of the paper tests repeated judgment sampling's practical applicability by asking judges to provide a third and final answer on the basis of their first two estimates. In an experiment, the majority of judges do not consistently average their first two answers. As a result, they do not realise the potential accuracy gains from averaging.

  6. Sampling the Hydrogen Atom

    Directory of Open Access Journals (Sweden)

    Graves N.

    2013-01-01

    Full Text Available A model is proposed for the hydrogen atom in which the electron is an objectively real particle orbiting at very near to light speed. The model is based on the postulate that certain velocity terms associated with orbiting bodies can be considered as being af- fected by relativity. This leads to a model for the atom in which the stable electron orbits are associated with orbital velocities where Gamma is n /α , leading to the idea that it is Gamma that is quantized and not angular momentum as in the Bohr and other models. The model provides a mechanism which leads to quantization of energy levels within the atom and also provides a simple mechanical explanation for the Fine Struc- ture Constant. The mechanism is closely associated with the Sampling theorem and the related phenomenon of aliasing developed in the mid-20th century by engineers at Bell labs.

  7. Sample collection, biobanking, and analysis

    NARCIS (Netherlands)

    Ahsman, Maurice J.; Tibboel, Dick; Mathot, Ron A. A.; de Wildt, Saskia N.

    2011-01-01

    Pediatric pharmacokinetic studies require sampling of biofluids from neonates and children. Limitations on sampling frequency and sample volume complicate the design of these studies. In addition, strict guidelines, designed to guarantee patient safety, are in place. This chapter describes the

  8. Variable Sampling Mapping

    Science.gov (United States)

    Smith, Jeffrey, S.; Aronstein, David L.; Dean, Bruce H.; Lyon, Richard G.

    2012-01-01

    The performance of an optical system (for example, a telescope) is limited by the misalignments and manufacturing imperfections of the optical elements in the system. The impact of these misalignments and imperfections can be quantified by the phase variations imparted on light traveling through the system. Phase retrieval is a methodology for determining these variations. Phase retrieval uses images taken with the optical system and using a light source of known shape and characteristics. Unlike interferometric methods, which require an optical reference for comparison, and unlike Shack-Hartmann wavefront sensors that require special optical hardware at the optical system's exit pupil, phase retrieval is an in situ, image-based method for determining the phase variations of light at the system s exit pupil. Phase retrieval can be used both as an optical metrology tool (during fabrication of optical surfaces and assembly of optical systems) and as a sensor used in active, closed-loop control of an optical system, to optimize performance. One class of phase-retrieval algorithms is the iterative transform algorithm (ITA). ITAs estimate the phase variations by iteratively enforcing known constraints in the exit pupil and at the detector, determined from modeled or measured data. The Variable Sampling Mapping (VSM) technique is a new method for enforcing these constraints in ITAs. VSM is an open framework for addressing a wide range of issues that have previously been considered detrimental to high-accuracy phase retrieval, including undersampled images, broadband illumination, images taken at or near best focus, chromatic aberrations, jitter or vibration of the optical system or detector, and dead or noisy detector pixels. The VSM is a model-to-data mapping procedure. In VSM, fully sampled electric fields at multiple wavelengths are modeled inside the phase-retrieval algorithm, and then these fields are mapped to intensities on the light detector, using the properties

  9. Sample size estimation and sampling techniques for selecting a representative sample

    OpenAIRE

    Aamir Omair

    2014-01-01

    Introduction: The purpose of this article is to provide a general understanding of the concepts of sampling as applied to health-related research. Sample Size Estimation: It is important to select a representative sample in quantitative research in order to be able to generalize the results to the target population. The sample should be of the required sample size and must be selected using an appropriate probability sampling technique. There are many hidden biases which can adversely affect ...

  10. Two-Step Sequential Sampling

    NARCIS (Netherlands)

    Moors, J.J.A.; Strijbosch, L.W.G.

    2000-01-01

    Deciding upon the optimal sample size in advance is a difficult problem in general. Often, the investigator regrets not having drawn a larger sample; in many cases additional observations are done. This implies that the actual sample size is no longer deterministic; hence, even if all sample

  11. Jenis Sample: Keuntungan dan Kerugiannya

    OpenAIRE

    Suprapto, Agus

    1994-01-01

    Sample is a part of a population that are used in a study for purposes of making estimation about the nature of the total population that is obtained with sampling technic. Sampling technic is more adventagous than cencus because it can reduce cost, time, and it can gather deeper information and more accurate data. It is useful to distinguish two major types of sampling technics. First, Prob bility sampling i.e. simple random sampling. Second, Non Probability sampling i.e. systematic sam­plin...

  12. Sample design for Understanding Society

    OpenAIRE

    Lynn, Peter

    2009-01-01

    This paper describes the design of the sample for “Understanding Society†. The sample consists of five components. The largest component is a newly-selected general population sample. The other four components are an ethnic minority ‘boost’ sample, a general population comparison sample, the ex-BHPS (British Household Panel Survey) sample, and the innovation panel sample. For each component, the paper outlines the design and explains the rationale behind the main features of the desig...

  13. Basic design of sample container for transport of extraterrestrial samples

    Science.gov (United States)

    Dirri, F.; Longobardo, A.; Palomba, E.; Hutzler, A.; Ferrière, L.

    2017-09-01

    The aim of this work is to provide, in the framework of the EURO-CARES (European Curation of Astromaterials Returned from Exploration of Space) project, a technical overview based on the sample container used in previous sample return missions (e.g., Hayabusa1, Stardust, etc.) and to define a basic design of a sample container aimed at transporting the extraterrestrial returned samples within a Sample Curation Facility (SCF) or from a SCF to another laboratory (and vice versa). The sample container structure and the transportation criticalities (such as contamination and mechanical stress) are discussed in detail in each scenario.

  14. Coordination of Conditional Poisson Samples

    Directory of Open Access Journals (Sweden)

    Grafström Anton

    2015-12-01

    Full Text Available Sample coordination seeks to maximize or to minimize the overlap of two or more samples. The former is known as positive coordination, and the latter as negative coordination. Positive coordination is mainly used for estimation purposes and to reduce data collection costs. Negative coordination is mainly performed to diminish the response burden of the sampled units. Poisson sampling design with permanent random numbers provides an optimum coordination degree of two or more samples. The size of a Poisson sample is, however, random. Conditional Poisson (CP sampling is a modification of the classical Poisson sampling that produces a fixed-size πps sample. We introduce two methods to coordinate Conditional Poisson samples over time or simultaneously. The first one uses permanent random numbers and the list-sequential implementation of CP sampling. The second method uses a CP sample in the first selection and provides an approximate one in the second selection because the prescribed inclusion probabilities are not respected exactly. The methods are evaluated using the size of the expected sample overlap, and are compared with their competitors using Monte Carlo simulation. The new methods provide a good coordination degree of two samples, close to the performance of Poisson sampling with permanent random numbers.

  15. Sample size estimation and sampling techniques for selecting a representative sample

    Directory of Open Access Journals (Sweden)

    Aamir Omair

    2014-01-01

    Full Text Available Introduction: The purpose of this article is to provide a general understanding of the concepts of sampling as applied to health-related research. Sample Size Estimation: It is important to select a representative sample in quantitative research in order to be able to generalize the results to the target population. The sample should be of the required sample size and must be selected using an appropriate probability sampling technique. There are many hidden biases which can adversely affect the outcome of the study. Important factors to consider for estimating the sample size include the size of the study population, confidence level, expected proportion of the outcome variable (for categorical variables/standard deviation of the outcome variable (for numerical variables, and the required precision (margin of accuracy from the study. The more the precision required, the greater is the required sample size. Sampling Techniques: The probability sampling techniques applied for health related research include simple random sampling, systematic random sampling, stratified random sampling, cluster sampling, and multistage sampling. These are more recommended than the nonprobability sampling techniques, because the results of the study can be generalized to the target population.

  16. Sample Acquisition for Materials in Planetary Exploration (SAMPLE) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — ORBITEC proposes to analyze, design, and develop a device for autonomous lunar surface/subsurface sampling and processing applications. The Sample Acquisition for...

  17. Sampling and chemical analysis in environmental samples around Nuclear Power Plants and some environmental samples

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Yong Woo; Han, Man Jung; Cho, Seong Won; Cho, Hong Jun; Oh, Hyeon Kyun; Lee, Jeong Min; Chang, Jae Sook [KORTIC, Taejon (Korea, Republic of)

    2002-12-15

    Twelve kinds of environmental samples such as soil, seawater, underground water, etc. around Nuclear Power Plants(NPPs) were collected. Tritium chemical analysis was tried for the samples of rain water, pine-needle, air, seawater, underground water, chinese cabbage, a grain of rice and milk sampled around NPPs, and surface seawater and rain water sampled over the country. Strontium in the soil that sere sampled at 60 point of district in Korea were analyzed. Tritium were sampled at 60 point of district in Korea were analyzed. Tritium were analyzed in 21 samples of surface seawater around the Korea peninsular that were supplied from KFRDI(National Fisheries Research and Development Institute). Sampling and chemical analysis environmental samples around Kori, Woolsung, Youngkwang, Wooljin Npps and Taeduk science town for tritium and strontium analysis was managed according to plans. Succeed to KINS after all samples were tried.

  18. A Note on Information-Directed Sampling and Thompson Sampling

    OpenAIRE

    Zhou, Li

    2015-01-01

    This note introduce three Bayesian style Multi-armed bandit algorithms: Information-directed sampling, Thompson Sampling and Generalized Thompson Sampling. The goal is to give an intuitive explanation for these three algorithms and their regret bounds, and provide some derivations that are omitted in the original papers.

  19. Metadata, Identifiers, and Physical Samples

    Science.gov (United States)

    Arctur, D. K.; Lenhardt, W. C.; Hills, D. J.; Jenkyns, R.; Stroker, K. J.; Todd, N. S.; Dassie, E. P.; Bowring, J. F.

    2016-12-01

    Physical samples are integral to much of the research conducted by geoscientists. The samples used in this research are often obtained at significant cost and represent an important investment for future research. However, making information about samples - whether considered data or metadata - available for researchers to enable discovery is difficult: a number of key elements related to samples are difficult to characterize in common ways, such as classification, location, sample type, sampling method, repository information, subsample distribution, and instrumentation, because these differ from one domain to the next. Unifying these elements or developing metadata crosswalks is needed. The iSamples (Internet of Samples) NSF-funded Research Coordination Network (RCN) is investigating ways to develop these types of interoperability and crosswalks. Within the iSamples RCN, one of its working groups, WG1, has focused on the metadata related to physical samples. This includes identifying existing metadata standards and systems, and how they might interoperate with the International Geo Sample Number (IGSN) schema (schema.igsn.org) in order to help inform leading practices for metadata. For example, we are examining lifecycle metadata beyond the IGSN `birth certificate.' As a first step, this working group is developing a list of relevant standards and comparing their various attributes. In addition, the working group is looking toward technical solutions to facilitate developing a linked set of registries to build the web of samples. Finally, the group is also developing a comparison of sample identifiers and locators. This paper will provide an overview and comparison of the standards identified thus far, as well as an update on the technical solutions examined for integration. We will discuss how various sample identifiers might work in complementary fashion with the IGSN to more completely describe samples, facilitate retrieval of contextual information, and

  20. New prior sampling methods for nested sampling - Development and testing

    Science.gov (United States)

    Stokes, Barrie; Tuyl, Frank; Hudson, Irene

    2017-06-01

    Nested Sampling is a powerful algorithm for fitting models to data in the Bayesian setting, introduced by Skilling [1]. The nested sampling algorithm proceeds by carrying out a series of compressive steps, involving successively nested iso-likelihood boundaries, starting with the full prior distribution of the problem parameters. The "central problem" of nested sampling is to draw at each step a sample from the prior distribution whose likelihood is greater than the current likelihood threshold, i.e., a sample falling inside the current likelihood-restricted region. For both flat and informative priors this ultimately requires uniform sampling restricted to the likelihood-restricted region. We present two new methods of carrying out this sampling step, and illustrate their use with the lighthouse problem [2], a bivariate likelihood used by Gregory [3] and a trivariate Gaussian mixture likelihood. All the algorithm development and testing reported here has been done with Mathematica® [4].

  1. Public Use Microdata Samples (PUMS)

    Data.gov (United States)

    National Aeronautics and Space Administration — Public Use Microdata Samples (PUMS) are computer-accessible files containing records for a sample of housing units, with information on the characteristics of each...

  2. Graph Sampling for Visual Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Fangyan; Zhang, Song; Chung Wong, Pak

    2017-07-01

    Effectively visualizing large graphs and capturing the statistical properties are two challenging tasks. To aid in these two tasks, many sampling approaches for graph simplification have been proposed, falling into three categories: node sampling, edge sampling, and traversal-based sampling. It is still unknown which approach is the best. We evaluate commonly used graph sampling methods through a combined visual and statistical comparison of graphs sampled at various rates. We conduct our evaluation on three graph models: random graphs, small-world graphs, and scale-free graphs. Initial results indicate that the effectiveness of a sampling method is dependent on the graph model, the size of the graph, and the desired statistical property. This benchmark study can be used as a guideline in choosing the appropriate method for a particular graph sampling task, and the results presented can be incorporated into graph visualization and analysis tools.

  3. Treat Medication Samples with Respect

    Science.gov (United States)

    Home Support ISMP Newsletters Webinars Report Errors Educational Store Consulting FAQ Tools About Us Contact Us Treat Medication Samples with Respect A physician may give you samples of a particular medication at the time of your office or clinic visit. ...

  4. Representative mass reduction in sampling

    DEFF Research Database (Denmark)

    Petersen, Lars; Esbensen, Harry Kim; Dahl, Casper Kierulf

    2004-01-01

    dividers, the Boerner Divider, the ??spoon method??, alternate/fractional shoveling and grab sampling. Only devices based on riffle splitting principles (static or rotational) passes the ultimate representativity test (with minor, but significant relative differences). Grab sampling, the overwhelmingly...... always be representative in the full Theory of Sampling (TOS) sense. This survey also allows empirical verification of the merits of the famous ??Gy?s formula?? for order-of-magnitude estimation of the Fundamental Sampling Error (FSE)....

  5. The rise of survey sampling

    NARCIS (Netherlands)

    Bethlehem, J.

    2009-01-01

    This paper is about the history of survey sampling. It describes how sampling became an accepted scientific method. From the first ideas in 1895 it took some 50 years before the principles of probability sampling were widely accepted. This papers has a focus on developments in official statistics in

  6. Sample inhomogeneity in PIXE analysis

    Science.gov (United States)

    Kajfosz, J.; Szymczyk, S.; Kornaś, G.

    1984-04-01

    The influence of sample inhomogeneity on the precision of analytical results obtained by PIXE was investigated. A simple method for the determination of sample inhomogeneity is proposed and its applicability is shown on a series of examples. Differences in the distribution of individual elements in samples were observed.

  7. Sampling and sample processing in pesticide residue analysis.

    Science.gov (United States)

    Lehotay, Steven J; Cook, Jo Marie

    2015-05-13

    Proper sampling and sample processing in pesticide residue analysis of food and soil have always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agrochemical industry and contract laboratories. As global food trade and the importance of monitoring increase, the food industry and regulatory laboratories are also considering miniaturized high-throughput methods. In conjunction with a summary of the symposium "Residues in Food and Feed - Going from Macro to Micro: The Future of Sample Processing in Residue Analytical Methods" held at the 13th IUPAC International Congress of Pesticide Chemistry, this is an opportune time to review sampling theory and sample processing for pesticide residue analysis. If collected samples and test portions do not adequately represent the actual lot from which they came and provide meaningful results, then all costs, time, and efforts involved in implementing programs using sophisticated analytical instruments and techniques are wasted and can actually yield misleading results. This paper is designed to briefly review the often-neglected but crucial topic of sample collection and processing and put the issue into perspective for the future of pesticide residue analysis. It also emphasizes that analysts should demonstrate the validity of their sample processing approaches for the analytes/matrices of interest and encourages further studies on sampling and sample mass reduction to produce a test portion.

  8. Mars Sample Quarantine Protocol Workshop

    Science.gov (United States)

    DeVincenzi, Donald L. (Editor); Bagby, John (Editor); Race, Margaret (Editor); Rummel, John (Editor)

    1999-01-01

    The Mars Sample Quarantine Protocol (QP) Workshop was convened to deal with three specific aspects of the initial handling of a returned Mars sample: 1) biocontainment, to prevent uncontrolled release of sample material into the terrestrial environment; 2) life detection, to examine the sample for evidence of live organisms; and 3) biohazard testing, to determine if the sample poses any threat to terrestrial life forms and the Earth's biosphere. During the first part of the Workshop, several tutorials were presented on topics related to the workshop in order to give all participants a common basis in the technical areas necessary to achieve the objectives of the Workshop.

  9. Sample processing device and method

    DEFF Research Database (Denmark)

    2011-01-01

    A sample processing device is disclosed, which sample processing device comprises a first substrate and a second substrate, where the first substrate has a first surface comprising two area types, a first area type with a first contact angle with water and a second area type with a second contact...... a sample liquid comprising the sample and the first preparation system is adapted to receive a receiving liquid. In a particular embodiment, a magnetic sample transport component, such as a permanent magnet or an electromagnet, is arranged to move magnetic beads in between the first and second substrates....

  10. Improve natural gas sampling techniques

    Energy Technology Data Exchange (ETDEWEB)

    Jiskoot, R.J.J. (Jiskoot Autocontrol, Kent (United Kingdom))

    1994-02-01

    Accurate and reliable sampling systems are imperative when confirming natural gas' commercial value. Buyers and sellers need accurate hydrocarbon-composition information to conduct fair sale transactions. Because of poor sample extraction, preparation or analysis can invalidate the sale, more attention should be directed toward improving representative sampling. Consider all sampling components, i.e., gas types, line pressure and temperature, equipment maintenance and service needs, etc. The paper discusses gas sampling, design considerations (location, probe type, extraction devices, controller, and receivers), operating requirements, and system integration.

  11. Rotary Percussive Sample Acquisition Tool

    Science.gov (United States)

    Klein, K.; Badescu, M.; Haddad, N.; Shiraishi, L.; Walkemeyer, P.

    2012-01-01

    As part of a potential Mars Sample Return campaign NASA is studying a sample caching mission to Mars, with a possible 2018 launch opportunity. As such, a Sample Acquisition Tool (SAT) has been developed in support of the Integrated Mars Sample Acquisition and Handling (IMSAH) architecture as it relates to the proposed Mars Sample Return (MSR) campaign. The tool allows for core generation and capture directly into a sample tube. In doing so, the sample tube becomes the fundamental handling element within the IMSAH sample chain reducing the risk associated with sample contamination as well as the need to handle a sample of unknown geometry. The tool's functionality was verified utilizing a proposed rock test suite that encompasses a series of rock types that have been utilized in the past to qualify Martian surface sampling hardware. The corresponding results have shown the tool can effectively generate, fracture, and capture rock cores while maintaining torque margins of no less than 50% with an average power consumption of no greater than 90W and a tool mass of less than 6kg.

  12. Manual versus automated blood sampling

    DEFF Research Database (Denmark)

    Teilmann, A C; Kalliokoski, Otto; Sørensen, Dorte B

    2014-01-01

    Facial vein (cheek blood) and caudal vein (tail blood) phlebotomy are two commonly used techniques for obtaining blood samples from laboratory mice, while automated blood sampling through a permanent catheter is a relatively new technique in mice. The present study compared physiological parameters......, glucocorticoid dynamics as well as the behavior of mice sampled repeatedly for 24 h by cheek blood, tail blood or automated blood sampling from the carotid artery. Mice subjected to cheek blood sampling lost significantly more body weight, had elevated levels of plasma corticosterone, excreted more fecal...... corticosterone metabolites, and expressed more anxious behavior than did the mice of the other groups. Plasma corticosterone levels of mice subjected to tail blood sampling were also elevated, although less significantly. Mice subjected to automated blood sampling were less affected with regard to the parameters...

  13. Pierre Gy's sampling theory and sampling practice heterogeneity, sampling correctness, and statistical process control

    CERN Document Server

    Pitard, Francis F

    1993-01-01

    Pierre Gy's Sampling Theory and Sampling Practice, Second Edition is a concise, step-by-step guide for process variability management and methods. Updated and expanded, this new edition provides a comprehensive study of heterogeneity, covering the basic principles of sampling theory and its various applications. It presents many practical examples to allow readers to select appropriate sampling protocols and assess the validity of sampling protocols from others. The variability of dynamic process streams using variography is discussed to help bridge sampling theory with statistical process control. Many descriptions of good sampling devices, as well as descriptions of poor ones, are featured to educate readers on what to look for when purchasing sampling systems. The book uses its accessible, tutorial style to focus on professional selection and use of methods. The book will be a valuable guide for mineral processing engineers; metallurgists; geologists; miners; chemists; environmental scientists; and practit...

  14. Sampling designs dependent on sample parameters of auxiliary variables

    CERN Document Server

    Wywiał, Janusz L

    2015-01-01

    The book offers a valuable resource for students and statisticians whose work involves survey sampling. An estimation of the population parameters in finite and fixed populations assisted by auxiliary variables is considered. New sampling designs dependent on moments or quantiles of auxiliary variables are presented on the background of the classical methods. Accuracies of the estimators based on original sampling design are compared with classical estimation procedures. Specific conditional sampling designs are applied to problems of small area estimation as well as to estimation of quantiles of variables under study. .

  15. Comet coma sample return instrument

    Science.gov (United States)

    Albee, A. L.; Brownlee, Don E.; Burnett, Donald S.; Tsou, Peter; Uesugi, K. T.

    1994-01-01

    The sample collection technology and instrument concept for the Sample of Comet Coma Earth Return Mission (SOCCER) are described. The scientific goals of this Flyby Sample Return are to return to coma dust and volatile samples from a known comet source, which will permit accurate elemental and isotopic measurements for thousands of individual solid particles and volatiles, detailed analysis of the dust structure, morphology, and mineralogy of the intact samples, and identification of the biogenic elements or compounds in the solid and volatile samples. Having these intact samples, morphologic, petrographic, and phase structural features can be determined. Information on dust particle size, shape, and density can be ascertained by analyzing penetration holes and tracks in the capture medium. Time and spatial data of dust capture will provide understanding of the flux dynamics of the coma and the jets. Additional information will include the identification of cosmic ray tracks in the cometary grains, which can provide a particle's process history and perhaps even the age of the comet. The measurements will be made with the same equipment used for studying micrometeorites for decades past; hence, the results can be directly compared without extrapolation or modification. The data will provide a powerful and direct technique for comparing the cometary samples with all known types of meteorites and interplanetary dust. This sample collection system will provide the first sample return from a specifically identified primitive body and will allow, for the first time, a direct method of matching meteoritic materials captured on Earth with known parent bodies.

  16. Optimization of sampling parameters for standardized exhaled breath sampling.

    Science.gov (United States)

    Doran, Sophie; Romano, Andrea; Hanna, George B

    2017-09-05

    The lack of standardization of breath sampling is a major contributing factor to the poor repeatability of results and hence represents a barrier to the adoption of breath tests in clinical practice. On-line and bag breath sampling have advantages but do not suit multicentre clinical studies whereas storage and robust transport are essential for the conduct of wide-scale studies. Several devices have been developed to control sampling parameters and to concentrate volatile organic compounds (VOCs) onto thermal desorption (TD) tubes and subsequently transport those tubes for laboratory analysis. We conducted three experiments to investigate (i) the fraction of breath sampled (whole vs. lower expiratory exhaled breath); (ii) breath sample volume (125, 250, 500 and 1000ml) and (iii) breath sample flow rate (400, 200, 100 and 50 ml/min). The target VOCs were acetone and potential volatile biomarkers for oesophago-gastric cancer belonging to the aldehyde, fatty acids and phenol chemical classes. We also examined the collection execution time and the impact of environmental contamination. The experiments showed that the use of exhaled breath-sampling devices requires the selection of optimum sampling parameters. The increase in sample volume has improved the levels of VOCs detected. However, the influence of the fraction of exhaled breath and the flow rate depends on the target VOCs measured. The concentration of potential volatile biomarkers for oesophago-gastric cancer was not significantly different between the whole and lower airway exhaled breath. While the recovery of phenols and acetone from TD tubes was lower when breath sampling was performed at a higher flow rate, other VOCs were not affected. A dedicated 'clean air supply' overcomes the contamination from ambient air, but the breath collection device itself can be a source of contaminants. In clinical studies using VOCs to diagnose gastro-oesophageal cancer, the optimum parameters are 500mls sample volume

  17. Influence of sampling depth and post-sampling analysis time

    African Journals Online (AJOL)

    Paradise is released directly to the ocean at this point. ... cooled (0ºC - 0.2ºC) plastic bottle, sealed and labelled in the field. During mussel sampling, 10 -12 animals (sufficient to yield. 150-250 g of soft tissue), were taken from each shellfish stock sample, sealed in cool sterile plastic bags and kept in plastic containers.

  18. Quantum sampling problems, BosonSampling and quantum supremacy

    Science.gov (United States)

    Lund, A. P.; Bremner, Michael J.; Ralph, T. C.

    2017-04-01

    There is a large body of evidence for the potential of greater computational power using information carriers that are quantum mechanical over those governed by the laws of classical mechanics. But the question of the exact nature of the power contributed by quantum mechanics remains only partially answered. Furthermore, there exists doubt over the practicality of achieving a large enough quantum computation that definitively demonstrates quantum supremacy. Recently the study of computational problems that produce samples from probability distributions has added to both our understanding of the power of quantum algorithms and lowered the requirements for demonstration of fast quantum algorithms. The proposed quantum sampling problems do not require a quantum computer capable of universal operations and also permit physically realistic errors in their operation. This is an encouraging step towards an experimental demonstration of quantum algorithmic supremacy. In this paper, we will review sampling problems and the arguments that have been used to deduce when sampling problems are hard for classical computers to simulate. Two classes of quantum sampling problems that demonstrate the supremacy of quantum algorithms are BosonSampling and Instantaneous Quantum Polynomial-time Sampling. We will present the details of these classes and recent experimental progress towards demonstrating quantum supremacy in BosonSampling.

  19. Acceptance sampling using judgmental and randomly selected samples

    Energy Technology Data Exchange (ETDEWEB)

    Sego, Landon H.; Shulman, Stanley A.; Anderson, Kevin K.; Wilson, John E.; Pulsipher, Brent A.; Sieber, W. Karl

    2010-09-01

    We present a Bayesian model for acceptance sampling where the population consists of two groups, each with different levels of risk of containing unacceptable items. Expert opinion, or judgment, may be required to distinguish between the high and low-risk groups. Hence, high-risk items are likely to be identifed (and sampled) using expert judgment, while the remaining low-risk items are sampled randomly. We focus on the situation where all observed samples must be acceptable. Consequently, the objective of the statistical inference is to quantify the probability that a large percentage of the unsampled items in the population are also acceptable. We demonstrate that traditional (frequentist) acceptance sampling and simpler Bayesian formulations of the problem are essentially special cases of the proposed model. We explore the properties of the model in detail, and discuss the conditions necessary to ensure that required samples sizes are non-decreasing function of the population size. The method is applicable to a variety of acceptance sampling problems, and, in particular, to environmental sampling where the objective is to demonstrate the safety of reoccupying a remediated facility that has been contaminated with a lethal agent.

  20. Gravimetric dust sampling for control purposes and occupational dust sampling.

    CSIR Research Space (South Africa)

    Unsted, AD

    1997-02-01

    Full Text Available Prior to the introduction of gravimetric dust sampling, konimeters had been used for dust sampling, which was largely for control purposes. Whether or not absolute results were achievable was not an issue since relative results were used to evaluate...

  1. Sampling problems in twin research.

    Science.gov (United States)

    Torgersen, S

    1987-01-01

    Sampling problems related to twin problems may be of at least three kinds: small samples, self-selection and unrepresentative ascertainment. The article mostly discusses the third type of sample problem. Data from a nationwide Norwegian twin study show that the results of twin studies will be quite different depending upon the ascertainment procedure. According to both the ICD-9 as well as the DSM-III classification system, only samples ascertained from mental hospitals treating severe cases are able to demonstrate hereditary factors of any strength.

  2. Neonatal blood gas sampling methods

    African Journals Online (AJOL)

    Blood gas sampling is part of everyday practice in the care of babies admitted to the neonatal intensive care unit, particularly for those receiving respiratory support. There is little published guidance that systematically evaluates the different methods of neonatal blood gas sampling, where each method has its individual ...

  3. Learning to Reason from Samples

    Science.gov (United States)

    Ben-Zvi, Dani; Bakker, Arthur; Makar, Katie

    2015-01-01

    The goal of this article is to introduce the topic of "learning to reason from samples," which is the focus of this special issue of "Educational Studies in Mathematics" on "statistical reasoning." Samples are data sets, taken from some wider universe (e.g., a population or a process) using a particular procedure…

  4. Learning to reason from samples

    NARCIS (Netherlands)

    Ben-Zvi, Dani; Bakker, Arthur; Makar, Katie

    2015-01-01

    The goal of this article is to introduce the topic of learning to reason from samples, which is the focus of this special issue of Educational Studies in Mathematics on statistical reasoning. Samples are data sets, taken from some wider universe (e.g., a population or a process) using a particular

  5. Simulated Sampling of Estuary Plankton

    Science.gov (United States)

    Fortner, Rosanne W.; Jenkins, Deborah Bainer

    2009-01-01

    To find out about the microscopic life in the valuable estuary environment, it is usually necessary to be near the water. This dry lab offers an alternative, using authentic data and a simulation of plankton sampling. From the types of organisms found in the sample, middle school students can infer relationships in the biological and physical…

  6. Sample-whitened matched filters

    DEFF Research Database (Denmark)

    Andersen, Ib

    1973-01-01

    A sample-whitened matched filter (SWMF) for a channel with intersymbol interference and additive white Gaussian noise is defined as a linear filter with the properties that its output samples are a sufficient statistic for the MAP estimation of the transmitted sequence and have uncorrelated noise...

  7. Sampling by Fluidics and Microfluidics

    Directory of Open Access Journals (Sweden)

    V. Tesař

    2002-01-01

    Full Text Available Selecting one from several available fluid samples is a procedure often performed especially in chemical engineering. It is usually done by an array of valves sequentially opened and closed. Not generally known is an advantageous alternative: fluidic sampling units without moving parts. In the absence of complete pipe closure, cross-contamination between samples cannot be ruled out. This is eliminated by arranging for small protective flows that clear the cavities and remove any contaminated fluid. Although this complicates the overall circuit layout, fluidic sampling units with these "guard" flows were successfully built and tested. Recent interest in microchemistry leads to additional problems due very low operating Reynolds numbers. This necessitated the design of microfluidic sampling units based on new operating principles.

  8. Hermetic Seal Designs for Sample Return Sample Tubes

    Science.gov (United States)

    Younse, Paulo J.

    2013-01-01

    Prototypes have been developed of potential hermetic sample sealing techniques for encapsulating samples in a ˜1-cm-diameter thin-walled sample tube that are compatible with IMSAH (Integrated Mars Sample Acquisition and Handling) architecture. Techniques include a heat-activated, finned, shape memory alloy plug; a contracting shape memory alloy activated cap; an expanding shape memory alloy plug; and an expanding torque plug. Initial helium leak testing of the shape memory alloy cap and finned shape memory alloy plug seals showed hermetic- seal capability compared against an industry standard of seal integrity after Martian diurnal cycles. Developmental testing is currently being done on the expanding torque plug, and expanding shape memory alloy plug seal designs. The finned shape memory alloy (SMA) plug currently shows hermetic sealing capability based on preliminary tests.

  9. Sampling Theorem in Terms of the Bandwidth and Sampling Interval

    Science.gov (United States)

    Dean, Bruce H.

    2011-01-01

    An approach has been developed for interpolating non-uniformly sampled data, with applications in signal and image reconstruction. This innovation generalizes the Whittaker-Shannon sampling theorem by emphasizing two assumptions explicitly (definition of a band-limited function and construction by periodic extension). The Whittaker- Shannon sampling theorem is thus expressed in terms of two fundamental length scales that are derived from these assumptions. The result is more general than what is usually reported, and contains the Whittaker- Shannon form as a special case corresponding to Nyquist-sampled data. The approach also shows that the preferred basis set for interpolation is found by varying the frequency component of the basis functions in an optimal way.

  10. Defining sample size and sampling strategy for dendrogeomorphic rockfall reconstructions

    Science.gov (United States)

    Morel, Pauline; Trappmann, Daniel; Corona, Christophe; Stoffel, Markus

    2015-05-01

    Optimized sampling strategies have been recently proposed for dendrogeomorphic reconstructions of mass movements with a large spatial footprint, such as landslides, snow avalanches, and debris flows. Such guidelines have, by contrast, been largely missing for rockfalls and cannot be transposed owing to the sporadic nature of this process and the occurrence of individual rocks and boulders. Based on a data set of 314 European larch (Larix decidua Mill.) trees (i.e., 64 trees/ha), growing on an active rockfall slope, this study bridges this gap and proposes an optimized sampling strategy for the spatial and temporal reconstruction of rockfall activity. Using random extractions of trees, iterative mapping, and a stratified sampling strategy based on an arbitrary selection of trees, we investigate subsets of the full tree-ring data set to define optimal sample size and sampling design for the development of frequency maps of rockfall activity. Spatially, our results demonstrate that the sampling of only 6 representative trees per ha can be sufficient to yield a reasonable mapping of the spatial distribution of rockfall frequencies on a slope, especially if the oldest and most heavily affected individuals are included in the analysis. At the same time, however, sampling such a low number of trees risks causing significant errors especially if nonrepresentative trees are chosen for analysis. An increased number of samples therefore improves the quality of the frequency maps in this case. Temporally, we demonstrate that at least 40 trees/ha are needed to obtain reliable rockfall chronologies. These results will facilitate the design of future studies, decrease the cost-benefit ratio of dendrogeomorphic studies and thus will permit production of reliable reconstructions with reasonable temporal efforts.

  11. Curation of Samples from Mars

    Science.gov (United States)

    Lindstrom, D.; Allen, C.

    One of the strong scientific reasons for returning samples from Mars is to search for evidence of current or past life in the samples. Because of the remote possibility that the samples may contain life forms that are hazardous to the terrestrial biosphere, the National Research Council has recommended that all samples returned from Mars be kept under strict biological containment until tests show that they can safely be released to other laboratories. It is possible that Mars samples may contain only scarce or subtle traces of life or prebiotic chemistry that could readily be overwhelmed by terrestrial contamination. Thus, the facilities used to contain, process, and analyze samples from Mars must have a combination of high-level biocontainment and organic / inorganic chemical cleanliness that is unprecedented. We have been conducting feasibility studies and developing designs for a facility that would be at least as capable as current maximum containment BSL-4 (BioSafety Level 4) laboratories, while simultaneously maintaining cleanliness levels exceeding those of the cleanest electronics manufacturing labs. Unique requirements for the processing of Mars samples have inspired a program to develop handling techniques that are much more precise and reliable than the approach (currently used for lunar samples) of employing gloved human hands in nitrogen-filled gloveboxes. Individual samples from Mars are expected to be much smaller than lunar samples, the total mass of samples returned by each mission being 0.5- 1 kg, compared with many tens of kg of lunar samples returned by each of the six Apollo missions. Smaller samp les require much more of the processing to be done under microscopic observation. In addition, the requirements for cleanliness and high-level containment would be difficult to satisfy while using traditional gloveboxes. JSC has constructed a laboratory to test concepts and technologies important to future sample curation. The Advanced Curation

  12. Sampling of illicit drugs for quantitative analysis--part III: sampling plans and sample preparations.

    Science.gov (United States)

    Csesztregi, T; Bovens, M; Dujourdy, L; Franc, A; Nagy, J

    2014-08-01

    The findings in this paper are based on the results of our drug homogeneity studies and particle size investigations. Using that information, a general sampling plan (depicted in the form of a flow-chart) was devised that could be applied to the quantitative instrumental analysis of the most common illicit drugs: namely heroin, cocaine, amphetamine, cannabis resin, MDMA tablets and herbal cannabis in 'bud' form (type I). Other more heterogeneous forms of cannabis (type II) were found to require alternative, more traditional sampling methods. A table was constructed which shows the sampling uncertainty expected when a particular number of random increments are taken and combined to form a single primary sample. It also includes a recommended increment size; which is 1 g for powdered drugs and cannabis resin, 1 tablet for MDMA and 1 bud for herbal cannabis in bud form (type I). By referring to that table, individual laboratories can ensure that the sampling uncertainty for a particular drug seizure can be minimised, such that it lies in the same region as their analytical uncertainty for that drug. The table shows that assuming a laboratory wishes to quantitatively analyse a seizure of powdered drug or cannabis resin with a 'typical' heterogeneity, a primary sample of 15×1 g increments is generally appropriate. The appropriate primary sample for MDMA tablets is 20 tablets, while for herbal cannabis (in bud form) 50 buds were found to be appropriate. Our study also showed that, for a suitably homogenised primary sample of the most common powdered drugs, an analytical sample size of between 20 and 35 mg was appropriate and for herbal cannabis the appropriate amount was 200 mg. The need to ensure that the results from duplicate or multiple incremental sampling were compared, to demonstrate whether or not a particular seized material has a 'typical' heterogeneity and that the sampling procedure applied has resulted in a 'correct sample', was highlighted and the setting

  13. Defining And Characterizing Sample Representativeness For DWPF Melter Feed Samples

    Energy Technology Data Exchange (ETDEWEB)

    Shine, E. P.; Poirier, M. R.

    2013-10-29

    Representative sampling is important throughout the Defense Waste Processing Facility (DWPF) process, and the demonstrated success of the DWPF process to achieve glass product quality over the past two decades is a direct result of the quality of information obtained from the process. The objective of this report was to present sampling methods that the Savannah River Site (SRS) used to qualify waste being dispositioned at the DWPF. The goal was to emphasize the methodology, not a list of outcomes from those studies. This methodology includes proven methods for taking representative samples, the use of controlled analytical methods, and data interpretation and reporting that considers the uncertainty of all error sources. Numerous sampling studies were conducted during the development of the DWPF process and still continue to be performed in order to evaluate options for process improvement. Study designs were based on use of statistical tools applicable to the determination of uncertainties associated with the data needs. Successful designs are apt to be repeated, so this report chose only to include prototypic case studies that typify the characteristics of frequently used designs. Case studies have been presented for studying in-tank homogeneity, evaluating the suitability of sampler systems, determining factors that affect mixing and sampling, comparing the final waste glass product chemical composition and durability to that of the glass pour stream sample and other samples from process vessels, and assessing the uniformity of the chemical composition in the waste glass product. Many of these studies efficiently addressed more than one of these areas of concern associated with demonstrating sample representativeness and provide examples of statistical tools in use for DWPF. The time when many of these designs were implemented was in an age when the sampling ideas of Pierre Gy were not as widespread as they are today. Nonetheless, the engineers and

  14. Mars Science Laboratory Sample Acquisition, Sample Processing and Handling Subsystem: A Description of the Sampling Functionality

    Science.gov (United States)

    Jandura, L.; Burke, K.; Kennedy, B.; Melko, J.; Okon, A.; Sunshine, D.

    2009-12-01

    The Sample Acquisition/Sample Processing and Handling (SA/SPaH) subsystem for the Mars Science Library (MSL) is a rover-based sampling system scheduled to launch in 2011. The SA/SPaH consists of a powdering drill and a scooping, sieving, and portioning device mounted on a turret at the end of a robotic arm. Also on the turret is a dust removal tool for clearing the surface of scientific targets, and two science instruments mounted on vibration isolators. The SA/SPaH can acquire powder from rocks at depths of 20 to 50 mm and can also pick up loose regolith with its scoop. The acquired sample is sieved and portioned and delivered to one of two instruments inside the rover for analysis. The functionality of the system will be described along with the targets the system can acquire and the sample that can be delivered. Top View of the SA/SPaH on the Rover

  15. Continuous sampling from distributed streams

    DEFF Research Database (Denmark)

    Graham, Cormode; Muthukrishnan, S.; Yi, Ke

    2012-01-01

    A fundamental problem in data management is to draw and maintain a sample of a large data set, for approximate query answering, selectivity estimation, and query planning. With large, streaming data sets, this problem becomes particularly difficult when the data is shared across multiple...... distributed sites. The main challenge is to ensure that a sample is drawn uniformly across the union of the data while minimizing the communication needed to run the protocol on the evolving data. At the same time, it is also necessary to make the protocol lightweight, by keeping the space and time costs low...... for each participant. In this article, we present communication-efficient protocols for continuously maintaining a sample (both with and without replacement) from k distributed streams. These apply to the case when we want a sample from the full streams, and to the sliding window cases of only the W most...

  16. Sample size determination and power

    CERN Document Server

    Ryan, Thomas P, Jr

    2013-01-01

    THOMAS P. RYAN, PhD, teaches online advanced statistics courses for Northwestern University and The Institute for Statistics Education in sample size determination, design of experiments, engineering statistics, and regression analysis.

  17. Microfluidic Sample Preparation for Immunoassays

    Energy Technology Data Exchange (ETDEWEB)

    Visuri, S; Benett, W; Bettencourt, K; Chang, J; Fisher, K; Hamilton, J; Krulevitch, P; Park, C; Stockton, C; Tarte, L; Wang, A; Wilson, T

    2001-08-09

    Researchers at Lawrence Livermore National Laboratory are developing means to collect and identify fluid-based biological pathogens in the forms of proteins, viruses, and bacteria. to support detection instruments, they are developing a flexible fluidic sample preparation unit. The overall goal of this Microfluidic Module is to input a fluid sample, containing background particulates and potentially target compounds, and deliver a processed sample for detection. They are developing techniques for sample purification, mixing, and filtration that would be useful to many applications including immunologic and nucleic acid assays. Many of these fluidic functions are accomplished with acoustic radiation pressure or dielectrophoresis. They are integrating these technologies into packaged systems with pumps and valves to control fluid flow through the fluidic circuit.

  18. Subsurface Noble Gas Sampling Manual

    Energy Technology Data Exchange (ETDEWEB)

    Carrigan, C. R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Sun, Y. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-09-18

    The intent of this document is to provide information about best available approaches for performing subsurface soil gas sampling during an On Site Inspection or OSI. This information is based on field sampling experiments, computer simulations and data from the NA-22 Noble Gas Signature Experiment Test Bed at the Nevada Nuclear Security Site (NNSS). The approaches should optimize the gas concentration from the subsurface cavity or chimney regime while simultaneously minimizing the potential for atmospheric radioxenon and near-surface Argon-37 contamination. Where possible, we quantitatively assess differences in sampling practices for the same sets of environmental conditions. We recognize that all sampling scenarios cannot be addressed. However, if this document helps to inform the intuition of the reader about addressing the challenges resulting from the inevitable deviations from the scenario assumed here, it will have achieved its goal.

  19. SWOT ANALYSIS ON SAMPLING METHOD

    National Research Council Canada - National Science Library

    CHIS ANCA OANA; BELENESI MARIOARA;

    2014-01-01

    .... Our article aims to study audit sampling in audit of financial statements. As an audit technique largely used, in both its statistical and nonstatistical form, the method is very important for auditors...

  20. Biological Sample Monitoring Database (BSMDBS)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Biological Sample Monitoring Database System (BSMDBS) was developed for the Northeast Fisheries Regional Office and Science Center (NER/NEFSC) to record and...

  1. Tests on standard concrete samples

    CERN Multimedia

    CERN PhotoLab

    1973-01-01

    Compression and tensile tests on standard concrete samples. The use of centrifugal force in tensile testing has been developed by the SB Division and the instruments were built in the Central workshops.

  2. More practical critical height sampling.

    Science.gov (United States)

    Thomas B. Lynch; Jeffrey H. Gove

    2015-01-01

    Critical Height Sampling (CHS) (Kitamura 1964) can be used to predict cubic volumes per acre without using volume tables or equations. The critical height is defined as the height at which the tree stem appears to be in borderline condition using the point-sampling angle gauge (e.g. prism). An estimate of cubic volume per acre can be obtained from multiplication of the...

  3. Biological Environmental Sampling Technologies Assessment

    Science.gov (United States)

    2015-12-01

    array‐based measurements that are made on carbon ink surfaces. The instrument can process a wide variety of sample types and is targeted at...from all types of surfaces and absorb unknown liquids. The Aklus Shield system can also be used to sample debris, soil, or vegetation . For this...system is already part of the selected JBTDS system. Therefore, if the InnovaPrep system was selected, it would reduce the logistical footprint of

  4. Sampling Operations on Big Data

    Science.gov (United States)

    2015-11-29

    big data in Section II, followed by a description of the analytic environment D4M in Section III. We then describe the types of sampling methods and...signal reconstruction steps are used to do these operations. Big Data analytics , often characterized by analytics applied to datasets that strain available...Sampling Operations on Big Data Vijay Gadepally, Taylor Herr, Luke Johnson, Lauren Milechin, Maja Milosavljevic, Benjamin A. Miller Lincoln

  5. Biofouling development on plasma treated samples versus layers coated samples

    Science.gov (United States)

    Hnatiuc, B.; Exnar, P.; Sabau, A.; Spatenka, P.; Dumitrache, C. L.; Hnatiuc, M.; Ghita, S.

    2016-12-01

    Biofouling is the most important cause of naval corrosion. In order to reduce the Biofouling development on naval materials as steel or resin, different new methods have been tested. These methods could help to follow the new IMO environment reglementations and they could replace few classic operations before the painting of the small ships. The replacement of these operations means a reduction in maintenance costs. Their action must influence especially the first two steps of the Biofouling development, called Microfouling, that demand about 24 hours. This work presents the comparative results of the Biofouling development on two different classic naval materials, steel and resin, for three treated samples, immersed in sea water. Non-thermal plasma, produced by GlidArc technology, is applied to the first sample, called GD. The plasma treatment was set to 10 minutes. The last two samples, called AE9 and AE10 are covered by hydrophobic layers, prepared from a special organic-inorganic sol synthesized by sol-gel method. Theoretically, because of the hydrophobic properties, the Biofouling formation must be delayed for AE9 and AE10. The Biofouling development on each treated sample was compared with a witness non-treated sample. The microbiological analyses have been done for 24 hours by epifluorescence microscopy, available for one single layer.

  6. Are most samples of animals systematically biased? Consistent individual trait differences bias samples despite random sampling.

    Science.gov (United States)

    Biro, Peter A

    2013-02-01

    Sampling animals from the wild for study is something nearly every biologist has done, but despite our best efforts to obtain random samples of animals, 'hidden' trait biases may still exist. For example, consistent behavioral traits can affect trappability/catchability, independent of obvious factors such as size and gender, and these traits are often correlated with other repeatable physiological and/or life history traits. If so, systematic sampling bias may exist for any of these traits. The extent to which this is a problem, of course, depends on the magnitude of bias, which is presently unknown because the underlying trait distributions in populations are usually unknown, or unknowable. Indeed, our present knowledge about sampling bias comes from samples (not complete population censuses), which can possess bias to begin with. I had the unique opportunity to create naturalized populations of fish by seeding each of four small fishless lakes with equal densities of slow-, intermediate-, and fast-growing fish. Using sampling methods that are not size-selective, I observed that fast-growing fish were up to two-times more likely to be sampled than slower-growing fish. This indicates substantial and systematic bias with respect to an important life history trait (growth rate). If correlations between behavioral, physiological and life-history traits are as widespread as the literature suggests, then many animal samples may be systematically biased with respect to these traits (e.g., when collecting animals for laboratory use), and affect our inferences about population structure and abundance. I conclude with a discussion on ways to minimize sampling bias for particular physiological/behavioral/life-history types within animal populations.

  7. SWOT ANALYSIS ON SAMPLING METHOD

    Directory of Open Access Journals (Sweden)

    CHIS ANCA OANA

    2014-07-01

    Full Text Available Audit sampling involves the application of audit procedures to less than 100% of items within an account balance or class of transactions. Our article aims to study audit sampling in audit of financial statements. As an audit technique largely used, in both its statistical and nonstatistical form, the method is very important for auditors. It should be applied correctly for a fair view of financial statements, to satisfy the needs of all financial users. In order to be applied correctly the method must be understood by all its users and mainly by auditors. Otherwise the risk of not applying it correctly would cause loose of reputation and discredit, litigations and even prison. Since there is not a unitary practice and methodology for applying the technique, the risk of incorrectly applying it is pretty high. The SWOT analysis is a technique used that shows the advantages, disadvantages, threats and opportunities. We applied SWOT analysis in studying the sampling method, from the perspective of three players: the audit company, the audited entity and users of financial statements. The study shows that by applying the sampling method the audit company and the audited entity both save time, effort and money. The disadvantages of the method are difficulty in applying and understanding its insight. Being largely used as an audit method and being a factor of a correct audit opinion, the sampling method’s advantages, disadvantages, threats and opportunities must be understood by auditors.

  8. Distance sampling methods and applications

    CERN Document Server

    Buckland, S T; Marques, T A; Oedekoven, C S

    2015-01-01

    In this book, the authors cover the basic methods and advances within distance sampling that are most valuable to practitioners and in ecology more broadly. This is the fourth book dedicated to distance sampling. In the decade since the last book published, there have been a number of new developments. The intervening years have also shown which advances are of most use. This self-contained book covers topics from the previous publications, while also including recent developments in method, software and application. Distance sampling refers to a suite of methods, including line and point transect sampling, in which animal density or abundance is estimated from a sample of distances to detected individuals. The book illustrates these methods through case studies; data sets and computer code are supplied to readers through the book’s accompanying website.  Some of the case studies use the software Distance, while others use R code. The book is in three parts.  The first part addresses basic methods, the ...

  9. Sample Results from Routine Salt Batch 7 Samples

    Energy Technology Data Exchange (ETDEWEB)

    Peters, T. [Savannah River Site (SRS), Aiken, SC (United States)

    2015-05-13

    Strip Effluent Hold Tank (SEHT) and Decontaminated Salt Solution Hold Tank (DSSHT) samples from several of the “microbatches” of Integrated Salt Disposition Project (ISDP) Salt Batch (“Macrobatch”) 7B have been analyzed for 238Pu, 90Sr, 137Cs, Inductively Coupled Plasma Emission Spectroscopy (ICPES), and Ion Chromatography Anions (IC-A). The results from the current microbatch samples are similar to those from earlier samples from this and previous macrobatches. The Actinide Removal Process (ARP) and the Modular Caustic-Side Solvent Extraction Unit (MCU) continue to show more than adequate Pu and Sr removal, and there is a distinct positive trend in Cs removal, due to the use of the Next Generation Solvent (NGS). The Savannah River National Laboratory (SRNL) notes that historically, most measured Concentration Factor (CF) values during salt processing have been in the 12-14 range. However, recent processing gives CF values closer to 11. This observation does not indicate that the solvent performance is suffering, as the Decontamination Factor (DF) has still maintained consistently high values. Nevertheless, SRNL will continue to monitor for indications of process upsets. The bulk chemistry of the DSSHT and SEHT samples do not show any signs of unusual behavior.

  10. Environmental surveillance master sampling schedule

    Energy Technology Data Exchange (ETDEWEB)

    Bisping, L.E.

    1997-01-01

    Environmental surveillance of the Hanford Site and surrounding areas is conducted by the Pacific Northwest National Laboratory (PNNL)(a) for the US Department of Energy (DOE). This document contains the planned 1997 schedules for routine collection of samples for the Surface Environmental Surveillance Project (SESP) and Drinking Water Monitoring Project. In addition, Section 3.0, Biota, also reflects a rotating collection schedule identifying the year a specific sample is scheduled for collection. The purpose of these monitoring projects is to evaluate levels of radioactive and nonradioactive pollutants in the Hanford environs, as required in DOE Order 5400.1, General Environmental Protection Program, and DOE Order 5400.5, Radiation Protection of the Public and the Environment. The sampling methods will be the same as those described in the Environmental Monitoring Plan, US Department of Energy, Richland Operations Office, DOE/RL91-50, Rev. 1, US Department of Energy, Richland, Washington.

  11. β-NMR sample optimization

    CERN Document Server

    Zakoucka, Eva

    2013-01-01

    During my summer student programme I was working on sample optimization for a new β-NMR project at the ISOLDE facility. The β-NMR technique is well-established in solid-state physics and just recently it is being introduced for applications in biochemistry and life sciences. The β-NMR collaboration will be applying for beam time to the INTC committee in September for three nuclei: Cu, Zn and Mg. Sample optimization for Mg was already performed last year during the summer student programme. Therefore sample optimization for Cu and Zn had to be completed as well for the project proposal. My part in the project was to perform thorough literature research on techniques studying Cu and Zn complexes in native conditions, search for relevant binding candidates for Cu and Zn applicable for ß-NMR and eventually evaluate selected binding candidates using UV-VIS spectrometry.

  12. Ball assisted device for analytical surface sampling

    Science.gov (United States)

    ElNaggar, Mariam S; Van Berkel, Gary J; Covey, Thomas R

    2015-11-03

    A system for sampling a surface includes a sampling probe having a housing and a socket, and a rolling sampling sphere within the socket. The housing has a sampling fluid supply conduit and a sampling fluid exhaust conduit. The sampling fluid supply conduit supplies sampling fluid to the sampling sphere. The sampling fluid exhaust conduit has an inlet opening for receiving sampling fluid carried from the surface by the sampling sphere. A surface sampling probe and a method for sampling a surface are also disclosed.

  13. The ocean sampling day consortium

    DEFF Research Database (Denmark)

    Kopf, Anna; Bicak, Mesude; Kottmann, Renzo

    2015-01-01

    Ocean Sampling Day was initiated by the EU-funded Micro B3 (Marine Microbial Biodiversity, Bioinformatics, Biotechnology) project to obtain a snapshot of the marine microbial biodiversity and function of the world’s oceans. It is a simultaneous global mega-sequencing campaign aiming to generate...... the largest standardized microbial data set in a single day. This will be achievable only through the coordinated efforts of an Ocean Sampling Day Consortium, supportive partnerships and networks between sites. This commentary outlines the establishment, function and aims of the Consortium and describes our...... vision for a sustainable study of marine microbial communities and their embedded functional traits....

  14. Sampling for stereology in lungs

    Directory of Open Access Journals (Sweden)

    J. R. Nyengaard

    2006-12-01

    Full Text Available The present article reviews the relevant stereological estimators for obtaining reliable quantitative structural data from the lungs. Stereological sampling achieves reliable, quantitative information either about the whole lung or complete lobes, whilst minimising the workload. Studies have used systematic random sampling, which has fixed and constant sampling probabilities on all blocks, sections and fields of view. For an estimation of total lung or lobe volume, the Cavalieri principle can be used, but it is not useful in estimating individual cell volume due to various effects from over- or underprojection. If the number of certain structures is required, two methods can be used: the disector and the fractionator. The disector method is a three-dimensional stereological probe for sampling objects according to their number. However, it may be affected on tissue deformation and, therefore, the fractionator method is often the preferred sampling principle. In this method, a known and predetermined fraction of an object is sampled in one or more steps, with the final step estimating the number. Both methods can be performed in a physical and optical manner, therefore enabling cells and larger lung structure numbers (e.g. number of alveoli to be estimated. Some estimators also require randomisation of orientation, so that all directions have an equal chance of being chosen. Using such isotropic sections, surface area, length, and diameter can be estimated on a Cavalieri set of sections. Stereology can also illustrate the potential for transport between two compartments by analysing the barrier width. Estimating the individual volume of cells can be achieved by local stereology using a two-step procedure that first samples lung cells using the disector and then introduces individual volume estimation of the sampled cells. The coefficient of error of most unbiased stereological estimators is a combination of variance from blocks, sections, fields

  15. Sample Return Primer and Handbook

    Science.gov (United States)

    Barrow, Kirk; Cheuvront, Allan; Faris, Grant; Hirst, Edward; Mainland, Nora; McGee, Michael; Szalai, Christine; Vellinga, Joseph; Wahl, Thomas; Williams, Kenneth; hide

    2007-01-01

    This three-part Sample Return Primer and Handbook provides a road map for conducting the terminal phase of a sample return mission. The main chapters describe element-by-element analyses and trade studies, as well as required operations plans, procedures, contingencies, interfaces, and corresponding documentation. Based on the experiences of the lead Stardust engineers, the topics include systems engineering (in particular range safety compliance), mission design and navigation, spacecraft hardware and entry, descent, and landing certification, flight and recovery operations, mission assurance and system safety, test and training, and the very important interactions with external support organizations (non-NASA tracking assets, landing site support, and science curation).

  16. Succinct Sampling from Discrete Distributions

    DEFF Research Database (Denmark)

    Bringmann, Karl; Larsen, Kasper Green

    2013-01-01

    We revisit the classic problem of sampling from a discrete distribution: Given n non-negative w-bit integers x_1,...,x_n, the task is to build a data structure that allows sampling i with probability proportional to x_i. The classic solution is Walker's alias method that takes, when implemented...... requirement of the classic solution for a fundamental sampling problem, on the other hand, they provide the strongest known separation between the systematic and non-systematic case for any data structure problem. Finally, we also believe our upper bounds are practically efficient and simpler than Walker...... on a Word RAM, O(n) preprocessing time, O(1) expected query time for one sample, and n(w+2 lg n+o(1)) bits of space. Using the terminology of succinct data structures, this solution has redundancy 2n lg n+o(n) bits, i.e., it uses 2n lg n+o(n) bits in addition to the information theoretic minimum required...

  17. Mahalanobis' Contributions to Sample Surveys

    Indian Academy of Sciences (India)

    has more than 60 research publications in various journals and is a Member of the International. Statistical Institute. Rao is on the Governing Council of the National Sample. Survey Organisation and is the Managing Editor of. Sankhya, Series B as well as a coeditor. ______ LAA~AA< ______ __. RESONANCE I June 1999.

  18. Sampling Assumptions in Inductive Generalization

    Science.gov (United States)

    Navarro, Daniel J.; Dry, Matthew J.; Lee, Michael D.

    2012-01-01

    Inductive generalization, where people go beyond the data provided, is a basic cognitive capability, and it underpins theoretical accounts of learning, categorization, and decision making. To complete the inductive leap needed for generalization, people must make a key "sampling" assumption about how the available data were generated.…

  19. Environmental surveillance master sampling schedule

    Energy Technology Data Exchange (ETDEWEB)

    Bisping, L.E.

    1994-02-01

    This document contains the planned 1994 schedules for routine collection of samples for the Surface Environmental Surveillance Project (SESP), Drinking Water Project, and Ground-Water Surveillance Project. Samples are routinely collected for the SESP and analyzed to determine the quality of air, surface water, soil, sediment, wildlife, vegetation, foodstuffs, and farm products at Hanford Site and surrounding communities. The responsibility for monitoring the onsite drinking water falls outside the scope of the SESP. The Hanford Environmental Health Foundation is responsible for monitoring the nonradiological parameters as defined in the National Drinking Water Standards while PNL conducts the radiological monitoring of the onsite drinking water. PNL conducts the drinking water monitoring project concurrent with the SESP to promote efficiency and consistency, utilize the expertise developed over the years, and reduce costs associated with management, procedure development, data management, quality control and reporting. The ground-water sampling schedule identifies ground-water sampling events used by PNL for environmental surveillance of the Hanford Site.

  20. The Lyman alpha reference sample

    DEFF Research Database (Denmark)

    Hayes, M.; Östlin, G.; Schaerer, D.

    2013-01-01

    We report on new imaging observations of the Lyman alpha emission line (Lyα), performed with the Hubble Space Telescope, that comprise the backbone of the Lyman alpha Reference Sample. We present images of 14 starburst galaxies at redshifts 0.028

  1. The RECONS 10 Parsec Sample

    Science.gov (United States)

    Henry, Todd; Dieterich, Sergio; Finch, C.; Ianna, P. A.; Jao, W.-C.; Riedel, Adric; Subasavage, John; Winters, J.; RECONS Team

    2018-01-01

    The sample of stars, brown dwarfs, and exoplanets known within 10 parsecs of our Solar System as of January 1, 2017 is presented. The current census is comprised of 416 objects made up of 371 stars (including the Sun and white dwarfs) and 45 brown dwarfs. The stars are known to be orbited by 43 planets (eight in our Solar System and 35 exoplanets). There are 309 systems within 10 pc, including 275 with stellar primaries and 34 systems containing only brown dwarfs.Via a long-term astrometric effort at CTIO, the RECONS (REsearch Consortium On Nearby Stars, www.recons.org) team has added 44 stellar systems to the sample, accounting for one of every seven systems known within 10 pc. Overall, the 278 red dwarfs clearly dominate the sample, accounting for 75% of all stars known within 10 pc. The completeness of the sample is assessed, indicating that a few red, brown, and white dwarfs within 10 pc may be discovered, both as primaries and secondaries, although we estimate that 90% of the stellar systems have been identified. The evolution of the 10 pc sample over the past century is outlined to illustrate our growing knowledge of the solar neighborhood.The luminosity and mass functions for stars within 10 pc are described. In contrast to many studies, once all known close multiples are resolved into individual components, the true mass function rises to the end of the stellar main sequence, followed by a precipitous drop in the number of brown dwarfs, which are outnumbered 8.2 to 1 by stars. Of the 275 stellar primaries in the sample, 182 (66%) are single, 75 (27%) have at least one stellar companion, only 8 (3%) have a brown dwarf companion, and 19 (7%) systems are known to harbor planets. Searches for brown dwarf companions to stars in this sample have been quite rigorous, so the brown dwarf companion rate is unlikely to rise significantly. In contrast, searches for exoplanets, particularly terrestrial planets, have been limited. Thus, overall the solar neighborhood is

  2. Apparatus for Sampling Surface Contamination

    Science.gov (United States)

    Wells, Mark

    2008-01-01

    An apparatus denoted a swab device has been developed as a convenient means of acquiring samples of contaminants from surfaces and suspending the samples in liquids. (Thereafter, the liquids can be dispensed, in controlled volumes, into scientific instruments for analysis of the contaminants.) The swab device is designed so as not to introduce additional contamination and to facilitate, simplify, and systematize the dispensing of controlled volumes of liquid into analytical instruments. The swab device is a single apparatus into which are combined all the equipment and materials needed for sampling surface contamination. The swab device contains disposable components stacked together on a nondisposable dispensing head. One of the disposable components is a supply cartridge holding a sufficient volume of liquid for one complete set of samples. (The liquid could be clean water or another suitable solvent, depending on the application.) This supply of liquid is sealed by Luer valves. At the beginning of a sampling process, the user tears open a sealed bag containing the supply cartridge. A tip on the nondisposable dispensing head is engaged with a Luer valve on one end of the supply cartridge and rotated, locking the supply cartridge on the dispensing head and opening the valve. The swab tip includes a fabric swab that is wiped across the surface of interest to acquire a sample. A sealed bag containing a disposable dispensing tip is then opened, and the swab tip is pushed into the dispensing tip until seated. The dispensing head contains a piston that passes through a spring-loaded lip seal. The air volume displaced by this piston forces the liquid out of the supply cartridge, over the swab, and into the dispensing tip. The piston is manually cycled to enforce oscillation of the air volume and thereby to cause water to flow to wash contaminants from the swab and cause the resulting liquid suspension of contaminants to flow into the dispensing tip. After several cycles

  3. Authentication of forensic DNA samples.

    Science.gov (United States)

    Frumkin, Dan; Wasserstrom, Adam; Davidson, Ariane; Grafit, Arnon

    2010-02-01

    Over the past twenty years, DNA analysis has revolutionized forensic science, and has become a dominant tool in law enforcement. Today, DNA evidence is key to the conviction or exoneration of suspects of various types of crime, from theft to rape and murder. However, the disturbing possibility that DNA evidence can be faked has been overlooked. It turns out that standard molecular biology techniques such as PCR, molecular cloning, and recently developed whole genome amplification (WGA), enable anyone with basic equipment and know-how to produce practically unlimited amounts of in vitro synthesized (artificial) DNA with any desired genetic profile. This artificial DNA can then be applied to surfaces of objects or incorporated into genuine human tissues and planted in crime scenes. Here we show that the current forensic procedure fails to distinguish between such samples of blood, saliva, and touched surfaces with artificial DNA, and corresponding samples with in vivo generated (natural) DNA. Furthermore, genotyping of both artificial and natural samples with Profiler Plus((R)) yielded full profiles with no anomalies. In order to effectively deal with this problem, we developed an authentication assay, which distinguishes between natural and artificial DNA based on methylation analysis of a set of genomic loci: in natural DNA, some loci are methylated and others are unmethylated, while in artificial DNA all loci are unmethylated. The assay was tested on natural and artificial samples of blood, saliva, and touched surfaces, with complete success. Adopting an authentication assay for casework samples as part of the forensic procedure is necessary for maintaining the high credibility of DNA evidence in the judiciary system.

  4. Superfund Site Information - Site Sampling Data

    Data.gov (United States)

    U.S. Environmental Protection Agency — This asset includes Superfund site-specific sampling information including location of samples, types of samples, and analytical chemistry characteristics of...

  5. Adaptive Sampling in Hierarchical Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Knap, J; Barton, N R; Hornung, R D; Arsenlis, A; Becker, R; Jefferson, D R

    2007-07-09

    We propose an adaptive sampling methodology for hierarchical multi-scale simulation. The method utilizes a moving kriging interpolation to significantly reduce the number of evaluations of finer-scale response functions to provide essential constitutive information to a coarser-scale simulation model. The underlying interpolation scheme is unstructured and adaptive to handle the transient nature of a simulation. To handle the dynamic construction and searching of a potentially large set of finer-scale response data, we employ a dynamic metric tree database. We study the performance of our adaptive sampling methodology for a two-level multi-scale model involving a coarse-scale finite element simulation and a finer-scale crystal plasticity based constitutive law.

  6. Network reconstruction via density sampling

    CERN Document Server

    Squartini, Tiziano; Gabrielli, Andrea; Garlaschelli, Diego

    2016-01-01

    Reconstructing weighted networks from partial information is necessary in many important circumstances, e.g. for a correct estimation of systemic risk. It has been shown that, in order to achieve an accurate reconstruction, it is crucial to reliably replicate the empirical degree sequence, which is however unknown in many realistic situations. More recently, it has been found that the knowledge of the degree sequence can be replaced by the knowledge of the strength sequence, which is typically accessible, complemented by that of the total number of links, thus considerably relaxing the observational requirements. Here we further relax these requirements and devise a procedure valid when even the the total number of links is unavailable. We assume that, apart from the heterogeneity induced by the degree sequence itself, the network is homogeneous, so that its link density can be estimated by sampling subsets of nodes with representative density. We show that the best way of sampling nodes is the random selecti...

  7. Representative process sampling - in practice

    DEFF Research Database (Denmark)

    Esbensen, Kim; Friis-Pedersen, Hans Henrik; Julius, Lars Petersen

    2007-01-01

    Didactic data sets representing a range of real-world processes are used to illustrate "how to do" representative process sampling and process characterisation. The selected process data lead to diverse variogram expressions with different systematics (no range vs. important ranges; trends and/or...... presented cases of variography either solved the initial problems or served to understand the reasons and causes behind the specific process structures revealed in the variograms. Process Analytical Technologies (PAT) are not complete without process TOS....

  8. Pseudo-Marginal Slice Sampling

    OpenAIRE

    Murray, Iain; Graham, Matthew

    2015-01-01

    Markov chain Monte Carlo (MCMC) methods asymptotically sample from complex probability distributions. The pseudo-marginal MCMC framework only requires an unbiased estimator of the unnormalized probability distribution function to construct a Markov chain. However, the resulting chains are harder to tune to a target distribution than conventional MCMC, and the types of updates available are limited. We describe a general way to clamp and update the random numbers used in a pseudo-marginal meth...

  9. Accurate sampling using Langevin dynamics

    CERN Document Server

    Bussi, Giovanni

    2008-01-01

    We show how to derive a simple integrator for the Langevin equation and illustrate how it is possible to check the accuracy of the obtained distribution on the fly, using the concept of effective energy introduced in a recent paper [J. Chem. Phys. 126, 014101 (2007)]. Our integrator leads to correct sampling also in the difficult high-friction limit. We also show how these ideas can be applied in practical simulations, using a Lennard-Jones crystal as a paradigmatic case.

  10. Model-based distance sampling

    OpenAIRE

    Buckland, Stephen Terrence; Oedekoven, Cornelia Sabrina; Borchers, David Louis

    2015-01-01

    CSO was part-funded by EPSRC/NERC Grant EP/1000917/1. Conventional distance sampling adopts a mixed approach, using model-based methods for the detection process, and design-based methods to estimate animal abundance in the study region, given estimated probabilities of detection. In recent years, there has been increasing interest in fully model-based methods. Model-based methods are less robust for estimating animal abundance than conventional methods, but offer several advantages: they ...

  11. Focused conformational sampling in proteins

    Science.gov (United States)

    Bacci, Marco; Langini, Cassiano; Vymětal, Jiří; Caflisch, Amedeo; Vitalis, Andreas

    2017-11-01

    A detailed understanding of the conformational dynamics of biological molecules is difficult to obtain by experimental techniques due to resolution limitations in both time and space. Computer simulations avoid these in theory but are often too short to sample rare events reliably. Here we show that the progress index-guided sampling (PIGS) protocol can be used to enhance the sampling of rare events in selected parts of biomolecules without perturbing the remainder of the system. The method is very easy to use as it only requires as essential input a set of several features representing the parts of interest sufficiently. In this feature space, new states are discovered by spontaneous fluctuations alone and in unsupervised fashion. Because there are no energetic biases acting on phase space variables or projections thereof, the trajectories PIGS generates can be analyzed directly in the framework of transition networks. We demonstrate the possibility and usefulness of such focused explorations of biomolecules with two loops that are part of the binding sites of bromodomains, a family of epigenetic "reader" modules. This real-life application uncovers states that are structurally and kinetically far away from the initial crystallographic structures and are also metastable. Representative conformations are intended to be used in future high-throughput virtual screening campaigns.

  12. Characterization of superconducting multilayers samples

    CERN Document Server

    Antoine, C Z; Berry, S; Bouat, S; Jacquot, J F; Villegier, J C; Lamura, G; Gurevich, A

    2009-01-01

    Best RF bulk niobium accelerating cavities have nearly reached their ultimate limits at rf equatorial magnetic field H  200 mT close to the thermodynamic critical field Hc. In 2006 Gurevich proposed to use nanoscale layers of superconducting materials with high values of Hc > HcNb for magnetic shielding of bulk niobium to increase the breakdown magnetic field inside SC RF cavities [1]. Depositing good quality layers inside a whole cavity is rather difficult but we have sputtered high quality samples by applying the technique used for the preparation of superconducting electronics circuits and characterized these samples by X-ray reflectivity, dc resistivity (PPMS) and dc magnetization (SQUID). Dc magnetization curves of a 250 nm thick Nb film have been measured, with and without a magnetron sputtered coating of a single or multiple stack of 15 nm MgO and 25 nm NbN layers. The Nb samples with/without the coating clearly exhibit different behaviors. Because SQUID measurements are influenced by edge an...

  13. Synchronizing data from irregularly sampled sensors

    Science.gov (United States)

    Uluyol, Onder

    2017-07-11

    A system and method include receiving a set of sampled measurements for each of multiple sensors, wherein the sampled measurements are at irregular intervals or different rates, re-sampling the sampled measurements of each of the multiple sensors at a higher rate than one of the sensor's set of sampled measurements, and synchronizing the sampled measurements of each of the multiple sensors.

  14. Sampling for Machine Translation Evaluation

    OpenAIRE

    de la Fuente, Rubén

    2014-01-01

    Aquest article pretén oferir una visió general de les millors pràctiques desenvolupades a PayPal per al disseny i preparació de mostres per a diferents tasques incloses en el procés d'avaluació de la traducció automàtica. This paper intends to provide an overview of best practices developed within PayPal for designing and preparing samples for different tasks included in the process of machine translation evaluation. Este artículo pretende ofrecer una visión general de las mejores práct...

  15. Bayesian Sampling using Condition Indicators

    DEFF Research Database (Denmark)

    Faber, Michael H.; Sørensen, John Dalsgaard

    2002-01-01

    The problem of control quality of components is considered for the special case where the acceptable failure rate is low, the test costs are high and where it may be difficult or impossible to test the condition of interest directly. Based on the classical control theory and the concept...... of condition indicators introduced by Benjamin and Cornell (1970) a Bayesian approach to quality control is formulated. The formulation is then extended to the case where the quality control is based on sampling of indirect information about the condition of the components, i.e. condition indicators...

  16. Biobanking and international interoperability: samples.

    Science.gov (United States)

    Kiehntopf, Michael; Krawczak, Michael

    2011-09-01

    In terms of sample exchange, international collaborations between biobanks, or between biobanks and their research partners, have two important aspects. First, the donors' consent usually implies that the scope and purpose of any sample transfer to third parties is subject to major constraints. Since the legal, ethical and political framework of biobanking may differ substantially, even between countries of comparable jurisdictional systems, general rules for the international sharing of biomaterial are difficult, if not impossible, to define. Issues of uncertainty include the right to transfer the material, the scope of research allowed, and intellectual property rights. Since suitable means of international law enforcement may not be available in the context of biobanking, collaborators are advised to clarify any residual uncertainty by means of bilateral contracts, for example, in the form of material transfer agreements. Second, biobank partners may rightly expect that the biomaterial they receive for further analysis attains a certain level of quality. This implies that a biobank has to implement stringent quality control measures covering, in addition to the material transfer itself, the whole process of material acquisition, transport, pre-analytical handling and storage. Again, it may be advisable for biobank partners to claim contractual warranties for the type and quality of the biomaterial they wish to acquire.

  17. Cold SQUIDs and hot samples

    Energy Technology Data Exchange (ETDEWEB)

    Lee, T.S.C. [Univ. of California, Berkeley, CA (United States). Dept. of Physics]|[Lawrence Berkeley national Lab., CA (United States). Materials Sciences Div.

    1997-05-01

    Low transition temperature (low-{Tc}) and high-{Tc} Superconducting QUantum Interference Devices (SQUIDs) have been used to perform high-resolution magnetic measurements on samples whose temperatures are much higher than the operating temperatures of the devices. Part 1 of this work focuses on measurements of the rigidity of flux vortices in high-{Tc} superconductors using two low-{Tc} SQUIDs, one on either side of a thermally-insulated sample. The correlation between the signals of the SQUIDs is a direct measure of the extent of correlation between the movements of opposite ends of vortices. These measurements were conducted under the previously-unexplored experimental conditions of nominally-zero applied magnetic field, such that vortex-vortex interactions were unimportant, and with zero external current. At specific temperatures, the authors observed highly-correlated noise sources, suggesting that the vortices moved as rigid rods. At other temperatures, the noise was mostly uncorrelated, suggesting that the relevant vortices were pinned at more than one point along their length. Part 2 describes the design, construction, performance, and applications of a scanning high-{Tc} SQUID microscope optimized for imaging room-temperature objects with very high spatial resolution and magnetic source sensitivity.

  18. NASA's Aerosol Sampling Experiment Summary

    Science.gov (United States)

    Meyer, Marit E.

    2016-01-01

    In a spacecraft cabin environment, the size range of indoor aerosols is much larger and they persist longer than on Earth because they are not removed by gravitational settling. A previous aerosol experiment in 1991 documented that over 90 of the mass concentration of particles in the NASA Space Shuttle air were between 10 m and 100 m based on measurements with a multi-stage virtual impactor and a nephelometer (Liu et al. 1991). While the now-retired Space Shuttle had short duration missions (less than two weeks), the International Space Station (ISS) has been continually inhabited by astronauts for over a decade. High concentrations of inhalable particles on ISS are potentially responsible for crew complaints of respiratory and eye irritation and comments about 'dusty' air. Air filtration is the current control strategy for airborne particles on the ISS, and filtration modeling, performed for engineering and design validation of the air revitalization system in ISS, predicted that PM requirements would be met. However, aerosol monitoring has never been performed on the ISS to verify PM levels. A flight experiment is in preparation which will provide data on particulate matter in ISS ambient air. Particles will be collected with a thermophoretic sampler as well as with passive samplers which will extend the particle size range of sampling. Samples will be returned to Earth for chemical and microscopic analyses, providing the first aerosol data for ISS ambient air.

  19. TRU waste-sampling program

    Energy Technology Data Exchange (ETDEWEB)

    Warren, J.L.; Zerwekh, A.

    1985-08-01

    As part of a TRU waste-sampling program, Los Alamos National Laboratory retrieved and examined 44 drums of /sup 238/Pu- and /sup 239/Pu-contaminated waste. The drums ranged in age from 8 months to 9 years. The majority of drums were tested for pressure, and gas samples withdrawn from the drums were analyzed by a mass spectrometer. Real-time radiography and visual examination were used to determine both void volumes and waste content. Drum walls were measured for deterioration, and selected drum contents were reassayed for comparison with original assays and WIPP criteria. Each drum tested at atmospheric pressure. Mass spectrometry revealed no problem with /sup 239/Pu-contaminated waste, but three 8-month-old drums of /sup 238/Pu-contaminated waste contained a potentially hazardous gas mixture. Void volumes fell within the 81 to 97% range. Measurements of drum walls showed no significant corrosion or deterioration. All reassayed contents were within WIPP waste acceptance criteria. Five of the drums opened and examined (15%) could not be certified as packaged. Three contained free liquids, one had corrosive materials, and one had too much unstabilized particulate. Eleven drums had the wrong (or not the most appropriate) waste code. In many cases, disposal volumes had been inefficiently used. 2 refs., 23 figs., 7 tabs.

  20. Sampled-data controller implementation

    Science.gov (United States)

    Wang, Yu; Leduc, Ryan J.

    2012-09-01

    The setting of this article is the implementation of timed discrete-event systems (TDES) as sampled-data (SD) controllers. An SD controller is driven by a periodic clock and sees the system as a series of inputs and outputs. On each clock edge (tick event), it samples its inputs, changes states and updates its outputs. In this article, we establish a formal representation of an SD controller as a Moore synchronous finite state machine (FSM). We describe how to translate a TDES supervisor to an FSM, as well as necessary properties to be able to do so. We discuss how to construct a single centralised controller as well as a set of modular controllers, and show that they will produce equivalent output. We briefly discuss how the recently introduced SD controllability definition relates to our translation method. SD controllability is an extension of TDES controllability which captures several new properties that are useful in dealing with concurrency issues, as well as make it easier to translate a TDES supervisor into an SD controller. We next discuss the application of SD controllability to a small flexible manufacturing system (FMS) from the literature. The example demonstrates the successful application of the new SD properties. We describe the design of the system in detail to illustrate the new conditions and to provide designers with guidance on how to apply the properties. We also present some FSM translation issues encountered, as well as the FSM version of the system's supervisors.

  1. Towards Cost-efficient Sampling Methods

    CERN Document Server

    Peng, Luo; Chong, Wu

    2014-01-01

    The sampling method has been paid much attention in the field of complex network in general and statistical physics in particular. This paper presents two new sampling methods based on the perspective that a small part of vertices with high node degree can possess the most structure information of a network. The two proposed sampling methods are efficient in sampling the nodes with high degree. The first new sampling method is improved on the basis of the stratified random sampling method and selects the high degree nodes with higher probability by classifying the nodes according to their degree distribution. The second sampling method improves the existing snowball sampling method so that it enables to sample the targeted nodes selectively in every sampling step. Besides, the two proposed sampling methods not only sample the nodes but also pick the edges directly connected to these nodes. In order to demonstrate the two methods' availability and accuracy, we compare them with the existing sampling methods in...

  2. 40 CFR 141.21 - Coliform sampling.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 22 2010-07-01 2010-07-01 false Coliform sampling. 141.21 Section 141... sampling. (a) Routine monitoring. (1) Public water systems must collect total coliform samples at sites... must collect at least one repeat sample from the sampling tap where the original total coliform...

  3. 40 CFR 1065.150 - Continuous sampling.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Continuous sampling. 1065.150 Section... ENGINE-TESTING PROCEDURES Equipment Specifications § 1065.150 Continuous sampling. You may use continuous sampling techniques for measurements that involve raw or dilute sampling. Make sure continuous sampling...

  4. 40 CFR 89.420 - Background sample.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Background sample. 89.420 Section 89... Procedures § 89.420 Background sample. (a) Background samples are produced by continuously drawing a sample... background samples may be produced and analyzed for each mode. Hence, a unique background value will be used...

  5. Methodology Series Module 5: Sampling Strategies.

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ' Sampling Method'. There are essentially two types of sampling methods: 1) probability sampling - based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling - based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ' random sample' when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ' generalizability' of these results. In such a scenario, the researcher may want to use ' purposive sampling' for the study.

  6. Methodology series module 5: Sampling strategies

    Directory of Open Access Journals (Sweden)

    Maninder Singh Setia

    2016-01-01

    Full Text Available Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the 'Sampling Method'. There are essentially two types of sampling methods: 1 probability sampling – based on chance events (such as random numbers, flipping a coin etc.; and 2 non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term 'random sample' when the researcher has used convenience sample. The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the 'generalizability' of these results. In such a scenario, the researcher may want to use 'purposive sampling' for the study.

  7. Sample Return Systems for Extreme Environments Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Since the Apollo era, sample return missions have been primarily limited to asteroid sampling. More comprehensive sampling could yield critical information on the...

  8. Methodology series module 5: Sampling strategies

    OpenAIRE

    Maninder Singh Setia

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ? Sampling Method?. There are essentially two types of sampling methods: 1) probability sampling ? based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling ? based on researcher's choice, population that accessible & available. Some of the non-probabilit...

  9. Zamak samples analyses using EDXRF

    Energy Technology Data Exchange (ETDEWEB)

    Assis, J.T. de; Lima, I.; Monin, V., E-mail: joaquim@iprj.uerj.b, E-mail: inaya@iprj.uerj.b, E-mail: monin@iprj.uerj.b [Universidade do Estado do Rio de Janeiro (UERJ), Nova Friburgo, RJ (Brazil). Inst. Politecnico. Dept. de Engenharia Mecanica e Energia; Anjos, M. dos; Lopes, R.T., E-mail: ricardo@lin.ufrj.b [Coordenacao dos Programas de Pos-graduacao de Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Lab. de Instrumentacao Nuclear; Alves, H., E-mail: marcelin@uerj.b, E-mail: haimon.dlafis@gmail.co [Universidade do Estado do Rio de Janeiro (UERJ), Rio de Janeiro, RJ (Brazil). Inst. de Fisica. Dept. de Fisica Aplicada e Termodinamica

    2009-07-01

    Zamak is a family of alloys with a base metal of zinc and alloying elements of aluminium, magnesium and copper. Among all non-ferrous metal alloys, Zamak is one that has more applications, for their physical, mechanical properties and easy ability to electrodeposition. It has good resistance to corrosion, traction, shock and wear. Its low melting point (approximately 400 deg C) allows greater durability of the mold, allowing greater production of melted series parts. Zamak can be used in several kinds of areas, such as, to produce residential and industrial locks, construction and carpentry components, refrigerators hinges and so on. It is observed that in some cases the quality of these products is not very good. The problem should be the quality of Zamak alloy purchased by the industries. One possible technique that can be used to investigate the quality of these alloys is Energy Dispersive X-ray fluorescence. In this paper we present results of eight samples of Zamak alloy by this technique and it was possible to classify Zamak alloy and verify some irregularity on these alloys. (author)

  10. Graph Sampling for Covariance Estimation

    KAUST Repository

    Chepuri, Sundeep Prabhakar

    2017-04-25

    In this paper the focus is on subsampling as well as reconstructing the second-order statistics of signals residing on nodes of arbitrary undirected graphs. Second-order stationary graph signals may be obtained by graph filtering zero-mean white noise and they admit a well-defined power spectrum whose shape is determined by the frequency response of the graph filter. Estimating the graph power spectrum forms an important component of stationary graph signal processing and related inference tasks such as Wiener prediction or inpainting on graphs. The central result of this paper is that by sampling a significantly smaller subset of vertices and using simple least squares, we can reconstruct the second-order statistics of the graph signal from the subsampled observations, and more importantly, without any spectral priors. To this end, both a nonparametric approach as well as parametric approaches including moving average and autoregressive models for the graph power spectrum are considered. The results specialize for undirected circulant graphs in that the graph nodes leading to the best compression rates are given by the so-called minimal sparse rulers. A near-optimal greedy algorithm is developed to design the subsampling scheme for the non-parametric and the moving average models, whereas a particular subsampling scheme that allows linear estimation for the autoregressive model is proposed. Numerical experiments on synthetic as well as real datasets related to climatology and processing handwritten digits are provided to demonstrate the developed theory.

  11. Bilateral inferior petrosal sinus sampling.

    Science.gov (United States)

    Zampetti, Benedetta; Grossrubatscher, Erika; Dalino Ciaramella, Paolo; Boccardi, Edoardo; Loli, Paola

    2016-07-01

    Simultaneous bilateral inferior petrosal sinus sampling (BIPSS) plays a crucial role in the diagnostic work-up of Cushing's syndrome. It is the most accurate procedure in the differential diagnosis of hypercortisolism of pituitary or ectopic origin, as compared with clinical, biochemical and imaging analyses, with a sensitivity and specificity of 88-100% and 67-100%, respectively. In the setting of hypercortisolemia, ACTH levels obtained from venous drainage of the pituitary are expected to be higher than the levels of peripheral blood, thus suggesting pituitary ACTH excess as the cause of hypercortisolism. Direct stimulation of the pituitary corticotroph with corticotrophin-releasing hormone enhances the sensitivity of the procedure. The procedure must be undertaken in the presence of hypercortisolemia, which suppresses both the basal and stimulated secretory activity of normal corticotrophic cells: ACTH measured in the sinus is, therefore, the result of the secretory activity of the tumor tissue. The poor accuracy in lateralization of BIPSS (positive predictive value of 50-70%) makes interpetrosal ACTH gradient alone not sufficient for the localization of the tumor. An accurate exploration of the gland is recommended if a tumor is not found in the predicted area. Despite the fact that BIPSS is an invasive procedure, the occurrence of adverse events is extremely rare, particularly if it is performed by experienced operators in referral centres. © 2016 The authors.

  12. Bilateral inferior petrosal sinus sampling

    Directory of Open Access Journals (Sweden)

    Benedetta Zampetti

    2016-08-01

    Full Text Available Simultaneous bilateral inferior petrosal sinus sampling (BIPSS plays a crucial role in the diagnostic work-up of Cushing’s syndrome. It is the most accurate procedure in the differential diagnosis of hypercortisolism of pituitary or ectopic origin, as compared with clinical, biochemical and imaging analyses, with a sensitivity and specificity of 88–100% and 67–100%, respectively. In the setting of hypercortisolemia, ACTH levels obtained from venous drainage of the pituitary are expected to be higher than the levels of peripheral blood, thus suggesting pituitary ACTH excess as the cause of hypercortisolism. Direct stimulation of the pituitary corticotroph with corticotrophin-releasing hormone enhances the sensitivity of the procedure. The procedure must be undertaken in the presence of hypercortisolemia, which suppresses both the basal and stimulated secretory activity of normal corticotrophic cells: ACTH measured in the sinus is, therefore, the result of the secretory activity of the tumor tissue. The poor accuracy in lateralization of BIPSS (positive predictive value of 50–70% makes interpetrosal ACTH gradient alone not sufficient for the localization of the tumor. An accurate exploration of the gland is recommended if a tumor is not found in the predicted area. Despite the fact that BIPSS is an invasive procedure, the occurrence of adverse events is extremely rare, particularly if it is performed by experienced operators in referral centres.

  13. 21 CFR 203.38 - Sample lot or control numbers; labeling of sample units.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Sample lot or control numbers; labeling of sample... SERVICES (CONTINUED) DRUGS: GENERAL PRESCRIPTION DRUG MARKETING Samples § 203.38 Sample lot or control numbers; labeling of sample units. (a) Lot or control number required on drug sample labeling and sample...

  14. Tank 12H residuals sample analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Oji, L. N. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Shine, E. P. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Diprete, D. P. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Coleman, C. J. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Hay, M. S. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-06-11

    The Savannah River National Laboratory (SRNL) was requested by Savannah River Remediation (SRR) to provide sample preparation and analysis of the Tank 12H final characterization samples to determine the residual tank inventory prior to grouting. Eleven Tank 12H floor and mound residual material samples and three cooling coil scrape samples were collected and delivered to SRNL between May and August of 2014.

  15. Credit in Acceptance Sampling on Attributes

    NARCIS (Netherlands)

    Klaassen, Chris A.J.

    2000-01-01

    Credit is introduced in acceptance sampling on attributes and a Credit Based Acceptance sampling system is developed that is very easy to apply in practice.The credit of a producer is defined as the total number of items accepted since the last rejection.In our sampling system the sample size for a

  16. 45 CFR 1356.84 - Sampling.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false Sampling. 1356.84 Section 1356.84 Public Welfare....84 Sampling. (a) The State agency may collect and report the information required in section 1356.83(e) of this part on a sample of the baseline population consistent with the sampling requirements...

  17. 30 CFR 90.208 - Bimonthly sampling.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Bimonthly sampling. 90.208 Section 90.208... MANDATORY HEALTH STANDARDS-COAL MINERS WHO HAVE EVIDENCE OF THE DEVELOPMENT OF PNEUMOCONIOSIS Sampling Procedures § 90.208 Bimonthly sampling. (a) Each operator shall take one valid respirable dust sample for...

  18. 30 CFR 90.207 - Compliance sampling.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Compliance sampling. 90.207 Section 90.207... MANDATORY HEALTH STANDARDS-COAL MINERS WHO HAVE EVIDENCE OF THE DEVELOPMENT OF PNEUMOCONIOSIS Sampling Procedures § 90.207 Compliance sampling. (a) The operator shall take five valid respirable dust samples for...

  19. 42 CFR 402.109 - Statistical sampling.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false Statistical sampling. 402.109 Section 402.109... Statistical sampling. (a) Purpose. CMS or OIG may introduce the results of a statistical sampling study to... or caused to be presented. (b) Prima facie evidence. The results of the statistical sampling study...

  20. 40 CFR 61.34 - Air sampling.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 8 2010-07-01 2010-07-01 false Air sampling. 61.34 Section 61.34... sampling. (a) Stationary sources subject to § 61.32(b) shall locate air sampling sites in accordance with a... concentrations calculated within 30 days after filters are collected. Records of concentrations at all sampling...

  1. 7 CFR 51.17 - Official sampling.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Official sampling. 51.17 Section 51.17 Agriculture... Inspection Service § 51.17 Official sampling. Samples may be officially drawn by any duly authorized... time and place of the sampling and the brands or other identifying marks of the containers from which...

  2. 40 CFR 90.422 - Background sample.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Background sample. 90.422 Section 90... Procedures § 90.422 Background sample. (a) Background samples are produced by drawing a sample of the dilution air during the exhaust collection phase of each test cycle mode. (1) An individual background...

  3. Improved variance estimation along sample eigenvectors

    NARCIS (Netherlands)

    Hendrikse, A.J.; Veldhuis, Raymond N.J.; Spreeuwers, Lieuwe Jan

    Second order statistics estimates in the form of sample eigenvalues and sample eigenvectors give a sub optimal description of the population density. So far only attempts have been made to reduce the bias in the sample eigenvalues. However, because the sample eigenvectors differ from the population

  4. Sampling wild species to conserve genetic diversity

    Science.gov (United States)

    Sampling seed from natural populations of crop wild relatives requires choice of the locations to sample from and the amount of seed to sample. While this may seem like a simple choice, in fact careful planning of a collector’s sampling strategy is needed to ensure that a crop wild collection will ...

  5. Nanopipettes: probes for local sample analysis.

    Science.gov (United States)

    Saha-Shah, Anumita; Weber, Anna E; Karty, Jonathan A; Ray, Steven J; Hieftje, Gary M; Baker, Lane A

    2015-06-01

    Nanopipettes (pipettes with diameters nanopipette shank was studied to optimize sampling volume and probe geometry. This method was utilized to collect nanoliter volumes (nanopipettes for surface sampling of mouse brain tissue sections was also explored. Lipid analyses were performed on mouse brain tissues with spatial resolution of sampling as small as 50 μm. Nanopipettes were shown to be a versatile tool that will find further application in studies of sample heterogeneity and population analysis for a wide range of samples.

  6. Separating Interviewer and Sampling-Point Effects

    OpenAIRE

    Rainer Schnell; Frauke Kreuter

    2003-01-01

    "Data used in nationwide face-to-face surveys are almost always collected in multistage cluster samples. The relative homogeneity of the clusters selected in this way can lead to design effects at the sampling stage. Interviewers can further homogenize answers within the small geographic clusters that form the sampling points. The study presented here was designed to distinguish between interviewer effects and sampling-point effects using interpenetrated samples for conducting ...

  7. Concrete samples for organic samples, data package and 222-S validation summary report. Addendum 1A

    Energy Technology Data Exchange (ETDEWEB)

    Vogel, R.E.

    1994-11-01

    This document is in two parts: the first is the data package entitled ``Concrete Samples for Organic Samples`` and the second is entitled ``Concrete Samples for Organic Samples -- Addendum 1A`` which is the 222-S validation summary report.

  8. Using Inverse Probability Bootstrap Sampling to Eliminate Sample Induced Bias in Model Based Analysis of Unequal Probability Samples.

    Science.gov (United States)

    Nahorniak, Matthew; Larsen, David P; Volk, Carol; Jordan, Chris E

    2015-01-01

    In ecology, as in other research fields, efficient sampling for population estimation often drives sample designs toward unequal probability sampling, such as in stratified sampling. Design based statistical analysis tools are appropriate for seamless integration of sample design into the statistical analysis. However, it is also common and necessary, after a sampling design has been implemented, to use datasets to address questions that, in many cases, were not considered during the sampling design phase. Questions may arise requiring the use of model based statistical tools such as multiple regression, quantile regression, or regression tree analysis. However, such model based tools may require, for ensuring unbiased estimation, data from simple random samples, which can be problematic when analyzing data from unequal probability designs. Despite numerous method specific tools available to properly account for sampling design, too often in the analysis of ecological data, sample design is ignored and consequences are not properly considered. We demonstrate here that violation of this assumption can lead to biased parameter estimates in ecological research. In addition, to the set of tools available for researchers to properly account for sampling design in model based analysis, we introduce inverse probability bootstrapping (IPB). Inverse probability bootstrapping is an easily implemented method for obtaining equal probability re-samples from a probability sample, from which unbiased model based estimates can be made. We demonstrate the potential for bias in model-based analyses that ignore sample inclusion probabilities, and the effectiveness of IPB sampling in eliminating this bias, using both simulated and actual ecological data. For illustration, we considered three model based analysis tools--linear regression, quantile regression, and boosted regression tree analysis. In all models, using both simulated and actual ecological data, we found inferences to be

  9. Sampling a guide for internal auditors

    CERN Document Server

    Apostolou, Barbara

    2004-01-01

    While it is possible to examine 100 percent of an audit customer's data, the time and cost associated with such a study are often prohibitive. To obtain sufficient, reliable, and relevant information with a limited data set, sampling is an efficient and effective tool. It can help you evaluate the customer's assertions, as well as reach audit conclusions and provide reasonable assurance to your organization. This handbook will help you understand sampling. It also serves as a guide for auditors and students preparing for certification. Topics include: An overview of sampling. Statistical and nonstatistical sampling issues. Sampling selection methods and risks. The pros and cons of popular sampling plans.

  10. Dynamic Method for Identifying Collected Sample Mass

    Science.gov (United States)

    Carson, John

    2008-01-01

    G-Sample is designed for sample collection missions to identify the presence and quantity of sample material gathered by spacecraft equipped with end effectors. The software method uses a maximum-likelihood estimator to identify the collected sample's mass based on onboard force-sensor measurements, thruster firings, and a dynamics model of the spacecraft. This makes sample mass identification a computation rather than a process requiring additional hardware. Simulation examples of G-Sample are provided for spacecraft model configurations with a sample collection device mounted on the end of an extended boom. In the absence of thrust knowledge errors, the results indicate that G-Sample can identify the amount of collected sample mass to within 10 grams (with 95-percent confidence) by using a force sensor with a noise and quantization floor of 50 micrometers. These results hold even in the presence of realistic parametric uncertainty in actual spacecraft inertia, center-of-mass offset, and first flexibility modes. Thrust profile knowledge is shown to be a dominant sensitivity for G-Sample, entering in a nearly one-to-one relationship with the final mass estimation error. This means thrust profiles should be well characterized with onboard accelerometers prior to sample collection. An overall sample-mass estimation error budget has been developed to approximate the effect of model uncertainty, sensor noise, data rate, and thrust profile error on the expected estimate of collected sample mass.

  11. Small Sample Whole-Genome Amplification

    Energy Technology Data Exchange (ETDEWEB)

    Hara, C A; Nguyen, C P; Wheeler, E K; Sorensen, K J; Arroyo, E S; Vrankovich, G P; Christian, A T

    2005-09-20

    Many challenges arise when trying to amplify and analyze human samples collected in the field due to limitations in sample quantity, and contamination of the starting material. Tests such as DNA fingerprinting and mitochondrial typing require a certain sample size and are carried out in large volume reactions; in cases where insufficient sample is present whole genome amplification (WGA) can be used. WGA allows very small quantities of DNA to be amplified in a way that enables subsequent DNA-based tests to be performed. A limiting step to WGA is sample preparation. To minimize the necessary sample size, we have developed two modifications of WGA: the first allows for an increase in amplified product from small, nanoscale, purified samples with the use of carrier DNA while the second is a single-step method for cleaning and amplifying samples all in one column. Conventional DNA cleanup involves binding the DNA to silica, washing away impurities, and then releasing the DNA for subsequent testing. We have eliminated losses associated with incomplete sample release, thereby decreasing the required amount of starting template for DNA testing. Both techniques address the limitations of sample size by providing ample copies of genomic samples. Carrier DNA, included in our WGA reactions, can be used when amplifying samples with the standard purification method, or can be used in conjunction with our single-step DNA purification technique to potentially further decrease the amount of starting sample necessary for future forensic DNA-based assays.

  12. Aerobot Sampling and Handling System Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Honeybee Robotics proposes to: ?Derive and document the functional and technical requirements for Aerobot surface sampling and sample handling across a range of...

  13. 1990 sampling of treated aspen stands

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — In mid-August, 1990, sampling of aspen stand exclosures were conducted at the National Elk Refuge. This sampling is part of a study to monitor aspen regeneration on...

  14. Revised Total Coliform Rule Lab Sampling Form

    Science.gov (United States)

    This form should be completed when a water system collects any required Revised Total Coliform Rule (RTCR) samples. It should also be used when collecting “Special” non-compliance samples for the RTCR.

  15. Extreme Environment Sampling System Deployment Mechanism Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Future Venus or Comet mission architectures may feature robotic sampling systems comprised of a Sampling Tool and Deployment Mechanism. Since 2005, Honeybee has been...

  16. AFSC/ABL: 2009 Chinook Excluder Samples

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This project genetically analyzed 1,620 chinook salmon samples from the 2009 spring salmon excluder device test. These samples were collected over a short period of...

  17. ISCO Grab Sample Ion Chromatography Analytical Data

    Data.gov (United States)

    U.S. Environmental Protection Agency — ISCO grab samples were collected from river, wastewater treatment plant discharge, and public drinking water intakes. Samples were analyzed for major ions (ppb)...

  18. Commutability of food microbiology proficiency testing samples.

    Science.gov (United States)

    Abdelmassih, M; Polet, M; Goffaux, M-J; Planchon, V; Dierick, K; Mahillon, J

    2014-03-01

    Food microbiology proficiency testing (PT) is a useful tool to assess the analytical performances among laboratories. PT items should be close to routine samples to accurately evaluate the acceptability of the methods. However, most PT providers distribute exclusively artificial samples such as reference materials or irradiated foods. This raises the issue of the suitability of these samples because the equivalence-or 'commutability'-between results obtained on artificial vs. authentic food samples has not been demonstrated. In the clinical field, the use of noncommutable PT samples has led to erroneous evaluation of the performances when different analytical methods were used. This study aimed to provide a first assessment of the commutability of samples distributed in food microbiology PT. REQUASUD and IPH organized 13 food microbiology PTs including 10-28 participants. Three types of PT items were used: genuine food samples, sterile food samples and reference materials. The commutability of the artificial samples (reference material or sterile samples) was assessed by plotting the distribution of the results on natural and artificial PT samples. This comparison highlighted matrix-correlated issues when nonfood matrices, such as reference materials, were used. Artificially inoculated food samples, on the other hand, raised only isolated commutability issues. In the organization of a PT-scheme, authentic or artificially inoculated food samples are necessary to accurately evaluate the analytical performances. Reference materials, used as PT items because of their convenience, may present commutability issues leading to inaccurate penalizing conclusions for methods that would have provided accurate results on food samples. For the first time, the commutability of food microbiology PT samples was investigated. The nature of the samples provided by the organizer turned out to be an important factor because matrix effects can impact on the analytical results. © 2013

  19. Optimal Design in Geostatistics under Preferential Sampling

    OpenAIRE

    Ferreira, Gustavo da Silva; Gamerman, Dani

    2015-01-01

    This paper analyses the effect of preferential sampling in Geostatistics when the choice of new sampling locations is the main interest of the researcher. A Bayesian criterion based on maximizing utility functions is used. Simulated studies are presented and highlight the strong influence of preferential sampling in the decisions. The computational complexity is faced by treating the new local sampling locations as a model parameter and the optimal choice is then made by analysing its posteri...

  20. Representative process sampling for reliable data analysis

    DEFF Research Database (Denmark)

    Julius, Lars Petersen; Esbensen, Kim

    2005-01-01

    Process sampling of moving streams of particulate matter, fluids and slurries (over time or space) or stationary one-dimensional (1-D) lots is often carried out according to existing tradition or protocol not taking the theory of sampling (TOS) into account. In many situations, sampling errors...

  1. 7 CFR 275.11 - Sampling.

    Science.gov (United States)

    2010-01-01

    ... FOOD STAMP AND FOOD DISTRIBUTION PROGRAM PERFORMANCE REPORTING SYSTEM Quality Control (QC) Reviews... two samples for the food stamp quality control review process, an active case sample and a negative... quality control review has an equal or known chance of being selected in the sample. Since the food stamp...

  2. Spur reduction technique for sampling PLLs

    NARCIS (Netherlands)

    Gao, X.; Bahai, A.; Klumperink, Eric A.M.; Nauta, Bram; Bohsali, M.; Djabbari, A.; Socci, G.

    2010-01-01

    Control circuitry and method of controlling a sampling phase locked loop (PLL). By controlling the duty cycle of a sampling control signal, in accordance with the PLL reference and output signals, spurious output signals from the sampling PLL being controlled can be reduced.

  3. Low power and low spur sampling PLL

    NARCIS (Netherlands)

    Gao, X.; Klumperink, Eric A.M.; Bahai, A.; Bohsali, M.; Nauta, Bram; Djabbari, A.; Socci, G.

    2010-01-01

    Abstract Control circuitry and method of controlling a sampling phase locked loop (PLL). By controlling the duty cycle of one or more sampling control signals, power consumption by the reference signal buffer and spurious output signals from the sampling PLL being controlled can be reduced.

  4. 45 CFR 160.536 - Statistical sampling.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Statistical sampling. 160.536 Section 160.536... REQUIREMENTS GENERAL ADMINISTRATIVE REQUIREMENTS Procedures for Hearings § 160.536 Statistical sampling. (a) In... statistical sampling study as evidence of the number of violations under § 160.406 of this part, or the...

  5. 40 CFR 761.348 - Contemporaneous sampling.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Contemporaneous sampling. 761.348... PROHIBITIONS Sampling Non-Liquid, Non-Metal PCB Bulk Product Waste for Purposes of Characterization for PCB Disposal in Accordance With § 761.62, and Sampling PCB Remediation Waste Destined for Off-Site Disposal...

  6. 10 CFR 430.63 - Sampling.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Sampling. 430.63 Section 430.63 Energy DEPARTMENT OF... Enforcement § 430.63 Sampling. (a) For purposes of a certification of compliance, the determination that a... the case of faucets, showerheads, water closets, and urinals) shall be based upon the sampling...

  7. 40 CFR 61.44 - Stack sampling.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 8 2010-07-01 2010-07-01 false Stack sampling. 61.44 Section 61.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL... Firing § 61.44 Stack sampling. (a) Sources subject to § 61.42(b) shall be continuously sampled, during...

  8. Spur reduction technique for sampling PLLs

    NARCIS (Netherlands)

    Gao, X.; Bahai, A.; Klumperink, Eric A.M.; Nauta, Bram; Bohsali, M.; Djabbari, A.; Socci, G.

    2012-01-01

    Control circuitry and method of controlling a sampling phase locked loop (PLL). By controlling the duty cycle of a sampling control signal, in accordance with the PLL reference and output signals, spurious output signals from the sampling PLL being controlled can be reduced.

  9. Spur reduction technique for sampling PLLs

    NARCIS (Netherlands)

    Gao, X.; Bahai, Ahmad; Bohsali, Mounhir; Djabbari, Ali; Klumperink, Eric A.M.; Nauta, Bram; Socci, Gerard

    2013-01-01

    Control circuitry and method of controlling a sampling phase locked loop (PLL). By controlling the duty cycle of a sampling control signal, in accordance with the PLL reference and output signals, spurious output signals from the sampling PLL being controlled can be reduced.

  10. 19 CFR 151.10 - Sampling.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Sampling. 151.10 Section 151.10 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE General § 151.10 Sampling. When necessary, the port director...

  11. 7 CFR 75.18 - Sampling.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Sampling. 75.18 Section 75.18 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards, Inspections... CERTIFICATION OF QUALITY OF AGRICULTURAL AND VEGETABLE SEEDS Inspection § 75.18 Sampling. Sampling, when...

  12. 42 CFR 1003.133 - Statistical sampling.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Statistical sampling. 1003.133 Section 1003.133... AUTHORITIES CIVIL MONEY PENALTIES, ASSESSMENTS AND EXCLUSIONS § 1003.133 Statistical sampling. (a) In meeting... statistical sampling study as evidence of the number and amount of claims and/or requests for payment as...

  13. Sample normalization methods in quantitative metabolomics.

    Science.gov (United States)

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Mixture model analysis of complex samples

    NARCIS (Netherlands)

    Wedel, M; ter Hofstede, F; Steenkamp, JBEM

    1998-01-01

    We investigate the effects of a complex sampling design on the estimation of mixture models. An approximate or pseudo likelihood approach is proposed to obtain consistent estimates of class-specific parameters when the sample arises from such a complex design. The effects of ignoring the sample

  15. Lunar and Meteorite Sample Disk for Educators

    Science.gov (United States)

    Foxworth, Suzanne; Luckey, M.; McInturff, B.; Allen, J.; Kascak, A.

    2015-01-01

    NASA Johnson Space Center (JSC) has the unique responsibility to curate NASA's extraterrestrial samples from past and future missions. Curation includes documentation, preservation, preparation and distribution of samples for research, education and public outreach. Between 1969 and 1972 six Apollo missions brought back 382 kilograms of lunar rocks, core and regolith samples, from the lunar surface. JSC also curates meteorites collected from a US cooperative effort among NASA, the National Science Foundation (NSF) and the Smithsonian Institution that funds expeditions to Antarctica. The meteorites that are collected include rocks from Moon, Mars, and many asteroids including Vesta. The sample disks for educational use include these different samples. Active relevant learning has always been important to teachers and the Lunar and Meteorite Sample Disk Program provides this active style of learning for students and the general public. The Lunar and Meteorite Sample Disks permit students to conduct investigations comparable to actual scientists. The Lunar Sample Disk contains 6 samples; Basalt, Breccia, Highland Regolith, Anorthosite, Mare Regolith and Orange Soil. The Meteorite Sample Disk contains 6 samples; Chondrite L3, Chondrite H5, Carbonaceous Chondrite, Basaltic Achondrite, Iron and Stony-Iron. Teachers are given different activities that adhere to their standards with the disks. During a Sample Disk Certification Workshop, teachers participate in the activities as students gain insight into the history, formation and geologic processes of the moon, asteroids and meteorites.

  16. Multivariate stratified sampling by stochastic multiobjective optimisation

    OpenAIRE

    Diaz-Garcia, Jose A.; Ramos-Quiroga, Rogelio

    2011-01-01

    This work considers the allocation problem for multivariate stratified random sampling as a problem of integer non-linear stochastic multiobjective mathematical programming. With this goal in mind the asymptotic distribution of the vector of sample variances is studied. Two alternative approaches are suggested for solving the allocation problem for multivariate stratified random sampling. An example is presented by applying the different proposed techniques.

  17. Illustration of Launching Samples Home from Mars

    Science.gov (United States)

    2005-01-01

    One crucial step in a Mars sample return mission would be to launch the collected sample away from the surface of Mars. This artist's concept depicts a Mars ascent vehicle for starting a sample of Mars rocks on their trip to Earth.

  18. Self-Digitization of Sample Volumes

    Science.gov (United States)

    Cohen, Dawn E.; Schneider, Thomas; Wang, Michelle; Chiu, Daniel T.

    2010-01-01

    This paper describes a very simple and robust microfluidic device for digitizing samples into an array of discrete volumes. The device is based on an inherent fluidic phenomenon, where an incoming aqueous sample divides itself into an array of chambers that have been primed with an immiscible phase. Self-digitization of sample volumes results from the interplay between fluidic forces, interfacial tension, channel geometry, and the final stability of the digitized samples in the chambers. Here we describe experiments and simulations that were used to characterize these parameters and the conditions under which the self-digitization process occurred. Unlike existing methods used to partition samples into array, our method is able to digitize 100% of a sample into a localized array without any loss of sample volume. The final volume of the discretized sample at each location is defined by the geometry and size of each chamber. Thus, we can form an array of samples with varying but predefined volumes. We exploited this feature to separate the crystal growth of otherwise concomitant polymorphs from a single solution. Additionally, we demonstrated the removal of the digitized samples from the chambers for downstream analysis, as well as the addition of reagents to the digitized samples. We believe this simple method will be useful in a broad range of applications where a large array of discretized samples is required, including digital PCR, single-cell analysis, and cell-based drug screening. PMID:20550137

  19. 7 CFR 28.908 - Samples.

    Science.gov (United States)

    2010-01-01

    .... Samples may be drawn in gins equipped with mechanical samplers approved by the Division and operated... that were drawn by a mechanical sampler at the gin may be transported with the bales to the warehouse... sample from a bale for review classification if the producer so desires. (b) Drawing of samples manual...

  20. Social network sampling using spanning trees

    Science.gov (United States)

    Jalali, Zeinab S.; Rezvanian, Alireza; Meybodi, Mohammad Reza

    2016-12-01

    Due to the large scales and limitations in accessing most online social networks, it is hard or infeasible to directly access them in a reasonable amount of time for studying and analysis. Hence, network sampling has emerged as a suitable technique to study and analyze real networks. The main goal of sampling online social networks is constructing a small scale sampled network which preserves the most important properties of the original network. In this paper, we propose two sampling algorithms for sampling online social networks using spanning trees. The first proposed sampling algorithm finds several spanning trees from randomly chosen starting nodes; then the edges in these spanning trees are ranked according to the number of times that each edge has appeared in the set of found spanning trees in the given network. The sampled network is then constructed as a sub-graph of the original network which contains a fraction of nodes that are incident on highly ranked edges. In order to avoid traversing the entire network, the second sampling algorithm is proposed using partial spanning trees. The second sampling algorithm is similar to the first algorithm except that it uses partial spanning trees. Several experiments are conducted to examine the performance of the proposed sampling algorithms on well-known real networks. The obtained results in comparison with other popular sampling methods demonstrate the efficiency of the proposed sampling algorithms in terms of Kolmogorov-Smirnov distance (KSD), skew divergence distance (SDD) and normalized distance (ND).

  1. Optimization of environmental sampling using interactive GIS.

    NARCIS (Netherlands)

    Groenigen, van J.W.; Stein, A.; Zuurbier, R.

    1997-01-01

    An interactive sampling procedure is proposed to optimize environmental risk assessment. Subsequent sampling stages were used as quantitative pre-information. With this pre-information probability maps were made using indicator kriging to direct subsequent sampling. In this way, optimal use of the

  2. Radar Doppler Processing with Nonuniform Sampling.

    Energy Technology Data Exchange (ETDEWEB)

    Doerry, Armin W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-07-01

    Conventional signal processing to estimate radar Doppler frequency often assumes uniform pulse/sample spacing. This is for the convenience of t he processing. More recent performance enhancements in processor capability allow optimally processing nonuniform pulse/sample spacing, thereby overcoming some of the baggage that attends uniform sampling, such as Doppler ambiguity and SNR losses due to sidelobe control measures.

  3. METHODOLOGICAL ASPECTS OF STRATIFICATION OF AUDIT SAMPLING

    OpenAIRE

    Vilena A. Yakimova

    2013-01-01

    The article presents the methodological foundations for construction stratification audit sampling for attribute-based sampling. The sampling techniques of Russian and foreign practice is studied and stratified. The role of stratification in the audit is described. Approaches to construction of the stratification are revealed on the basis of professional judgment (qualitative methods), statistical groupings (quantitative methods) and combinatory ones (complex qualitative stratifications). Gro...

  4. Global Unique Identification of Geoscience Samples: The International Geo Sample Number (IGSN) and the System for Earth Sample Registration (SESAR)

    Science.gov (United States)

    Lehnert, K. A.; Goldstein, S. L.; Vinayagamoorthy, S.; Lenhardt, W. C.

    2005-12-01

    Data on samples represent a primary foundation of Geoscience research across disciplines, ranging from the study of climate change, to biogeochemical cycles, to mantle and continental dynamics and are key to our knowledge of the Earth's dynamical systems and evolution. Different data types are generated for individual samples by different research groups, published in different papers, and stored in different databases on a global scale. The utility of these data is critically dependent on their integration. Such integration can be achieved within a Geoscience Cyberinfrastructure, but requires unambiguous identification of samples. Currently, naming of samples is arbitrary and inconsistent and therefore severely limits our ability to share, link, and integrate sample-based data. Major problems include name duplication, and changing of names as a sample is passed along over many years to different investigators. SESAR, the System for Earth Sample Registration (http://www.geosamples.org), addresses this problem by building a registry that generates and administers globally unique identifiers for Geoscience samples: the International Geo Sample Number (IGSN). Implementation of the IGSN in data publication and digital data management will dramatically advance interoperability among information systems for sample-based data, opening an extensive range of new opportunities for discovery and for interdisciplinary approaches in research. The IGSN will also facilitate the ability of investigators to build on previously collected data on samples as new measurements are made or new techniques are developed. With potentially broad application to all types of Geoscience samples, SESAR is global in scope. It is a web-based system that can be easily accessed by individual users through an interactive web interface and by distributed client systems via standard web services. Samples can be registered individually or in batches and at various levels of granularity from entire cores

  5. [Variance estimation considering multistage sampling design in multistage complex sample analysis].

    Science.gov (United States)

    Li, Yichong; Zhao, Yinjun; Wang, Limin; Zhang, Mei; Zhou, Maigeng

    2016-03-01

    Multistage sampling is a frequently-used method in random sampling survey in public health. Clustering or independence between observations often exists in the sampling, often called complex sample, generated by multistage sampling. Sampling error may be underestimated and the probability of type I error may be increased if the multistage sample design was not taken into consideration in analysis. As variance (error) estimator in complex sample is often complicated, statistical software usually adopt ultimate cluster variance estimate (UCVE) to approximate the estimation, which simply assume that the sample comes from one-stage sampling. However, with increased sampling fraction of primary sampling unit, contribution from subsequent sampling stages is no more trivial, and the ultimate cluster variance estimate may, therefore, lead to invalid variance estimation. This paper summarize a method of variance estimation considering multistage sampling design. The performances are compared with UCVE and the method considering multistage sampling design by simulating random sampling under different sampling schemes using real world data. Simulation showed that as primary sampling unit (PSU) sampling fraction increased, UCVE tended to generate increasingly biased estimation, whereas accurate estimates were obtained by using the method considering multistage sampling design.

  6. Handling missing data in ranked set sampling

    CERN Document Server

    Bouza-Herrera, Carlos N

    2013-01-01

    The existence of missing observations is a very important aspect to be considered in the application of survey sampling, for example. In human populations they may be caused by a refusal of some interviewees to give the true value for the variable of interest. Traditionally, simple random sampling is used to select samples. Most statistical models are supported by the use of samples selected by means of this design. In recent decades, an alternative design has started being used, which, in many cases, shows an improvement in terms of accuracy compared with traditional sampling. It is called R

  7. Statistical aspects of food safety sampling.

    Science.gov (United States)

    Jongenburger, I; den Besten, H M W; Zwietering, M H

    2015-01-01

    In food safety management, sampling is an important tool for verifying control. Sampling by nature is a stochastic process. However, uncertainty regarding results is made even greater by the uneven distribution of microorganisms in a batch of food. This article reviews statistical aspects of sampling and describes the impact of distributions on the sampling results. Five different batch contamination scenarios are illustrated: a homogeneous batch, a heterogeneous batch with high- or low-level contamination, and a batch with localized high- or low-level contamination. These batch contamination scenarios showed that sampling results have to be interpreted carefully, especially when heterogeneous and localized contamination in food products is expected.

  8. Rotary Mode Core Sample System availability improvement

    Energy Technology Data Exchange (ETDEWEB)

    Jenkins, W.W.; Bennett, K.L.; Potter, J.D. [Westinghouse Hanford Co., Richland, WA (United States); Cross, B.T.; Burkes, J.M.; Rogers, A.C. [Southwest Research Institute (United States)

    1995-02-28

    The Rotary Mode Core Sample System (RMCSS) is used to obtain stratified samples of the waste deposits in single-shell and double-shell waste tanks at the Hanford Site. The samples are used to characterize the waste in support of ongoing and future waste remediation efforts. Four sampling trucks have been developed to obtain these samples. Truck I was the first in operation and is currently being used to obtain samples where the push mode is appropriate (i.e., no rotation of drill). Truck 2 is similar to truck 1, except for added safety features, and is in operation to obtain samples using either a push mode or rotary drill mode. Trucks 3 and 4 are now being fabricated to be essentially identical to truck 2.

  9. Quality evaluation of processed clay soil samples.

    Science.gov (United States)

    Steiner-Asiedu, Matilda; Harrison, Obed Akwaa; Vuvor, Frederick; Tano-Debrah, Kwaku

    2016-01-01

    This study assessed the microbial quality of clay samples sold on two of the major Ghanaian markets. The study was a cross-sectional assessing the evaluation of processed clay and effects it has on the nutrition of the consumers in the political capital town of Ghana. The items for the examination was processed clay soil samples. Staphylococcus spp and fecal coliforms including Klebsiella, Escherichia, and Shigella and Enterobacterspp were isolated from the clay samples. Samples from the Kaneshie market in Accra recorded the highest total viable counts 6.5 Log cfu/g and Staphylococcal count 5.8 Log cfu/g. For fecal coliforms, Madina market samples had the highest count 6.5 Log cfu/g and also recorded the highest levels of yeast and mould. For Koforidua, total viable count was highest in the samples from the Zongo market 6.3 Log cfu/g. Central market samples had the highest count of fecal coliforms 4.6 Log cfu/g and yeasts and moulds 6.5 Log cfu/g. "Small" market recorded the highest staphylococcal count 6.2 Log cfu/g. The water activity of the clay samples were low, and ranged between 0.65±0.01 and 0.66±0.00 for samples collected from Koforidua and Accra respectively. The clay samples were found to contain Klebsiella spp. Escherichia, Enterobacter, Shigella spp. staphylococcus spp., yeast and mould. These have health implications when consumed.

  10. Static versus dynamic sampling for data mining

    Energy Technology Data Exchange (ETDEWEB)

    John, G.H.; Langley, P. [Stanford Univ., CA (United States)

    1996-12-31

    As data warehouses grow to the point where one hundred gigabytes is considered small, the computational efficiency of data-mining algorithms on large databases becomes increasingly important. Using a sample from the database can speed up the datamining process, but this is only acceptable if it does not reduce the quality of the mined knowledge. To this end, we introduce the {open_quotes}Probably Close Enough{close_quotes} criterion to describe the desired properties of a sample. Sampling usually refers to the use of static statistical tests to decide whether a sample is sufficiently similar to the large database, in the absence of any knowledge of the tools the data miner intends to use. We discuss dynamic sampling methods, which take into account the mining tool being used and can thus give better samples. We describe dynamic schemes that observe a mining tool`s performance on training samples of increasing size and use these results to determine when a sample is sufficiently large. We evaluate these sampling methods on data from the UCI repository and conclude that dynamic sampling is preferable.

  11. Uncertainty and sampling issues in tank characterization

    Energy Technology Data Exchange (ETDEWEB)

    Liebetrau, A.M.; Pulsipher, B.A.; Kashporenko, D.M. [and others

    1997-06-01

    A defensible characterization strategy must recognize that uncertainties are inherent in any measurement or estimate of interest and must employ statistical methods for quantifying and managing those uncertainties. Estimates of risk and therefore key decisions must incorporate knowledge about uncertainty. This report focuses statistical methods that should be employed to ensure confident decision making and appropriate management of uncertainty. Sampling is a major source of uncertainty that deserves special consideration in the tank characterization strategy. The question of whether sampling will ever provide the reliable information needed to resolve safety issues is explored. The issue of sample representativeness must be resolved before sample information is reliable. Representativeness is a relative term but can be defined in terms of bias and precision. Currently, precision can be quantified and managed through an effective sampling and statistical analysis program. Quantifying bias is more difficult and is not being addressed under the current sampling strategies. Bias could be bounded by (1) employing new sampling methods that can obtain samples from other areas in the tanks, (2) putting in new risers on some worst case tanks and comparing the results from existing risers with new risers, or (3) sampling tanks through risers under which no disturbance or activity has previously occurred. With some bound on bias and estimates of precision, various sampling strategies could be determined and shown to be either cost-effective or infeasible.

  12. Soil Gas Sample Handling: Evaluation of Water Removal and Sample Ganging

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Brad G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Abrecht, David G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hayes, James C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Mendoza, Donaldo P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-10-31

    Soil gas sampling is currently conducted in support of Nuclear Test Ban treaty verification. Soil gas samples are collected and analyzed for isotopes of interest. Some issues that can impact sampling and analysis of these samples are excess moisture and sample processing time. Here we discuss three potential improvements to the current sampling protocol; a desiccant for water removal, use of molecular sieve to remove CO2 from the sample during collection, and a ganging manifold to allow composite analysis of multiple samples.

  13. Distributed MIMO radar using compressive sampling

    CERN Document Server

    Petropulu, Athina P; Poor, H Vincent

    2009-01-01

    A distributed MIMO radar is considered, in which the transmit and receive antennas belong to nodes of a small scale wireless network. The transmit waveforms could be uncorrelated, or correlated in order to achieve a desirable beampattern. The concept of compressive sampling is employed at the receive nodes in order to perform direction of arrival (DOA) estimation. According to the theory of compressive sampling, a signal that is sparse in some domain can be recovered based on far fewer samples than required by the Nyquist sampling theorem. The DOAs of targets form a sparse vector in the angle space, and therefore, compressive sampling can be applied for DOA estimation. The proposed approach achieves the superior resolution of MIMO radar with far fewer samples than other approaches. This is particularly useful in a distributed scenario, in which the results at each receive node need to be transmitted to a fusion center.

  14. Automatic polarization control in optical sampling system

    Science.gov (United States)

    Zhao, Zhao; Yang, Aiying; Feng, Lihui

    2015-08-01

    In an optical sampling system for high-speed optical communications, polarization controlling is one of the most important parts of the system, regardless of nonlinear optical sampling or linear optical sampling. A simple method based on variance calculation of sampled data is proposed in this paper to tune the wave plates in a motor-driven polarization controller. In the experiment, an optical sampling system base on SFG in PPLN is carried for a 10Gbit/s or beyond optical data signal. The results demonstrate that, with the proposed method, the error of estimated Q factor from the sampled data is least, and the tuning time of optimized polarization state is less than 30 seconds with the accuracy of +/-1°.

  15. Reconstruction of Intensity From Covered Samples

    Energy Technology Data Exchange (ETDEWEB)

    Barabash, Rozaliya [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Watkins, Thomas R [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Meisner, Roberta Ann [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Burchell, Timothy D [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Rosseel, Thomas M [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-01-01

    The safe handling of activated samples requires containment and covering the sample to eliminate any potential for contamination. Subsequent characterization of the surface with x-rays ideally necessitates a thin film. While many films appear visually transparent, they are not necessarily x-ray transparent. Each film material has a unique beam attenuation and sometimes have amorphous peaks that can superimpose with those of the sample. To reconstruct the intensity of the underlying activated sample, the x-ray attenuation and signal due to the film needs to be removed from that of the sample. This requires the calculation of unique deconvolution parameters for the film. The development of a reconstruction procedure for a contained/covered sample is described.

  16. Subsurface Sample Acquisition and Transfer Systems (SSATS)

    Science.gov (United States)

    Rafeek, S.; Gorevan, S. P.; Kong, K. Y.

    2001-01-01

    In the exploration of planets and small bodies, scientists will need the services of a deep drilling and material handling system to not only obtain the samples necessary for analyses but also to precisely transfer and deposit those samples in in-situ instruments on board a landed craft or rover. The technology for such a deep sampling system as the SSATS is currently been developed by Honeybee Robotics through a PIDDP effort. The SSATS has its foundation in a one-meter prototype (SATM) drill that was developed under the New Millenium Program for ST4/Champollion. Additionally the SSATS includes relevant coring technology form a coring drill (Athena Mini-Corer) developed for the Mars Sample Return Mission. These highly developed technologies along with the current PIDDP effort, is combined to produce a sampling system that can acquire and transfer samples from various depths. Additional information is contained in the original extended abstract.

  17. Sample size in qualitative interview studies

    DEFF Research Database (Denmark)

    Malterud, Kirsti; Siersma, Volkert Dirk; Guassora, Ann Dorrit Kristiane

    2016-01-01

    Sample sizes must be ascertained in qualitative studies like in quantitative studies but not by the same means. The prevailing concept for sample size in qualitative studies is “saturation.” Saturation is closely tied to a specific methodology, and the term is inconsistently applied. We propose...... the concept “information power” to guide adequate sample size for qualitative studies. Information power indicates that the more information the sample holds, relevant for the actual study, the lower amount of participants is needed. We suggest that the size of a sample with sufficient information power...... depends on (a) the aim of the study, (b) sample specificity, (c) use of established theory, (d) quality of dialogue, and (e) analysis strategy. We present a model where these elements of information and their relevant dimensions are related to information power. Application of this model in the planning...

  18. Numerical simulations of regolith sampling processes

    Science.gov (United States)

    Schäfer, Christoph M.; Scherrer, Samuel; Buchwald, Robert; Maindl, Thomas I.; Speith, Roland; Kley, Wilhelm

    2017-07-01

    We present recent improvements in the simulation of regolith sampling processes in microgravity using the numerical particle method smooth particle hydrodynamics (SPH). We use an elastic-plastic soil constitutive model for large deformation and failure flows for dynamical behaviour of regolith. In the context of projected small body (asteroid or small moons) sample return missions, we investigate the efficiency and feasibility of a particular material sampling method: Brushes sweep material from the asteroid's surface into a collecting tray. We analyze the influence of different material parameters of regolith such as cohesion and angle of internal friction on the sampling rate. Furthermore, we study the sampling process in two environments by varying the surface gravity (Earth's and Phobos') and we apply different rotation rates for the brushes. We find good agreement of our sampling simulations on Earth with experiments and provide estimations for the influence of the material properties on the collecting rate.

  19. Diagnostic herd sensitivity using environmental samples

    DEFF Research Database (Denmark)

    Vigre, Håkan; Josefsen, Mathilde Hartmann; Seyfarth, Anne Mette

    . In our example, the prevalence of infected pigs in each herd was estimated from the pooled samples of nasal swabs. Logistic regression was used to estimate the effect of animal prevalence on the probability to detect MRSA in the dust and air samples at herd level. The results show a significant increase...... of the within herd prevalence, and performed almost perfectly at a prevalence of 25% infected pigs (sensitivity=99%). In general, the dependence of within herd prevalence should be considered in designing surveillance programs based on environmental samples.......Due to logistic and economic benefits, the animal industry has an increased interest in using environmental samples to classify herds free of infections. For a valid interpretation of results obtained from environmental samples, the performance of the diagnostic method using these samples must...

  20. Sample transport with thermocapillary force for microfluidics

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, N-T; Pang, W W; Huang, X [School of Mechanical and Production Engineering, Nanyang Technological University 50 Nanyang Avenue, Singapore 639798 (Singapore)

    2006-04-01

    This paper presents a novel concept for transport of aqueous sample in capillaries. The concept is based on the thermocapillary effect, which utilizes the temperature dependency of surface tension to drive a sample droplet. To date, the major problem of this concept was the evaporation of the aqueous sample. In our approach, a liquid-liquid system was used for delivering the sample. The aqueous sample is protected by silicone oil, thus evaporation can be avoided. A transient temperature field drives both liquids away from a heater. The paper first presents a theoretical model for the coupled thermocapillary problem. Next, the paper compares and discusses experimental results with different capillary sizes. The results show the huge potential of this concept for handling sample droplets dispersed in oil, which are often created by droplet-based microfluidics.

  1. Direct impact aerosol sampling by electrostatic precipitation

    Energy Technology Data Exchange (ETDEWEB)

    Braden, Jason D.; Harter, Andrew G.; Stinson, Brad J.; Sullivan, Nicholas M.

    2016-02-02

    The present disclosure provides apparatuses for collecting aerosol samples by ionizing an air sample at different degrees. An air flow is generated through a cavity in which at least one corona wire is disposed and electrically charged to form a corona therearound. At least one grounded sample collection plate is provided downstream of the at least one corona wire so that aerosol ions generated within the corona are deposited on the at least one grounded sample collection plate. A plurality of aerosol samples ionized to different degrees can be generated. The at least one corona wire may be perpendicular to the direction of the flow, or may be parallel to the direction of the flow. The apparatus can include a serial connection of a plurality of stages such that each stage is capable of generating at least one aerosol sample, and the air flow passes through the plurality of stages serially.

  2. Incremental Sampling Methodology (ISM) for Metallic Residues

    Science.gov (United States)

    2013-08-01

    result in improved precision for Cu or if other changes, such as increasing the digestion aliquot mass or di- gestion interval or increasing the number...200 g of material. The soil samples were air-dried at ambient temperature, sieved to remove the greater-than- 2-mm fraction, and the less-than-2-mm...yielding a 25-kg sample. The incremental sample was air-dried at ambient temperature and passed through a 2-mm sieve. A rotary splitter was

  3. Representative Sampling for reliable data analysis

    DEFF Research Database (Denmark)

    Petersen, Lars; Esbensen, Kim Harry

    2005-01-01

    regime in order to secure the necessary reliability of: samples (which must be representative, from the primary sampling onwards), analysis (which will not mean anything outside the miniscule analytical volume without representativity ruling all mass reductions involved, also in the laboratory) and data......) that fully cover all practical aspects of sampling and provides a handy “toolbox” for samplers, engineers, laboratory and scientific personnel....

  4. Field Sampling from a Segmented Image

    CSIR Research Space (South Africa)

    Debba, Pravesh

    2008-06-01

    Full Text Available Image Debba, Stein, van der Meer, Carranza, Lucieer Objective Study Site Methods The ICM Algorithm Sampling Per Category Sample Size Per Category Fitness Function Per Category Simulated Annealing Per Category Results Experiment Case... Study Conclusions Field Sampling from a Segmented Image P. Debba1 A. Stein2 F.D. van der Meer2 E.J.M. Carranza2 A. Lucieer3 1The Council for Scientific and Industrial Research (CSIR), Logistics and Quantitative Methods, CSIR Built Environment, P...

  5. Efficient Monte Carlo sampling by parallel marginalization

    OpenAIRE

    Weare, Jonathan

    2007-01-01

    Markov chain Monte Carlo sampling methods often suffer from long correlation times. Consequently, these methods must be run for many steps to generate an independent sample. In this paper, a method is proposed to overcome this difficulty. The method utilizes information from rapidly equilibrating coarse Markov chains that sample marginal distributions of the full system. This is accomplished through exchanges between the full chain and the auxiliary coarse chains. Results of numerical tests o...

  6. Techniques for geothermal liquid sampling and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kindle, C.H.; Woodruff, E.M.

    1981-07-01

    A methodology has been developed that is particularly suited to liquid-dominated resources and adaptable to a variety of situations. It is intended to be a base methodology upon which variations can be made to meet specific needs or situations. The approach consists of recording flow conditions at the time of sampling, a specific insertable probe sampling system, a sample stabilization procedure, commercially available laboratory instruments, and data quality check procedures.

  7. METALLOGRAPHIC SAMPLE PREPARATION STATION-CONSTRUCTIVE CONCEPT

    Directory of Open Access Journals (Sweden)

    AVRAM Florin Timotei

    2016-11-01

    Full Text Available In this paper we propose to present the issues involved in the case of the constructive conception of a station for metallographic sample preparation. This station is destined for laboratory work. The metallographic station is composed of a robot ABB IRB1600, a metallographic microscope, a gripping device, a manipulator, a laboratory grinding and polishing machine. The robot will be used for manipulation of the sample preparation and the manipulator take the sample preparation for processing.

  8. Microfabricated Devices for Sample Extraction, Concentrations, and Related Sample Processing Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Gang; Lin, Yuehe

    2006-12-01

    This is an invited book chapter. As with other analytical techniques, sample pretreatments, sample extraction, sample introduction, and related techniques are of extreme importance for micro-electro-mechanical systems (MEMS). Bio-MEMS devices and systems start with a sampling step. The biological sample then usually undergoes some kinds of sample preparation steps before the actual analysis. These steps may involve extracting the target sample from its matrix, removing interferences from the sample, derivatizing the sample to detectable species, or performing a sample preconcentration step. The integration of the components for sample pretreatment into microfluidic devices represents one of the remaining the bottle-neck towards achieving true miniaturized total analysis systems (?TAS). This chapter provides a thorough state-of-art of the developments in this field to date.

  9. It's in the Sample: The Effects of Sample Size and Sample Diversity on the Breadth of Inductive Generalization

    Science.gov (United States)

    Lawson, Chris A.; Fisher, Anna V.

    2011-01-01

    Developmental studies have provided mixed evidence with regard to the question of whether children consider sample size and sample diversity in their inductive generalizations. Results from four experiments with 105 undergraduates, 105 school-age children (M = 7.2 years), and 105 preschoolers (M = 4.9 years) showed that preschoolers made a higher…

  10. Investigation of Hardened Filling Grout Samples

    DEFF Research Database (Denmark)

    Sørensen, Eigil V.

     Suzlon Wind Energy A/S requested on August 28, 2007 an investigation of 2 samples of a hardened filling grout to be carried out, comprising drilling and strength determination of 4 test cylinders, and description of the surface characteristics of the samples....... Suzlon Wind Energy A/S requested on August 28, 2007 an investigation of 2 samples of a hardened filling grout to be carried out, comprising drilling and strength determination of 4 test cylinders, and description of the surface characteristics of the samples....

  11. Unbiased sampling and meshing of isosurfaces

    KAUST Repository

    Yan, Dongming

    2014-11-01

    In this paper, we present a new technique to generate unbiased samples on isosurfaces. An isosurface, F(x,y,z) = c , of a function, F , is implicitly defined by trilinear interpolation of background grid points. The key idea of our approach is that of treating the isosurface within a grid cell as a graph (height) function in one of the three coordinate axis directions, restricted to where the slope is not too high, and integrating / sampling from each of these three. We use this unbiased sampling algorithm for applications in Monte Carlo integration, Poisson-disk sampling, and isosurface meshing.

  12. How to calculate sample size and why.

    Science.gov (United States)

    Kim, Jeehyoung; Seo, Bong Soo

    2013-09-01

    Calculating the sample size is essential to reduce the cost of a study and to prove the hypothesis effectively. Referring to pilot studies and previous research studies, we can choose a proper hypothesis and simplify the studies by using a website or Microsoft Excel sheet that contains formulas for calculating sample size in the beginning stage of the study. There are numerous formulas for calculating the sample size for complicated statistics and studies, but most studies can use basic calculating methods for sample size calculation.

  13. Sampling in the Linear Canonical Transform Domain

    Directory of Open Access Journals (Sweden)

    Bing-Zhao Li

    2012-01-01

    Full Text Available This paper investigates the interpolation formulae and the sampling theorem for bandpass signals in the linear canonical transform (LCT domain. Firstly, one of the important relationships between the bandpass signals in the Fourier domain and the bandpass signals in the LCT domain is derived. Secondly, two interpolation formulae from uniformly sampled points at half of the sampling rate associated with the bandpass signals and their generalized Hilbert transform or the derivatives in the LCT domain are obtained. Thirdly, the interpolation formulae from nonuniform samples are investigated. The simulation results are also proposed to verify the correctness of the derived results.

  14. Efficient Monte Carlo sampling by parallel marginalization.

    Science.gov (United States)

    Weare, Jonathan

    2007-07-31

    Markov chain Monte Carlo sampling methods often suffer from long correlation times. Consequently, these methods must be run for many steps to generate an independent sample. In this paper, a method is proposed to overcome this difficulty. The method utilizes information from rapidly equilibrating coarse Markov chains that sample marginal distributions of the full system. This is accomplished through exchanges between the full chain and the auxiliary coarse chains. Results of numerical tests on the bridge sampling and filtering/smoothing problems for a stochastic differential equation are presented.

  15. Sample size determination for the fluctuation experiment.

    Science.gov (United States)

    Zheng, Qi

    2017-01-01

    The Luria-Delbrück fluctuation experiment protocol is increasingly employed to determine microbial mutation rates in the laboratory. An important question raised at the planning stage is "How many cultures are needed?" For over 70 years sample sizes have been determined either by intuition or by following published examples where sample sizes were chosen intuitively. This paper proposes a practical method for determining the sample size. The proposed method relies on existing algorithms for computing the expected Fisher information under two commonly used mutant distributions. The role of partial plating in reducing sample size is discussed. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Drone inflight mixing of biochemical samples.

    Science.gov (United States)

    Katariya, Mayur; Chung, Dwayne Chung Kim; Minife, Tristan; Gupta, Harshit; Zahidi, Alifa Afiah Ahmad; Liew, Oi Wah; Ng, Tuck Wah

    2018-01-04

    Autonomous systems for sample transport to the laboratory for analysis can be improved in terms of timeliness, cost and error mitigation in the pre-analytical testing phase. Drones have been reported for outdoor sample transport but incorporating devices on them to attain homogenous mixing of reagents during flight to enhance sample processing timeliness is limited by payload issues. It is shown here that flipping maneuvers conducted with quadcopters are able to facilitate complete and gentle mixing. This capability incorporated during automated sample transport serves to address an important factor contributing to pre-analytical variability which ultimately impacts on test result reliability. Copyright © 2018 Elsevier Inc. All rights reserved.

  17. Sampling Criterion for EMC Near Field Measurements

    DEFF Research Database (Denmark)

    Franek, Ondrej; Sørensen, Morten; Ebert, Hans

    2012-01-01

    An alternative, quasi-empirical sampling criterion for EMC near field measurements intended for close coupling investigations is proposed. The criterion is based on maximum error caused by sub-optimal sampling of near fields in the vicinity of an elementary dipole, which is suggested as a worst......-case representative of a signal trace on a typical printed circuit board. It has been found that the sampling density derived in this way is in fact very similar to that given by the antenna near field sampling theorem, if an error less than 1 dB is required. The principal advantage of the proposed formulation is its...

  18. DXC'13 Industrial Track Sample Data

    Data.gov (United States)

    National Aeronautics and Space Administration — The sample scenarios provided here are competition scenarios from previous DXC competitions. They are identical to the competition data associated with previous...

  19. Colling Wipe Samples for VX Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Koester, C; Hoppes, W G

    2010-02-11

    This standard operating procedure (SOP) provides uniform procedures for the collection of wipe samples of VX residues from surfaces. Personnel may use this procedure to collect and handle wipe samples in the field. Various surfaces, including building materials (wood, metal, tile, vinyl, etc.) and equipment, may be sampled based on this procedure. The purpose of such sampling is to determine whether or not the relevant surfaces are contaminated, to determine the extent of their contamination, to evaluate the effectiveness of decontamination procedures, and to determine the amount of contaminant that might present as a contact hazard.

  20. Method and apparatus for sampling atmospheric mercury

    Science.gov (United States)

    Trujillo, Patricio E.; Campbell, Evan E.; Eutsler, Bernard C.

    1976-01-20

    A method of simultaneously sampling particulate mercury, organic mercurial vapors, and metallic mercury vapor in the working and occupational environment and determining the amount of mercury derived from each such source in the sampled air. A known volume of air is passed through a sampling tube containing a filter for particulate mercury collection, a first adsorber for the selective adsorption of organic mercurial vapors, and a second adsorber for the adsorption of metallic mercury vapor. Carbon black molecular sieves are particularly useful as the selective adsorber for organic mercurial vapors. The amount of mercury adsorbed or collected in each section of the sampling tube is readily quantitatively determined by flameless atomic absorption spectrophotometry.

  1. Colloid characterization and quantification in groundwater samples

    Energy Technology Data Exchange (ETDEWEB)

    K. Stephen Kung

    2000-06-01

    This report describes the work conducted at Los Alamos National Laboratory for studying the groundwater colloids for the Yucca Mountain Project in conjunction with the Hydrologic Resources Management Program (HRMP) and the Underground Test Area (UGTA) Project. Colloidal particle size distributions and total particle concentration in groundwater samples are quantified and characterized. Colloid materials from cavity waters collected near underground nuclear explosion sites by HRMP field sampling personnel at the Nevada Test Site (NTS) were quantified. Selected colloid samples were further characterized by electron microscope to evaluate the colloid shapes, elemental compositions, and mineral phases. The authors have evaluated the colloid size and concentration in the natural groundwater sample that was collected from the ER-20-5 well and stored in a 50-gallon (about 200-liter) barrel for several months. This groundwater sample was studied because HRMP personnel have identified trace levels of radionuclides in the water sample. Colloid results show that even though the water sample had filtered through a series of Millipore filters, high-colloid concentrations were identified in all unfiltered and filtered samples. They had studied the samples that were diluted with distilled water and found that diluted samples contained more colloids than the undiluted ones. These results imply that colloids are probably not stable during the storage conditions. Furthermore, results demonstrate that undesired colloids have been introduced into the samples during the storage, filtration, and dilution processes. They have evaluated possible sources of colloid contamination associated with sample collection, filtrating, storage, and analyses of natural groundwaters. The effects of container types and sample storage time on colloid size distribution and total concentration were studied to evaluate colloid stability by using J13 groundwater. The data suggests that groundwater samples

  2. Comet Odyssey: Comet Surface Sample Return

    Science.gov (United States)

    Weissman, Paul R.; Bradley, J.; Smythe, W. D.; Brophy, J. R.; Lisano, M. E.; Syvertson, M. L.; Cangahuala, L. A.; Liu, J.; Carlisle, G. L.

    2010-10-01

    Comet Odyssey is a proposed New Frontiers mission that would return the first samples from the surface of a cometary nucleus. Stardust demonstrated the tremendous power of analysis of returned samples in terrestrial laboratories versus what can be accomplished in situ with robotic missions. But Stardust collected only 1 milligram of coma dust, and the 6.1 km/s flyby speed heated samples up to 2000 K. Comet Odyssey would collect two independent 800 cc samples directly from the surface in a far more benign manner, preserving the primitive composition. Given a minimum surface density of 0.2 g/cm3, this would return two 160 g surface samples to Earth. Comet Odyssey employs solar-electric propulsion to rendezvous with the target comet. After 180 days of reconnaissance and site selection, the spacecraft performs a "touch-and-go” maneuver with surface contact lasting 3 seconds. A brush-wheel sampler on a remote arm collects up to 800 cc of sample. A duplicate second arm and sampler collects the second sample. The samples are placed in a return capsule and maintained at colder than -70 C during the return flight and at colder than -30 C during re-entry and for up to six hours after landing. The entire capsule is then refrigerated and transported to the Astromaterials Curatorial Facility at NASA/JSC for initial inspection and sample analysis by the Comet Odyssey team. Comet Odyssey's planned target was comet 9P/Tempel 1, with launch in December 2017 and comet arrival in June 2022. After a stay of 300 days at the comet, the spacecraft departs and arrives at Earth in May 2027. Comet Odyssey is a forerunner to a flagship Cryogenic Comet Sample Return mission that would return samples from deep below the nucleus surface, including volatile ices. This work was supported by internal funds from the Jet Propulsion Laboratory.

  3. Sample-Clock Phase-Control Feedback

    Science.gov (United States)

    Quirk, Kevin J.; Gin, Jonathan W.; Nguyen, Danh H.; Nguyen, Huy

    2012-01-01

    To demodulate a communication signal, a receiver must recover and synchronize to the symbol timing of a received waveform. In a system that utilizes digital sampling, the fidelity of synchronization is limited by the time between the symbol boundary and closest sample time location. To reduce this error, one typically uses a sample clock in excess of the symbol rate in order to provide multiple samples per symbol, thereby lowering the error limit to a fraction of a symbol time. For systems with a large modulation bandwidth, the required sample clock rate is prohibitive due to current technological barriers and processing complexity. With precise control of the phase of the sample clock, one can sample the received signal at times arbitrarily close to the symbol boundary, thus obviating the need, from a synchronization perspective, for multiple samples per symbol. Sample-clock phase-control feedback was developed for use in the demodulation of an optical communication signal, where multi-GHz modulation bandwidths would require prohibitively large sample clock frequencies for rates in excess of the symbol rate. A custom mixedsignal (RF/digital) offset phase-locked loop circuit was developed to control the phase of the 6.4-GHz clock that samples the photon-counting detector output. The offset phase-locked loop is driven by a feedback mechanism that continuously corrects for variation in the symbol time due to motion between the transmitter and receiver as well as oscillator instability. This innovation will allow significant improvements in receiver throughput; for example, the throughput of a pulse-position modulation (PPM) with 16 slots can increase from 188 Mb/s to 1.5 Gb/s.

  4. Corrosion of metal samples rapidly measured

    Science.gov (United States)

    Maskell, C. E.

    1966-01-01

    Corrosion of a large number of metal samples that have been exposed to controlled environment is accurately and rapidly measured. Wire samples of the metal are embedded in clear plastic and sectioned for microexamination. Unexposed wire can be included in the matrix as a reference.

  5. METABOLITE CHARACTERIZATION IN SERUM SAMPLES FROM ...

    African Journals Online (AJOL)

    Preferred Customer

    take advantage of larger chemical shift spread of 13C resonances allowing a more detailed identification of ... fingerprints of various metabolites of serum samples of normal healthy control have been obtained which can ... fasting 10 mL of blood sample from each individual was taken and was allowed to clot in plastic.

  6. 7 CFR 29.34 - Sample seal.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Sample seal. 29.34 Section 29.34 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Regulations Definitions § 29.34 Sample seal. A seal approved by the Director for sealing official...

  7. K-Median: Random Sampling Procedure

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. K-Median: Random Sampling Procedure. Sample a set of 1/ + 1 points from P. Let Q = first 1/ points, p = last point. Let T = Avg. 1-Median cost of P, c=1-Median. Let B1 = B(c,T/ 2), B2 = B(p, T). Let P' = points in B1.

  8. Sampling Lesbian, Gay, and Bisexual Populations

    Science.gov (United States)

    Meyer, Ilan H.; Wilson, Patrick A.

    2009-01-01

    Sampling has been the single most influential component of conducting research with lesbian, gay, and bisexual (LGB) populations. Poor sampling designs can result in biased results that will mislead other researchers, policymakers, and practitioners. Investigators wishing to study LGB populations must therefore devote significant energy and…

  9. 40 CFR 61.33 - Stack sampling.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 8 2010-07-01 2010-07-01 false Stack sampling. 61.33 Section 61.33 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL... sampling. (a) Unless a waiver of emission testing is obtained under § 61.13, each owner or operator...

  10. Sampling depth confounds soil acidification outcomes

    Science.gov (United States)

    In the northern Great Plains (NGP) of North America, surface sampling depths of 0-15 or 0-20 cm are suggested for testing soil characteristics such as pH. However, acidification is often most pronounced near the soil surface. Thus, sampling deeper can potentially dilute (increase) pH measurements an...

  11. Sampled Noise in Switched Current Circuits

    DEFF Research Database (Denmark)

    Jørgensen, Ivan Herald Holger; Bogason, Gudmundur

    1997-01-01

    The understanding of noise in analog sampled data systems is vital for the design of high resolution circuitry. In this paper a general description of sampled and held noise is presented. The noise calculations are verified by measurements on an analog delay line implemented using switched...

  12. Sampling scheme optimization from hyperspectral data

    NARCIS (Netherlands)

    Debba, P.

    2006-01-01

    This thesis presents statistical sampling scheme optimization for geo-environ-menta] purposes on the basis of hyperspectral data. It integrates derived products of the hyperspectral remote sensing data into individual sampling schemes. Five different issues are being dealt with.First, the optimized

  13. 40 CFR 1065.805 - Sampling system.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Sampling system. 1065.805 Section 1065.805 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Testing With Oxygenated Fuels § 1065.805 Sampling system. (a) Dilute engine...

  14. Sampling for validation of digital soil maps

    NARCIS (Netherlands)

    Brus, D.J.; Kempen, B.; Heuvelink, G.B.M.

    2011-01-01

    The increase in digital soil mapping around the world means that appropriate and efficient sampling strategies are needed for validation. Data used for calibrating a digital soil mapping model typically are non-random samples. In such a case we recommend collection of additional independent data and

  15. Additional Considerations in Determining Sample Size.

    Science.gov (United States)

    Levin, Joel R.; Subkoviak, Michael J.

    Levin's (1975) sample-size determination procedure for completely randomized analysis of variance designs is extended to designs in which antecedent or blocking variables information is considered. In particular, a researcher's choice of designs is framed in terms of determining the respective sample sizes necessary to detect specified contrasts…

  16. Determining Sample Size for Research Activities

    Science.gov (United States)

    Krejcie, Robert V.; Morgan, Daryle W.

    1970-01-01

    A formula for determining sample size, which originally appeared in 1960, has lacked a table for easy reference. This article supplies a graph of the function and a table of values which permits easy determination of the size of sample needed to be representative of a given population. (DG)

  17. Sampling of temporal networks: Methods and biases

    Science.gov (United States)

    Rocha, Luis E. C.; Masuda, Naoki; Holme, Petter

    2017-11-01

    Temporal networks have been increasingly used to model a diversity of systems that evolve in time; for example, human contact structures over which dynamic processes such as epidemics take place. A fundamental aspect of real-life networks is that they are sampled within temporal and spatial frames. Furthermore, one might wish to subsample networks to reduce their size for better visualization or to perform computationally intensive simulations. The sampling method may affect the network structure and thus caution is necessary to generalize results based on samples. In this paper, we study four sampling strategies applied to a variety of real-life temporal networks. We quantify the biases generated by each sampling strategy on a number of relevant statistics such as link activity, temporal paths and epidemic spread. We find that some biases are common in a variety of networks and statistics, but one strategy, uniform sampling of nodes, shows improved performance in most scenarios. Given the particularities of temporal network data and the variety of network structures, we recommend that the choice of sampling methods be problem oriented to minimize the potential biases for the specific research questions on hand. Our results help researchers to better design network data collection protocols and to understand the limitations of sampled temporal network data.

  18. Writing for Distance Education. Samples Booklet.

    Science.gov (United States)

    International Extension Coll., Cambridge (England).

    Approaches to the format, design, and layout of printed instructional materials for distance education are illustrated in 36 samples designed to accompany the manual, "Writing for Distance Education." Each sample is presented on a single page with a note pointing out its key features. Features illustrated include use of typescript layout, a comic…

  19. Statistical aspects of food safety sampling

    NARCIS (Netherlands)

    Jongenburger, I.; Besten, den H.M.W.; Zwietering, M.H.

    2015-01-01

    In food safety management, sampling is an important tool for verifying control. Sampling by nature is a stochastic process. However, uncertainty regarding results is made even greater by the uneven distribution of microorganisms in a batch of food. This article reviews statistical aspects of

  20. Accuracy assessment with complex sampling designs

    Science.gov (United States)

    Raymond L. Czaplewski

    2010-01-01

    A reliable accuracy assessment of remotely sensed geospatial data requires a sufficiently large probability sample of expensive reference data. Complex sampling designs reduce cost or increase precision, especially with regional, continental and global projects. The General Restriction (GR) Estimator and the Recursive Restriction (RR) Estimator separate a complex...

  1. Methodological Choices in Rating Speech Samples

    Science.gov (United States)

    O'Brien, Mary Grantham

    2016-01-01

    Much pronunciation research critically relies upon listeners' judgments of speech samples, but researchers have rarely examined the impact of methodological choices. In the current study, 30 German native listeners and 42 German L2 learners (L1 English) rated speech samples produced by English-German L2 learners along three continua: accentedness,…

  2. Analysing designed experiments in distance sampling

    Science.gov (United States)

    Stephen T. Buckland; Robin E. Russell; Brett G. Dickson; Victoria A. Saab; Donal N. Gorman; William M. Block

    2009-01-01

    Distance sampling is a survey technique for estimating the abundance or density of wild animal populations. Detection probabilities of animals inherently differ by species, age class, habitats, or sex. By incorporating the change in an observer's ability to detect a particular class of animals as a function of distance, distance sampling leads to density estimates...

  3. Sampling low-density gypsy moth populations

    Science.gov (United States)

    William E. Wallner; Clive G. Jones; Joseph S. Elkinton; Bruce L. Parker

    1991-01-01

    The techniques and methodology for sampling gypsy moth, Lymantria dispar L., at low densities, less than 100 egg masses/ha (EM/ha), are compared. Forest managers have constraints of time and cost, and need a useful, simple predictable means to assist them in sampling gypsy moth populations. A comparison of various techniques coupled with results of...

  4. Quantum algorithm for exact Monte Carlo sampling

    OpenAIRE

    Destainville, Nicolas; Georgeot, Bertrand; Giraud, Olivier

    2010-01-01

    We build a quantum algorithm which uses the Grover quantum search procedure in order to sample the exact equilibrium distribution of a wide range of classical statistical mechanics systems. The algorithm is based on recently developed exact Monte Carlo sampling methods, and yields a polynomial gain compared to classical procedures.

  5. Multilingualism remixed: Sampling, braggadocio and the stylisation ...

    African Journals Online (AJOL)

    Multilingualism remixed: Sampling, braggadocio and the stylisation of local voice. ... is the question of how multilingual voice may carry across media, modalities and context. In this paper, we ... Specifically, we ask how emcees sample local varieties of language, texts and registers to stage their particular stylisation of voice.

  6. Chemical fingerprinting of unevaporated automotive gasoline samples.

    Science.gov (United States)

    Sandercock, P M L; Du Pasquier, E

    2003-06-24

    The comparison of two or more samples of liquid gasoline (petrol) to establish a common origin is a difficult problem in the forensic investigation of arsons and suspicious fires. A total of 35 randomly collected samples of unevaporated gasoline, covering three different grades (regular unleaded, premium unleaded and lead replacement), were examined. The high-boiling fraction of the gasoline was targeted with a view to apply the techniques described herein to evaporated gasoline samples in the future.A novel micro solid phase extraction (SPE) technique using activated alumina was developed to isolate the polar compounds and the polycyclic aromatic hydrocarbons (PAHs) from a 200microl sample of gasoline. Samples were analysed using full-scan gas chromatography-mass spectrometry (GC-MS) and potential target compounds identified. Samples were then re-analysed directly, without prior treatment, using GC-MS in selected ion monitoring (SIM) mode for target compounds that exhibited variation between gasoline samples. Principal component analysis (PCA) was applied to the chromatographic data. The first two principal components (PCs) accounted for 91.5% of the variation in the data. Linear discriminant analysis (LDA) performed on the PCA results showed that the 35 samples tested could be classified into 32 different groups.

  7. Evaluation of diesel particulate matter sampling techniques

    CSIR Research Space (South Africa)

    Pretorius, CJ

    2011-09-01

    Full Text Available The study evaluated diesel particulate matter (DPM) sampling methods used in the South African mining industry. The three-piece cassette respirable, open face and stopper sampling methods were compared with the SKC DPM cassette method to find a...

  8. Personal gravimetric dust sampling and risk assessment.

    CSIR Research Space (South Africa)

    Unsted, AD

    1996-03-01

    Full Text Available . At all the sampling sites extremely large variation in dust concentrations were measured on a day to day and shift basis. Correlation of dust concentrations between personal and stationary samples was very poor as was the correlation between quartz...

  9. k-Means: Random Sampling Procedure

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. k-Means: Random Sampling Procedure. Optimal 1-Mean is. Approximation of Centroid (Inaba et al). S = random sample of size O(1/ ); Centroid of S is a (1+ )-approx centroid of P with constant probability.

  10. Decisions from Experience: Why Small Samples?

    Science.gov (United States)

    Hertwig, Ralph; Pleskac, Timothy J.

    2010-01-01

    In many decisions we cannot consult explicit statistics telling us about the risks involved in our actions. In lieu of such data, we can arrive at an understanding of our dicey options by sampling from them. The size of the samples that we take determines, ceteris paribus, how good our choices will be. Studies of decisions from experience have…

  11. 27 CFR 6.91 - Samples.

    Science.gov (United States)

    2010-04-01

    ... TREASURY LIQUORS âTIED-HOUSEâ Exceptions § 6.91 Samples. The act by an industry member of furnishing or giving a sample of distilled spirits, wine, or malt beverages to a retailer who has not purchased the brand from that industry member within the last 12 months does not constitute a means to induce within...

  12. Statistical Literacy and Sample Survey Results

    Science.gov (United States)

    McAlevey, Lynn; Sullivan, Charles

    2010-01-01

    Sample surveys are widely used in the social sciences and business. The news media almost daily quote from them, yet they are widely misused. Using students with prior managerial experience embarking on an MBA course, we show that common sample survey results are misunderstood even by those managers who have previously done a statistics course. In…

  13. 40 CFR 61.54 - Sludge sampling.

    Science.gov (United States)

    2010-07-01

    ..., preparation, and analysis of sludge samples shall be accomplished according to Method 105 in appendix B of... may use Method 105 of appendix B and the procedures specified in this section. (1) A sludge test shall... be sampled according to paragraph (c)(1) of this section, sludge charging rate for the plant shall be...

  14. Gamma-ray spectrometry of LDEF samples

    Energy Technology Data Exchange (ETDEWEB)

    Winn, W.G.

    1991-01-01

    A total of 31 samples from the Long Duration Exposure Facility (LDEF), including materials of aluminum, vanadium, and steel trunnions were analyzed by ultra-low-level gamma spectroscopy. The study quantified particle induced activations of (sup 22)Na, {sup 46}Sc, {sup 51}Cr, {sup 54}Mn, {sup 56}Co, {sup 57}Co, {sup 58}Co, and {sup 60}Co. The samples of trunnion sections exhibited increasing activity toward the outer end of the trunnion and decreasing activity toward its radial center. The trunnion sections did not include end pieces, which have been reported to collect noticeable {sup 7}Be on their leading surfaces. No significant {sup 7}Be was detected in the samples analyzed. The Underground Counting Facility at Savannah River Laboratory (SRL) was used in this work. The facility is 50 ft. underground, constructed with low-background shielding materials, and operated as a clean room. The most sensitive analyses were performed with a 90%-efficient HPGe gamma-ray detector, which is enclosed in a purged active/passive shield. Each sample was counted for one to six days in two orientations to yield more representative average activities for the sample. The non-standard geometries of the LDEF samples prompted the development of a novel calibration method, whereby the efficiency about the samples surfaces (measured with point sources) predicted the efficiency for the bulk sample.

  15. Gamma-ray spectrometry of LDEF samples

    Energy Technology Data Exchange (ETDEWEB)

    Winn, W.G.

    1991-12-31

    A total of 31 samples from the Long Duration Exposure Facility (LDEF), including materials of aluminum, vanadium, and steel trunnions were analyzed by ultra-low-level gamma spectroscopy. The study quantified particle induced activations of (sup 22)Na, {sup 46}Sc, {sup 51}Cr, {sup 54}Mn, {sup 56}Co, {sup 57}Co, {sup 58}Co, and {sup 60}Co. The samples of trunnion sections exhibited increasing activity toward the outer end of the trunnion and decreasing activity toward its radial center. The trunnion sections did not include end pieces, which have been reported to collect noticeable {sup 7}Be on their leading surfaces. No significant {sup 7}Be was detected in the samples analyzed. The Underground Counting Facility at Savannah River Laboratory (SRL) was used in this work. The facility is 50 ft. underground, constructed with low-background shielding materials, and operated as a clean room. The most sensitive analyses were performed with a 90%-efficient HPGe gamma-ray detector, which is enclosed in a purged active/passive shield. Each sample was counted for one to six days in two orientations to yield more representative average activities for the sample. The non-standard geometries of the LDEF samples prompted the development of a novel calibration method, whereby the efficiency about the samples surfaces (measured with point sources) predicted the efficiency for the bulk sample.

  16. Network Sampling with Memory: A proposal for more efficient sampling from social networks

    OpenAIRE

    Mouw, Ted; Verdery, Ashton M.

    2012-01-01

    Techniques for sampling from networks have grown into an important area of research across several fields. For sociologists, the possibility of sampling from a network is appealing for two reasons: (1) A network sample can yield substantively interesting data about network structures and social interactions, and (2) it is useful in situations where study populations are difficult or impossible to survey with traditional sampling approaches because of the lack of a sampling frame. Despite its ...

  17. A sampling algorithm for segregation analysis

    Directory of Open Access Journals (Sweden)

    Henshall John

    2001-11-01

    Full Text Available Abstract Methods for detecting Quantitative Trait Loci (QTL without markers have generally used iterative peeling algorithms for determining genotype probabilities. These algorithms have considerable shortcomings in complex pedigrees. A Monte Carlo Markov chain (MCMC method which samples the pedigree of the whole population jointly is described. Simultaneous sampling of the pedigree was achieved by sampling descent graphs using the Metropolis-Hastings algorithm. A descent graph describes the inheritance state of each allele and provides pedigrees guaranteed to be consistent with Mendelian sampling. Sampling descent graphs overcomes most, if not all, of the limitations incurred by iterative peeling algorithms. The algorithm was able to find the QTL in most of the simulated populations. However, when the QTL was not modeled or found then its effect was ascribed to the polygenic component. No QTL were detected when they were not simulated.

  18. East Mountain Area 1995 air sampling results

    Energy Technology Data Exchange (ETDEWEB)

    Deola, R.A. [Sandia National Labs., Albuquerque, NM (United States). Air Quality Dept.

    1996-09-01

    Ambient air samples were taken at two locations in the East Mountain Area in conjunction with thermal testing at the Lurance Canyon Burn Site (LCBS). The samples were taken to provide measurements of particulate matter with a diameter less than or equal to 10 micrometers (PM{sub 10}) and volatile organic compounds (VOCs). This report summarizes the results of the sampling performed in 1995. The results from small-scale testing performed to determine the potentially produced air pollutants in the thermal tests are included in this report. Analytical results indicate few samples produced measurable concentrations of pollutants believed to be produced by thermal testing. Recommendations for future air sampling in the East Mountain Area are also noted.

  19. Importance sampling the Rayleigh phase function

    DEFF Research Database (Denmark)

    Frisvad, Jeppe Revall

    2011-01-01

    Rayleigh scattering is used frequently in Monte Carlo simulation of multiple scattering. The Rayleigh phase function is quite simple, and one might expect that it should be simple to importance sample it efficiently. However, there seems to be no one good way of sampling it in the literature. Thi....... This paper provides the details of several different techniques for importance sampling the Rayleigh phase function, and it includes a comparison of their performance as well as hints toward efficient implementation.......Rayleigh scattering is used frequently in Monte Carlo simulation of multiple scattering. The Rayleigh phase function is quite simple, and one might expect that it should be simple to importance sample it efficiently. However, there seems to be no one good way of sampling it in the literature...

  20. Custom sample environments at the ALBA XPEEM

    Energy Technology Data Exchange (ETDEWEB)

    Foerster, Michael, E-mail: mfoerster@cells.es; Prat, Jordi; Massana, Valenti; Gonzalez, Nahikari; Fontsere, Abel; Molas, Bernat; Matilla, Oscar; Pellegrin, Eric; Aballe, Lucia

    2016-12-15

    A variety of custom-built sample holders offer users a wide range of non-standard measurements at the ALBA synchrotron PhotoEmission Electron Microscope (PEEM) experimental station. Some of the salient features are: an ultrahigh vacuum (UHV) suitcase compatible with many offline deposition and characterization systems, built-in electromagnets for uni- or biaxial in-plane (IP) and out-of-plane (OOP) fields, as well as the combination of magnetic fields with electric fields or current injection. Electronics providing a synchronized sinusoidal signal for sample excitation enable time-resolved measurements at the 500 MHz storage ring RF frequency. - Highlights: • Custom sample environment for XPEEM at ALBA. • Sample holders with electromagnets, in-plane dipole, in-plane quadruple and out-of-plane. • Sample holders with printed circuit boards for electric contacts including electromagnets. • UHV suitcase adapter. • Synchronized 500 MHz electrical excitation for time resolved measurements.

  1. A Geology Sampling System for Small Bodies

    Science.gov (United States)

    Naids, Adam J.; Hood, Anthony D.; Abell, Paul; Graff, Trevor; Buffington, Jesse

    2016-01-01

    Human exploration of microgravity bodies is being investigated as a precursor to a Mars surface mission. Asteroids, comets, dwarf planets, and the moons of Mars all fall into this microgravity category and some are being discussed as potential mission targets. Obtaining geological samples for return to Earth will be a major objective for any mission to a small body. Currently, the knowledge base for geology sampling in microgravity is in its infancy. Humans interacting with non-engineered surfaces in microgravity environment pose unique challenges. In preparation for such missions a team at the NASA Johnson Space Center has been working to gain experience on how to safely obtain numerous sample types in such an environment. This paper describes the type of samples the science community is interested in, highlights notable prototype work, and discusses an integrated geology sampling solution.

  2. A Geology Sampling System for Microgravity Bodies

    Science.gov (United States)

    Hood, Anthony; Naids, Adam

    2016-01-01

    Human exploration of microgravity bodies is being investigated as a precursor to a Mars surface mission. Asteroids, comets, dwarf planets, and the moons of Mars all fall into this microgravity category and some are been discussed as potential mission targets. Obtaining geological samples for return to Earth will be a major objective for any mission to a microgravity body. Currently the knowledge base for geology sampling in microgravity is in its infancy. Humans interacting with non-engineered surfaces in microgravity environment pose unique challenges. In preparation for such missions a team at the NASA Johnson Space Center has been working to gain experience on how to safely obtain numerous sample types in such an environment. This paper describes the type of samples the science community is interested in, highlights notable prototype work, and discusses an integrated geology sampling solution.

  3. Recommended Maximum Temperature For Mars Returned Samples

    Science.gov (United States)

    Beaty, D. W.; McSween, H. Y.; Czaja, A. D.; Goreva, Y. S.; Hausrath, E.; Herd, C. D. K.; Humayun, M.; McCubbin, F. M.; McLennan, S. M.; Hays, L. E.

    2016-01-01

    The Returned Sample Science Board (RSSB) was established in 2015 by NASA to provide expertise from the planetary sample community to the Mars 2020 Project. The RSSB's first task was to address the effect of heating during acquisition and storage of samples on scientific investigations that could be expected to be conducted if the samples are returned to Earth. Sample heating may cause changes that could ad-versely affect scientific investigations. Previous studies of temperature requirements for returned mar-tian samples fall within a wide range (-73 to 50 degrees Centigrade) and, for mission concepts that have a life detection component, the recommended threshold was less than or equal to -20 degrees Centigrade. The RSSB was asked by the Mars 2020 project to determine whether or not a temperature requirement was needed within the range of 30 to 70 degrees Centigrade. There are eight expected temperature regimes to which the samples could be exposed, from the moment that they are drilled until they are placed into a temperature-controlled environment on Earth. Two of those - heating during sample acquisition (drilling) and heating while cached on the Martian surface - potentially subject samples to the highest temperatures. The RSSB focused on the upper temperature limit that Mars samples should be allowed to reach. We considered 11 scientific investigations where thermal excursions may have an adverse effect on the science outcome. Those are: (T-1) organic geochemistry, (T-2) stable isotope geochemistry, (T-3) prevention of mineral hydration/dehydration and phase transformation, (T-4) retention of water, (T-5) characterization of amorphous materials, (T-6) putative Martian organisms, (T-7) oxidation/reduction reactions, (T-8) (sup 4) He thermochronometry, (T-9) radiometric dating using fission, cosmic-ray or solar-flare tracks, (T-10) analyses of trapped gasses, and (T-11) magnetic studies.

  4. CHARACTERIZATION OF TANK 19F SAMPLES

    Energy Technology Data Exchange (ETDEWEB)

    Oji, L.; Diprete, D.; Click, D.

    2009-12-17

    The Savannah River National Laboratory (SRNL) was asked by Liquid Waste Operations to characterize Tank 19F closure samples. Tank 19F slurry samples analyzed included the liquid and solid fractions derived from the slurry materials along with the floor scrape bottom Tank 19F wet solids. These samples were taken from Tank 19F in April 2009 and made available to SRNL in the same month. Because of limited amounts of solids observed in Tank 19F samples, the samples from the north quadrants of the tank were combined into one Tank 19F North Hemisphere sample and similarly the south quadrant samples were combined into one Tank 19F South Hemisphere sample. These samples were delivered to the SRNL shielded cell. The Tank 19F samples were analyzed for radiological, chemical and elemental components. Where analytical methods yielded additional contaminants other than those requested by the customer, these results were also reported. The target detection limits for isotopes analyzed were based on detection values of 1E-04 {micro}Ci/g for most radionuclides and customer desired detection values of 1E-05 {micro}Ci/g for I-129, Pa-231, Np-237, and Ra-226. While many of the target detection limits, as specified in the technical task request and task technical and quality assurance plans were met for the species characterized for Tank 19F, some were not met. In a number of cases, the relatively high levels of radioactive species of the same element or a chemically similar element precluded the ability to measure some isotopes to low levels. SRNL, in conjunction with the plant customer, reviewed all these cases and determined that the impacts were negligible.

  5. CHARACTERIZATION OF THE TANK 18F SAMPLES

    Energy Technology Data Exchange (ETDEWEB)

    Oji, L.; Click, D.; Diprete, D.

    2009-12-17

    The Savannah River National Laboratory (SRNL) was asked by Liquid Waste Operations to characterize Tank 18F closure samples. Tank 18F slurry samples analyzed included the liquid and solid fractions derived from the 'as-received' slurry materials along with the floor scrape bottom Tank 18F wet solids. These samples were taken from Tank 18F in March 2009 and made available to SRNL in the same month. Because of limited amounts of solids observed in Tank 18F samples, the samples from the north quadrants of the tank were combined into one North Tank 18F Hemisphere sample and similarly the south quadrant samples were combined into one South Tank 18F Hemisphere sample. These samples were delivered to the SRNL shielded cell. The Tank 18F samples were analyzed for radiological, chemical and elemental components. Where analytical methods yielded additional contaminants other than those requested by the customer, these results were also reported. The target detection limits for isotopes analyzed were 1E-04 {micro}Ci/g for most radionuclides and customer desired detection values of 1E-05 {micro}Ci/g for I-129, Pa-231, Np-237, and Ra-226. While many of the minimum detection limits, as specified in the technical task request and task technical and quality assurance plans were met for the species characterized for Tank 18F, some were not met due to spectral interferences. In a number of cases, the relatively high levels of radioactive species of the same element or a chemically similar element precluded the ability to measure some isotopes to low levels. SRNL, in conjunction with the plant customer, reviewed all these cases and determined that the impacts were negligible.

  6. Cooled membrane for high sensitivity gas sampling.

    Science.gov (United States)

    Jiang, Ruifen; Pawliszyn, Janusz

    2014-04-18

    A novel sample preparation method that combines the advantages of high surface area geometry and cold surface effect was proposed to achieve high sensitivity gas sampling. To accomplish this goal, a device that enables the membrane to be cooled down was developed for sampling, and a gas chromatograph-mass spectrometer was used for separation and quantification analysis. Method development included investigation of the effect of membrane temperature, membrane size, gas flow rate and humidity. Results showed that high sensitivity for equilibrium sampling, such as limonene sampling in the current study could be achieved by either cooling down the membrane and/or using a large volume extraction phase. On the other hand, for pre-equilibrium extraction, in which the extracted amount was mainly determined by membrane surface area and diffusion coefficient, high sensitivity could be obtained by using thinner membranes with a larger surface and/or a higher sampling flow rate. In addition, humidity showed no significant influence on extraction efficiency, due to the absorption property of the liquid extraction phase. Next, the limit of detection (LOD) was found, and the reproducibility of the developed cooled membrane gas sampling method was evaluated. Results showed that LODs with a membrane diameter of 19mm at room temperature sampling were 9.2ng/L, 0.12ng/L, 0.10ng/L for limonene, cinnamaldehyde and 2-pentadecanone, respectively. Intra- and inter-membrane sampling reproducibility revealed RSD% lower than 8% and 13%, respectively. Results uniformly demonstrated that the proposed cooled membrane device could serve as an alternative powerful tool for future gas sampling. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. The LITA Drill and Sample Delivery System

    Science.gov (United States)

    Paulsen, G.; Yoon, S.; Zacny, K.; Wettergreeng, D.; Cabrol, N. A.

    2013-12-01

    The Life in the Atacama (LITA) project has a goal of demonstrating autonomous roving, sample acquisition, delivery and analysis operations in Atacama, Chile. To enable the sample handling requirement, Honeybee Robotics developed a rover-deployed, rotary-percussive, autonomous drill, called the LITA Drill, capable of penetrating to ~80 cm in various formations, capturing and delivering subsurface samples to a 20 cup carousel. The carousel has a built-in capability to press the samples within each cup, and position target cups underneath instruments for analysis. The drill and sample delivery system had to have mass and power requirements consistent with a flight system. The drill weighs 12 kg and uses less than 100 watt of power to penetrate ~80 cm. The LITA Drill auger has been designed with two distinct stages. The lower part has deep and gently sloping flutes for retaining powdered sample, while the upper section has shallow and steep flutes for preventing borehole collapse and for efficient movement of cuttings and fall back material out of the hole. The drill uses the so called 'bite-sampling' approach that is samples are taken in short, 5-10 cm bites. To take the first bite, the drill is lowered onto the ground and upon drilling of the first bite it is then retracted into an auger tube. The auger with the auger tube are then lifted off the ground and positioned next to the carousel. To deposit the sample, the auger is rotated and retracted above the auger tube. The cuttings retained on the flutes are either gravity fed or are brushed off by a passive side brush into the cup. After the sample from the first bite has been deposited, the drill is lowered back into the same hole to take the next bite. This process is repeated until a target depth is reached. The bite sampling is analogous to peck drilling in the machining process where a bit is periodically retracted to clear chips. If there is some fall back into the hole once the auger has cleared the hole, this

  8. Biostatistics Series Module 5: Determining Sample Size.

    Science.gov (United States)

    Hazra, Avijit; Gogtay, Nithya

    2016-01-01

    Determining the appropriate sample size for a study, whatever be its type, is a fundamental aspect of biomedical research. An adequate sample ensures that the study will yield reliable information, regardless of whether the data ultimately suggests a clinically important difference between the interventions or elements being studied. The probability of Type 1 and Type 2 errors, the expected variance in the sample and the effect size are the essential determinants of sample size in interventional studies. Any method for deriving a conclusion from experimental data carries with it some risk of drawing a false conclusion. Two types of false conclusion may occur, called Type 1 and Type 2 errors, whose probabilities are denoted by the symbols σ and β. A Type 1 error occurs when one concludes that a difference exists between the groups being compared when, in reality, it does not. This is akin to a false positive result. A Type 2 error occurs when one concludes that difference does not exist when, in reality, a difference does exist, and it is equal to or larger than the effect size defined by the alternative to the null hypothesis. This may be viewed as a false negative result. When considering the risk of Type 2 error, it is more intuitive to think in terms of power of the study or (1 - β). Power denotes the probability of detecting a difference when a difference does exist between the groups being compared. Smaller α or larger power will increase sample size. Conventional acceptable values for power and α are 80% or above and 5% or below, respectively, when calculating sample size. Increasing variance in the sample tends to increase the sample size required to achieve a given power level. The effect size is the smallest clinically important difference that is sought to be detected and, rather than statistical convention, is a matter of past experience and clinical judgment. Larger samples are required if smaller differences are to be detected. Although the

  9. 1997 Baseline Sampling and Analysis Sample Locations, Geographic NAD83, LOSCO (2004) [BSA_1997_sample_locations_LOSCO_2004

    Data.gov (United States)

    Louisiana Geographic Information Center — The monitor point data set was produced as a part of the Baseline Sampling and Analysis (BSA) program coordinated by the Louisiana Oil Spill Coordinator's Office....

  10. 1999 Baseline Sampling and Analysis Sampling Locations, Geographic NAD83, LOSCO (2004) [BSA_1999_sample_locations_LOSCO_2004

    Data.gov (United States)

    Louisiana Geographic Information Center — The monitor point data set was produced as a part of the Baseline Sampling and Analysis program coordinated by the Louisiana Oil Spill Coordinator's Office. This...

  11. 1998 Baseline Sampling and Analysis Sampling Locations, Geographic NAD83, LOSCO (2004) [BSA_1998_sample_locations_LOSCO_2004

    Data.gov (United States)

    Louisiana Geographic Information Center — The monitor point data set was produced as a part of the Baseline Sampling and Analysis program coordinated by the Louisiana Oil Spill Coordinator's Office. This...

  12. Galahad: medium class asteroid sample return mission

    Science.gov (United States)

    Cheng, Andrew; Rivkin, Andrew; Adler, Mark

    The Galahad asteroid sample return mission proposal to the NASA New Frontiers solicitation met all of the objectives for the Asteroid Rover/Sample Return mission as defined in that announcement. Galahad is in many ways similar to the Marco Polo and the OSIRIS-Rex proposals. All three missions plan bulk sample returns from primitive, C or B class Near Earth asteroids. Galahad in particular will rendezvous with and orbit the binary C-asteroid 1996 FG3, making extensive orbital measurements. It will then land and collect over 60 g of well-documented samples with geologic context for return to Earth. The samples are expected to provide abundant materials from the early solar system, including chondrules and CAIs, as well as a primitive assemblage of organics, presolar grains and probably hydrated minerals. Analyses of these samples will yield new understanding of the early solar system, planetary accretion, and the nature and origins of prebiotic organic material. We will discuss scientific and technical approaches to characterization of, landing on, and sample collection from small primitive bodies.

  13. Downsampling Non-Uniformly Sampled Data

    Directory of Open Access Journals (Sweden)

    Fredrik Gustafsson

    2007-10-01

    Full Text Available Decimating a uniformly sampled signal a factor D involves low-pass antialias filtering with normalized cutoff frequency 1/D followed by picking out every Dth sample. Alternatively, decimation can be done in the frequency domain using the fast Fourier transform (FFT algorithm, after zero-padding the signal and truncating the FFT. We outline three approaches to decimate non-uniformly sampled signals, which are all based on interpolation. The interpolation is done in different domains, and the inter-sample behavior does not need to be known. The first one interpolates the signal to a uniformly sampling, after which standard decimation can be applied. The second one interpolates a continuous-time convolution integral, that implements the antialias filter, after which every Dth sample can be picked out. The third frequency domain approach computes an approximate Fourier transform, after which truncation and IFFT give the desired result. Simulations indicate that the second approach is particularly useful. A thorough analysis is therefore performed for this case, using the assumption that the non-uniformly distributed sampling instants are generated by a stochastic process.

  14. Sampling of Complex Networks: A Datamining Approach

    Science.gov (United States)

    Loecher, Markus; Dohrmann, Jakob; Bauer, Gernot

    2007-03-01

    Efficient and accurate sampling of big complex networks is still an unsolved problem. As the degree distribution is one of the most commonly used attributes to characterize a network, there have been many attempts in recent papers to derive the original degree distribution from the data obtained during a traceroute- like sampling process. This talk describes a strategy for predicting the original degree of a node using the data obtained from a network by traceroute-like sampling making use of datamining techniques. Only local quantities (the sampled degree k, the redundancy of node detection r, the time of the first discovery of a node t and the distance to the sampling source d) are used as input for the datamining models. Global properties like the betweenness centrality are ignored. These local quantities are examined theoretically and in simulations to increase their value for the predictions. The accuracy of the models is discussed as a function of the number of sources used in the sampling process and the underlying topology of the network. The purpose of this work is to introduce the techniques of the relatively young field of datamining to the discussion on network sampling.

  15. Enzymatic Purification of Microplastics in Environmental Samples.

    Science.gov (United States)

    Löder, Martin G J; Imhof, Hannes K; Ladehoff, Maike; Löschel, Lena A; Lorenz, Claudia; Mintenig, Svenja; Piehl, Sarah; Primpke, Sebastian; Schrank, Isabella; Laforsch, Christian; Gerdts, Gunnar

    2017-12-19

    Micro-Fourier transform infrared (micro-FTIR) spectroscopy and Raman spectroscopy enable the reliable identification and quantification of microplastics (MPs) in the lower micron range. Since concentrations of MPs in the environment are usually low, the large sample volumes required for these techniques lead to an excess of coenriched organic or inorganic materials. While inorganic materials can be separated from MPs using density separation, the organic fraction impedes the ability to conduct reliable analyses. Hence, the purification of MPs from organic materials is crucial prior to conducting an identification via spectroscopic techniques. Strong acidic or alkaline treatments bear the danger of degrading sensitive synthetic polymers. We suggest an alternative method, which uses a series of technical grade enzymes for purifying MPs in environmental samples. A basic enzymatic purification protocol (BEPP) proved to be efficient while reducing 98.3 ± 0.1% of the sample matrix in surface water samples. After showing a high recovery rate (84.5 ± 3.3%), the BEPP was successfully applied to environmental samples from the North Sea where numbers of MPs range from 0.05 to 4.42 items m-3. Experiences with different environmental sample matrices were considered in an improved and universally applicable version of the BEPP, which is suitable for focal plane array detector (FPA)-based micro-FTIR analyses of water, wastewater, sediment, biota, and food samples.

  16. Thermal probe design for Europa sample acquisition

    Science.gov (United States)

    Horne, Mera F.

    2018-01-01

    The planned lander missions to the surface of Europa will access samples from the subsurface of the ice in a search for signs of life. A small thermal drill (probe) is proposed to meet the sample requirement of the Science Definition Team's (SDT) report for the Europa mission. The probe is 2 cm in diameter and 16 cm in length and is designed to access the subsurface to 10 cm deep and to collect five ice samples of 7 cm3 each, approximately. The energy required to penetrate the top 10 cm of ice in a vacuum is 26 Wh, approximately, and to melt 7 cm3 of ice is 1.2 Wh, approximately. The requirement stated in the SDT report of collecting samples from five different sites can be accommodated with repeated use of the same thermal drill. For smaller sample sizes, a smaller probe of 1.0 cm in diameter with the same length of 16 cm could be utilized that would require approximately 6.4 Wh to penetrate the top 10 cm of ice, and 0.02 Wh to collect 0.1 g of sample. The thermal drill has the advantage of simplicity of design and operations and the ability to penetrate ice over a range of densities and hardness while maintaining sample integrity.

  17. Apollo Lunar Sample Photograph Digitization Project Update

    Science.gov (United States)

    Todd, N. S.; Lofgren, G. E.

    2012-01-01

    This is an update of the progress of a 4-year data restoration project effort funded by the LASER program to digitize photographs of the Apollo lunar rock samples and create high resolution digital images and undertaken by the Astromaterials Acquisition and Curation Office at JSC [1]. The project is currently in its last year of funding. We also provide an update on the derived products that make use of the digitized photos including the Lunar Sample Catalog and Photo Database[2], Apollo Sample data files for GoogleMoon[3].

  18. Sampling the Uppermost Surface of Airless Bodies

    Science.gov (United States)

    Noble, S. K.; Keller, L. P.; Christoffersen, R.

    2011-01-01

    The uppermost surface of an airless body is a critical source of ground-truth information for the various remote sensing techniques that only penetrate nanometers to micrometers into the surface. Such samples will also be vital for understanding conditions at the surface and acquiring information about how the body interacts with its environment, including solar wind interaction, grain charging and levitation [1]. Sampling the uppermost surface while preserving its structure (e.g. porosity, grain-to-grain contacts) however, is a daunting task that has not been achieved on any sample return mission to date.

  19. Basic Statistical Concepts for Sample Size Estimation

    Directory of Open Access Journals (Sweden)

    Vithal K Dhulkhed

    2008-01-01

    Full Text Available For grant proposals the investigator has to include an estimation of sample size .The size of the sample should be adequate enough so that there is sufficient data to reliably answer the research question being addressed by the study. At the very planning stage of the study the investigator has to involve the statistician. To have meaningful dialogue with the statistician every research worker should be familiar with the basic concepts of statistics. This paper is concerned with simple principles of sample size calculation. Concepts are explained based on logic rather than rigorous mathematical calculations to help him assimilate the fundamentals.

  20. Fluidics platform and method for sample preparation

    Science.gov (United States)

    Benner, Henry W.; Dzenitis, John M.

    2016-06-21

    Provided herein are fluidics platforms and related methods for performing integrated sample collection and solid-phase extraction of a target component of the sample all in one tube. The fluidics platform comprises a pump, particles for solid-phase extraction and a particle-holding means. The method comprises contacting the sample with one or more reagents in a pump, coupling a particle-holding means to the pump and expelling the waste out of the pump while the particle-holding means retains the particles inside the pump. The fluidics platform and methods herein described allow solid-phase extraction without pipetting and centrifugation.

  1. Visual Sample Plan (VSP) - FIELDS Integration

    Energy Technology Data Exchange (ETDEWEB)

    Pulsipher, Brent A.; Wilson, John E.; Gilbert, Richard O.; Hassig, Nancy L.; Carlson, Deborah K.; Bing-Canar, John; Cooper, Brian; Roth, Chuck

    2003-04-19

    Two software packages, VSP 2.1 and FIELDS 3.5, are being used by environmental scientists to plan the number and type of samples required to meet project objectives, display those samples on maps, query a database of past sample results, produce spatial models of the data, and analyze the data in order to arrive at defensible decisions. VSP 2.0 is an interactive tool to calculate optimal sample size and optimal sample location based on user goals, risk tolerance, and variability in the environment and in lab methods. FIELDS 3.0 is a set of tools to explore the sample results in a variety of ways to make defensible decisions with quantified levels of risk and uncertainty. However, FIELDS 3.0 has a small sample design module. VSP 2.0, on the other hand, has over 20 sampling goals, allowing the user to input site-specific assumptions such as non-normality of sample results, separate variability between field and laboratory measurements, make two-sample comparisons, perform confidence interval estimation, use sequential search sampling methods, and much more. Over 1,000 copies of VSP are in use today. FIELDS is used in nine of the ten U.S. EPA regions, by state regulatory agencies, and most recently by several international countries. Both software packages have been peer-reviewed, enjoy broad usage, and have been accepted by regulatory agencies as well as site project managers as key tools to help collect data and make environmental cleanup decisions. Recently, the two software packages were integrated, allowing the user to take advantage of the many design options of VSP, and the analysis and modeling options of FIELDS. The transition between the two is simple for the user – VSP can be called from within FIELDS, automatically passing a map to VSP and automatically retrieving sample locations and design information when the user returns to FIELDS. This paper will describe the integration, give a demonstration of the integrated package, and give users download

  2. Sampling system for in vivo ultrasound images

    DEFF Research Database (Denmark)

    Jensen, Jorgen Arendt; Mathorne, Jan

    1991-01-01

    Newly developed algorithms for processing medical ultrasound images use the high frequency sampled transducer signal. This paper describes demands imposed on a sampling system suitable for acquiring such data and gives details about a prototype constructed. It acquires full clinical images at a s...... at a sampling frequency of 20 MHz with a resolution of 12 bits. The prototype can be used for real time image processing. An example of a clinical in vivo image is shown and various aspects of the data acquisition process are discussed....

  3. METHODOLOGICAL ASPECTS OF STRATIFICATION OF AUDIT SAMPLING

    Directory of Open Access Journals (Sweden)

    Vilena A. Yakimova

    2013-01-01

    Full Text Available The article presents the methodological foundations for construction stratification audit sampling for attribute-based sampling. The sampling techniques of Russian and foreign practice is studied and stratified. The role of stratification in the audit is described. Approaches to construction of the stratification are revealed on the basis of professional judgment (qualitative methods, statistical groupings (quantitative methods and combinatory ones (complex qualitative stratifications. Grouping of accounting information for the purpose of constructing an optimal stratification and its criteria are proposed. The stratification methods are worked out and tested on the example of ABC-analysis.

  4. A Comet Surface Sample Return System Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed Phase II investigation will focus on the development of spacecraft systems required to obtain a sample from the nucleus of a comet, hermetically seal...

  5. Fetal scalp blood sampling during labor

    DEFF Research Database (Denmark)

    Chandraharan, Edwin; Wiberg, Nana

    2014-01-01

    Fetal cardiotocography is characterized by low specificity; therefore, in an attempt to ensure fetal well-being, fetal scalp blood sampling has been recommended by most obstetric societies in the case of a non-reassuring cardiotocography. The scientific agreement on the evidence for using fetal...... scalp blood sampling to decrease the rate of operative delivery for fetal distress is ambiguous. Based on the same studies, a Cochrane review states that fetal scalp blood sampling increases the rate of instrumental delivery while decreasing neonatal acidosis, whereas the National Institute of Health...... and Clinical Excellence guideline considers that fetal scalp blood sampling decreases instrumental delivery without differences in other outcome variables. The fetal scalp is supplied by vessels outside the skull below the level of the cranial vault, which is likely to be compressed during contractions...

  6. On Invertible Sampling and Adaptive Security

    DEFF Research Database (Denmark)

    Ishai, Yuval; Kumarasubramanian, Abishek; Orlandi, Claudio

    2011-01-01

    functionalities was left open. We provide the first convincing evidence that the answer to this question is negative, namely that some (randomized) functionalities cannot be realized with adaptive security. We obtain this result by studying the following related invertible sampling problem: given an efficient...... sampling algorithm A, obtain another sampling algorithm B such that the output of B is computationally indistinguishable from the output of A, but B can be efficiently inverted (even if A cannot). This invertible sampling problem is independently motivated by other cryptographic applications. We show......Secure multiparty computation (MPC) is one of the most general and well studied problems in cryptography. We focus on MPC protocols that are required to be secure even when the adversary can adaptively corrupt parties during the protocol, and under the assumption that honest parties cannot reliably...

  7. GeoLab Sample Handling System Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Develop  a robotic sample handling/ manipulator system for the GeoLab glovebox. This work leverages from earlier GeoLab work and a 2012 collaboration with a...

  8. Guam Commercial Fisheries BioSampling (CFBS)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Guam Commercial Fisheries Biosampling program, which collects length and weight frequency data for whole commercial catches, and samples 4-8 species for in-depth...

  9. Silicon based ultrafast optical waveform sampling

    DEFF Research Database (Denmark)

    Ji, Hua; Galili, Michael; Pu, Minhao

    2010-01-01

    A 300 nmx450 nmx5 mm silicon nanowire is designed and fabricated for a four wave mixing based non-linear optical gate. Based on this silicon nanowire, an ultra-fast optical sampling system is successfully demonstrated using a free-running fiber laser with a carbon nanotube-based mode-locker as th......A 300 nmx450 nmx5 mm silicon nanowire is designed and fabricated for a four wave mixing based non-linear optical gate. Based on this silicon nanowire, an ultra-fast optical sampling system is successfully demonstrated using a free-running fiber laser with a carbon nanotube-based mode......-locker as the sampling source. A clear eye-diagram of a 320 Gbit/s data signal is obtained. The temporal resolution of the sampling system is estimated to 360 fs....

  10. A Comet Surface Sample Return System Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed Phase I investigation will focus on the development of spacecraft systems required to obtain a sample from the nucleus of a comet, hermetically seal the...

  11. Statistical sampling method, used in the audit

    Directory of Open Access Journals (Sweden)

    Gabriela-Felicia UNGUREANU

    2010-05-01

    Full Text Available The rapid increase in the size of U.S. companies from the early twentieth century created the need for audit procedures based on the selection of a part of the total population audited to obtain reliable audit evidence, to characterize the entire population consists of account balances or classes of transactions. Sampling is not used only in audit – is used in sampling surveys, market analysis and medical research in which someone wants to reach a conclusion about a large number of data by examining only a part of these data. The difference is the “population” from which the sample is selected, ie that set of data which is intended to draw a conclusion. Audit sampling applies only to certain types of audit procedures.

  12. Sample preparation in biological mass spectrometry

    CERN Document Server

    Ivanov, Alexander R

    2011-01-01

    The aim of this book is to provide the researcher with important sample preparation strategies in a wide variety of analyte molecules, specimens, methods, and biological applications requiring mass spectrometric analysis as a detection end-point.

  13. ROE Gulf of Mexico Hypoxia Sample Locations

    Data.gov (United States)

    U.S. Environmental Protection Agency — This dataset describes dissolved oxygen levels in the Gulf of Mexico. Individual sampling sites are represented by point data. The background polygon shows areas...

  14. National Sample Survey of Registered Nurses

    Data.gov (United States)

    U.S. Department of Health & Human Services — The National Sample Survey of Registered Nurses (NSSRN) Download makes data from the survey readily available to users in a one-stop download. The Survey has been...

  15. DXC'09 Industrial Track Sample Data

    Data.gov (United States)

    National Aeronautics and Space Administration — Sample data, including nominal and faulty scenarios, for Tier 1 and Tier 2 of the First International Diagnostic Competition. Three file formats are provided,...

  16. Bisphenol A levels in multimedia samples

    Data.gov (United States)

    U.S. Environmental Protection Agency — Levels of bisphenol A in multimedia samples. This dataset is associated with the following publication: Morgan, M., M. Nash, D. Boyd Barr, J. Starr, M. Clifton, and...

  17. Surface sampling concentration and reaction probe

    Science.gov (United States)

    Van Berkel, Gary J; Elnaggar, Mariam S

    2013-07-16

    A method of analyzing a chemical composition of a specimen is described. The method can include providing a probe comprising an outer capillary tube and an inner capillary tube disposed co-axially within the outer capillary tube, where the inner and outer capillary tubes define a solvent capillary and a sampling capillary in fluid communication with one another at a distal end of the probe; contacting a target site on a surface of a specimen with a solvent in fluid communication with the probe; maintaining a plug volume proximate a solvent-specimen interface, wherein the plug volume is in fluid communication with the probe; draining plug sampling fluid from the plug volume through the sampling capillary; and analyzing a chemical composition of the plug sampling fluid with an analytical instrument. A system for performing the method is also described.

  18. NMFS Menhaden Biostatistical (Port Samples) Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Data set consists of port samples of gulf and Atlantic menhaden from the reduction purse-seine fisheries: data include specimen fork length, weight and age (yrs), as...

  19. Optimising uncertainty in physical sample preparation.

    Science.gov (United States)

    Lyn, Jennifer A; Ramsey, Michael H; Damant, Andrew P; Wood, Roger

    2005-11-01

    Uncertainty associated with the result of a measurement can be dominated by the physical sample preparation stage of the measurement process. In view of this, the Optimised Uncertainty (OU) methodology has been further developed to allow the optimisation of the uncertainty from this source, in addition to that from the primary sampling and the subsequent chemical analysis. This new methodology for the optimisation of physical sample preparation uncertainty (u(prep), estimated as s(prep)) is applied for the first time, to a case study of myclobutanil in retail strawberries. An increase in expenditure (+7865%) on the preparatory process was advised in order to reduce the s(prep) by the 69% recommended. This reduction is desirable given the predicted overall saving, under optimised conditions, of 33,000 pounds Sterling per batch. This new methodology has been shown to provide guidance on the appropriate distribution of resources between the three principle stages of a measurement process, including physical sample preparation.

  20. DXC'11 Industrial Track Sample Data

    Data.gov (United States)

    National Aeronautics and Space Administration — The sample scenarios provided here are competition scenarios from Diagnostic Problems I and II of DXC'10. The zip file has a spreadsheet (and pdf) that lists the...

  1. Nitrate Waste Treatment Sampling and Analysis Plan

    Energy Technology Data Exchange (ETDEWEB)

    Vigil-Holterman, Luciana R. [Los Alamos National Laboratory; Martinez, Patrick Thomas [Los Alamos National Laboratory; Garcia, Terrence Kerwin [Los Alamos National Laboratory

    2017-07-05

    This plan is designed to outline the collection and analysis of nitrate salt-bearing waste samples required by the New Mexico Environment Department- Hazardous Waste Bureau in the Los Alamos National Laboratory (LANL) Hazardous Waste Facility Permit (Permit).

  2. Optimizing sampling approaches along ecological gradients

    DEFF Research Database (Denmark)

    Schweiger, Andreas; Irl, Severin D. H.; Steinbauer, Manuel

    2016-01-01

    1. Natural scientists and especially ecologists use manipulative experiments or field observations along gradients to differentiate patterns driven by processes from those caused by random noise. A well-conceived sampling design is essential for identifying, analysing and reporting underlying...

  3. Particle size distribution in ground biological samples.

    Science.gov (United States)

    Koglin, D; Backhaus, F; Schladot, J D

    1997-05-01

    Modern trace and retrospective analysis of Environmental Specimen Bank (ESB) samples require surplus material prepared and characterized as reference materials. Before the biological samples could be analyzed and stored for long periods at cryogenic temperatures, the materials have to be pre-crushed. As a second step, a milling and homogenization procedure has to follow. For this preparation, a grinding device is cooled with liquid nitrogen to a temperature of -190 degrees C. It is a significant condition for homogeneous samples that at least 90% of the particles should be smaller than 200 microns. In the German ESB the particle size distribution of the processed material is determined by means of a laser particle sizer. The decrease of particle sizes of deer liver and bream muscles after different grinding procedures as well as the consequences of ultrasonic treatment of the sample before particle size measurements have been investigated.

  4. Two-stage sampling for acceptance testing

    Energy Technology Data Exchange (ETDEWEB)

    Atwood, C.L.; Bryan, M.F.

    1992-09-01

    Sometimes a regulatory requirement or a quality-assurance procedure sets an allowed maximum on a confidence limit for a mean. If the sample mean of the measurements is below the allowed maximum, but the confidence limit is above it, a very widespread practice is to increase the sample size and recalculate the confidence bound. The confidence level of this two-stage procedure is rarely found correctly, but instead is typically taken to be the nominal confidence level, found as if the final sample size had been specified in advance. In typical settings, the correct nominal [alpha] should be between the desired P(Type I error) and half that value. This note gives tables for the correct a to use, some plots of power curves, and an example of correct two-stage sampling.

  5. Two-stage sampling for acceptance testing

    Energy Technology Data Exchange (ETDEWEB)

    Atwood, C.L.; Bryan, M.F.

    1992-09-01

    Sometimes a regulatory requirement or a quality-assurance procedure sets an allowed maximum on a confidence limit for a mean. If the sample mean of the measurements is below the allowed maximum, but the confidence limit is above it, a very widespread practice is to increase the sample size and recalculate the confidence bound. The confidence level of this two-stage procedure is rarely found correctly, but instead is typically taken to be the nominal confidence level, found as if the final sample size had been specified in advance. In typical settings, the correct nominal {alpha} should be between the desired P(Type I error) and half that value. This note gives tables for the correct a to use, some plots of power curves, and an example of correct two-stage sampling.

  6. Designing optimal sampling schemes for field visits

    CSIR Research Space (South Africa)

    Debba, Pravesh

    2008-10-01

    Full Text Available This is a presentation of a statistical method for deriving optimal spatial sampling schemes. The research focuses on ground verification of minerals derived from hyperspectral data. Spectral angle mapper (SAM) and spectral feature fitting (SFF...

  7. Sample Return Systems for Extreme Environments Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In Phase I we were able to demonstrate that sample return missions utilizing high velocity penetrators (0.1- 1 km/s) could provide substantial new capabilities for...

  8. BioSampling Data from LHP Cruises

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This data set includes separate bioSampling logs from each LHP Bottomfishing cruise both within and outside of the Main Hawaiian Islands, as well as a master file...

  9. Guidelines for sampling fish in inland waters

    National Research Council Canada - National Science Library

    Backiel, Tadeusz; Welcomme, R. L

    1980-01-01

    The book is addressed mainly to Fishery Biologists but it is hoped that Fishing Gear Technologists also can acquire some basic knowledge of sampling problems and procedures which, in turn, can result...

  10. Sampling: Making Electronic Discovery More Cost Effective

    Directory of Open Access Journals (Sweden)

    Milton Luoma

    2011-06-01

    Full Text Available With the huge volumes of electronic data subject to discovery in virtually every instance of litigation, time and costs of conducting discovery have become exceedingly important when litigants plan their discovery strategies.  Rather than incurring the costs of having lawyers review every document produced in response to a discovery request in search of relevant evidence, a cost effective strategy for document review planning is to use statistical sampling of the database of documents to determine the likelihood of finding relevant evidence by reviewing additional documents.  This paper reviews and discusses how sampling can be used to make document review more cost effective by considering issues such as an appropriate sample size, how to develop a sampling strategy, and taking into account the potential value of the litigation in relation to the costs of additional discovery efforts. 

  11. Boson sampling on a photonic chip.

    Science.gov (United States)

    Spring, Justin B; Metcalf, Benjamin J; Humphreys, Peter C; Kolthammer, W Steven; Jin, Xian-Min; Barbieri, Marco; Datta, Animesh; Thomas-Peter, Nicholas; Langford, Nathan K; Kundys, Dmytro; Gates, James C; Smith, Brian J; Smith, Peter G R; Walmsley, Ian A

    2013-02-15

    Although universal quantum computers ideally solve problems such as factoring integers exponentially more efficiently than classical machines, the formidable challenges in building such devices motivate the demonstration of simpler, problem-specific algorithms that still promise a quantum speedup. We constructed a quantum boson-sampling machine (QBSM) to sample the output distribution resulting from the nonclassical interference of photons in an integrated photonic circuit, a problem thought to be exponentially hard to solve classically. Unlike universal quantum computation, boson sampling merely requires indistinguishable photons, linear state evolution, and detectors. We benchmarked our QBSM with three and four photons and analyzed sources of sampling inaccuracy. Scaling up to larger devices could offer the first definitive quantum-enhanced computation.

  12. Hydraulically controlled discrete sampling from open boreholes

    Science.gov (United States)

    Harte, Philip T.

    2013-01-01

    Groundwater sampling from open boreholes in fractured-rock aquifers is particularly challenging because of mixing and dilution of fluid within the borehole from multiple fractures. This note presents an alternative to traditional sampling in open boreholes with packer assemblies. The alternative system called ZONFLO (zonal flow) is based on hydraulic control of borehole flow conditions. Fluid from discrete fractures zones are hydraulically isolated allowing for the collection of representative samples. In rough-faced open boreholes and formations with less competent rock, hydraulic containment may offer an attractive alternative to physical containment with packers. Preliminary test results indicate a discrete zone can be effectively hydraulically isolated from other zones within a borehole for the purpose of groundwater sampling using this new method.

  13. Water Sample Points, Navajo Nation, 2000, USACE

    Data.gov (United States)

    U.S. Environmental Protection Agency — This point shapefile presents the locations and results for water samples collected on the Navajo Nation by the US Army Corps of Engineers (USACE) for the US...

  14. CNMI Commercial Fisheries BioSampling (CFBS)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The main market sampling program in the Commonwealth of the Northern Mariana Islands (CNMI) is the new biosampling program implemented in late 2010 on the island of...

  15. Sampling, Probability Models and Statistical Reasoning Statistical ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...

  16. Spatial-dependence recurrence sample entropy

    Science.gov (United States)

    Pham, Tuan D.; Yan, Hong

    2018-03-01

    Measuring complexity in terms of the predictability of time series is a major area of research in science and engineering, and its applications are spreading throughout many scientific disciplines, where the analysis of physiological signals is perhaps the most widely reported in literature. Sample entropy is a popular measure for quantifying signal irregularity. However, the sample entropy does not take sequential information, which is inherently useful, into its calculation of sample similarity. Here, we develop a method that is based on the mathematical principle of the sample entropy and enables the capture of sequential information of a time series in the context of spatial dependence provided by the binary-level co-occurrence matrix of a recurrence plot. Experimental results on time-series data of the Lorenz system, physiological signals of gait maturation in healthy children, and gait dynamics in Huntington's disease show the potential of the proposed method.

  17. Importance Sampling Variance Reduction in GRESS ATMOSIM

    Energy Technology Data Exchange (ETDEWEB)

    Wakeford, Daniel Tyler [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-04-26

    This document is intended to introduce the importance sampling method of variance reduction to a Geant4 user for application to neutral particle Monte Carlo transport through the atmosphere, as implemented in GRESS ATMOSIM.

  18. Commercial Fisheries Database Biological Sample (CFDBS)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Age and length frequency data for finfish and invertebrate species collected during commercial fishing vessels. Samples are collected by fisheries reporting...

  19. Comparison of metagenomic samples using sequence signatures

    Directory of Open Access Journals (Sweden)

    Jiang Bai

    2012-12-01

    Full Text Available Abstract Background Sequence signatures, as defined by the frequencies of k-tuples (or k-mers, k-grams, have been used extensively to compare genomic sequences of individual organisms, to identify cis-regulatory modules, and to study the evolution of regulatory sequences. Recently many next-generation sequencing (NGS read data sets of metagenomic samples from a variety of different environments have been generated. The assembly of these reads can be difficult and analysis methods based on mapping reads to genes or pathways are also restricted by the availability and completeness of existing databases. Sequence-signature-based methods, however, do not need the complete genomes or existing databases and thus, can potentially be very useful for the comparison of metagenomic samples using NGS read data. Still, the applications of sequence signature methods for the comparison of metagenomic samples have not been well studied. Results We studied several dissimilarity measures, including d2, d2* and d2S recently developed from our group, a measure (hereinafter noted as Hao used in CVTree developed from Hao’s group (Qi et al., 2004, measures based on relative di-, tri-, and tetra-nucleotide frequencies as in Willner et al. (2009, as well as standard lp measures between the frequency vectors, for the comparison of metagenomic samples using sequence signatures. We compared their performance using a series of extensive simulations and three real next-generation sequencing (NGS metagenomic datasets: 39 fecal samples from 33 mammalian host species, 56 marine samples across the world, and 13 fecal samples from human individuals. Results showed that the dissimilarity measure d2S can achieve superior performance when comparing metagenomic samples by clustering them into different groups as well as recovering environmental gradients affecting microbial samples. New insights into the environmental factors affecting microbial compositions in metagenomic samples

  20. Streaming Gibbs Sampling for LDA Model

    OpenAIRE

    Gao, Yang; Chen, Jianfei; Zhu, Jun

    2016-01-01

    Streaming variational Bayes (SVB) is successful in learning LDA models in an online manner. However previous attempts toward developing online Monte-Carlo methods for LDA have little success, often by having much worse perplexity than their batch counterparts. We present a streaming Gibbs sampling (SGS) method, an online extension of the collapsed Gibbs sampling (CGS). Our empirical study shows that SGS can reach similar perplexity as CGS, much better than SVB. Our distributed version of SGS,...

  1. Exact sampling hardness of Ising spin models

    Science.gov (United States)

    Fefferman, B.; Foss-Feig, M.; Gorshkov, A. V.

    2017-09-01

    We study the complexity of classically sampling from the output distribution of an Ising spin model, which can be implemented naturally in a variety of atomic, molecular, and optical systems. In particular, we construct a specific example of an Ising Hamiltonian that, after time evolution starting from a trivial initial state, produces a particular output configuration with probability very nearly proportional to the square of the permanent of a matrix with arbitrary integer entries. In a similar spirit to boson sampling, the ability to sample classically from the probability distribution induced by time evolution under this Hamiltonian would imply unlikely complexity theoretic consequences, suggesting that the dynamics of such a spin model cannot be efficiently simulated with a classical computer. Physical Ising spin systems capable of achieving problem-size instances (i.e., qubit numbers) large enough so that classical sampling of the output distribution is classically difficult in practice may be achievable in the near future. Unlike boson sampling, our current results only imply hardness of exact classical sampling, leaving open the important question of whether a much stronger approximate-sampling hardness result holds in this context. The latter is most likely necessary to enable a convincing experimental demonstration of quantum supremacy. As referenced in a recent paper [A. Bouland, L. Mancinska, and X. Zhang, in Proceedings of the 31st Conference on Computational Complexity (CCC 2016), Leibniz International Proceedings in Informatics (Schloss Dagstuhl-Leibniz-Zentrum für Informatik, Dagstuhl, 2016)], our result completes the sampling hardness classification of two-qubit commuting Hamiltonians.

  2. Statistical literacy and sample survey results

    Science.gov (United States)

    McAlevey, Lynn; Sullivan, Charles

    2010-10-01

    Sample surveys are widely used in the social sciences and business. The news media almost daily quote from them, yet they are widely misused. Using students with prior managerial experience embarking on an MBA course, we show that common sample survey results are misunderstood even by those managers who have previously done a statistics course. In general, they fare no better than managers who have never studied statistics. There are implications for teaching, especially in business schools, as well as for consulting.

  3. Object Detection with Active Sample Harvesting

    OpenAIRE

    Canévet, Olivier

    2017-01-01

    The work presented in this dissertation lies in the domains of image classification, object detection, and machine learning. Whether it is training image classifiers or object detectors, the learning phase consists in finding an optimal boundary between populations of samples. In practice, all the samples are not equally important: some examples are trivially classified and do not bring much to the training, while others close to the boundary or misclassified are the ones that truly matter. S...

  4. Applicability of passive sampling to groundwater monitoring

    OpenAIRE

    Berho, Catherine; Togola, Anne; Ghestem, Jean Philippe

    2011-01-01

    Passive sampling technology has become of great importance in the field of environmental monitoring for several years, due to its well-known advantages (low perturbation of the sample, time weighted average concentration estimation ...). Although passive samplers have been successfully used in a variety of field studies in surface waters, only a few studies have tested their applicability in groundwater. Indeed, groundwater presents specificity such as a low velocity of water which might affe...

  5. Sampling for assurance of future reliability

    Science.gov (United States)

    Klauenberg, Katy; Elster, Clemens

    2017-02-01

    Ensuring measurement trueness, compliance with regulations and conformity with standards are key tasks in metrology which are often considered at the time of an inspection. Current practice does not always verify quality after or between inspections, calibrations, laboratory comparisons, conformity assessments, etc. Statistical models describing behavior over time may ensure reliability, i.e. they may give the probability of functioning, compliance or survival until some future point in time. It may not always be possible or economic to inspect a whole population of measuring devices or other units. Selecting a subset of the population according to statistical sampling plans and inspecting only these, allows conclusions about the quality of the whole population with a certain confidence. Combining these issues of sampling and aging, raises questions such as: How many devices need to be inspected, and at least how many of them must conform, so that one can be sure, that more than 100p % of the population will comply until the next inspection? This research is to raise awareness and offer a simple answer to such time- and sample-based quality statements in metrology and beyond. Reliability demonstration methods, such as the prevailing Weibull binomial model, quantify the confidence in future reliability on the basis of a sample. We adapt the binomial model to be applicable to sampling without replacement and simplify the Weibull model so that sampling plans may be determined on the basis of existing ISO standards. Provided the model is suitable, no additional information and no software are needed; and yet, the consumer is protected against future failure. We establish new sampling plans for utility meter surveillance, which are required by a recent modification of German law. These sampling plans are given in similar tables to the previous ones, which demonstrates their suitability for everyday use.

  6. Harpoon-based sample Acquisition System

    Science.gov (United States)

    Bernal, Javier; Nuth, Joseph; Wegel, Donald

    2012-02-01

    Acquiring information about the composition of comets, asteroids, and other near Earth objects is very important because they may contain the primordial ooze of the solar system and the origins of life on Earth. Sending a spacecraft is the obvious answer, but once it gets there it needs to collect and analyze samples. Conceptually, a drill or a shovel would work, but both require something extra to anchor it to the comet, adding to the cost and complexity of the spacecraft. Since comets and asteroids are very low gravity objects, drilling becomes a problem. If you do not provide a grappling mechanism, the drill would push the spacecraft off the surface. Harpoons have been proposed as grappling mechanisms in the past and are currently flying on missions such as ROSETTA. We propose to use a hollow, core sampling harpoon, to act as the anchoring mechanism as well as the sample collecting device. By combining these two functions, mass is reduced, more samples can be collected and the spacecraft can carry more propellant. Although challenging, returning the collected samples to Earth allows them to be analyzed in laboratories with much greater detail than possible on a spacecraft. Also, bringing the samples back to Earth allows future generations to study them.

  7. High-efficiency multiphoton boson sampling

    Science.gov (United States)

    Wang, Hui; He, Yu; Li, Yu-Huai; Su, Zu-En; Li, Bo; Huang, He-Liang; Ding, Xing; Chen, Ming-Cheng; Liu, Chang; Qin, Jian; Li, Jin-Peng; He, Yu-Ming; Schneider, Christian; Kamp, Martin; Peng, Cheng-Zhi; Höfling, Sven; Lu, Chao-Yang; Pan, Jian-Wei

    2017-06-01

    Boson sampling is considered as a strong candidate to demonstrate 'quantum computational supremacy' over classical computers. However, previous proof-of-principle experiments suffered from small photon number and low sampling rates owing to the inefficiencies of the single-photon sources and multiport optical interferometers. Here, we develop two central components for high-performance boson sampling: robust multiphoton interferometers with 99% transmission rate and actively demultiplexed single-photon sources based on a quantum dot-micropillar with simultaneously high efficiency, purity and indistinguishability. We implement and validate three-, four- and five-photon boson sampling, and achieve sampling rates of 4.96 kHz, 151 Hz and 4 Hz, respectively, which are over 24,000 times faster than previous experiments. Our architecture can be scaled up for a larger number of photons and with higher sampling rates to compete with classical computers, and might provide experimental evidence against the extended Church-Turing thesis.

  8. Micro contactor based on isotachophoretic sample transport.

    Science.gov (United States)

    Goet, Gabriele; Baier, Tobias; Hardt, Steffen

    2009-12-21

    It is demonstrated how isotachophoresis (ITP) in a microfluidic device may be utilized to bring two small sample volumes into contact in a well-controlled manner. The ITP contactor serves a similar purpose as micromixers that are designed to mix two species rapidly in a microfluidic channel. In contrast to many micromixers, the ITP contactor does not require complex channel architectures and allows a sample processing in the spirit of "digital microfluidics", i.e. the samples always remain in a compact volume. It is shown that the ITP zone transport through microchannels proceeds in a reproducible and predictable manner, and that the sample trajectories follow simple relationships obtained from Ohm's law. Firstly, the micro contactor can be used to synchronize two ITP zones having reached a channel at different points in time. Secondly, fulfilling its actual purpose it is capable of bringing two samples in molecular contact via an interpenetration of ITP zones. It is demonstrated that the contacting time is proportional to the ITP zone extension. This opens up the possibility of using that type of device as a special type of micromixer with "mixing times" significantly below one second and an option to regulate the duration of contact through specific parameters such as the sample volume. Finally, it is shown how the micro contactor can be utilized to conduct a hybridization reaction between two ITP zones containing complementary DNA strands.

  9. On sampling fractions and electron shower shapes

    Energy Technology Data Exchange (ETDEWEB)

    Peryshkin, Alexander; Raja, Rajendran; /Fermilab

    2011-12-01

    We study the usage of various definitions of sampling fractions in understanding electron shower shapes in a sampling multilayer electromagnetic calorimeter. We show that the sampling fractions obtained by the conventional definition (I) of (average observed energy in layer)/(average deposited energy in layer) will not give the best energy resolution for the calorimeter. The reason for this is shown to be the presence of layer by layer correlations in an electromagnetic shower. The best resolution is obtained by minimizing the deviation from the total input energy using a least squares algorithm. The 'sampling fractions' obtained by this method (II) are shown to give the best resolution for overall energy. We further show that the method (II) sampling fractions are obtained by summing the columns of a non-local {lambda} tensor that incorporates the correlations. We establish that the sampling fractions (II) cannot be used to predict the layer by layer energies and that one needs to employ the full {lambda} tensor for this purpose. This effect is again a result of the correlations.

  10. Electrostatic sampling of trace DNA from clothing.

    Science.gov (United States)

    Zieger, Martin; Defaux, Priscille Merciani; Utz, Silvia

    2016-05-01

    During acts of physical aggression, offenders frequently come into contact with clothes of the victim, thereby leaving traces of DNA-bearing biological material on the garments. Since tape-lifting and swabbing, the currently established methods for non-destructive trace DNA sampling from clothing, both have their shortcomings in collection efficiency and handling, we thought about a new collection method for these challenging samples. Testing two readily available electrostatic devices for their potential to sample biological material from garments made of different fabrics, we found one of them, the electrostatic dust print lifter (DPL), to perform comparable to well-established sampling with wet cotton swabs. In simulated aggression scenarios, we had the same success rate for the establishment of single aggressor profiles, suitable for database submission, with both the DPL and wet swabbing. However, we lost a substantial amount of information with electrostatic sampling, since almost no mixed aggressor-victim profiles suitable for database entry could be established, compared to conventional swabbing. This study serves as a proof of principle for electrostatic DNA sampling from items of clothing. The technique still requires optimization before it might be used in real casework. But we are confident that in the future it could be an efficient and convenient contribution to the toolbox of forensic practitioners.

  11. Quota sampling in internet research: practical issues.

    Science.gov (United States)

    Im, Eun-Ok; Chee, Wonshik

    2011-07-01

    Quota sampling has been suggested as a potentially good method for Internet-based research and has been used by several researchers working with Internet samples. However, very little is known about the issues or concerns in using a quota sampling method in Internet research. The purpose of this article was to present the practical issues using quota sampling in an Internet-based study. During the Internet study, the research team recorded all recruitment issues that arose and made written notes indicating the possible reasons for the problems. In addition, biweekly team discussions were conducted for which written records were kept. Overall, quota sampling was effective in ensuring that an adequate number of midlife women were recruited from the targeted ethnic groups. However, during the study process, we encountered the following practical issues using quota sampling: (1) difficulty reaching out to women in lower socioeconomic classes, (2) difficulty ensuring authenticity of participants' identities, (3) participants giving inconsistent answers for the screening questions versus the Internet survey questions, (4) potential problems with a question on socioeconomic status, (5) resentment toward the research project and/or researchers because of rejection, and (6) a longer time and more expense than anticipated.

  12. Automated sampling and control of gaseous simulations

    KAUST Repository

    Huang, Ruoguan

    2013-05-04

    In this work, we describe a method that automates the sampling and control of gaseous fluid simulations. Several recent approaches have provided techniques for artists to generate high-resolution simulations based on a low-resolution simulation. However, often in applications the overall flow in the low-resolution simulation that an animator observes and intends to preserve is composed of even lower frequencies than the low resolution itself. In such cases, attempting to match the low-resolution simulation precisely is unnecessarily restrictive. We propose a new sampling technique to efficiently capture the overall flow of a fluid simulation, at the scale of user\\'s choice, in such a way that the sampled information is sufficient to represent what is virtually perceived and no more. Thus, by applying control based on the sampled data, we ensure that in the resulting high-resolution simulation, the overall flow is matched to the low-resolution simulation and the fine details on the high resolution are preserved. The samples we obtain have both spatial and temporal continuity that allows smooth keyframe matching and direct manipulation of visible elements such as smoke density through temporal blending of samples. We demonstrate that a user can easily configure a simulation with our system to achieve desired results. © 2013 Springer-Verlag Berlin Heidelberg.

  13. Demystifying Theoretical Sampling in Grounded Theory Research

    Directory of Open Access Journals (Sweden)

    Jenna Breckenridge BSc(Hons,Ph.D.Candidate

    2009-06-01

    Full Text Available Theoretical sampling is a central tenet of classic grounded theory and is essential to the development and refinement of a theory that is ‘grounded’ in data. While many authors appear to share concurrent definitions of theoretical sampling, the ways in which the process is actually executed remain largely elusive and inconsistent. As such, employing and describing the theoretical sampling process can present a particular challenge to novice researchers embarking upon their first grounded theory study. This article has been written in response to the challenges faced by the first author whilst writing a grounded theory proposal. It is intended to clarify theoretical sampling for new grounded theory researchers, offering some insight into the practicalities of selecting and employing a theoretical sampling strategy. It demonstrates that the credibility of a theory cannot be dissociated from the process by which it has been generated and seeks to encourage and challenge researchers to approach theoretical sampling in a way that is apposite to the core principles of the classic grounded theory methodology.

  14. Collecting Samples in Gale Crater, Mars; an Overview of the Mars Science Laboratory Sample Acquisition, Sample Processing and Handling System

    Science.gov (United States)

    Anderson, R. C.; Jandura, L.; Okon, A. B.; Sunshine, D.; Roumeliotis, C.; Beegle, L. W.; Hurowitz, J.; Kennedy, B.; Limonadi, D.; McCloskey, S.; Robinson, M.; Seybold, C.; Brown, K.

    2012-09-01

    The Mars Science Laboratory Mission (MSL), scheduled to land on Mars in the summer of 2012, consists of a rover and a scientific payload designed to identify and assess the habitability, geological, and environmental histories of Gale crater. Unraveling the geologic history of the region and providing an assessment of present and past habitability requires an evaluation of the physical and chemical characteristics of the landing site; this includes providing an in-depth examination of the chemical and physical properties of Martian regolith and rocks. The MSL Sample Acquisition, Processing, and Handling (SA/SPaH) subsystem will be the first in-situ system designed to acquire interior rock and soil samples from Martian surface materials. These samples are processed and separated into fine particles and distributed to two onboard analytical science instruments SAM (Sample Analysis at Mars Instrument Suite) and CheMin (Chemistry and Mineralogy) or to a sample analysis tray for visual inspection. The SA/SPaH subsystem is also responsible for the placement of the two contact instruments, Alpha Particle X-Ray Spectrometer (APXS), and the Mars Hand Lens Imager (MAHLI), on rock and soil targets. Finally, there is a Dust Removal Tool (DRT) to remove dust particles from rock surfaces for subsequent analysis by the contact and or mast mounted instruments (e.g. Mast Cameras (MastCam) and the Chemistry and Micro-Imaging instruments (ChemCam)).

  15. UNLABELED SELECTED SAMPLES IN FEATURE EXTRACTION FOR CLASSIFICATION OF HYPERSPECTRAL IMAGES WITH LIMITED TRAINING SAMPLES

    Directory of Open Access Journals (Sweden)

    A. Kianisarkaleh

    2015-12-01

    Full Text Available Feature extraction plays a key role in hyperspectral images classification. Using unlabeled samples, often unlimitedly available, unsupervised and semisupervised feature extraction methods show better performance when limited number of training samples exists. This paper illustrates the importance of selecting appropriate unlabeled samples that used in feature extraction methods. Also proposes a new method for unlabeled samples selection using spectral and spatial information. The proposed method has four parts including: PCA, prior classification, posterior classification and sample selection. As hyperspectral image passes these parts, selected unlabeled samples can be used in arbitrary feature extraction methods. The effectiveness of the proposed unlabeled selected samples in unsupervised and semisupervised feature extraction is demonstrated using two real hyperspectral datasets. Results show that through selecting appropriate unlabeled samples, the proposed method can improve the performance of feature extraction methods and increase classification accuracy.

  16. Determining the Mineralogy of Lunar Samples Using Micro Raman Spectroscopy: Comparisons Between Polished and Unpolished Samples

    Science.gov (United States)

    Bower, D. M.; Curran, N. M.; Cohen, B. A.

    2017-10-01

    Raman spectroscopy is a versatile non-destructive analytical technique that provides compositional and contextual information for geologic samples, including lunar rocks. We have analyzed a suite of Apollo 16 samples using micro Raman spectroscopy.

  17. Sample results from the interim salt disposition program macrobatch 9 tank 21H qualification samples

    Energy Technology Data Exchange (ETDEWEB)

    Peters, T. B. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-11-01

    Savannah River National Laboratory (SRNL) analyzed samples from Tank 21H in support of qualification of Macrobatch (Salt Batch) 9 for the Interim Salt Disposition Program (ISDP). This document reports characterization data on the samples of Tank 21H.

  18. Fluid sample collection and distribution system. [qualitative analysis of aqueous samples from several points

    Science.gov (United States)

    Brooks, R. L. (Inventor)

    1979-01-01

    A multipoint fluid sample collection and distribution system is provided wherein the sample inputs are made through one or more of a number of sampling valves to a progressive cavity pump which is not susceptible to damage by large unfiltered particles. The pump output is through a filter unit that can provide a filtered multipoint sample. An unfiltered multipoint sample is also provided. An effluent sample can be taken and applied to a second progressive cavity pump for pumping to a filter unit that can provide one or more filtered effluent samples. The second pump can also provide an unfiltered effluent sample. Means are provided to periodically back flush each filter unit without shutting off the whole system.

  19. Procedures for sampling and sample reduction within quality assurance systems for solid biofuels

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-07-01

    The objective of this experimental study on sampling was to determine the size and number of samples of biofuels required (taken at two sampling points in each case) and to compare two methods of sampling. The first objective of the sample-reduction exercise was to compare the reliability of various sampling methods, and the second objective was to measure the variations introduced as a result of reducing the sample size to form suitable test portions. The materials studied were sawdust, wood chips, wood pellets and bales of straw, and these were analysed for moisture, ash, particle size and chloride. The sampling procedures are described. The study was conducted in Scandinavia. The results of the study were presented in Leipzig in October 2004. The work was carried out as part of the UK's DTI Technology Programme: New and Renewable Energy.

  20. GSAMPLE: Stata module to draw a random sample

    OpenAIRE

    Jann, Ben

    2006-01-01

    gsample draws a random sample from the data in memory. Simple random sampling (SRS) is supported, as well as unequal probability sampling (UPS), of which sampling with probabilities proportional to size (PPS) is a special case. Both methods, SRS and UPS/PPS, provide sampling with replacement and sampling without replacement. Furthermore, stratified sampling and cluster sampling is supported.

  1. Novel Sample-handling Approach for XRD Analysis with Minimal Sample Preparation

    Science.gov (United States)

    Sarrazin, P.; Chipera, S.; Bish, D.; Blake, D.; Feldman, S.; Vaniman, D.; Bryson, C.

    2004-01-01

    Sample preparation and sample handling are among the most critical operations associated with X-ray diffraction (XRD) analysis. These operations require attention in a laboratory environment, but they become a major constraint in the deployment of XRD instruments for robotic planetary exploration. We are developing a novel sample handling system that dramatically relaxes the constraints on sample preparation by allowing characterization of coarse-grained material that would normally be impossible to analyze with conventional powder-XRD techniques.

  2. Influence of sampling depth and post-sampling analysis time on the ...

    African Journals Online (AJOL)

    Bacteriological analysis was carried out for samples taken at water depth and at 1, 6, 12 and 24 hours post-sampling. It was observed that the total and faecal coliform bacteria were significantly higher in the 3 m water depth samples than in the surface water samples (ANOVA, F = 59.41, 26.751, 9.82 (T.C); 46.41, 26.81, ...

  3. Replicating studies in which samples of participants respond to samples of stimuli.

    Science.gov (United States)

    Westfall, Jacob; Judd, Charles M; Kenny, David A

    2015-05-01

    In a direct replication, the typical goal is to reproduce a prior experimental result with a new but comparable sample of participants in a high-powered replication study. Often in psychology, the research to be replicated involves a sample of participants responding to a sample of stimuli. In replicating such studies, we argue that the same criteria should be used in sampling stimuli as are used in sampling participants. Namely, a new but comparable sample of stimuli should be used to ensure that the original results are not due to idiosyncrasies of the original stimulus sample, and the stimulus sample must often be enlarged to ensure high statistical power. In support of the latter point, we discuss the fact that in experiments involving samples of stimuli, statistical power typically does not approach 1 as the number of participants goes to infinity. As an example of the importance of sampling new stimuli, we discuss the bygone literature on the risky shift phenomenon, which was almost entirely based on a single stimulus sample that was later discovered to be highly unrepresentative. We discuss the use of both resampled and expanded stimulus sets, that is, stimulus samples that include the original stimuli plus new stimuli. © The Author(s) 2015.

  4. Single versus duplicate blood samples in ACTH stimulated adrenal vein sampling

    NARCIS (Netherlands)

    Dekkers, T.; Arntz, M.; Wilt, G.J. van der; Schultze Kool, L.J.; Sweep, F.C.; Hermus, A.R.M.M.; Lenders, J.W.M.; Deinum, J.

    2013-01-01

    BACKGROUND: Adrenal vein sampling (AVS) is the preferred test for subtyping primary aldosteronism. However, the procedure is technically demanding and costly. In AVS it is common practice to take duplicate blood samples at each location. In this paper we explore whether a single sample procedure

  5. Perpendicular distance sampling: an alternative method for sampling downed coarse woody debris

    Science.gov (United States)

    Michael S. Williams; Jeffrey H. Gove

    2003-01-01

    Coarse woody debris (CWD) plays an important role in many forest ecosystem processes. In recent years, a number of new methods have been proposed to sample CWD. These methods select individual logs into the sample using some form of unequal probability sampling. One concern with most of these methods is the difficulty in estimating the volume of each log. A new method...

  6. Why sampling scheme matters: the effect of sampling scheme on landscape genetic results

    Science.gov (United States)

    Michael K. Schwartz; Kevin S. McKelvey

    2008-01-01

    There has been a recent trend in genetic studies of wild populations where researchers have changed their sampling schemes from sampling pre-defined populations to sampling individuals uniformly across landscapes. This reflects the fact that many species under study are continuously distributed rather than clumped into obvious "populations". Once individual...

  7. Vibronic Boson Sampling: Generalized Gaussian Boson Sampling for Molecular Vibronic Spectra at Finite Temperature.

    Science.gov (United States)

    Huh, Joonsuk; Yung, Man-Hong

    2017-08-07

    Molecular vibroic spectroscopy, where the transitions involve non-trivial Bosonic correlation due to the Duschinsky Rotation, is strongly believed to be in a similar complexity class as Boson Sampling. At finite temperature, the problem is represented as a Boson Sampling experiment with correlated Gaussian input states. This molecular problem with temperature effect is intimately related to the various versions of Boson Sampling sharing the similar computational complexity. Here we provide a full description to this relation in the context of Gaussian Boson Sampling. We find a hierarchical structure, which illustrates the relationship among various Boson Sampling schemes. Specifically, we show that every instance of Gaussian Boson Sampling with an initial correlation can be simulated by an instance of Gaussian Boson Sampling without initial correlation, with only a polynomial overhead. Since every Gaussian state is associated with a thermal state, our result implies that every sampling problem in molecular vibronic transitions, at any temperature, can be simulated by Gaussian Boson Sampling associated with a product of vacuum modes. We refer such a generalized Gaussian Boson Sampling motivated by the molecular sampling problem as Vibronic Boson Sampling.

  8. Sample Size for Measuring Grammaticality in Preschool Children from Picture-Elicited Language Samples

    Science.gov (United States)

    Eisenberg, Sarita L.; Guo, Ling-Yu

    2015-01-01

    Purpose: The purpose of this study was to investigate whether a shorter language sample elicited with fewer pictures (i.e., 7) would yield a percent grammatical utterances (PGU) score similar to that computed from a longer language sample elicited with 15 pictures for 3-year-old children. Method: Language samples were elicited by asking forty…

  9. Lunar Samples: Apollo Collection Tools, Curation Handling, Surveyor III and Soviet Luna Samples

    Science.gov (United States)

    Allton, J.H.

    2009-01-01

    The 6 Apollo missions that landed on the lunar surface returned 2196 samples comprised of 382 kg. The 58 samples weighing 21.5 kg collected on Apollo 11 expanded to 741 samples weighing 110.5 kg by the time of Apollo 17. The main goal on Apollo 11 was to obtain some material and return it safely to Earth. As we gained experience, the sampling tools and a more specific sampling strategy evolved. A summary of the sample types returned is shown in Table 1. By year 1989, some statistics on allocation by sample type were compiled [2]. The "scientific interest index" is based on the assumption that the more allocations per gram of sample, the higher the scientific interest. It is basically a reflection of the amount of diversity within a given sample type. Samples were also set aside for biohazard testing. The samples set aside and used for biohazard testing were represen-tative, as opposed to diverse. They tended to be larger and be comprised of less scientifically valuable mate-rial, such as dust and debris in the bottom of sample containers.

  10. Systematic sampling of discrete and continuous populations: sample selection and the choice of estimator

    Science.gov (United States)

    Harry T. Valentine; David L. R. Affleck; Timothy G. Gregoire

    2009-01-01

    Systematic sampling is easy, efficient, and widely used, though it is not generally recognized that a systematic sample may be drawn from the population of interest with or without restrictions on randomization. The restrictions or the lack of them determine which estimators are unbiased, when using the sampling design as the basis for inference. We describe the...

  11. Biological Sterilization of Returned Mars Samples

    Science.gov (United States)

    Allen, C. C.; Albert, F. G.; Combie, J.; Bodnar, R. J.; Hamilton, V. E.; Jolliff, B. L.; Kuebler, K.; Wang, A.; Lindstrom, D. J.; Morris, P. A.

    1999-01-01

    Martian rock and soil, collected by robotic spacecraft, will be returned to terrestrial laboratories early in the next century. Current plans call for the samples to be immediately placed into biological containment and tested for signs of present or past life and biological hazards. It is recommended that "Controlled distribution of unsterilized materials from Mars should occur only if rigorous analyses determine that the materials do not constitute a biological hazard. If any portion of the sample is removed from containment prior to completion of these analyses it should first be sterilized." While sterilization of Mars samples may not be required, an acceptable method must be available before the samples are returned to Earth. The sterilization method should be capable of destroying a wide range of organisms with minimal effects on the geologic samples. A variety of biological sterilization techniques and materials are currently in use, including dry heat, high pressure steam, gases, plasmas and ionizing radiation. Gamma radiation is routinely used to inactivate viruses and destroy bacteria in medical research. Many commercial sterilizers use Co-60 , which emits gamma photons of 1.17 and 1.33 MeV. Absorbed doses of approximately 1 Mrad (10(exp 8) ergs/g) destroy most bacteria. This study investigates the effects of lethal doses of Co-60 gamma radiation on materials similar to those anticipated to be returned from Mars. The goals are to determine the gamma dose required to kill microorganisms in rock and soil samples and to determine the effects of gamma sterilization on the samples' isotopic, chemical and physical properties. Additional information is contained in the original extended abstract.

  12. Predicting sample size required for classification performance

    Directory of Open Access Journals (Sweden)

    Figueroa Rosa L

    2012-02-01

    Full Text Available Abstract Background Supervised learning methods need annotated data in order to generate efficient models. Annotated data, however, is a relatively scarce resource and can be expensive to obtain. For both passive and active learning methods, there is a need to estimate the size of the annotated sample required to reach a performance target. Methods We designed and implemented a method that fits an inverse power law model to points of a given learning curve created using a small annotated training set. Fitting is carried out using nonlinear weighted least squares optimization. The fitted model is then used to predict the classifier's performance and confidence interval for larger sample sizes. For evaluation, the nonlinear weighted curve fitting method was applied to a set of learning curves generated using clinical text and waveform classification tasks with active and passive sampling methods, and predictions were validated using standard goodness of fit measures. As control we used an un-weighted fitting method. Results A total of 568 models were fitted and the model predictions were compared with the observed performances. Depending on the data set and sampling method, it took between 80 to 560 annotated samples to achieve mean average and root mean squared error below 0.01. Results also show that our weighted fitting method outperformed the baseline un-weighted method (p Conclusions This paper describes a simple and effective sample size prediction algorithm that conducts weighted fitting of learning curves. The algorithm outperformed an un-weighted algorithm described in previous literature. It can help researchers determine annotation sample size for supervised machine learning.

  13. Blast sampling for structural and functional analyses.

    Science.gov (United States)

    Friedrich, Anne; Ripp, Raymond; Garnier, Nicolas; Bettler, Emmanuel; Deléage, Gilbert; Poch, Olivier; Moulinier, Luc

    2007-02-23

    The post-genomic era is characterised by a torrent of biological information flooding the public databases. As a direct consequence, similarity searches starting with a single query sequence frequently lead to the identification of hundreds, or even thousands of potential homologues. The huge volume of data renders the subsequent structural, functional and evolutionary analyses very difficult. It is therefore essential to develop new strategies for efficient sampling of this large sequence space, in order to reduce the number of sequences to be processed. At the same time, it is important to retain the most pertinent sequences for structural and functional studies. An exhaustive analysis on a large scale test set (284 protein families) was performed to compare the efficiency of four different sampling methods aimed at selecting the most pertinent sequences. These four methods sample the proteins detected by BlastP searches and can be divided into two categories: two customisable methods where the user defines either the maximal number or the percentage of sequences to be selected; two automatic methods in which the number of sequences selected is determined by the program. We focused our analysis on the potential information content of the sampled sets of sequences using multiple alignment of complete sequences as the main validation tool. The study considered two criteria: the total number of sequences in BlastP and their associated E-values. The subsequent analyses investigated the influence of the sampling methods on the E-value distributions, the sequence coverage, the final multiple alignment quality and the active site characterisation at various residue conservation thresholds as a function of these criteria. The comparative analysis of the four sampling methods allows us to propose a suitable sampling strategy that significantly reduces the number of homologous sequences required for alignment, while at the same time maintaining the relevant information

  14. Blast sampling for structural and functional analyses

    Directory of Open Access Journals (Sweden)

    Friedrich Anne

    2007-02-01

    Full Text Available Abstract Background The post-genomic era is characterised by a torrent of biological information flooding the public databases. As a direct consequence, similarity searches starting with a single query sequence frequently lead to the identification of hundreds, or even thousands of potential homologues. The huge volume of data renders the subsequent structural, functional and evolutionary analyses very difficult. It is therefore essential to develop new strategies for efficient sampling of this large sequence space, in order to reduce the number of sequences to be processed. At the same time, it is important to retain the most pertinent sequences for structural and functional studies. Results An exhaustive analysis on a large scale test set (284 protein families was performed to compare the efficiency of four different sampling methods aimed at selecting the most pertinent sequences. These four methods sample the proteins detected by BlastP searches and can be divided into two categories: two customisable methods where the user defines either the maximal number or the percentage of sequences to be selected; two automatic methods in which the number of sequences selected is determined by the program. We focused our analysis on the potential information content of the sampled sets of sequences using multiple alignment of complete sequences as the main validation tool. The study considered two criteria: the total number of sequences in BlastP and their associated E-values. The subsequent analyses investigated the influence of the sampling methods on the E-value distributions, the sequence coverage, the final multiple alignment quality and the active site characterisation at various residue conservation thresholds as a function of these criteria. Conclusion The comparative analysis of the four sampling methods allows us to propose a suitable sampling strategy that significantly reduces the number of homologous sequences required for alignment, while

  15. [Staphylococcus aureus in bulk milk samples].

    Science.gov (United States)

    Benda, P; Vyletĕlová, M

    1995-07-01

    In the years 1993-1994 the occurrence of Staphylococcus aureus was investigated in bulk milk samples in the area where a Baby Food Factory at Zábreh in Moravia is located, and in Bruntál, Zlín and Policka districts. Evaluation of the results was based on ECC Directive 92/46, while the dynamics of S. aureus presence was followed for the whole period of observation as well as in the particular seasons. A total of 4,485 samples was processed. Out of these, 50.7% contained less than 100 CFU/ml of S. aureus, 41.4% contained 100-500 CFU/ml, 6.73% 500-2,000 CFU/ml and 1.14% contained more than 2,000 CFU/ml (Fig. 1). The samples were divided into three categories: private new-established farms, cooperative and State-owned enterprises in the area of the Zábĕh Factory and others (Zlín, Bruntál and Policka districts). There were highly significant differences in the content of staphylococci (P = 0.01%) between the three categories of samples. Ninety-eight percent of samples from private farms, 96% samples from the Zábreh Factory area and 85% of the other samples comply with the regulation EEC 92/64 (Tab. I) for raw cow's milk for the manufacture of products "made with raw milk" whose manufacturing process does not involve any heat treatment (Fig. 2). The occurrence of S. aureus in the Zábreh Factory area shows an expressive seasonal dynamics (P = 0.005%) with maximum values in winter months (December-March) and minimum values in summer months (July-October)-Fig. 3. The same relationship can be seen on more extensive data files for the particular producers (Fig. 4).(ABSTRACT TRUNCATED AT 250 WORDS)

  16. Preservative solution for skeletal muscle biopsy samples

    Directory of Open Access Journals (Sweden)

    Yasemin Gulcan Kurt

    2015-01-01

    Full Text Available Context : Muscle biopsy samples must be frozen with liquid nitrogen immediately after excision and maintained at -80 o C until analysis. Because of this requirement for tissue processing, patients with neuromuscular diseases often have to travel to centers with on-site muscle pathology laboratories for muscle biopsy sample excision to ensure that samples are properly preserved. Aim: Here, we developed a preservative solution and examined its protectiveness on striated muscle tissues for a minimum of the length of time that would be required to reach a specific muscle pathology laboratory. Materials and Methods: A preservative solution called Kurt-Ozcan (KO solution was prepared. Eight healthy Sprague-Dawley rats were sacrificed; striated muscle tissue samples were collected and divided into six different groups. Muscle tissue samples were separated into groups for morphological, enzyme histochemical, molecular, and biochemical analysis. Statistical method used: Chi-square and Kruskal Wallis tests. Results: Samples kept in the KO and University of Wisconsin (UW solutions exhibited very good morphological scores at 3, 6, and 18 hours, but artificial changes were observed at 24 hours. Similar findings were observed for the evaluated enzyme activities. There were no differences between the control group and the samples kept in the KO or UW solution at 3, 6, and 18 hours for morphological, enzyme histochemical, and biochemical features. The messenger ribonucleic acid (mRNA of β-actin gene was protected up to 6 hours in the KO and UW solutions. Conclusion: The KO solution protects the morphological, enzyme histochemical, and biochemical features of striated muscle tissue of healthy rats for 18 hours and preserves the mRNA for 6 hours.

  17. Network Sampling with Memory: A proposal for more efficient sampling from social networks

    Science.gov (United States)

    Mouw, Ted; Verdery, Ashton M.

    2013-01-01

    Techniques for sampling from networks have grown into an important area of research across several fields. For sociologists, the possibility of sampling from a network is appealing for two reasons: (1) A network sample can yield substantively interesting data about network structures and social interactions, and (2) it is useful in situations where study populations are difficult or impossible to survey with traditional sampling approaches because of the lack of a sampling frame. Despite its appeal, methodological concerns about the precision and accuracy of network-based sampling methods remain. In particular, recent research has shown that sampling from a network using a random walk based approach such as Respondent Driven Sampling (RDS) can result in high design effects (DE)—the ratio of the sampling variance to the sampling variance of simple random sampling (SRS). A high design effect means that more cases must be collected to achieve the same level of precision as SRS. In this paper we propose an alternative strategy, Network Sampling with Memory (NSM), which collects network data from respondents in order to reduce design effects and, correspondingly, the number of interviews needed to achieve a given level of statistical power. NSM combines a “List” mode, where all individuals on the revealed network list are sampled with the same cumulative probability, with a “Search” mode, which gives priority to bridge nodes connecting the current sample to unexplored parts of the network. We test the relative efficiency of NSM compared to RDS and SRS on 162 school and university networks from Add Health and Facebook that range in size from 110 to 16,278 nodes. The results show that the average design effect for NSM on these 162 networks is 1.16, which is very close to the efficiency of a simple random sample (DE=1), and 98.5% lower than the average DE we observed for RDS. PMID:24159246

  18. Network Sampling with Memory: A proposal for more efficient sampling from social networks.

    Science.gov (United States)

    Mouw, Ted; Verdery, Ashton M

    2012-08-01

    Techniques for sampling from networks have grown into an important area of research across several fields. For sociologists, the possibility of sampling from a network is appealing for two reasons: (1) A network sample can yield substantively interesting data about network structures and social interactions, and (2) it is useful in situations where study populations are difficult or impossible to survey with traditional sampling approaches because of the lack of a sampling frame. Despite its appeal, methodological concerns about the precision and accuracy of network-based sampling methods remain. In particular, recent research has shown that sampling from a network using a random walk based approach such as Respondent Driven Sampling (RDS) can result in high design effects (DE)-the ratio of the sampling variance to the sampling variance of simple random sampling (SRS). A high design effect means that more cases must be collected to achieve the same level of precision as SRS. In this paper we propose an alternative strategy, Network Sampling with Memory (NSM), which collects network data from respondents in order to reduce design effects and, correspondingly, the number of interviews needed to achieve a given level of statistical power. NSM combines a "List" mode, where all individuals on the revealed network list are sampled with the same cumulative probability, with a "Search" mode, which gives priority to bridge nodes connecting the current sample to unexplored parts of the network. We test the relative efficiency of NSM compared to RDS and SRS on 162 school and university networks from Add Health and Facebook that range in size from 110 to 16,278 nodes. The results show that the average design effect for NSM on these 162 networks is 1.16, which is very close to the efficiency of a simple random sample (DE=1), and 98.5% lower than the average DE we observed for RDS.

  19. Oral processing of two milk chocolate samples.

    Science.gov (United States)

    Carvalho-da-Silva, Ana Margarida; Van Damme, Isabella; Taylor, Will; Hort, Joanne; Wolf, Bettina

    2013-02-26

    Oral processing of two milk chocolates, identical in composition and viscosity, was investigated to understand the textural behaviour. Previous studies had shown differences in mouthcoating and related attributes such as time of clearance from the oral cavity to be most discriminating between the samples. Properties of panellists' saliva, with regard to protein concentration and profile before and after eating the two chocolates, were included in the analysis but did not reveal any correlation with texture perception. The microstructure of the chocolate samples following oral processing, which resembled an emulsion as the chocolate phase inverts in-mouth, was clearly different and the sample that was found to be more mouthcoating appeared less flocculated after 20 chews. The differences in flocculation behaviour were mirrored in the volume based particle size distributions acquired with a laser diffraction particle size analyser. The less mouthcoating and more flocculated sample showed a clear bimodal size distribution with peaks at around 40 and 500 μm, for 10 and 20 chews, compared to a smaller and then diminishing second peak for the other sample following 10 and 20 chews, respectively. The corresponding mean particle diameters after 20 chews were 184 ± 23 and 141 ± 10 μm for the less and more mouthcoating samples, respectively. Also, more of the mouthcoating sample had melted after both 10 and 20 chews (80 ± 8% compared to 72 ± 10% for 20 chews). Finally, the friction behaviour between a soft and hard surface (elastopolymer/steel) and at in-mouth temperature was investigated using a commercial tribology attachment on a rotational rheometer. Complex material behaviour was revealed. Observations included an unusual increase in friction coefficient at very low sliding speeds, initially overlapping for both samples, to a threefold higher value for the more mouthcoating sample. This was followed by a commonly observed decrease in friction coefficient with

  20. Experimental and Sampling Design for the INL-2 Sample Collection Operational Test

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, Gregory F.; Amidan, Brett G.; Matzke, Brett D.

    2009-02-16

    This report describes the experimental and sampling design developed to assess sampling approaches and methods for detecting contamination in a building and clearing the building for use after decontamination. An Idaho National Laboratory (INL) building will be contaminated with BG (Bacillus globigii, renamed Bacillus atrophaeus), a simulant for Bacillus anthracis (BA). The contamination, sampling, decontamination, and re-sampling will occur per the experimental and sampling design. This INL-2 Sample Collection Operational Test is being planned by the Validated Sampling Plan Working Group (VSPWG). The primary objectives are: 1) Evaluate judgmental and probabilistic sampling for characterization as well as probabilistic and combined (judgment and probabilistic) sampling approaches for clearance, 2) Conduct these evaluations for gradient contamination (from low or moderate down to absent or undetectable) for different initial concentrations of the contaminant, 3) Explore judgment composite sampling approaches to reduce sample numbers, 4) Collect baseline data to serve as an indication of the actual levels of contamination in the tests. A combined judgmental and random (CJR) approach uses Bayesian methodology to combine judgmental and probabilistic samples to make clearance statements of the form "X% confidence that at least Y% of an area does not contain detectable contamination” (X%/Y% clearance statements). The INL-2 experimental design has five test events, which 1) vary the floor of the INL building on which the contaminant will be released, 2) provide for varying the amount of contaminant released to obtain desired concentration gradients, and 3) investigate overt as well as covert release of contaminants. Desirable contaminant gradients would have moderate to low concentrations of contaminant in rooms near the release point, with concentrations down to zero in other rooms. Such gradients would provide a range of contamination levels to challenge the sampling

  1. Getting DNA copy numbers without control samples.

    Science.gov (United States)

    Ortiz-Estevez, Maria; Aramburu, Ander; Rubio, Angel

    2012-08-16

    The selection of the reference to scale the data in a copy number analysis has paramount importance to achieve accurate estimates. Usually this reference is generated using control samples included in the study. However, these control samples are not always available and in these cases, an artificial reference must be created. A proper generation of this signal is crucial in terms of both noise and bias.We propose NSA (Normality Search Algorithm), a scaling method that works with and without control samples. It is based on the assumption that genomic regions enriched in SNPs with identical copy numbers in both alleles are likely to be normal. These normal regions are predicted for each sample individually and used to calculate the final reference signal. NSA can be applied to any CN data regardless the microarray technology and preprocessing method. It also finds an optimal weighting of the samples minimizing possible batch effects. Five human datasets (a subset of HapMap samples, Glioblastoma Multiforme (GBM), Ovarian, Prostate and Lung Cancer experiments) have been analyzed. It is shown that using only tumoral samples, NSA is able to remove the bias in the copy number estimation, to reduce the noise and therefore, to increase the ability to detect copy number aberrations (CNAs). These improvements allow NSA to also detect recurrent aberrations more accurately than other state of the art methods. NSA provides a robust and accurate reference for scaling probe signals data to CN values without the need of control samples. It minimizes the problems of bias, noise and batch effects in the estimation of CNs. Therefore, NSA scaling approach helps to better detect recurrent CNAs than current methods. The automatic selection of references makes it useful to perform bulk analysis of many GEO or ArrayExpress experiments without the need of developing a parser to find the normal samples or possible batches within the data. The method is available in the open-source R package

  2. Getting DNA copy numbers without control samples

    Directory of Open Access Journals (Sweden)

    Ortiz-Estevez Maria

    2012-08-01

    Full Text Available Abstract Background The selection of the reference to scale the data in a copy number analysis has paramount importance to achieve accurate estimates. Usually this reference is generated using control samples included in the study. However, these control samples are not always available and in these cases, an artificial reference must be created. A proper generation of this signal is crucial in terms of both noise and bias. We propose NSA (Normality Search Algorithm, a scaling method that works with and without control samples. It is based on the assumption that genomic regions enriched in SNPs with identical copy numbers in both alleles are likely to be normal. These normal regions are predicted for each sample individually and used to calculate the final reference signal. NSA can be applied to any CN data regardless the microarray technology and preprocessing method. It also finds an optimal weighting of the samples minimizing possible batch effects. Results Five human datasets (a subset of HapMap samples, Glioblastoma Multiforme (GBM, Ovarian, Prostate and Lung Cancer experiments have been analyzed. It is shown that using only tumoral samples, NSA is able to remove the bias in the copy number estimation, to reduce the noise and therefore, to increase the ability to detect copy number aberrations (CNAs. These improvements allow NSA to also detect recurrent aberrations more accurately than other state of the art methods. Conclusions NSA provides a robust and accurate reference for scaling probe signals data to CN values without the need of control samples. It minimizes the problems of bias, noise and batch effects in the estimation of CNs. Therefore, NSA scaling approach helps to better detect recurrent CNAs than current methods. The automatic selection of references makes it useful to perform bulk analysis of many GEO or ArrayExpress experiments without the need of developing a parser to find the normal samples or possible batches within the

  3. Importance of sampling frequency when collecting diatoms

    Science.gov (United States)

    Wu, Naicheng; Faber, Claas; Sun, Xiuming; Qu, Yueming; Wang, Chao; Ivetic, Snjezana; Riis, Tenna; Ulrich, Uta; Fohrer, Nicola

    2016-11-01

    There has been increasing interest in diatom-based bio-assessment but we still lack a comprehensive understanding of how to capture diatoms’ temporal dynamics with an appropriate sampling frequency (ASF). To cover this research gap, we collected and analyzed daily riverine diatom samples over a 1-year period (25 April 2013-30 April 2014) at the outlet of a German lowland river. The samples were classified into five clusters (1-5) by a Kohonen Self-Organizing Map (SOM) method based on similarity between species compositions over time. ASFs were determined to be 25 days at Cluster 2 (June-July 2013) and 13 days at Cluster 5 (February-April 2014), whereas no specific ASFs were found at Cluster 1 (April-May 2013), 3 (August-November 2013) (>30 days) and Cluster 4 (December 2013 - January 2014) (<1 day). ASFs showed dramatic seasonality and were negatively related to hydrological wetness conditions, suggesting that sampling interval should be reduced with increasing catchment wetness. A key implication of our findings for freshwater management is that long-term bio-monitoring protocols should be developed with the knowledge of tracking algal temporal dynamics with an appropriate sampling frequency.

  4. Raman spectroscopy of selected carbonaceous samples

    Energy Technology Data Exchange (ETDEWEB)

    Kwiecinska, Barbara [University of Science and Technology-AGH, Faculty of Geology, Geophysics and Environmental Protection, Krakow (Poland); Suarez-Ruiz, Isabel [Instituto Nacional del Carbon, (INCAR-CSIC), Oviedo (Spain); Paluszkiewicz, Czeslawa [University of Science and Technology-AGH, Faculty of Materials Science and Technology, Krakow (Poland); Rodriques, Sandra [Universidade do Porto, Faculdade de Ciencias, Dept. de Geologia (Portugal)

    2010-12-01

    This paper presents the results of Raman spectra measured on carbonaceous materials ranging from greenschist facies to granulite-facies graphite (Anchimetamorphism and Epimetamorphism zones). Raman spectroscopy has come to be regarded as a more appropriate tool than X-ray diffraction for study of highly ordered carbon materials, including chondritic matter, soot, polycyclic aromatic hydrocarbons and evolved coal samples. This work demonstrates the usefulness of the Raman spectroscopy analysis in determining internal crystallographic structure (disordered lattice, heterogeneity). Moreover, this methodology permits the detection of differences within the meta-anthracite rank, semi-graphite and graphite stages for the samples included in this study. In the first order Raman spectra, the bands located near to c.a. 1350 cm{sup -1} (defects and disorder mode A{sub 1g}) and 1580 cm{sup -1} (in plane E{sub 2g} zone - centre mode) contribute to the characterization and determination of the degree of structural evolution and graphitization of the carbonaceous samples. The data from Raman spectroscopy were compared with parameters obtained by means of structural, chemical and optical microscopic analysis carried out on the same carbonaceous samples. The results revealed some positive and significant relationships, although the use of reflectance as a parameter for following the increase in structural order in natural graphitized samples was subject to limitations. (author)

  5. Statistical Symbolic Execution with Informed Sampling

    Science.gov (United States)

    Filieri, Antonio; Pasareanu, Corina S.; Visser, Willem; Geldenhuys, Jaco

    2014-01-01

    Symbolic execution techniques have been proposed recently for the probabilistic analysis of programs. These techniques seek to quantify the likelihood of reaching program events of interest, e.g., assert violations. They have many promising applications but have scalability issues due to high computational demand. To address this challenge, we propose a statistical symbolic execution technique that performs Monte Carlo sampling of the symbolic program paths and uses the obtained information for Bayesian estimation and hypothesis testing with respect to the probability of reaching the target events. To speed up the convergence of the statistical analysis, we propose Informed Sampling, an iterative symbolic execution that first explores the paths that have high statistical significance, prunes them from the state space and guides the execution towards less likely paths. The technique combines Bayesian estimation with a partial exact analysis for the pruned paths leading to provably improved convergence of the statistical analysis. We have implemented statistical symbolic execution with in- formed sampling in the Symbolic PathFinder tool. We show experimentally that the informed sampling obtains more precise results and converges faster than a purely statistical analysis and may also be more efficient than an exact symbolic analysis. When the latter does not terminate symbolic execution with informed sampling can give meaningful results under the same time and memory limits.

  6. Importance of sampling frequency when collecting diatoms

    KAUST Repository

    Wu, Naicheng

    2016-11-14

    There has been increasing interest in diatom-based bio-assessment but we still lack a comprehensive understanding of how to capture diatoms’ temporal dynamics with an appropriate sampling frequency (ASF). To cover this research gap, we collected and analyzed daily riverine diatom samples over a 1-year period (25 April 2013–30 April 2014) at the outlet of a German lowland river. The samples were classified into five clusters (1–5) by a Kohonen Self-Organizing Map (SOM) method based on similarity between species compositions over time. ASFs were determined to be 25 days at Cluster 2 (June-July 2013) and 13 days at Cluster 5 (February-April 2014), whereas no specific ASFs were found at Cluster 1 (April-May 2013), 3 (August-November 2013) (>30 days) and Cluster 4 (December 2013 - January 2014) (<1 day). ASFs showed dramatic seasonality and were negatively related to hydrological wetness conditions, suggesting that sampling interval should be reduced with increasing catchment wetness. A key implication of our findings for freshwater management is that long-term bio-monitoring protocols should be developed with the knowledge of tracking algal temporal dynamics with an appropriate sampling frequency.

  7. Wilsonville wastewater sampling program. Final report

    Energy Technology Data Exchange (ETDEWEB)

    None

    1983-10-01

    As part of its contrast to design, build and operate the SRC-1 Demonstration Plant in cooperation with the US Department of Energy (DOE), International Coal Refining Company (ICRC) was required to collect and evaluate data related to wastewater streams and wastewater treatment procedures at the SRC-1 Pilot Plant facility. The pilot plant is located at Wilsonville, Alabama and is operated by Catalytic, Inc. under the direction of Southern Company Services. The plant is funded in part by the Electric Power Research Institute and the DOE. ICRC contracted with Catalytic, Inc. to conduct wastewater sampling. Tasks 1 through 5 included sampling and analysis of various wastewater sources and points of different steps in the biological treatment facility at the plant. The sampling program ran from May 1 to July 31, 1982. Also included in the sampling program was the generation and analysis of leachate from SRC product using standard laboratory leaching procedures. For Task 6, available plant wastewater data covering the period from February 1978 to December 1981 was analyzed to gain information that might be useful for a demonstration plant design basis. This report contains a tabulation of the analytical data, a summary tabulation of the historical operating data that was evaluated and comments concerning the data. The procedures used during the sampling program are also documented.

  8. Amchitka Island, Alaska, special sampling project 1997

    Energy Technology Data Exchange (ETDEWEB)

    U.S. Department of Energy, Nevada Operations Office

    2000-06-28

    This 1997 special sampling project represents a special radiobiological sampling effort to augment the 1996 Long-Term Hydrological Monitoring Program (LTHMP) for Amchitka Island in Alaska. Lying in the western portion of the Aleutian Islands arc, near the International Date Line, Amchitka Island is one of the southernmost islands of the Rat Island Chain. Between 1965 and 1971, the U.S. Atomic Energy Commission conducted three underground nuclear tests on Amchitka Island. In 1996, Greenpeace collected biota samples and speculated that several long-lived, man-made radionuclides detected (i.e., americium-241, plutonium-239 and -240, beryllium-7, and cesium-137) leaked into the surface environment from underground cavities created during the testing. The nuclides of interest are detected at extremely low concentrations throughout the environment. The objectives of this special sampling project were to scientifically refute the Greenpeace conclusions that the underground cavities were leaking contaminants to the surface. This was achieved by first confirming the presence of these radionuclides in the Amchitka Island surface environment and, second, if the radionuclides were present, determining if the source is the underground cavity or worldwide fallout. This special sampling and analysis determined that the only nonfallout-related radionuclide detected was a low level of tritium from the Long Shot test, which had been previously documented. The tritium contamination is monitored and continues a decreasing trend due to radioactive decay and dilution.

  9. Modeling abundance using hierarchical distance sampling

    Science.gov (United States)

    Royle, Andy; Kery, Marc

    2016-01-01

    In this chapter, we provide an introduction to classical distance sampling ideas for point and line transect data, and for continuous and binned distance data. We introduce the conditional and the full likelihood, and we discuss Bayesian analysis of these models in BUGS using the idea of data augmentation, which we discussed in Chapter 7. We then extend the basic ideas to the problem of hierarchical distance sampling (HDS), where we have multiple point or transect sample units in space (or possibly in time). The benefit of HDS in practice is that it allows us to directly model spatial variation in population size among these sample units. This is a preeminent concern of most field studies that use distance sampling methods, but it is not a problem that has received much attention in the literature. We show how to analyze HDS models in both the unmarked package and in the BUGS language for point and line transects, and for continuous and binned distance data. We provide a case study of HDS applied to a survey of the island scrub-jay on Santa Cruz Island, California.

  10. Sampling approaches for extensive surveys in nematology.

    Science.gov (United States)

    Prot, J C; Ferris, H

    1992-12-01

    Extensive surveys of the frequency and abundance of plant-parasitic nematodes over large geographic areas provide useful data of unknown reliability. Time, cost, and logistical constraints may limit the sampling intensity that can be invested at any survey site. We developed a computer program to evaluate the probability of detection and the reliability of population estimates obtained by different strategies for collecting one sample of 10 cores from a field. We used data from two fields that had been sampled systematically and extensively as the basis for our analyses. Our analyses indicate that, at least for those two fields, it is possible to have a high probability of detecting the presence of nematode species and to reliably estimate abundance, with a single 10-core soil sample from a field. When species were rare or not uniformly distributed in a field, the probability of detection and reliability of the population estimate were correlated with the distance between core removal sites. Increasing the prescribed distance between cores resulted in the composite sample representing a wider range of microenvironments in the field.

  11. Energy Preserved Sampling for Compressed Sensing MRI

    Directory of Open Access Journals (Sweden)

    Yudong Zhang

    2014-01-01

    Full Text Available The sampling patterns, cost functions, and reconstruction algorithms play important roles in optimizing compressed sensing magnetic resonance imaging (CS-MRI. Simple random sampling patterns did not take into account the energy distribution in k-space and resulted in suboptimal reconstruction of MR images. Therefore, a variety of variable density (VD based samplings patterns had been developed. To further improve it, we propose a novel energy preserving sampling (ePRESS method. Besides, we improve the cost function by introducing phase correction and region of support matrix, and we propose iterative thresholding algorithm (ITA to solve the improved cost function. We evaluate the proposed ePRESS sampling method, improved cost function, and ITA reconstruction algorithm by 2D digital phantom and 2D in vivo MR brains of healthy volunteers. These assessments demonstrate that the proposed ePRESS method performs better than VD, POWER, and BKO; the improved cost function can achieve better reconstruction quality than conventional cost function; and the ITA is faster than SISTA and is competitive with FISTA in terms of computation time.

  12. Relevance of sample preparation for flow cytometry.

    Science.gov (United States)

    Muccio, V E; Saraci, E; Gilestro, M; Oddolo, D; Ruggeri, M; Caltagirone, S; Bruno, B; Boccadoro, M; Omedè, P

    2017-10-06

    Flow cytometry is a useful tool for diagnosis and minimal residual disease (MRD) study of hematological diseases. Standard sample preparation protocols are characterized by stain-lyse-wash (SLW). To prevent nonspecific bindings and achieve high sensitivity in MRD studies, lyse-wash-stain-wash (LWSW) is required. To our knowledge, no comparison between the two methods has been performed. We compared mean fluorescence intensity (MFI), stain index, signal-to-noise ratio, and percentage of positive cells of 104 antibodies and of 13 selected antibodies tested in 10 samples simultaneously prepared with the two methods. MFI and percentages of positive cells obtained by the two methods did not show significant differences and showed a very high correlation. Stain index and signal-to-noise ratio presented higher values for kappa and lambda surface chains in LWSW samples and a trend of higher values for the other antibodies in SLW samples. We suggest to use LWSW method also at diagnosis to obtain more comparable antibody intensity expressions when samples from the same patient are processed for MRD evaluation after bulk lysis. Moreover, LWSW can prevent nonspecific bindings, shows no differences in the identification and quantitation of the populations of interest, and reduces acquisition of cell debris. © 2017 John Wiley & Sons Ltd.

  13. Large sample neutron activation analysis avoids representative sub-sampling and sample preparation difficulties : An added value for forensic analysis

    NARCIS (Netherlands)

    Bode, P.; Romanò, Sabrina; Romolo, Francesco Saverio

    2017-01-01

    A crucial part of any chemical analysis is the degree of representativeness of the measurand(s) in the test portion for the same measurands in the object, originally collected for investigation. Such an object usually may have either to be homogenized and sub-sampled, or digested/dissolved. Any

  14. Monitoring of Extraction Efficiency by a Sample Process Control Virus Added Immediately Upon Sample Receipt.

    Science.gov (United States)

    Ruhanya, Vurayai; Diez-Valcarce, Marta; D'Agostino, Martin; Cook, Nigel; Hernández, Marta; Rodríguez-Lázaro, David

    2015-12-01

    When analysing food samples for enteric viruses, a sample process control virus (SPCV) must be added at the commencement of the analytical procedure, to verify that the analysis has been performed correctly. Samples can on occasion arrive at the laboratory late in the working day or week. The analyst may consequently have insufficient time to commence and complete the complex procedure, and the samples must consequently be stored. To maintain the validity of the analytical result, it will be necessary to consider storage as part of the process, and the analytical procedure as commencing on sample receipt. The aim of this study was to verify that an SPCV can be recovered after sample storage, and thus indicate the effective recovery of enteric viruses. Two types of samples (fresh and frozen raspberries) and two types of storage (refrigerated and frozen) were studied using Mengovirus vMC0 as SPCV. SPCV recovery was not significantly different (P > 0.5) regardless of sample type or duration of storage (up to 14 days at -20 °C). Accordingly, samples can be stored without a significant effect on the performance of the analysis. The results of this study should assist the analyst by demonstrating that they can verify that viruses can be extracted from food samples even if samples have been stored.

  15. Laser-matter Interaction with Submerged Samples

    Energy Technology Data Exchange (ETDEWEB)

    Mariella, R; Rubenchik, A; Norton, M; Donohue, G; Roberts, K

    2010-03-25

    With the long-term goal in mind of investigating if one could possibly design a 'universal solid-sample comminution technique' for debris and rubble, we have studied pulsed-laser ablation of solid samples that were contained within a surrounding fluid. Using pulses with fluences between 2 J and 0.3 J, wavelengths of 351 and 527 nm, and samples of rock, concrete, and red brick, each submerged in water, we have observed conditions in which {micro}m-scale particles can be preferentially generated in a controlled manner, during the laser ablation process. Others have studied laser peening of metals, where their attention has been to the substrate. Our study uses non-metallic substrates and analyzes the particles that are ablated from the process. The immediate impact of our investigation is that laser-comminution portion of a new systems concept for chemical analysis has been verified as feasible.

  16. Geometric and Texture Inpainting by Gibbs Sampling

    DEFF Research Database (Denmark)

    Gustafsson, David Karl John; Pedersen, Kim Steenstrup; Nielsen, Mads

    2007-01-01

    This paper discuss a method suitable for inpainting both large scale geometric structures and more stochastic texture components. Image inpainting concerns the problem of reconstructing the intensity contents inside regions of missing data. Common techniques for solving this problem are methods....... In this paper we use the well-known FRAME (Filters, Random Fields and Maximum Entropy) for inpainting. We introduce a temperature term in the learned FRAME Gibbs distribution. By sampling using different temperature in the FRAME Gibbs distribution, different contents of the image are reconstructed. We propose...... a two step method for inpainting using FRAME. First the geometric structure of the image is reconstructed by sampling from a cooled Gibbs distribution, then the stochastic component is reconstructed by sample froma heated Gibbs distribution. Both steps in the reconstruction process are necessary...

  17. Rapid surface sampling and archival record system

    Energy Technology Data Exchange (ETDEWEB)

    Barren, E.; Penney, C.M.; Sheldon, R.B. [GE Corporate Research and Development Center, Schenectady, NY (United States)] [and others

    1995-10-01

    A number of contamination sites exist in this country where the area and volume of material to be remediated is very large, approaching or exceeding 10{sup 6} m{sup 2} and 10{sup 6} m{sup 3}. Typically, only a small fraction of this material is actually contaminated. In such cases there is a strong economic motivation to test the material with a sufficient density of measurements to identify which portions are uncontaminated, so extensively they be left in place or be disposed of as uncontaminated waste. Unfortunately, since contamination often varies rapidly from position to position, this procedure can involve upwards of one million measurements per site. The situation is complicated further in many cases by the difficulties of sampling porous surfaces, such as concrete. This report describes a method for sampling concretes in which an immediate distinction can be made between contaminated and uncontaminated surfaces. Sample acquisition and analysis will be automated.

  18. Regression Estimator Using Double Ranked Set Sampling

    Directory of Open Access Journals (Sweden)

    Hani M. Samawi

    2002-06-01

    Full Text Available The performance of a regression estimator based on the double ranked set sample (DRSS scheme, introduced by Al-Saleh and Al-Kadiri (2000, is investigated when the mean of the auxiliary variable X is unknown. Our primary analysis and simulation indicates that using the DRSS regression estimator for estimating the population mean substantially increases relative efficiency compared to using regression estimator based on simple random sampling (SRS or ranked set sampling (RSS (Yu and Lam, 1997 regression estimator.  Moreover, the regression estimator using DRSS is also more efficient than the naïve estimators of the population mean using SRS, RSS (when the correlation coefficient is at least 0.4 and DRSS for high correlation coefficient (at least 0.91. The theory is illustrated using a real data set of trees.

  19. Sampling bias on cup anemometer mean winds

    Energy Technology Data Exchange (ETDEWEB)

    Kristensen, L.; Hansen, O.F.; Hoejstrup, J. [Risoe National Laboratory, Roskilde (Denmark)

    2003-07-01

    The cup anemometer signal can be sampled in several ways to obtain the mean wind speed. Here we discuss the sampling of series of mean wind speeds from consecutive rotor rotations, followed by unweighted and weighted averaging. It is shown that the unweighted averaging creates a positive bias on the long-term mean wind speed, which is at least one order of magnitude larger than the positive bias from the weighted averaging, also known as the sample-and-hold method. For a homogeneous, neutrally stratified flow the first biases are 1%-2%. For comparison the biases due to fluctuations of the three wind velocity components and due to calibration non-linearity are determined under the same conditions. The largest of these is the v-bias from direction fluctuations. The calculations pertain to the Risoe P2546A model cup anemometer. (author)

  20. Nicotine Contamination in Particulate Matter Sampling

    Directory of Open Access Journals (Sweden)

    Eric Garshick

    2009-02-01

    Full Text Available We have addressed potential contamination of PM2.5 filter samples by nicotine from cigarette smoke. We collected two nicotine samples – one nicotine sampling filter was placed in-line after the collection of PM2.5 and the other stood alone. The overall correlation between the two nicotine filter levels was 0.99. The nicotine collected on the “stand-alone” filter was slightly greater than that on the “in-line” filter (mean difference = 1.10 μg/m3, but the difference was statistically significant only when PM2.5 was low (≤ 50 μg/m3. It is therefore important to account for personal and secondhand smoke exposure while assessing occupational and environmental PM.

  1. On small sample experiments in neuroimaging

    DEFF Research Database (Denmark)

    Goutte, Cyril; Hansen, Lars Kai

    1998-01-01

    Most human brain imaging experiments involve a number of subjects that is unusually low by accepted statistical standards. Although there are anumber of practical reasons for using small samples in neuroimaging we need to face the question regarding whether results obtained with only a fewsubject...... will generalise to a larger population. In this contribution we address this issue using a Bayesian framework, derive confidence intervals forsmall samples experiments, and discuss the issue of the prior.......Most human brain imaging experiments involve a number of subjects that is unusually low by accepted statistical standards. Although there are anumber of practical reasons for using small samples in neuroimaging we need to face the question regarding whether results obtained with only a fewsubjects...

  2. Rapid Automated Sample Preparation for Biological Assays

    Energy Technology Data Exchange (ETDEWEB)

    Shusteff, M

    2011-03-04

    Our technology utilizes acoustic, thermal, and electric fields to separate out contaminants such as debris or pollen from environmental samples, lyse open cells, and extract the DNA from the lysate. The objective of the project is to optimize the system described for a forensic sample, and demonstrate its performance for integration with downstream assay platforms (e.g. MIT-LL's ANDE). We intend to increase the quantity of DNA recovered from the sample beyond the current {approx}80% achieved using solid phase extraction methods. Task 1: Develop and test an acoustic filter for cell extraction. Task 2: Develop and test lysis chip. Task 3: Develop and test DNA extraction chip. All chips have been fabricated based on the designs laid out in last month's report.

  3. Standard guide for sampling radioactive tank waste

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2011-01-01

    1.1 This guide addresses techniques used to obtain grab samples from tanks containing high-level radioactive waste created during the reprocessing of spent nuclear fuels. Guidance on selecting appropriate sampling devices for waste covered by the Resource Conservation and Recovery Act (RCRA) is also provided by the United States Environmental Protection Agency (EPA) (1). Vapor sampling of the head-space is not included in this guide because it does not significantly affect slurry retrieval, pipeline transport, plugging, or mixing. 1.2 The values stated in inch-pound units are to be regarded as standard. No other units of measurement are included in this standard. 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  4. Personality and Frailty: Evidence From Four Samples.

    Science.gov (United States)

    Stephan, Yannick; Sutin, Angelina R; Canada, Brice; Terracciano, Antonio

    2017-02-01

    Frailty is a prevalent geriatric syndrome. Little is known about the psychological factors associated with this syndrome. Based on four large samples of older adults aged from 65 to 104 years old, the present study examined whether personality traits are related to frailty. High neuroticism, low conscientiousness, low extraversion, low openness and low agreeableness were related to higher frailty across samples. Longitudinal analysis conducted in one sample revealed that high neuroticism was associated with worsening frailty over an 8-year period. Higher frailty at baseline and over time was related to maladaptive personality changes. This study extends existing knowledge on the link between personality and health in older adults, by identifying the personality traits associated with frailty, a complex geriatric syndrome.

  5. Exact and Efficient Sampling of Conditioned Walks

    Science.gov (United States)

    Adorisio, Matteo; Pezzotta, Alberto; de Mulatier, Clélia; Micheletti, Cristian; Celani, Antonio

    2017-11-01

    A computationally challenging and open problem is how to efficiently generate equilibrated samples of conditioned walks. We present here a general stochastic approach that allows one to produce these samples with their correct statistical weight and without rejections. The method is illustrated for a jump process conditioned to evolve within a cylindrical channel and forced to reach one of its ends. We obtain analytically the exact probability density function of the jumps and offer a direct method for gathering equilibrated samples of a random walk conditioned to stay in a channel with suitable boundary conditions. Unbiased walks of arbitrary length can thus be generated with linear computational complexity—even when the channel width is much smaller than the typical bond length of the unconditioned walk. By profiling the metric properties of the generated walks for various bond lengths we characterize the crossover between weak and strong confinement regimes with great detail.

  6. Pressure Stimulated Currents (PSCin marble samples

    Directory of Open Access Journals (Sweden)

    F. Vallianatos

    2004-06-01

    Full Text Available The electrical behaviour of marble samples from Penteli Mountain was studied while they were subjected to uniaxial stress. The application of consecutive impulsive variations of uniaxial stress to thirty connatural samples produced Pressure Stimulated Currents (PSC. The linear relationship between the recorded PSC and the applied variation rate was investigated. The main results are the following: as far as the samples were under pressure corresponding to their elastic region, the maximum PSC value obeyed a linear law with respect to pressure variation. In the plastic region deviations were observed which were due to variations of Young s modulus. Furthermore, a special burst form of PSC recordings during failure is presented. The latter is emitted when irregular longitudinal splitting is observed during failure.

  7. Photonic boson sampling in a tunable circuit.

    Science.gov (United States)

    Broome, Matthew A; Fedrizzi, Alessandro; Rahimi-Keshari, Saleh; Dove, Justin; Aaronson, Scott; Ralph, Timothy C; White, Andrew G

    2013-02-15

    Quantum computers are unnecessary for exponentially efficient computation or simulation if the Extended Church-Turing thesis is correct. The thesis would be strongly contradicted by physical devices that efficiently perform tasks believed to be intractable for classical computers. Such a task is boson sampling: sampling the output distributions of n bosons scattered by some passive, linear unitary process. We tested the central premise of boson sampling, experimentally verifying that three-photon scattering amplitudes are given by the permanents of submatrices generated from a unitary describing a six-mode integrated optical circuit. We find the protocol to be robust, working even with the unavoidable effects of photon loss, non-ideal sources, and imperfect detection. Scaling this to large numbers of photons should be a much simpler task than building a universal quantum computer.

  8. Power Spectrum Estimation of Randomly Sampled Signals

    DEFF Research Database (Denmark)

    Velte, Clara M.; Buchhave, Preben; K. George, William

    2014-01-01

    with high data rate and low inherent bias, respectively, while residence time weighting provides non-biased estimates regardless of setting. The free-running processor was also tested and compared to residence time weighting using actual LDA measurements in a turbulent round jet. Power spectra from...... of alternative methods attempting to produce correct power spectra have been invented andtested. The objective of the current study is to create a simple computer generated signal for baseline testing of residence time weighting and some of the most commonly proposed algorithms (or algorithms which most...... modernalgorithms ultimately are based on), sample-and-hold and the direct spectral estimator without residence time weighting, and compare how they perform in relation to power spectra based on the equidistantly sampled reference signal. The computer generated signal is a Poisson process with a sample rate...

  9. HAMMER: Reweighting tool for simulated data samples

    CERN Document Server

    Duell, Stephan; Ligeti, Zoltan; Papucci, Michele; Robinson, Dean

    2016-01-01

    Modern flavour physics experiments, such as Belle II or LHCb, require large samples of generated Monte Carlo events. Monte Carlo events often are processed in a sophisticated chain that includes a simulation of the detector response. The generation and reconstruction of large samples is resource-intensive and in principle would need to be repeated if e.g. parameters responsible for the underlying models change due to new measurements or new insights. To avoid having to regenerate large samples, we work on a tool, The Helicity Amplitude Module for Matrix Element Reweighting (HAMMER), which allows one to easily reweight existing events in the context of semileptonic b → q ` ̄ ν ` analyses to new model parameters or new physics scenarios.

  10. A personal ammonia monitor utilizing permeation sampling

    Energy Technology Data Exchange (ETDEWEB)

    Benedict, A.F. (Occupational Safety and Health Administration, Baton Rouge, LA); Reiszner, K.D.; West, P.W.

    1983-01-01

    A method has been developed for the determination of the time-weighted-average personal exposure to ammonia. Sample collection was achieved by permeation through a silicone membrane into a boric acid solution. The trapped ammonia was then determined spectrophotometrically with Nessler's reagent or potentiometrically with an ion selective electrode. The device may be used for sampling periods as short as 5 minutes and was not affected by changes in the environmental parameters normally encountered at industrial locations. The detection limit is 0.4 ppm for an 8 hr sampling period and the monitor responds linearly to at least 150 ppm. The Nessler's method may be utilized in industrial environments containing monoethanol amine in conjunction with ammonia with no significant interference. Although some interference was observed from ethylenediamine with the Nessler's technique, little interference was found with the potentiometric determination.

  11. Microfluidic Wheatstone bridge for rapid sample analysis.

    Science.gov (United States)

    Tanyeri, Melikhan; Ranka, Mikhil; Sittipolkul, Natawan; Schroeder, Charles M

    2011-12-21

    We developed a microfluidic analogue of the classic Wheatstone bridge circuit for automated, real-time sampling of solutions in a flow-through device format. We demonstrate precise control of flow rate and flow direction in the "bridge" microchannel using an on-chip membrane valve, which functions as an integrated "variable resistor". We implement an automated feedback control mechanism in order to dynamically adjust valve opening, thereby manipulating the pressure drop across the bridge and precisely controlling fluid flow in the bridge channel. At a critical valve opening, the flow in the bridge channel can be completely stopped by balancing the flow resistances in the Wheatstone bridge device, which facilitates rapid, on-demand fluid sampling in the bridge channel. In this article, we present the underlying mechanism for device operation and report key design parameters that determine device performance. Overall, the microfluidic Wheatstone bridge represents a new and versatile method for on-chip flow control and sample manipulation.

  12. Validation of Statistical Sampling Algorithms in Visual Sample Plan (VSP): Summary Report

    Energy Technology Data Exchange (ETDEWEB)

    Nuffer, Lisa L; Sego, Landon H.; Wilson, John E.; Hassig, Nancy L.; Pulsipher, Brent A.; Matzke, Brett D.

    2009-02-18

    The U.S. Department of Homeland Security, Office of Technology Development (OTD) contracted with a set of U.S. Department of Energy national laboratories, including the Pacific Northwest National Laboratory (PNNL), to write a Remediation Guidance for Major Airports After a Chemical Attack. The report identifies key activities and issues that should be considered by a typical major airport following an incident involving release of a toxic chemical agent. Four experimental tasks were identified that would require further research in order to supplement the Remediation Guidance. One of the tasks, Task 4, OTD Chemical Remediation Statistical Sampling Design Validation, dealt with statistical sampling algorithm validation. This report documents the results of the sampling design validation conducted for Task 4. In 2005, the Government Accountability Office (GAO) performed a review of the past U.S. responses to Anthrax terrorist cases. Part of the motivation for this PNNL report was a major GAO finding that there was a lack of validated sampling strategies in the U.S. response to Anthrax cases. The report (GAO 2005) recommended that probability-based methods be used for sampling design in order to address confidence in the results, particularly when all sample results showed no remaining contamination. The GAO also expressed a desire that the methods be validated, which is the main purpose of this PNNL report. The objective of this study was to validate probability-based statistical sampling designs and the algorithms pertinent to within-building sampling that allow the user to prescribe or evaluate confidence levels of conclusions based on data collected as guided by the statistical sampling designs. Specifically, the designs found in the Visual Sample Plan (VSP) software were evaluated. VSP was used to calculate the number of samples and the sample location for a variety of sampling plans applied to an actual release site. Most of the sampling designs validated are

  13. Discovering biological progression underlying microarray samples.

    Directory of Open Access Journals (Sweden)

    Peng Qiu

    2011-04-01

    Full Text Available In biological systems that undergo processes such as differentiation, a clear concept of progression exists. We present a novel computational approach, called Sample Progression Discovery (SPD, to discover patterns of biological progression underlying microarray gene expression data. SPD assumes that individual samples of a microarray dataset are related by an unknown biological process (i.e., differentiation, development, cell cycle, disease progression, and that each sample represents one unknown point along the progression of that process. SPD aims to organize the samples in a manner that reveals the underlying progression and to simultaneously identify subsets of genes that are responsible for that progression. We demonstrate the performance of SPD on a variety of microarray datasets that were generated by sampling a biological process at different points along its progression, without providing SPD any information of the underlying process. When applied to a cell cycle time series microarray dataset, SPD was not provided any prior knowledge of samples' time order or of which genes are cell-cycle regulated, yet SPD recovered the correct time order and identified many genes that have been associated with the cell cycle. When applied to B-cell differentiation data, SPD recovered the correct order of stages of normal B-cell differentiation and the linkage between preB-ALL tumor cells with their cell origin preB. When applied to mouse embryonic stem cell differentiation data, SPD uncovered a landscape of ESC differentiation into various lineages and genes that represent both generic and lineage specific processes. When applied to a prostate cancer microarray dataset, SPD identified gene modules that reflect a progression consistent with disease stages. SPD may be best viewed as a novel tool for synthesizing biological hypotheses because it provides a likely biological progression underlying a microarray dataset and, perhaps more importantly, the

  14. Designing an enhanced groundwater sample collection system

    Energy Technology Data Exchange (ETDEWEB)

    Schalla, R.

    1994-10-01

    As part of an ongoing technical support mission to achieve excellence and efficiency in environmental restoration activities at the Laboratory for Energy and Health-Related Research (LEHR), Pacific Northwest Laboratory (PNL) provided guidance on the design and construction of monitoring wells and identified the most suitable type of groundwater sampling pump and accessories for monitoring wells. The goal was to utilize a monitoring well design that would allow for hydrologic testing and reduce turbidity to minimize the impact of sampling. The sampling results of the newly designed monitoring wells were clearly superior to those of the previously installed monitoring wells. The new wells exhibited reduced turbidity, in addition to improved access for instrumentation and hydrologic testing. The variable frequency submersible pump was selected as the best choice for obtaining groundwater samples. The literature references are listed at the end of this report. Despite some initial difficulties, the actual performance of the variable frequency, submersible pump and its accessories was effective in reducing sampling time and labor costs, and its ease of use was preferred over the previously used bladder pumps. The surface seals system, called the Dedicator, proved to be useful accessory to prevent surface contamination while providing easy access for water-level measurements and for connecting the pump. Cost savings resulted from the use of the pre-production pumps (beta units) donated by the manufacturer for the demonstration. However, larger savings resulted from shortened field time due to the ease in using the submersible pumps and the surface seal access system. Proper deployment of the monitoring wells also resulted in cost savings and ensured representative samples.

  15. Bayesian stratified sampling to assess corpus utility

    Energy Technology Data Exchange (ETDEWEB)

    Hochberg, J.; Scovel, C.; Thomas, T.; Hall, S.

    1998-12-01

    This paper describes a method for asking statistical questions about a large text corpus. The authors exemplify the method by addressing the question, ``What percentage of Federal Register documents are real documents, of possible interest to a text researcher or analyst?`` They estimate an answer to this question by evaluating 200 documents selected from a corpus of 45,820 Federal Register documents. Bayesian analysis and stratified sampling are used to reduce the sampling uncertainty of the estimate from over 3,100 documents to fewer than 1,000. A possible application of the method is to establish baseline statistics used to estimate recall rates for information retrieval systems.

  16. Importance Sampling for Stochastic Timed Automata

    DEFF Research Database (Denmark)

    Jegourel, Cyrille; Larsen, Kim Guldstrand; Legay, Axel

    2016-01-01

    We present an importance sampling framework that combines symbolic analysis and simulation to estimate the probability of rare reachability properties in stochastic timed automata. By means of symbolic exploration, our framework first identifies states that cannot reach the goal. A state-wise cha......We present an importance sampling framework that combines symbolic analysis and simulation to estimate the probability of rare reachability properties in stochastic timed automata. By means of symbolic exploration, our framework first identifies states that cannot reach the goal. A state...

  17. Leaky Rayleigh wave investigation on mortar samples.

    Science.gov (United States)

    Neuenschwander, J; Schmidt, Th; Lüthi, Th; Romer, M

    2006-12-01

    Aggressive mineralized ground water may harm the concrete cover of tunnels and other underground constructions. Within a current research project mortar samples are used to study the effects of sulfate interaction in accelerated laboratory experiments. A nondestructive test method based on ultrasonic surface waves was developed to investigate the topmost layer of mortar samples. A pitch and catch arrangement is introduced for the generation and reception of leaky Rayleigh waves in an immersion technique allowing the measurement of their propagation velocity. The technique has been successfully verified for the reference materials aluminium, copper, and stainless steel. First measurements performed on mortar specimens demonstrate the applicability of this new diagnostic tool.

  18. Rapid determination of radiostrontium in seawater samples

    Energy Technology Data Exchange (ETDEWEB)

    Maxwell, Sherrod L.; Culligan, Brian K.; Utsey, Robin C.

    2013-03-12

    A new method for the determination of radiostrontium in seawater samples has been developed at the Savannah River National Laboratory (SRNL) that allows rapid preconcentration and separation of strontium and yttrium isotopes in seawater samples for measurement. The new SRNL method employs a novel and effective pre-concentration step that utilizes a blend of calcium phosphate with iron hydroxide to collect both strontium and yttrium rapidly from the seawater matrix with enhanced chemical yields. The pre-concentration steps, in combination with rapid Sr Resin and DGA Resin cartridge separation options using vacuum box technology, allow seawater samples up to 10 liters to be analyzed. The total 89Sr + 90Sr activity may be determined by gas flow proportional counting and recounted after ingrowth of 90Y to differentiate 89Sr from 90Sr. Gas flow proportional counting provides a lower method detection limit than liquid scintillation or Cerenkov counting and allows simultaneous counting of samples. Simultaneous counting allows for longer count times and lower method detection limits without handling very large aliquots of seawater. Seawater samples up to 6 liters may be analyzed using Sr Resin for 89Sr and 90Sr with a Minimum Detectable Activity (MDA) of 1-10 mBq/L, depending on count times. Seawater samples up to 10 liters may be analyzed for 90Sr using a DGA Resin method via collection and purification of 90Y only. If 89Sr and other fission products are present, then 91Y (beta energy 1.55 MeV, 58.5 day half-life) is also likely to be present. 91Y interferes with attempts to collect 90Y directly from the seawater sample without initial purification of Sr isotopes first and 90Y ingrowth. The DGA Resin option can be used to determine 90Sr, and if 91Y is also present, an ingrowth option with using DGA

  19. Distributed Capacitive Sensor for Sample Mass Measurement

    Science.gov (United States)

    Toda, Risaku; McKinney, Colin; Jackson, Shannon P.; Mojarradi, Mohammad; Manohara, Harish; Trebi-Ollennu, Ashitey

    2011-01-01

    Previous robotic sample return missions lacked in situ sample verification/ quantity measurement instruments. Therefore, the outcome of the mission remained unclear until spacecraft return. In situ sample verification systems such as this Distributed Capacitive (DisC) sensor would enable an unmanned spacecraft system to re-attempt the sample acquisition procedures until the capture of desired sample quantity is positively confirmed, thereby maximizing the prospect for scientific reward. The DisC device contains a 10-cm-diameter pressure-sensitive elastic membrane placed at the bottom of a sample canister. The membrane deforms under the weight of accumulating planetary sample. The membrane is positioned in close proximity to an opposing rigid substrate with a narrow gap. The deformation of the membrane makes the gap narrower, resulting in increased capacitance between the two parallel plates (elastic membrane and rigid substrate). C-V conversion circuits on a nearby PCB (printed circuit board) provide capacitance readout via LVDS (low-voltage differential signaling) interface. The capacitance method was chosen over other potential approaches such as the piezoelectric method because of its inherent temperature stability advantage. A reference capacitor and temperature sensor are embedded in the system to compensate for temperature effects. The pressure-sensitive membranes are aluminum 6061, stainless steel (SUS) 403, and metal-coated polyimide plates. The thicknesses of these membranes range from 250 to 500 m. The rigid substrate is made with a 1- to 2-mm-thick wafer of one of the following materials depending on the application requirements glass, silicon, polyimide, PCB substrate. The glass substrate is fabricated by a microelectromechanical systems (MEMS) fabrication approach. Several concentric electrode patterns are printed on the substrate. The initial gap between the two plates, 100 m, is defined by a silicon spacer ring that is anodically bonded to the glass

  20. Analytical laboratory and mobile sampling platform

    Energy Technology Data Exchange (ETDEWEB)

    Stetzenbach, K.; Smiecinski, A.

    1996-04-30

    This is the final report for the Analytical Laboratory and Mobile Sampling Platform project. This report contains only major findings and conclusions resulting from this project. Detailed reports of all activities performed for this project were provided to the Project Office every quarter since the beginning of the project. This report contains water chemistry data for samples collected in the Nevada section of Death Valley National Park (Triangle Area Springs), Nevada Test Site springs, Pahranagat Valley springs, Nevada Test Site wells, Spring Mountain springs and Crater Flat and Amargosa Valley wells.

  1. Analysis of lunar samples for carbon compounds.

    Science.gov (United States)

    Kvenvolden, K. A.

    1971-01-01

    Description of one approach to the analysis for carbon compounds in lunar materials from the Apollo 11 mission. The sequential scheme followed generally accepted organic geochemical practices, but was unusual in its application to a single sample. The procedures of the scheme were designed to minimize handling of the solids and extracts or hydrolysates. The solid lunar sample was retained in all steps of the sequential analysis in the vessel in which it was originally placed. Centrifugation was used to separate solid and liquid phases after extraction or refluxing. Liquids were recovered from solids by decantation.

  2. OSIRIS-REx Asteroid Sample Return Mission

    Science.gov (United States)

    Drake, M. J.; Lauretta, D. S.; Team, O.

    2011-12-01

    OSIRIS-REx is an asteroid sample return mission to organic-rich asteroid (101955) 1999 RQ36. The mission seeks to address deep questions: where did we come from; what is our destiny? Earth sterilized itself during its formation, yet here we are today. Where did the organics come from? To do so, we will return at least 60g of pristine, uncontaminated, organic-rich regolith for study on Earth by advanced analytical equipment. Because it is relatively easy for us to get the RQ36, it is relatively easy for it to get to us, making I the most potentially hazardous asteroid know to humanity with a 1:1800 probability of impacting the Earth in 2180. We will study the Yarkovsky effect, thermal forces that cause small objects to deviate from keplerian orbits, with the goal of understanding how to mitigate against a civilization-ending or species-ending impact catastrophe. The mission launches in September, 2016, arrives at RQ36 in November of 2019, and spends about a year conducting detailed studies of RQ36 in order to select the best sampling site. Sampling is achieved by approaching the surface ay 10 cm/sec and agitating the regolith with nitrogen gas on contact. The agitated regolith is collected in a sample head, which is stowed in the Sample return capsule for return to Earth at the UTTR Test range in Utah in September 2023. Two years of funded studies are carried out by the U.S. and world community before end of mission in 2025, after which samples will still be available through the NASA-JSC Curation Facility. OSIRIS-REx will return samples never before available for study on Earth, probably using some instruments yet to be invented. In addition, OSIRIS-REx will provide "ground truth" for telescope observations of airless bodies by returning a pristine sample of the surface of RQ36. OSIRIS-REx will evaluate resources available to future human missions, both materials and technologies such as proximity operations. And we will learn how to mitigate against impact

  3. OSIRIS-REx, Returning the Asteroid Sample

    Science.gov (United States)

    Ajluni, Thomas, M.; Everett, David F.; Linn, Timothy; Mink, Ronald; Willcockson, William; Wood, Joshua

    2015-01-01

    This paper addresses the technical aspects of the sample return system for the upcoming Origins, Spectral Interpretation, Resource Identification, and Security-Regolith Explorer (OSIRIS-REx) asteroid sample return mission. The overall mission design and current implementation are presented as an overview to establish a context for the technical description of the reentry and landing segment of the mission.The prime objective of the OSIRIS-REx mission is to sample a primitive, carbonaceous asteroid and to return that sample to Earth in pristine condition for detailed laboratory analysis. Targeting the near-Earth asteroid Bennu, the mission launches in September 2016 with an Earth reentry date of September 24, 2023.OSIRIS-REx will thoroughly characterize asteroid Bennu providing knowledge of the nature of near-Earth asteroids that is fundamental to understanding planet formation and the origin of life. The return to Earth of pristine samples with known geologic context will enable precise analyses that cannot be duplicated by spacecraft-based instruments, revolutionizing our understanding of the early Solar System. Bennu is both the most accessible carbonaceous asteroid and one of the most potentially Earth-hazardous asteroids known. Study of Bennu addresses multiple NASA objectives to understand the origin of the Solar System and the origin of life and will provide a greater understanding of both the hazards and resources in near-Earth space, serving as a precursor to future human missions to asteroids.This paper focuses on the technical aspects of the Sample Return Capsule (SRC) design and concept of operations, including trajectory design and reentry retrieval. Highlights of the mission are included below.The OSIRIS-REx spacecraft provides the essential functions for an asteroid characterization and sample return mission: attitude control propulsion power thermal control telecommunications command and data handling structural support to ensure successful

  4. [Respondent-Driven Sampling: a new sampling method to study visible and hidden populations].

    Science.gov (United States)

    Mantecón, Alejandro; Juan, Montse; Calafat, Amador; Becoña, Elisardo; Román, Encarna

    2008-01-01

    The paper introduces a variant of chain-referral sampling: respondent-driven sampling (RDS). This sampling method shows that methods based on network analysis can be combined with the statistical validity of standard probability sampling methods. In this sense, RDS appears to be a mathematical improvement of snowball sampling oriented to the study of hidden populations. However, we try to prove its validity with populations that are not within a sampling frame but can nonetheless be contacted without difficulty. The basics of RDS are explained through our research on young people (aged 14 to 25) who go clubbing, consume alcohol and other drugs, and have sex. Fieldwork was carried out between May and July 2007 in three Spanish regions: Baleares, Galicia and Comunidad Valenciana. The presentation of the study shows the utility of this type of sampling when the population is accessible but there is a difficulty deriving from the lack of a sampling frame. However, the sample obtained is not a random representative one in statistical terms of the target population. It must be acknowledged that the final sample is representative of a 'pseudo-population' that approximates to the target population but is not identical to it.

  5. Comparative Study of element composition of some honey samples ...

    African Journals Online (AJOL)

    The study was carried out at the Federal College of Forestry, Ibadan with seven honey samples were randomly selected within Ibadan metropolis, labeled as: Sample A (Forestry Honey), Sample B(Pure Honey), Sample C (Mr. Honey), Sample D (Taraba Honey), Sample E (Sokoto Honey), Sample F (Saki Honey), and ...

  6. Analysis of physical and chemical composition of honey samples in ...

    African Journals Online (AJOL)

    The study analyzed the physical and chemical compositions of seven honey samples, which were obtained from selected markets in Ibadan metropolis. Seven samples of honey were obtained namely from sample A (Forestry honey Ibadan), Sample B (Pure honey), Sample C (Mr. honey), Sample D (Taraba honey), sample ...

  7. Evaluation of common methods for sampling invertebrate pollinator assemblages: net sampling out-perform pan traps.

    Directory of Open Access Journals (Sweden)

    Tony J Popic

    Full Text Available Methods for sampling ecological assemblages strive to be efficient, repeatable, and representative. Unknowingly, common methods may be limited in terms of revealing species function and so of less value for comparative studies. The global decline in pollination services has stimulated surveys of flower-visiting invertebrates, using pan traps and net sampling. We explore the relative merits of these two methods in terms of species discovery, quantifying abundance, function, and composition, and responses of species to changing floral resources. Using a spatially-nested design we sampled across a 5000 km(2 area of arid grasslands, including 432 hours of net sampling and 1296 pan trap-days, between June 2010 and July 2011. Net sampling yielded 22% more species and 30% higher abundance than pan traps, and better reflected the spatio-temporal variation of floral resources. Species composition differed significantly between methods; from 436 total species, 25% were sampled by both methods, 50% only by nets, and the remaining 25% only by pans. Apart from being less comprehensive, if pan traps do not sample flower-visitors, the link to pollination is questionable. By contrast, net sampling functionally linked species to pollination through behavioural observations of flower-visitation interaction frequency. Netted specimens are also necessary for evidence of pollen transport. Benefits of net-based sampling outweighed minor differences in overall sampling effort. As pan traps and net sampling methods are not equivalent for sampling invertebrate-flower interactions, we recommend net sampling of invertebrate pollinator assemblages, especially if datasets are intended to document declines in pollination and guide measures to retain this important ecosystem service.

  8. Comparison of initial stream urine samples and cervical samples for detection of human papillomavirus.

    Science.gov (United States)

    Hagihara, Mao; Yamagishi, Yuka; Izumi, Koji; Miyazaki, Narimi; Suzuki, Takayoshi; Kato, Hideo; Nishiyama, Naoya; Koizumi, Yusuke; Suematsu, Hiroyuki; Mikamo, Hiroshige

    2016-08-01

    Uterine cervical cancer is a treatable and preventable cancer. Medical efforts to reduce rates of cervical cancer focus on the promotion of human papillomavirus (HPV) vaccination and the promotion of routine cervical cancer screening done by cervical cytology and cervical HPV testing. Urine-based HPV testing would be simple and noninvasive approach to screen for cervical cancer. Two biospecimens (clinician-taken sample from cervix and initial stream urine sample) were provided from a total of 240 healthy women attending for cancer screening provided for HPV testing. We have assessed the HPV detection rates among cervical samples and pellet fraction of urine samples using HPV test (Anyplex™ II HPV28 Detection kit, Seegene, Korea). Among 240 samples screened, HPV prevalence was 42.9% in pellet fractions of urine samples. The agreement between the two kinds of samples was 98.4%, k = 0.792. Discordant results were observed in 27 cases; 5 were positive only by urine samples and 22 were positive only by smear samples. Sensitivity and specificity for all HPV DNA in pellet fractions of urine using cervical samples as reference was 68.4% and 99.9%. Comparing methodologies of collection of samples for HPV detection, they showed the higher agreements for almost genotypes between cervical samples and pellet fractions of urine samples. These results suggest that urine could be a good noninvasive tool to monitor HPV infection in women. Additional research in a larger and general screening population would be needed. Copyright © 2016 Japanese Society of Chemotherapy and The Japanese Association for Infectious Diseases. Published by Elsevier Ltd. All rights reserved.

  9. Outgassing tests on iras solar panel samples

    Science.gov (United States)

    Premat, G.; Zwaal, A.; Pennings, N. H.

    1980-01-01

    Several outgassing tests were carried out on representative solar panel samples in order to determine the extent of contamination that could be expected from this source. The materials for the construction of the solar panels were selected as a result of contamination obtained in micro volatile condensable materials tests.

  10. A soil sampling program for the Netherlands

    NARCIS (Netherlands)

    Visschers, R.; Finke, P.A.; Gruijter, de J.J.

    2007-01-01

    Soil data users in The Netherlands were inventoried for current and future data needs. Prioritized data needs were used to design the Netherlands Soil Sampling Program (NSSP) as a framework containing 3 groups of related projects: map upgrading, map updating and upgrading of pedotransfer functions.

  11. Using Music Sampling to Teach Research Skills

    Science.gov (United States)

    Wakefield, Sarah R.

    2006-01-01

    One way to teach the research paper is by first discussing sampling, the musical practice of using other artists' work. By studying the lyrics of Sean "P. Diddy" Combs, a widely known hip-hop sampler, students gain an understanding of quoting, paraphrasing, and summarizing sources.

  12. 40 CFR 763.86 - Sampling.

    Science.gov (United States)

    2010-07-01

    ... insulation is fiberglass, foam glass, rubber, or other non-ACBM. (c) Miscellaneous material. In a manner... system insulation. (1) Except as provided in paragraphs (b) (2) through (4) of this section and § 763.87... samples from each homogeneous area of thermal system insulation that is not assumed to be ACM. (2) Collect...

  13. Automated system for fractionation of blood samples

    Energy Technology Data Exchange (ETDEWEB)

    Lee, N. E.; Genung, R. K.; Johnson, W. F.; Mrochek, J. E.; Scott, C. D.

    1978-01-01

    A prototype system for preparing multiple fractions of blood components (plasma, washed red cells, and hemolysates) using automated techniques has been developed. The procedure is based on centrifugal separation and differential pressure-induced transfer in a rotor that has been designed to process numerous samples simultaneously. Red cells are sedimented against the outer walls of the sample chamber, and plasma is syphoned, by imposition of eithr a slight positive or negative pressure, into individual reservoirs in a collection ring. Washing of cells is performed in situ; samples of washed cells, either packed or in saline solution, can be recovered. Cellular hemolysates are prepared and automatically transferred to individual, commercially available collection vials ready for storage in liquid nitrogen or immediate analysis. The system has potential application in any biomedical area which requires high sample throughput and in which one or more of the blood fractions will be used. A separate unit has been designed and developed for the semiautomated cleaning of the blood processing vessel.

  14. LOGISTICS OF ECOLOGICAL SAMPLING ON LARGE RIVERS

    Science.gov (United States)

    The objectives of this document are to provide an overview of the logistical problems associated with the ecological sampling of boatable rivers and to suggest solutions to those problems. It is intended to be used as a resource for individuals preparing to collect biological dat...

  15. Upper Atmospheric Particulate Monitoring and Sample Return

    Science.gov (United States)

    Liddell, Alan; Sohl, John E.

    2010-10-01

    H.A.R.B.O.R. (High Altitude Reconnaissance Balloon for Outreach and Research) is a student-run program in which high-altitude balloon systems are designed, constructed, and flown by students conducting individual or group research projects. One area of interest is in the sampling of particles in the upper atmosphere. Collecting airborne particulates and studying them under an SEM can answer questions on the origins of airborne particulate matter. We could find explanations for climate change or directly measure pollution caused by smokestacks. The SEM has the capacity to capture images of particulates and determine their composition. I am building a system capable of sampling air up to 30km (100,000 ft). The system will contain a servo-controlled filter system for sampling air captured by the ascent of the balloon. Currently, filter types are being evaluated for capture rate and air flow resistance. A circuit has been built to test the mass throughput of the airflow as the balloon travels its course. A vacuum chamber is being built to simulate the nearspace environment. Testing and simulation should be complete in time to fly a finalized sample return mission in spring 2011.

  16. Proceedings of the wellbore sampling workshop

    Energy Technology Data Exchange (ETDEWEB)

    Traeger, R.K. (ed.); Harding, B.W.

    1987-11-01

    Representatives from academia, industry and research laboratories participated in an intensive two-day review to identify major technological limitations in obtaining solid and fluid samples from wellbores. Top priorities identified for further development include: coring of hard and unconsolidated materials; flow through fluid samplers with borehole measurements T, P and pH; and nonintrusive interrogation of pressure cores.

  17. 21 CFR 211.170 - Reserve samples.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Reserve samples. 211.170 Section 211.170 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) DRUGS: GENERAL CURRENT GOOD MANUFACTURING PRACTICE FOR FINISHED PHARMACEUTICALS Laboratory Controls § 211.170 Reserve...

  18. Characterization of Soil Samples of Enzyme Activity

    Science.gov (United States)

    Freeland, P. W.

    1977-01-01

    Described are nine enzyme essays for distinguishing soil samples. Colorimetric methods are used to compare enzyme levels in soils from different sites. Each soil tested had its own spectrum of activity. Attention is drawn to applications of this technique in forensic science and in studies of soil fertility. (Author/AJ)

  19. Using Ancient Samples in Projection Analysis

    Directory of Open Access Journals (Sweden)

    Melinda A. Yang

    2016-01-01

    Full Text Available Projection analysis is a tool that extracts information from the joint allele frequency spectrum to better understand the relationship between two populations. In projection analysis, a test genome is compared to a set of genomes from a reference population. The projection’s shape depends on the historical relationship of the test genome’s population to the reference population. Here, we explore in greater depth the effects on the projection when ancient samples are included in the analysis. First, we conduct a series of simulations in which the ancient sample is directly ancestral to a present-day population (one-population model, or the ancient sample is ancestral to a sister population that diverged before the time of sampling (two-population model. We find that there are characteristic differences between the projections for the one-population and two-population models, which indicate that the projection can be used to determine whether a test genome is directly ancestral to a present-day population or not. Second, we compute projections for several published ancient genomes. We compare two Neanderthals and three ancient human genomes to European, Han Chinese and Yoruba reference panels. We use a previously constructed demographic model and insert these five ancient genomes to assess how well the observed projections are recovered.

  20. bacteriological quality of water samples in

    African Journals Online (AJOL)

    saprophyte encountered in the soil (10) and could have been carried along with soil that sticks to the containers used for fetching water. CONCLUSION. The well water samples were particularly observed to fall below the WHO recommendation which states that water should contain no microorganism known.

  1. Transabdominal Chorionic Villous Sampling in Nigeria: Correlation ...

    African Journals Online (AJOL)

    BACKGROUND: transabdominal chorionic villous sampling is generally preferred to the transvaginal approach. The procedure may, however, be associated with complcations due to a number of factors. OBJECTIVES: to review the relationship between the number of cases and other variables in transabdominal chorionic ...

  2. Variability Study of the S5 Sample

    Indian Academy of Sciences (India)

    2016-01-27

    Jan 27, 2016 ... We present the results of flux density monitoring of the S5 sample at 5GHz with the Urumqi 25-m radio telescope during Dec. 2008 and Nov. 2009. Most sources exhibited > 2% rms variation in our one-year monitoring. Twenty-five highly variable sources were detected at a confidence level of 99%. Weaker ...

  3. Event dependent sampling of recurrent events

    DEFF Research Database (Denmark)

    Kvist, Tine Kajsa; Andersen, Per Kragh; Angst, Jules

    2010-01-01

    The effect of event-dependent sampling of processes consisting of recurrent events is investigated when analyzing whether the risk of recurrence increases with event count. We study the situation where processes are selected for study if an event occurs in a certain selection interval. Motivation...

  4. Comparison of transition-matrix sampling procedures

    DEFF Research Database (Denmark)

    Yevick, D.; Reimer, M.; Tromborg, Bjarne

    2009-01-01

    We compare the accuracy of the multicanonical procedure with that of transition-matrix models of static and dynamic communication system properties incorporating different acceptance rules. We find that for appropriate ranges of the underlying numerical parameters, algorithmically simple yet high...... accurate procedures can be employed in place of the standard multicanonical sampling algorithm....

  5. Airflow Test of Acoustic Board Samples

    DEFF Research Database (Denmark)

    Jensen, Rasmus Lund; Jensen, Lise Mellergaard

    In the laboratory of Indoor Environmental Engineering, Department of Civil Engineering, Aalborg University an airflow test on 2x10 samples of acoustic board were carried out the 2nd of June 2012. The tests were carried out for Rambøll and STO AG. The test includes connected values of volume flow...

  6. Estimating Aquatic Insect Populations. Introduction to Sampling.

    Science.gov (United States)

    Chihuahuan Desert Research Inst., Alpine, TX.

    This booklet introduces high school and junior high school students to the major groups of aquatic insects and to population sampling techniques. Chapter 1 consists of a short field guide which can be used to identify five separate orders of aquatic insects: odonata (dragonflies and damselflies); ephemeroptera (mayflies); diptera (true flies);…

  7. Hierarchical Knowledge Gradient for Sequential Sampling

    NARCIS (Netherlands)

    Mes, Martijn R.K.; Powell, Warren B.; Frazier, Peter I.

    2011-01-01

    We propose a sequential sampling policy for noisy discrete global optimization and ranking and selection, in which we aim to efficiently explore a finite set of alternatives before selecting an alternative as best when exploration stops. Each alternative may be characterized by a multidimensional

  8. BIOGENIC AMINES CONTENT IN DIFFERENT WINE SAMPLES

    Directory of Open Access Journals (Sweden)

    Attila Kántor

    2015-02-01

    Full Text Available Twenty-five samples of different Slovak wines before and after filtration were analysed in order to determine the content of eight biogenic amines (tryptamine, phenylalanine, putrescine, cadaverine, histamine, tyramine, spermidine and spermine. The method involves extraction of biogenic amines from wine samples with used dansyl chloride. Ultra-high performance liquid chromatography (UHPLC was used for determination of biogenic amines equipped with a Rapid Resolution High Definition (RRHD, DAD detectors and Extend-C18 LC column (50 mm x 3.0 mm ID, 1.8 μm particle size. In this study the highest level of biogenic amine in all wine samples represent tryptamine (TRM with the highest content 170.9±5.3 mg/L in Pinot Blanc wine. Phenylalanine (PHE cadaverine (CAD, histamine (HIS and spermidine (SPD were not detected in all wines; mainly SPD was not detected in 16 wines, HIS not detected in 14 wines, PHE and CAD not detected in 2 wines. Tyramine (TYR, spermine (SPN and putrescine (PUT were detected in all wines, but PUT and SPN in very low concentration. The worst wine samples with high biogenic amine content were Saint Laurent (BF, Pinot Blanc (S and Pinot Noir (AF.

  9. 7 CFR 58.227 - Sampling device.

    Science.gov (United States)

    2010-01-01

    ... AGRICULTURAL MARKETING ACT OF 1946 AND THE EGG PRODUCTS INSPECTION ACT (CONTINUED) GRADING AND INSPECTION, GENERAL SPECIFICATIONS FOR APPROVED PLANTS AND STANDARDS FOR GRADES OF DAIRY PRODUCTS 1 General... 7 Agriculture 3 2010-01-01 2010-01-01 false Sampling device. 58.227 Section 58.227 Agriculture...

  10. Determining sample size for tree utilization surveys

    Science.gov (United States)

    Stanley J. Zarnoch; James W. Bentley; Tony G. Johnson

    2004-01-01

    The U.S. Department of Agriculture Forest Service has conducted many studies to determine what proportion of the timber harvested in the South is actually utilized. This paper describes the statistical methods used to determine required sample sizes for estimating utilization ratios for a required level of precision. The data used are those for 515 hardwood and 1,557...

  11. Improving your Hypothesis Testing: Determining Sample Sizes.

    Science.gov (United States)

    Luftig, Jeffrey T.; Norton, Willis P.

    1982-01-01

    This article builds on an earlier discussion of the importance of the Type II error (beta) and power to the hypothesis testing process (CE 511 484), and illustrates the methods by which sample size calculations should be employed so as to improve the research process. (Author/CT)

  12. From samples to populations in retinex models

    Science.gov (United States)

    Gianini, Gabriele

    2017-05-01

    Some spatial color algorithms, such as Brownian Milano retinex (MI-retinex) and random spray retinex (RSR), are based on sampling. In Brownian MI-retinex, memoryless random walks (MRWs) explore the neighborhood of a pixel and are then used to compute its output. Considering the relative redundancy and inefficiency of MRW exploration, the algorithm RSR replaced the walks by samples of points (the sprays). Recent works point to the fact that a mapping from the sampling formulation to the probabilistic formulation of the corresponding sampling process can offer useful insights into the models, at the same time featuring intrinsically noise-free outputs. The paper continues the development of this concept and shows that the population-based versions of RSR and Brownian MI-retinex can be used to obtain analytical expressions for the outputs of some test images. The comparison of the two analytic expressions from RSR and from Brownian MI-retinex demonstrates not only that the two outputs are, in general, different but also that they depend in a qualitatively different way upon the features of the image.

  13. Algorithms for the Sample Mean of Graphs

    Science.gov (United States)

    Jain, Brijnesh J.; Obermayer, Klaus

    Measures of central tendency for graphs are important for protoype construction, frequent substructure mining, and multiple alignment of protein structures. This contribution proposes subgradient-based methods for determining a sample mean of graphs. We assess the performance of the proposed algorithms in a comparative empirical study.

  14. Extracting Periodic Signals From Irregularly Sampled Data

    Science.gov (United States)

    Wilcox, Jaroslava Z.

    1995-01-01

    Successive approximations formed in Fourier space. Algorithm extracts periodic signals from sparse, irregularly sampled sets of measurement data. Pertains to data processed via fast Fourier transforms (FFTs). Data represents signal components with initially unknown frequencies spanning large spectral range and includes frequencies not integer multiples of minimum FFT frequency.

  15. ANTIBIOTIC RESISTANT BACTERIA IN FAECAL SAMPLES

    African Journals Online (AJOL)

    ABSTRACT . The occurrence of antibiotic-resistant bacteria in faeces of apparently healthy individual volun- teers was investigated. Faecal samples were collected from 216 individuals comprising 138 adults. (70 males and 68 females) and 78 children aged between 4 months and 42 years (mean age was. 30.2 months).

  16. LTRM Vegetation Sampling Strata, UMRS Pool 4

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The data set includes delineation of sampling strata for the six study reaches of the UMRR Program’s LTRM element. Separate strata coverages exist for each of the...

  17. METABOLITE CHARACTERIZATION IN SERUM SAMPLES FROM ...

    African Journals Online (AJOL)

    Preferred Customer

    Metabolites, the end products of cellular process reflect the system level biological stress response. Hence, any enzymatic ... gradient HSQC adiabatic pulses. The experiments were performed in ..... Expansion of 1H-13C HSQC spectra of serum control healthy samples highlighting the resonance assignments in the region ...

  18. Dielectric characterisation of human tissue samples

    NARCIS (Netherlands)

    Rossum, W.L. van; Nennie, F.; Deiana, D.; Veen, A.J. van der; Monni, S.

    2014-01-01

    The electrical properties of tissues samples are required for investigation and simulation purposes in biomedical applications of EM sensors. While available open literature mostly deals with ex-vivo characterization of isolated tissues, knowledge on dielectric properties of these tissues in their

  19. Calibrators and control samples for bilirubinometers.

    Science.gov (United States)

    Blijenberg, B G; Brügmann, G; Geilenkeuser, W J; Kusyschyn, R; Röhle, G; Schlebusch, H; Schneider, C

    1993-06-01

    The different matrix properties of neonatal serum and commercial control samples can lead to considerable errors in the calibration and control of bilirubinometers. These difficulties can be avoided by calibration with serum from healthy adults which is supplemented with unconjugated bilirubin. But this procedure is impracticable for most routine laboratories. Under certain preconditions, control samples, with bilirubin concentrations determined with correctly calibrated bilirubinometers or spectrophotometers, are also suitable as calibrators. This was established by determination of the bilirubin concentration of 16 different control samples, using both the reference method and correctly calibrated bilirubinometers or spectrophotometers in three or four specialist laboratories. This was also confirmed in several interlaboratory surveys, some involving up to 72 laboratories. The results of these investigations show that a control sample should be used for the calibration of a bilirubinometer only if it meets the following preconditions: 1. There should be no significant difference between the bilirubin values determined with the reference method and with a correctly calibrated spectrophotometer or bilirubinometer. 2. The bilirubin concentration should lie in the range 230-300 mumol/l. The photometric response of bilirubinometers has a limited linear range, so that analytical results greater than 300 mumol/l must be rated as basically unreliable.

  20. Contemporary sample stacking in analytical electrophoresis

    Czech Academy of Sciences Publication Activity Database

    Malá, Zdeňka; Šlampová, Andrea; Křivánková, Ludmila; Gebauer, Petr; Boček, Petr

    2015-01-01

    Roč. 36, č. 1 (2015), s. 15-35 ISSN 0173-0835 R&D Projects: GA ČR(CZ) GA13-05762S Institutional support: RVO:68081715 Keywords : biological samples * stacking * trace analysis * zone electrophoresis Subject RIV: CB - Analytical Chemistry, Separation Impact factor: 2.482, year: 2015