WorldWideScience

Sample records for sample-to-sample conductance fluctuations

  1. Sampling rare fluctuations of discrete-time Markov chains

    Science.gov (United States)

    Whitelam, Stephen

    2018-03-01

    We describe a simple method that can be used to sample the rare fluctuations of discrete-time Markov chains. We focus on the case of Markov chains with well-defined steady-state measures, and derive expressions for the large-deviation rate functions (and upper bounds on such functions) for dynamical quantities extensive in the length of the Markov chain. We illustrate the method using a series of simple examples, and use it to study the fluctuations of a lattice-based model of active matter that can undergo motility-induced phase separation.

  2. Universal conductance fluctuations in disordered metals

    International Nuclear Information System (INIS)

    Lee, P.A.

    1987-01-01

    The author argues that observed and theoretical fluctuations in the electrical conductance of disordered metals, induced by variations in the magnetic field or the chemical potential, are not time-dependent noise but that the conductance is a deterministic albeit fluctuating function for a given realization of the impurity configuration. A method is constructed for representing the sensitivity of the conductance of a given metal to a small change in the impurity configuration as a function of such variables as sample size, impurities per unit volume, and mean free path. The sensitivity helps explain the size of 1/f noise due to defect motion in disordered metals

  3. Nonequilibrium Gyrokinetic Fluctuation Theory and Sampling Noise in Gyrokinetic Particle-in-cell Simulations

    International Nuclear Information System (INIS)

    Krommes, John A.

    2007-01-01

    The present state of the theory of fluctuations in gyrokinetic (GK) plasmas and especially its application to sampling noise in GK particle-in-cell (PIC) simulations is reviewed. Topics addressed include the Δf method, the fluctuation-dissipation theorem for both classical and GK many-body plasmas, the Klimontovich formalism, sampling noise in PIC simulations, statistical closure for partial differential equations, the theoretical foundations of spectral balance in the presence of arbitrary noise sources, and the derivation of Kadomtsev-type equations from the general formalism

  4. Nonequilibrium Gyrokinetic Fluctuation Theory and Sampling Noise in Gyrokinetic Particle-in-cell Simulations

    Energy Technology Data Exchange (ETDEWEB)

    John A. Krommes

    2007-10-09

    The present state of the theory of fluctuations in gyrokinetic GK plasmas and especially its application to sampling noise in GK particle-in-cell PIC simulations is reviewed. Topics addressed include the Δf method, the fluctuation-dissipation theorem for both classical and GK many-body plasmas, the Klimontovich formalism, sampling noise in PIC simulations, statistical closure for partial differential equations, the theoretical foundations of spectral balance in the presence of arbitrary noise sources, and the derivation of Kadomtsev-type equations from the general formalism.

  5. Non-Contact Conductivity Measurement for Automated Sample Processing Systems

    Science.gov (United States)

    Beegle, Luther W.; Kirby, James P.

    2012-01-01

    A new method has been developed for monitoring and control of automated sample processing and preparation especially focusing on desalting of samples before analytical analysis (described in more detail in Automated Desalting Apparatus, (NPO-45428), NASA Tech Briefs, Vol. 34, No. 8 (August 2010), page 44). The use of non-contact conductivity probes, one at the inlet and one at the outlet of the solid phase sample preparation media, allows monitoring of the process, and acts as a trigger for the start of the next step in the sequence (see figure). At each step of the muti-step process, the system is flushed with low-conductivity water, which sets the system back to an overall low-conductivity state. This measurement then triggers the next stage of sample processing protocols, and greatly minimizes use of consumables. In the case of amino acid sample preparation for desalting, the conductivity measurement will define three key conditions for the sample preparation process. First, when the system is neutralized (low conductivity, by washing with excess de-ionized water); second, when the system is acidified, by washing with a strong acid (high conductivity); and third, when the system is at a basic condition of high pH (high conductivity). Taken together, this non-contact conductivity measurement for monitoring sample preparation will not only facilitate automation of the sample preparation and processing, but will also act as a way to optimize the operational time and use of consumables

  6. Rescaled Range Analysis and Detrended Fluctuation Analysis: Finite Sample Properties and Confidence Intervals

    Czech Academy of Sciences Publication Activity Database

    Krištoufek, Ladislav

    4/2010, č. 3 (2010), s. 236-250 ISSN 1802-4696 R&D Projects: GA ČR GD402/09/H045; GA ČR GA402/09/0965 Grant - others:GA UK(CZ) 118310 Institutional research plan: CEZ:AV0Z10750506 Keywords : rescaled range analysis * detrended fluctuation analysis * Hurst exponent * long-range dependence Subject RIV: AH - Economics http://library.utia.cas.cz/separaty/2010/E/kristoufek-rescaled range analysis and detrended fluctuation analysis finite sample properties and confidence intervals.pdf

  7. Method for Measuring Thermal Conductivity of Small Samples Having Very Low Thermal Conductivity

    Science.gov (United States)

    Miller, Robert A.; Kuczmarski, Maria a.

    2009-01-01

    This paper describes the development of a hot plate method capable of using air as a standard reference material for the steady-state measurement of the thermal conductivity of very small test samples having thermal conductivity on the order of air. As with other approaches, care is taken to ensure that the heat flow through the test sample is essentially one-dimensional. However, unlike other approaches, no attempt is made to use heated guards to block the flow of heat from the hot plate to the surroundings. It is argued that since large correction factors must be applied to account for guard imperfections when sample dimensions are small, it may be preferable to simply measure and correct for the heat that flows from the heater disc to directions other than into the sample. Experimental measurements taken in a prototype apparatus, combined with extensive computational modeling of the heat transfer in the apparatus, show that sufficiently accurate measurements can be obtained to allow determination of the thermal conductivity of low thermal conductivity materials. Suggestions are made for further improvements in the method based on results from regression analyses of the generated data.

  8. Conducting Clinical Research Using Crowdsourced Convenience Samples.

    Science.gov (United States)

    Chandler, Jesse; Shapiro, Danielle

    2016-01-01

    Crowdsourcing has had a dramatic impact on the speed and scale at which scientific research can be conducted. Clinical scientists have particularly benefited from readily available research study participants and streamlined recruiting and payment systems afforded by Amazon Mechanical Turk (MTurk), a popular labor market for crowdsourcing workers. MTurk has been used in this capacity for more than five years. The popularity and novelty of the platform have spurred numerous methodological investigations, making it the most studied nonprobability sample available to researchers. This article summarizes what is known about MTurk sample composition and data quality with an emphasis on findings relevant to clinical psychological research. It then addresses methodological issues with using MTurk--many of which are common to other nonprobability samples but unfamiliar to clinical science researchers--and suggests concrete steps to avoid these issues or minimize their impact.

  9. Monthly Fluctuations of Insomnia Symptoms in a Population-Based Sample

    Science.gov (United States)

    Morin, Charles M.; LeBlanc, M.; Ivers, H.; Bélanger, L.; Mérette, Chantal; Savard, Josée; Jarrin, Denise C.

    2014-01-01

    interval of 3 months proved the most reliable for defining chronic insomnia. Conclusions: Monthly assessment of insomnia and sleep patterns revealed significant variability over the course of a 12-month period. These findings highlight the importance for future epidemiological studies of conducting repeated assessment at shorter than the typical yearly interval in order to reliably capture the natural course of insomnia over time. Citation: Morin CM; LeBlanc M; Ivers H; Bélanger L; Mérette C; Savard J; Jarrin DC. Monthly fluctuations of insomnia symptoms in a population-based sample. SLEEP 2014;37(2):319-326. PMID:24497660

  10. Note: Development of a microfabricated sensor to measure thermal conductivity of picoliter scale liquid samples.

    Science.gov (United States)

    Park, Byoung Kyoo; Yi, Namwoo; Park, Jaesung; Kim, Dongsik

    2012-10-01

    This paper presents a thermal analysis device, which can measure thermal conductivity of picoliter scale liquid sample. We employ the three omega method with a microfabricated AC thermal sensor with nanometer width heater. The liquid sample is confined by a micro-well structure fabricated on the sensor surface. The performance of the instrument was verified by measuring the thermal conductivity of 27-picoliter samples of de-ionized (DI) water, ethanol, methanol, and DI water-ethanol mixtures with accuracies better than 3%. Furthermore, another analytical scheme allows real-time thermal conductivity measurement with 5% accuracy. To the best of our knowledge, this technique requires the smallest volume of sample to measure thermal property ever.

  11. Scanning Ion Conductance Microscopy for Studying Biological Samples

    Directory of Open Access Journals (Sweden)

    Irmgard D. Dietzel

    2012-11-01

    Full Text Available Scanning ion conductance microscopy (SICM is a scanning probe technique that utilizes the increase in access resistance that occurs if an electrolyte filled glass micro-pipette is approached towards a poorly conducting surface. Since an increase in resistance can be monitored before the physical contact between scanning probe tip and sample, this technique is particularly useful to investigate the topography of delicate samples such as living cells. SICM has shown its potential in various applications such as high resolution and long-time imaging of living cells or the determination of local changes in cellular volume. Furthermore, SICM has been combined with various techniques such as fluorescence microscopy or patch clamping to reveal localized information about proteins or protein functions. This review details the various advantages and pitfalls of SICM and provides an overview of the recent developments and applications of SICM in biological imaging. Furthermore, we show that in principle, a combination of SICM and ion selective micro-electrodes enables one to monitor the local ion activity surrounding a living cell.

  12. Comparative fluctuating asymmetry of spotted barb (Puntius binotatus sampled from the Rivers of Wawa and Tubay, Mindanao, Philippines

    Directory of Open Access Journals (Sweden)

    C.C. Cabuga Jr.

    2017-03-01

    Full Text Available Fluctuating Asymmetry (FA commonly uses to evaluate environmental stress and developmental variability of different biotic elements. This study aims to describe the possible effects of pollutants on the body shapes of spotted barb (Puntius binotatus with notes of physico-chemical parameters of Wawa River, Bayugan City, Agusan del Sur and Tubay River, Tubay, Agusan del Norte, Philippines. There were a total of 80 samples (40 females and 40 males collected from each sampling areas. Digital imaging was prepared and the acquired images were loaded into tpsDig2 program. Standard landmarks on fish morphometric were employed. Using thin-plate spline (TPS series, landmark analysis were completed and subjected to symmetry and asymmetry in geometric data (SAGE software. Results in Procrustes ANOVA showed high significant differences of (P<0.0001 in the three factors analyzed: the individuals; sides; and the interaction of individuals and sides; indicating high fluctuating asymmetry. In Tubay River, the level of asymmetry in females were 79.06% and in males 71.69% while in Wawa River, the level of asymmetry in females were 76.60% and in males 62.64%. Therefore, indicating high level of asymmetry denotes environmental alterations. On the other hand, physicochemical parameters were also determined in the two sampling areas. The results of One-way ANOVA showed that the mean parameters in Wawa River has significant difference of (P<0.0001, while Tubay River has no significant difference. Results of Pearson-correlation of fluctuating asymmetry between physicochemical parameters shows no correlation which suggests that water components is not directly influenced by the fluctuating asymmetry. The approach of FA and physico-chemical parameters were significant for evaluating environmental condition as well as species state of well-being.

  13. Universal mesoscopic conductance fluctuations

    International Nuclear Information System (INIS)

    Evangelou, S.N.

    1992-01-01

    The theory of conductance fluctuations in disordered metallic systems with size large compared to the mean free path of the electron but small compared to localization length is considered. It is demonstrates that fluctuations have an universal character and are due to repulsion between levels and spectral rigidity. The basic fluctuation measures for the energy spectrum in the mesoscopic regime of disordered systems are consistent with the Gaussian random matrix ensemble predictions. Although our disordered electron random matrix ensemble does not belong to the Gaussian ensemble the two ensembles turn out to be essentially similar. The level repulsion and the spectral rigidity found in nuclear spectra should also be observed in the metallic regime of Anderson localization. 7 refs. (orig.)

  14. Fluctuation conductivity of thin superconductive vanadium films

    International Nuclear Information System (INIS)

    Dmitrenko, I.M.; Sidorenko, A.S.; Fogel, N.Y.

    1982-01-01

    Resistive transitions into the superconductive state are studied in thin [d >T/sub c/ the experimental data on the excess conductivity of the films agree qualitatively and quantitively with Aslamazov--Larkin theory. There is no Maki--Thompson contribution to fluctuation conductivity. Near T/sub c/ the excess conductivity sigma' changes exponentially with temperature in accordance with the predictions of the theory of the critical fluctuations of the order parameter. The values of the effective charge carrier mass defined from data on sigma' for the low fluctuation and critical fluctuation regions differ markedly. This difference is within the spread of effective masses for various charge carrier groups already known for vanadium. Causes of the difference in resistive behavior for the regions T >T/sub c/ are considered

  15. Influence of high-conductivity buffer composition on field-enhanced sample injection coupled to sweeping in CE.

    Science.gov (United States)

    Anres, Philippe; Delaunay, Nathalie; Vial, Jérôme; Thormann, Wolfgang; Gareil, Pierre

    2013-02-01

    The aim of this work was to clarify the mechanism taking place in field-enhanced sample injection coupled to sweeping and micellar EKC (FESI-Sweep-MEKC), with the utilization of two acidic high-conductivity buffers (HCBs), phosphoric acid or sodium phosphate buffer, in view of maximizing sensitivity enhancements. Using cationic model compounds in acidic media, a chemometric approach and simulations with SIMUL5 were implemented. Experimental design first enabled to identify the significant factors and their potential interactions. Simulation demonstrates the formation of moving boundaries during sample injection, which originate at the initial sample/HCB and HCB/buffer discontinuities and gradually change the compositions of HCB and BGE. With sodium phosphate buffer, the HCB conductivity increased during the injection, leading to a more efficient preconcentration by staking (about 1.6 times) than with phosphoric acid alone, for which conductivity decreased during injection. For the same injection time at constant voltage, however, a lower amount of analytes was injected with sodium phosphate buffer than with phosphoric acid. Consequently sensitivity enhancements were lower for the whole FESI-Sweep-MEKC process. This is why, in order to maximize sensitivity enhancements, it is proposed to work with sodium phosphate buffer as HCB and to use constant current during sample injection. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Impact of frequent cerebrospinal fluid sampling on Aβ levels: systematic approach to elucidate influencing factors.

    Science.gov (United States)

    Van Broeck, Bianca; Timmers, Maarten; Ramael, Steven; Bogert, Jennifer; Shaw, Leslie M; Mercken, Marc; Slemmon, John; Van Nueten, Luc; Engelborghs, Sebastiaan; Streffer, Johannes Rolf

    2016-05-19

    Cerebrospinal fluid (CSF) amyloid-beta (Aβ) peptides are predictive biomarkers for Alzheimer's disease and are proposed as pharmacodynamic markers for amyloid-lowering therapies. However, frequent sampling results in fluctuating CSF Aβ levels that have a tendency to increase compared with baseline. The impact of sampling frequency, volume, catheterization procedure, and ibuprofen pretreatment on CSF Aβ levels using continuous sampling over 36 h was assessed. In this open-label biomarker study, healthy participants (n = 18; either sex, age 55-85 years) were randomized into one of three cohorts (n = 6/cohort; high-frequency sampling). In all cohorts except cohort 2 (sampling started 6 h post catheterization), sampling through lumbar catheterization started immediately post catheterization. Cohort 3 received ibuprofen (800 mg) before catheterization. Following interim data review, an additional cohort 4 (n = 6) with an optimized sampling scheme (low-frequency and lower volume) was included. CSF Aβ(1-37), Aβ(1-38), Aβ(1-40), and Aβ(1-42) levels were analyzed. Increases and fluctuations in mean CSF Aβ levels occurred in cohorts 1-3 at times of high-frequency sampling. Some outliers were observed (cohorts 2 and 3) with an extreme pronunciation of this effect. Cohort 4 demonstrated minimal fluctuation of CSF Aβ both on a group and an individual level. Intersubject variability in CSF Aβ profiles over time was observed in all cohorts. CSF Aβ level fluctuation upon catheterization primarily depends on the sampling frequency and volume, but not on the catheterization procedure or inflammatory reaction. An optimized low-frequency sampling protocol minimizes or eliminates fluctuation of CSF Aβ levels, which will improve the capability of accurately measuring the pharmacodynamic read-out for amyloid-lowering therapies. ClinicalTrials.gov NCT01436188 . Registered 15 September 2011.

  17. Non-Gaussian conductivity fluctuations in semiconductors

    International Nuclear Information System (INIS)

    Melkonyan, S.V.

    2010-01-01

    A theoretical study is presented on the statistical properties of conductivity fluctuations caused by concentration and mobility fluctuations of the current carriers. It is established that mobility fluctuations result from random deviations in the thermal equilibrium distribution of the carriers. It is shown that mobility fluctuations have generation-recombination and shot components which do not satisfy the requirements of the central limit theorem, in contrast to the current carrier's concentration fluctuation and intraband component of the mobility fluctuation. It is shown that in general the mobility fluctuation consist of thermal (or intraband) Gaussian and non-thermal (or generation-recombination, shot, etc.) non-Gaussian components. The analyses of theoretical results and experimental data from literature show that the statistical properties of mobility fluctuation and of 1/f-noise fully coincide. The deviation from Gaussian statistics of the mobility or 1/f fluctuations goes hand in hand with the magnitude of non-thermal noise (generation-recombination, shot, burst, pulse noises, etc.).

  18. Synthesis, structural characterization and fluctuation conductivity of HoBa2Cu3O7-δ-SrTiO3 composites

    International Nuclear Information System (INIS)

    Uribe Laverde, M.A.; Landinez Tellez, D.A.; Roa-Rojas, J.

    2010-01-01

    Single-phase polycrystalline samples of HoBa 2 Cu 3 O 7-δ superconductor and SrTiO 3 isolator were produced by means of the solid state reaction technique. After structural characterization of both materials, superconductor-isolator composites were produced with nominal isolator volume percentages between 0% and 10%. Resistivity measurements for the composites and the HoBa 2 Cu 3 O 7-δ sample with different currents evidenced a superconducting transition with critical temperature T C = 92 K, with wider transitions with increasing either isolator content or measurement current. Fluctuation conductivity analyses were carried out to obtain the exponents characterizing the conductivity divergence. Above T C , apart from the typical Gaussian and critical fluctuations an atypical regime with critical exponent about 0.14 is observed as a precursor of the transition. Below T C , it is observed that the coherence transition characteristic exponent increases rapidly with increasing isolator percentage in the composites and does not show important changes when modifying the current in the pure superconductor sample.

  19. Convergence of sampling in protein simulations

    NARCIS (Netherlands)

    Hess, B

    With molecular dynamics protein dynamics can be simulated in atomic detail. Current computers are not fast enough to probe all available conformations, but fluctuations around one conformation can be sampled to a reasonable extent. The motions with the largest fluctuations can be filtered out of a

  20. Fluctuation-enhanced electric conductivity in electrolyte solutions.

    Science.gov (United States)

    Péraud, Jean-Philippe; Nonaka, Andrew J; Bell, John B; Donev, Aleksandar; Garcia, Alejandro L

    2017-10-10

    We analyze the effects of an externally applied electric field on thermal fluctuations for a binary electrolyte fluid. We show that the fluctuating Poisson-Nernst-Planck (PNP) equations for charged multispecies diffusion coupled with the fluctuating fluid momentum equation result in enhanced charge transport via a mechanism distinct from the well-known enhancement of mass transport that accompanies giant fluctuations. Although the mass and charge transport occurs by advection by thermal velocity fluctuations, it can macroscopically be represented as electrodiffusion with renormalized electric conductivity and a nonzero cation-anion diffusion coefficient. Specifically, we predict a nonzero cation-anion Maxwell-Stefan coefficient proportional to the square root of the salt concentration, a prediction that agrees quantitatively with experimental measurements. The renormalized or effective macroscopic equations are different from the starting PNP equations, which contain no cross-diffusion terms, even for rather dilute binary electrolytes. At the same time, for infinitely dilute solutions the renormalized electric conductivity and renormalized diffusion coefficients are consistent and the classical PNP equations with renormalized coefficients are recovered, demonstrating the self-consistency of the fluctuating hydrodynamics equations. Our calculations show that the fluctuating hydrodynamics approach recovers the electrophoretic and relaxation corrections obtained by Debye-Huckel-Onsager theory, while elucidating the physical origins of these corrections and generalizing straightforwardly to more complex multispecies electrolytes. Finally, we show that strong applied electric fields result in anisotropically enhanced "giant" velocity fluctuations and reduced fluctuations of salt concentration.

  1. Monthly fluctuations of insomnia symptoms in a population-based sample.

    Science.gov (United States)

    Morin, Charles M; Leblanc, M; Ivers, H; Bélanger, L; Mérette, Chantal; Savard, Josée; Jarrin, Denise C

    2014-02-01

    To document the monthly changes in sleep/insomnia status over a 12-month period; to determine the optimal time intervals to reliably capture new incident cases and recurrent episodes of insomnia and the likelihood of its persistence over time. Participants were 100 adults (mean age = 49.9 years; 66% women) randomly selected from a larger population-based sample enrolled in a longitudinal study of the natural history of insomnia. They completed 12 monthly telephone interviews assessing insomnia, use of sleep aids, stressful life events, and physical and mental health problems in the previous month. A total of 1,125 interviews of a potential 1,200 were completed. Based on data collected at each assessment, participants were classified into one of three subgroups: good sleepers, insomnia symptoms, and insomnia syndrome. At baseline, 42 participants were classified as good sleepers, 34 met criteria for insomnia symptoms, and 24 for an insomnia syndrome. There were significant fluctuations of insomnia over time, with 66% of the participants changing sleep status at least once over the 12 monthly assessments (51.5% for good sleepers, 59.5% for insomnia syndrome, and 93.4% for insomnia symptoms). Changes of status were more frequent among individuals with insomnia symptoms at baseline (mean = 3.46, SD = 2.36) than among those initially classified as good sleepers (mean = 2.12, SD = 2.70). Among the subgroup with insomnia symptoms at baseline, 88.3% reported improved sleep (i.e., became good sleepers) at least once over the 12 monthly assessments compared to 27.7% whose sleep worsened (i.e., met criteria for an insomnia syndrome) during the same period. Among individuals classified as good sleepers at baseline, risks of developing insomnia symptoms and syndrome over the subsequent months were, respectively, 48.6% and 14.5%. Monthly assessment over an interval of 6 months was found most reliable to estimate incidence rates, while an interval of 3 months proved the most

  2. Fast temporal fluctuations in single-molecule junctions.

    Science.gov (United States)

    Ochs, Roif; Secker, Daniel; Elbing, Mark; Mayor, Marcel; Weber, Heiko B

    2006-01-01

    The noise within the electrical current through single-molecule junctions is studied cryogenic temperature. The organic sample molecules were contacted with the mechanically controlled break-junction technique. The noise spectra refer to a where only few Lorentzian fluctuators occur in the conductance. The frequency dependence shows qualitative variations from sample to sample.

  3. Short-term fluctuations in motivation to quit smoking in a sample of smokers in Hawaii.

    Science.gov (United States)

    Herzog, Thaddeus; Pokhrel, Pallav; Kawamoto, Crissy T

    2015-01-01

    Despite its potential for usefulness in informing the development of smoking cessation interventions, short-term fluctuations in motivation to quit is a relatively understudied topic. To assess the prevalence of smokers' day-to-day fluctuations in motivation to quit, and to assess associations of day-to-day fluctuations in motivation to quit with several established cessation-related variables. A cross-sectional survey was administered to smokers in Hawaii (N = 1,567). To assess short-term fluctuations in motivation to quit smoking, participants were asked to respond "True" or "False" to the statement: "My motivation to quit smoking changes from one day to the next." Other items measured desire to quit smoking, intention to quit, confidence in quitting, cigarette dependence, and other cessation-related variables. "My motivation to quit smoking changes from one day to the next" was endorsed as true by 64.7% of smokers, and false by 35.3%. Analyses revealed that smokers who indicated fluctuating motivation were significantly more interested in quitting as compared to smokers without fluctuations. Fluctuations in motivation to quit also were associated with greater confidence in quitting, lesser cigarette dependence, and more recent quitting activity (all p motivation to quit are common. Day-to-day fluctuations in motivation to quit are strongly associated with higher motivation to quit, greater confidence in future quitting, and other positive cessation-relevant trends.

  4. A passive guard for low thermal conductivity measurement of small samples by the hot plate method

    International Nuclear Information System (INIS)

    Jannot, Yves; Godefroy, Justine; Degiovanni, Alain; Grigorova-Moutiers, Veneta

    2017-01-01

    Hot plate methods under steady state conditions are based on a 1D model to estimate the thermal conductivity, using measurements of the temperatures T 0 and T 1 of the two sides of the sample and of the heat flux crossing it. To be consistent with the hypothesis of the 1D heat flux, either a hot plate guarded apparatus is used, or the temperature is measured at the centre of the sample. On one hand the latter method can be used only if the ratio thickness/width of the sample is sufficiently low and on the other hand the guarded hot plate method requires large width samples (typical cross section of 0.6  ×  0.6 m 2 ). That is why both methods cannot be used for low width samples. The method presented in this paper is based on an optimal choice of the temperatures T 0 and T 1 compared to the ambient temperature T a , enabling the estimation of the thermal conductivity with a centered hot plate method, by applying the 1D heat flux model. It will be shown that these optimal values do not depend on the size or on the thermal conductivity of samples (in the range 0.015–0.2 W m −1 K −1 ), but only on T a . The experimental results obtained validate the method for several reference samples for values of the ratio thickness/width up to 0.3, thus enabling the measurement of the thermal conductivity of samples having a small cross-section, down to 0.045  ×  0.045 m 2 . (paper)

  5. Soil Gas Sample Handling: Evaluation of Water Removal and Sample Ganging

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Brad G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Abrecht, David G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hayes, James C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Mendoza, Donaldo P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-10-31

    Soil gas sampling is currently conducted in support of Nuclear Test Ban treaty verification. Soil gas samples are collected and analyzed for isotopes of interest. Some issues that can impact sampling and analysis of these samples are excess moisture and sample processing time. Here we discuss three potential improvements to the current sampling protocol; a desiccant for water removal, use of molecular sieve to remove CO2 from the sample during collection, and a ganging manifold to allow composite analysis of multiple samples.

  6. Fluctuation conductivity in cuprate superconductors

    Indian Academy of Sciences (India)

    CaCu2O8+ single crystals in the temperature range 70–300 K. The thermodynamic fluctuations in the conductivity of both the samples start around ∼ 125 K. We find the Lawrence and Doniach [1] model to be inadequate to describe the ...

  7. Synthesis, structural characterization and fluctuation conductivity of HoBa{sub 2}Cu{sub 3}O{sub 7-{delta}-}SrTiO{sub 3} composites

    Energy Technology Data Exchange (ETDEWEB)

    Uribe Laverde, M.A., E-mail: mauribel@bt.unal.edu.c [Grupo de Fisica de Nuevos Materiales, Departamento de Fisica, Universidad Nacional de Colombia, Bogota (Colombia); Landinez Tellez, D.A.; Roa-Rojas, J. [Grupo de Fisica de Nuevos Materiales, Departamento de Fisica, Universidad Nacional de Colombia, Bogota (Colombia)

    2010-12-15

    Single-phase polycrystalline samples of HoBa{sub 2}Cu{sub 3}O{sub 7-{delta}} superconductor and SrTiO{sub 3} isolator were produced by means of the solid state reaction technique. After structural characterization of both materials, superconductor-isolator composites were produced with nominal isolator volume percentages between 0% and 10%. Resistivity measurements for the composites and the HoBa{sub 2}Cu{sub 3}O{sub 7-{delta}} sample with different currents evidenced a superconducting transition with critical temperature T{sub C} = 92 K, with wider transitions with increasing either isolator content or measurement current. Fluctuation conductivity analyses were carried out to obtain the exponents characterizing the conductivity divergence. Above T{sub C}, apart from the typical Gaussian and critical fluctuations an atypical regime with critical exponent about 0.14 is observed as a precursor of the transition. Below T{sub C}, it is observed that the coherence transition characteristic exponent increases rapidly with increasing isolator percentage in the composites and does not show important changes when modifying the current in the pure superconductor sample.

  8. Assessing the precision of a time-sampling-based study among GPs: balancing sample size and measurement frequency.

    Science.gov (United States)

    van Hassel, Daniël; van der Velden, Lud; de Bakker, Dinny; van der Hoek, Lucas; Batenburg, Ronald

    2017-12-04

    Our research is based on a technique for time sampling, an innovative method for measuring the working hours of Dutch general practitioners (GPs), which was deployed in an earlier study. In this study, 1051 GPs were questioned about their activities in real time by sending them one SMS text message every 3 h during 1 week. The required sample size for this study is important for health workforce planners to know if they want to apply this method to target groups who are hard to reach or if fewer resources are available. In this time-sampling method, however, standard power analyses is not sufficient for calculating the required sample size as this accounts only for sample fluctuation and not for the fluctuation of measurements taken from every participant. We investigated the impact of the number of participants and frequency of measurements per participant upon the confidence intervals (CIs) for the hours worked per week. Statistical analyses of the time-use data we obtained from GPs were performed. Ninety-five percent CIs were calculated, using equations and simulation techniques, for various different numbers of GPs included in the dataset and for various frequencies of measurements per participant. Our results showed that the one-tailed CI, including sample and measurement fluctuation, decreased from 21 until 3 h between one and 50 GPs. As a result of the formulas to calculate CIs, the increase of the precision continued and was lower with the same additional number of GPs. Likewise, the analyses showed how the number of participants required decreased if more measurements per participant were taken. For example, one measurement per 3-h time slot during the week requires 300 GPs to achieve a CI of 1 h, while one measurement per hour requires 100 GPs to obtain the same result. The sample size needed for time-use research based on a time-sampling technique depends on the design and aim of the study. In this paper, we showed how the precision of the

  9. Structural Origins of Conductance Fluctuations in Gold–Thiolate Molecular Transport Junctions

    KAUST Repository

    French, William R.

    2013-03-21

    We report detailed atomistic simulations combined with high-fidelity conductance calculations to probe the structural origins of conductance fluctuations in thermally evolving Au-benzene-1,4-dithiolate-Au junctions. We compare the behavior of structurally ideal junctions (where the electrodes are modeled as flat surfaces) to structurally realistic, experimentally representative junctions resulting from break-junction simulations. The enhanced mobility of metal atoms in structurally realistic junctions results in significant changes to the magnitude and origin of the conductance fluctuations. Fluctuations are larger by a factor of 2-3 in realistic junctions compared to ideal junctions. Moreover, in junctions with highly deformed electrodes, the conductance fluctuations arise primarily from changes in the Au geometry, in contrast to results for junctions with nondeformed electrodes, where the conductance fluctuations are dominated by changes in the molecule geometry. These results provide important guidance to experimentalists developing strategies to control molecular conductance, and also to theoreticians invoking simplified structural models of junctions to predict their behavior. © 2013 American Chemical Society.

  10. Structural Origins of Conductance Fluctuations in Gold–Thiolate Molecular Transport Junctions

    KAUST Repository

    French, William R.; Iacovella, Christopher R.; Rungger, Ivan; Souza, Amaury Melo; Sanvito, Stefano; Cummings, Peter T.

    2013-01-01

    We report detailed atomistic simulations combined with high-fidelity conductance calculations to probe the structural origins of conductance fluctuations in thermally evolving Au-benzene-1,4-dithiolate-Au junctions. We compare the behavior of structurally ideal junctions (where the electrodes are modeled as flat surfaces) to structurally realistic, experimentally representative junctions resulting from break-junction simulations. The enhanced mobility of metal atoms in structurally realistic junctions results in significant changes to the magnitude and origin of the conductance fluctuations. Fluctuations are larger by a factor of 2-3 in realistic junctions compared to ideal junctions. Moreover, in junctions with highly deformed electrodes, the conductance fluctuations arise primarily from changes in the Au geometry, in contrast to results for junctions with nondeformed electrodes, where the conductance fluctuations are dominated by changes in the molecule geometry. These results provide important guidance to experimentalists developing strategies to control molecular conductance, and also to theoreticians invoking simplified structural models of junctions to predict their behavior. © 2013 American Chemical Society.

  11. Conducting a respondent-driven sampling survey with the use of existing resources in Sydney, Australia.

    Science.gov (United States)

    Paquette, Dana M; Bryant, Joanne; Crawford, Sione; de Wit, John B F

    2011-07-01

    Respondent-driven sampling (RDS) is a form of chain-referral sampling that is increasingly being used for HIV behavioural surveillance. When used for surveillance purposes, a sampling method should be relatively inexpensive and simple to operate. This study examined whether an RDS survey of people who inject drugs (PWID) in Sydney, Australia, could be successfully conducted through the use of minimal and existing resources. The RDS survey was conducted on the premises of a local needle and syringe program (NSP) with some adjustments to take into account the constraints of existing resources. The impact of the survey on clients and on staff was examined by summarizing NSP service data and by conducting post-survey discussions with NSP staff. From November 2009 till March 2010, 261 participants were recruited in 16 waves. A significant increase was found in the number of services provided by the NSP during and after data collection. Generally, staff felt that the survey had a positive impact by exposing a broader group of people to the NSP. However, conducting the survey may have led to privacy issues for NSP clients due to an increased number of people gathering around the NSP. This study shows that RDS can be conducted with the use of minimal and existing resources under certain conditions (e.g., use of a self-administered questionnaire and no biological samples taken). A more detailed cost-utility analysis is needed to determine whether RDS' advantages outweigh potential challenges when compared to simpler and less costly convenience methods. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  12. Optimization of sampling parameters for standardized exhaled breath sampling.

    Science.gov (United States)

    Doran, Sophie; Romano, Andrea; Hanna, George B

    2017-09-05

    The lack of standardization of breath sampling is a major contributing factor to the poor repeatability of results and hence represents a barrier to the adoption of breath tests in clinical practice. On-line and bag breath sampling have advantages but do not suit multicentre clinical studies whereas storage and robust transport are essential for the conduct of wide-scale studies. Several devices have been developed to control sampling parameters and to concentrate volatile organic compounds (VOCs) onto thermal desorption (TD) tubes and subsequently transport those tubes for laboratory analysis. We conducted three experiments to investigate (i) the fraction of breath sampled (whole vs. lower expiratory exhaled breath); (ii) breath sample volume (125, 250, 500 and 1000ml) and (iii) breath sample flow rate (400, 200, 100 and 50 ml/min). The target VOCs were acetone and potential volatile biomarkers for oesophago-gastric cancer belonging to the aldehyde, fatty acids and phenol chemical classes. We also examined the collection execution time and the impact of environmental contamination. The experiments showed that the use of exhaled breath-sampling devices requires the selection of optimum sampling parameters. The increase in sample volume has improved the levels of VOCs detected. However, the influence of the fraction of exhaled breath and the flow rate depends on the target VOCs measured. The concentration of potential volatile biomarkers for oesophago-gastric cancer was not significantly different between the whole and lower airway exhaled breath. While the recovery of phenols and acetone from TD tubes was lower when breath sampling was performed at a higher flow rate, other VOCs were not affected. A dedicated 'clean air supply' overcomes the contamination from ambient air, but the breath collection device itself can be a source of contaminants. In clinical studies using VOCs to diagnose gastro-oesophageal cancer, the optimum parameters are 500mls sample volume

  13. Fluctuation-induced conductivity in melt-textured Pr-doped YBa2Cu3O7-δ composite superconductor

    DEFF Research Database (Denmark)

    Opata, Yuri Aparecido; Monteiro, João Frederico Haas Leandro; Siqueira, Ezequiel Costa

    2018-01-01

    In this study, the effects of thermal fluctuations on the electrical conductivity in melt-textured YBa2Cu3O7-δ, Y0.95Pr0.05Ba2Cu3O7-δ and (YBa2Cu3O7-δ)0.95–(PrBa2Cu3O7-δ)0.05 composite superconductor were considered. The composite superconductor samples were prepared through the top seeding method...... using melt-textured NdBa2Cu3O7-d seeds. The resistivity measurements were performed with a low-frequency, low-current AC technique in order to extract the temperature derivative and analyze the influence of the praseodymium ion on the normal superconductor transition and consequently on the fluctuation...

  14. Reexamination of basal plane thermal conductivity of suspended graphene samples measured by electro-thermal micro-bridge methods

    Directory of Open Access Journals (Sweden)

    Insun Jo

    2015-05-01

    Full Text Available Thermal transport in suspended graphene samples has been measured in prior works and this work with the use of a suspended electro-thermal micro-bridge method. These measurement results are analyzed here to evaluate and eliminate the errors caused by the extrinsic thermal contact resistance. It is noted that the room-temperature thermal resistance measured in a recent work increases linearly with the suspended length of the single-layer graphene samples synthesized by chemical vapor deposition (CVD, and that such a feature does not reveal the failure of Fourier’s law despite the increase in the reported apparent thermal conductivity with length. The re-analyzed apparent thermal conductivity of a single-layer CVD graphene sample reaches about 1680 ± 180 W m−1 K−1 at room temperature, which is close to the highest value reported for highly oriented pyrolytic graphite. In comparison, the apparent thermal conductivity values measured for two suspended exfoliated bi-layer graphene samples are about 880 ± 60 and 730 ± 60 Wm−1K−1 at room temperature, and approach that of the natural graphite source above room temperature. However, the low-temperature thermal conductivities of these suspended graphene samples are still considerably lower than the graphite values, with the peak thermal conductivities shifted to much higher temperatures. Analysis of the thermal conductivity data reveals that the low temperature behavior is dominated by phonon scattering by polymer residue instead of by the lateral boundary.

  15. Precision of quantization of the hall conductivity in a finite-size sample: Power law

    International Nuclear Information System (INIS)

    Greshnov, A. A.; Kolesnikova, E. N.; Zegrya, G. G.

    2006-01-01

    A microscopic calculation of the conductivity in the integer quantum Hall effect (IQHE) mode is carried out. The precision of quantization is analyzed for finite-size samples. The precision of quantization shows a power-law dependence on the sample size. A new scaling parameter describing this dependence is introduced. It is also demonstrated that the precision of quantization linearly depends on the ratio between the amplitude of the disorder potential and the cyclotron energy. The data obtained are compared with the results of magnetotransport measurements in mesoscopic samples

  16. Thermoelectric coefficient L(T) of polycrystalline silver doped BSCCO samples

    International Nuclear Information System (INIS)

    Rodriguez, J.E.; Marino, A.

    1998-01-01

    We present a study of the thermoelectric coefficient L(T) of polycrystalline silver doped BSCCO samples. The quantity L(T) relates the thermoelectric coefficient S(T) with the electrical conductivity σ (T) and gives an indication of the influence of the order parameter fluctuations (OPF) on S(T) in the mean field region (Mfr). The results of L(T) indicate that the critical behavior of S(T) above the superconducting transition is not only driven by σ (T). These results suggest that in the Mfr, L(T) is affected by thermodynamic fluctuations of the superconducting order parameter (OPF). The OPF effects show a two-dimensional (2D) character in the entire Mfr. (Author)

  17. Analysis of non-contact and contact probe-to-sample thermal exchange for quantitative measurements of thin film and nanostructure thermal conductivity by the scanning hot probe method

    Science.gov (United States)

    Wilson, Adam A.

    The ability to measure thermal properties of thin films and nanostructured materials is an important aspect of many fields of academic study. A strategy especially well-suited for nanoscale investigations of these properties is the scanning hot probe technique, which is unique in its ability to non-destructively interrogate the thermal properties with high resolution, both laterally as well as through the thickness of the material. Strategies to quantitatively determine sample thermal conductivity depend on probe calibration. State of the art calibration strategies assume that the area of thermal exchange between probe and sample does not vary with sample thermal conductivity. However, little investigation has gone into determining whether or not that assumption is valid. This dissertation provides a rigorous study into the probe-to-sample heat transfer through the air gap at diffusive distances for a variety of values of sample thermal conductivity. It is demonstrated that the thermal exchange radius and gap/contact thermal resistance varies with sample thermal conductivity as well as tip-to-sample clearance in non-contact mode. In contact mode, it is demonstrated that higher thermal conductivity samples lead to a reduction in thermal exchange radius for Wollaston probe tips. Conversely, in non-contact mode and in contact mode for sharper probe tips where air contributes the most to probe-to-sample heat transfer, the opposite trend occurs. This may be attributed to the relatively strong solid-to-solid conduction occurring between probe and sample for the Wollaston probes. A three-dimensional finite element (3DFE) model was developed to investigate how the calibrated thermal exchange parameters vary with sample thermal conductivity when calibrating the probe via the intersection method in non-contact mode at diffusive distances. The 3DFE model was then used to explore the limits of sensitivity of the experiment for a range of simulated experimental conditions. It

  18. Role of electrostatic fluctuations in doped semiconductors upon the transition from band to hopping conduction (by the example of p-Ge:Ga)

    Energy Technology Data Exchange (ETDEWEB)

    Poklonski, N. A., E-mail: poklonski@bsu.by; Vyrko, S. A.; Poklonskaya, O. N. [Belarusian State University (Belarus); Zabrodskii, A. G. [Russian Academy of Sciences, Ioffe Physical–Technical Institute (Russian Federation)

    2016-06-15

    The electrostatic model of ionization equilibrium between hydrogen-like acceptors and v-band holes in crystalline covalent p-type semiconductors is developed. The range of applicability of the model is the entire insulator side of the insulator–metal (Mott) phase transition. The density of the spatial distribution of acceptor- and donor-impurity atoms and holes over a crystal was assumed to be Poissonian and the fluctuations of their electrostatic potential energy, to be Gaussian. The model takes into account the effect of a decrease in the energy of affinity of an ionized acceptor to a v-band hole due to Debye–Hückel ion screening by both free v-band holes and localized holes hopping over charge states (0) and (–1) of acceptors in the acceptor band. All donors are in charge state (+1) and are not directly involved in the screening, but ensure the total electroneutrality of a sample. In the quasiclassical approximation, analytical expressions for the root-mean-square fluctuation of the v-band hole energy W{sub p} and effective acceptor bandwidth W{sub a} are obtained. In calculating W{sub a}, only fluctuations caused by the Coulomb interaction between two nearest point charges (impurity ions and holes) are taken into account. It is shown that W{sub p} is lower than W{sub a}, since electrostatic fluctuations do not manifest themselves on scales smaller than the average de Broglie wavelength of a free hole. The delocalization threshold for v-band holes is determined as the sum of the diffusive-percolation threshold and exchange energy of holes. The concentration of free v-band holes is calculated at the temperature T{sub j} of the transition from dc band conductivity to conductivity implemented via hopping over acceptor states, which is determined from the virial theorem. The dependence of the differential energy of the thermal ionization of acceptors at the temperature 3T{sub j}/2 on their concentration N and degree of compensation K (the ratio between the

  19. Statistical sampling techniques as applied to OSE inspections

    International Nuclear Information System (INIS)

    Davis, J.J.; Cote, R.W.

    1987-01-01

    The need has been recognized for statistically valid methods for gathering information during OSE inspections; and for interpretation of results, both from performance testing and from records reviews, interviews, etc. Battelle Columbus Division, under contract to DOE OSE has performed and is continuing to perform work in the area of statistical methodology for OSE inspections. This paper represents some of the sampling methodology currently being developed for use during OSE inspections. Topics include population definition, sample size requirements, level of confidence and practical logistical constraints associated with the conduct of an inspection based on random sampling. Sequential sampling schemes and sampling from finite populations are also discussed. The methods described are applicable to various data gathering activities, ranging from the sampling and examination of classified documents to the sampling of Protective Force security inspectors for skill testing

  20. Lightweight link dimensioning using sFlow sampling

    DEFF Research Database (Denmark)

    de Oliviera Schmidt, Ricardo; Sadre, Ramin; Sperotto, Anna

    2013-01-01

    not be trivial in high-speed links. Aiming scalability, operators often deploy packet sampling on monitoring, but little is known how it affects link dimensioning. In this paper we assess the feasibility of lightweight link dimensioning using sFlow, which is a widely-deployed traffic monitoring tool. We...... implement sFlow sampling algorithm and use a previously proposed and validated dimensioning formula that needs traffic variance. We validate our approach using packet captures from real networks. Results show that the proposed procedure is successful for a range of sampling rates and that, due to randomness...... of sampling algorithm, the error introduced by scaling the traffic variance yields more conservative results that cope with short-term traffic fluctuations....

  1. Sensor-triggered sampling to determine instantaneous airborne vapor exposure concentrations.

    Science.gov (United States)

    Smith, Philip A; Simmons, Michael K; Toone, Phillip

    2018-06-01

    It is difficult to measure transient airborne exposure peaks by means of integrated sampling for organic chemical vapors, even with very short-duration sampling. Selection of an appropriate time to measure an exposure peak through integrated sampling is problematic, and short-duration time-weighted average (TWA) values obtained with integrated sampling are not likely to accurately determine actual peak concentrations attained when concentrations fluctuate rapidly. Laboratory analysis for integrated exposure samples is preferred from a certainty standpoint over results derived in the field from a sensor, as a sensor user typically must overcome specificity issues and a number of potential interfering factors to obtain similarly reliable data. However, sensors are currently needed to measure intra-exposure period concentration variations (i.e., exposure peaks). In this article, the digitized signal from a photoionization detector (PID) sensor triggered collection of whole-air samples when toluene or trichloroethylene vapors attained pre-determined levels in a laboratory atmosphere generation system. Analysis by gas chromatography-mass spectrometry of whole-air samples (with both 37 and 80% relative humidity) collected using the triggering mechanism with rapidly increasing vapor concentrations showed good agreement with the triggering set point values. Whole-air samples (80% relative humidity) in canisters demonstrated acceptable 17-day storage recoveries, and acceptable precision and bias were obtained. The ability to determine exceedance of a ceiling or peak exposure standard by laboratory analysis of an instantaneously collected sample, and to simultaneously provide a calibration point to verify the correct operation of a sensor was demonstrated. This latter detail may increase the confidence in reliability of sensor data obtained across an entire exposure period.

  2. PFP Wastewater Sampling Facility

    International Nuclear Information System (INIS)

    Hirzel, D.R.

    1995-01-01

    This test report documents the results obtained while conducting operational testing of the sampling equipment in the 225-WC building, the PFP Wastewater Sampling Facility. The Wastewater Sampling Facility houses equipment to sample and monitor the PFP's liquid effluents before discharging the stream to the 200 Area Treated Effluent Disposal Facility (TEDF). The majority of the streams are not radioactive and discharges from the PFP Heating, Ventilation, and Air Conditioning (HVAC). The streams that might be contaminated are processed through the Low Level Waste Treatment Facility (LLWTF) before discharging to TEDF. The sampling equipment consists of two flow-proportional composite samplers, an ultrasonic flowmeter, pH and conductivity monitors, chart recorder, and associated relays and current isolators to interconnect the equipment to allow proper operation. Data signals from the monitors are received in the 234-5Z Shift Office which contains a chart recorder and alarm annunciator panel. The data signals are also duplicated and sent to the TEDF control room through the Local Control Unit (LCU). Performing the OTP has verified the operability of the PFP wastewater sampling system. This Operability Test Report documents the acceptance of the sampling system for use

  3. Smoothing the redshift distributions of random samples for the baryon acoustic oscillations: applications to the SDSS-III BOSS DR12 and QPM mock samples

    Science.gov (United States)

    Wang, Shao-Jiang; Guo, Qi; Cai, Rong-Gen

    2017-12-01

    We investigate the impact of different redshift distributions of random samples on the baryon acoustic oscillations (BAO) measurements of D_V(z)r_d^fid/r_d from the two-point correlation functions of galaxies in the Data Release 12 of the Baryon Oscillation Spectroscopic Survey (BOSS). Big surveys, such as BOSS, usually assign redshifts to the random samples by randomly drawing values from the measured redshift distributions of the data, which would necessarily introduce fiducial signals of fluctuations into the random samples, weakening the signals of BAO, if the cosmic variance cannot be ignored. We propose a smooth function of redshift distribution that fits the data well to populate the random galaxy samples. The resulting cosmological parameters match the input parameters of the mock catalogue very well. The significance of BAO signals has been improved by 0.33σ for a low-redshift sample and by 0.03σ for a constant-stellar-mass sample, though the absolute values do not change significantly. Given the precision of the measurements of current cosmological parameters, it would be appreciated for the future improvements on the measurements of galaxy clustering.

  4. 40 CFR 761.283 - Determination of the number of samples to collect and sample collection locations.

    Science.gov (United States)

    2010-07-01

    ...) MANUFACTURING, PROCESSING, DISTRIBUTION IN COMMERCE, AND USE PROHIBITIONS Sampling To Verify Completion of Self... cleanup verification conducted in accordance with § 761.61(a)(6), follow the procedures in paragraph (b... verification conducted in accordance with § 761.61(a)(6), follow the procedures in this section for locating...

  5. Development of a sampling strategy and sample size calculation to estimate the distribution of mammographic breast density in Korean women.

    Science.gov (United States)

    Jun, Jae Kwan; Kim, Mi Jin; Choi, Kui Son; Suh, Mina; Jung, Kyu-Won

    2012-01-01

    Mammographic breast density is a known risk factor for breast cancer. To conduct a survey to estimate the distribution of mammographic breast density in Korean women, appropriate sampling strategies for representative and efficient sampling design were evaluated through simulation. Using the target population from the National Cancer Screening Programme (NCSP) for breast cancer in 2009, we verified the distribution estimate by repeating the simulation 1,000 times using stratified random sampling to investigate the distribution of breast density of 1,340,362 women. According to the simulation results, using a sampling design stratifying the nation into three groups (metropolitan, urban, and rural), with a total sample size of 4,000, we estimated the distribution of breast density in Korean women at a level of 0.01% tolerance. Based on the results of our study, a nationwide survey for estimating the distribution of mammographic breast density among Korean women can be conducted efficiently.

  6. Sampling pig farms at the abattoir in a cross-sectional study - Evaluation of a sampling method.

    Science.gov (United States)

    Birkegård, Anna Camilla; Halasa, Tariq; Toft, Nils

    2017-09-15

    A cross-sectional study design is relatively inexpensive, fast and easy to conduct when compared to other study designs. Careful planning is essential to obtaining a representative sample of the population, and the recommended approach is to use simple random sampling from an exhaustive list of units in the target population. This approach is rarely feasible in practice, and other sampling procedures must often be adopted. For example, when slaughter pigs are the target population, sampling the pigs on the slaughter line may be an alternative to on-site sampling at a list of farms. However, it is difficult to sample a large number of farms from an exact predefined list, due to the logistics and workflow of an abattoir. Therefore, it is necessary to have a systematic sampling procedure and to evaluate the obtained sample with respect to the study objective. We propose a method for 1) planning, 2) conducting, and 3) evaluating the representativeness and reproducibility of a cross-sectional study when simple random sampling is not possible. We used an example of a cross-sectional study with the aim of quantifying the association of antimicrobial resistance and antimicrobial consumption in Danish slaughter pigs. It was not possible to visit farms within the designated timeframe. Therefore, it was decided to use convenience sampling at the abattoir. Our approach was carried out in three steps: 1) planning: using data from meat inspection to plan at which abattoirs and how many farms to sample; 2) conducting: sampling was carried out at five abattoirs; 3) evaluation: representativeness was evaluated by comparing sampled and non-sampled farms, and the reproducibility of the study was assessed through simulated sampling based on meat inspection data from the period where the actual data collection was carried out. In the cross-sectional study samples were taken from 681 Danish pig farms, during five weeks from February to March 2015. The evaluation showed that the sampling

  7. The Internet of Samples in the Earth Sciences (iSamples)

    Science.gov (United States)

    Carter, M. R.; Lehnert, K. A.

    2015-12-01

    Across most Earth Science disciplines, research depends on the availability of samples collected above, at, and beneath Earth's surface, on the moon and in space, or generated in experiments. Many domains in the Earth Sciences have recently expressed the need for better discovery, access, and sharing of scientific samples and collections (EarthCube End-User Domain workshops, 2012 and 2013, http://earthcube.org/info/about/end-user-workshops), as has the US government (OSTP Memo, March 2014). The Internet of Samples in the Earth Sciences (iSamples) is an initiative funded as a Research Coordination Network (RCN) within the EarthCube program to address this need. iSamples aims to advance the use of innovative cyberinfrastructure to connect physical samples and sample collections across the Earth Sciences with digital data infrastructures to revolutionize their utility for science. iSamples strives to build, grow, and foster a new community of practice, in which domain scientists, curators of sample repositories and collections, computer and information scientists, software developers and technology innovators engage in and collaborate on defining, articulating, and addressing the needs and challenges of physical samples as a critical component of digital data infrastructure. A primary goal of iSamples is to deliver a community-endorsed set of best practices and standards for the registration, description, identification, and citation of physical specimens and define an actionable plan for implementation. iSamples conducted a broad community survey about sample sharing and has created 5 different working groups to address the different challenges of developing the internet of samples - from metadata schemas and unique identifiers to an architecture of a shared cyberinfrastructure for collections, to digitization of existing collections, to education, and ultimately to establishing the physical infrastructure that will ensure preservation and access of the physical

  8. Thermal conductivity measurements of impregnated Nb3Sn coil samples in the temperature range of 3.5 K to 100 K

    Science.gov (United States)

    Koettig, T.; Maciocha, W.; Bermudez, S.; Rysti, J.; Tavares, S.; Cacherat, F.; Bremer, J.

    2017-02-01

    In the framework of the luminosity upgrade of the LHC, high-field magnets are under development. Magnetic flux densities of up to 13 T require the use of Nb3Sn superconducting coils. Quench protection becomes challenging due to the high stored energy density and the low stabilizer fraction. The thermal conductivity and diffusivity of the combination of insulating layers and Nb3Sn based cables are an important thermodynamic input parameter for quench protection systems and superfluid helium cooling studies. A two-stage cryocooler based test stand is used to measure the thermal conductance of the coil sample in two different heat flow directions with respect to the coil package geometry. Variable base temperatures of the experimental platform at the cryocooler allow for a steady-state heat flux method up to 100 K. The heat is applied at wedges style copper interfaces of the Rutherford cables. The respective temperature difference represents the absolute value of thermal conductance of the sample arrangement. We report about the measurement methodology applied to this kind of non-uniform sample composition and the evaluation of the used resin composite materials.

  9. An Electromagnetic Gauge Technique for Measuring Shocked Particle Velocity in Electrically Conductive Samples

    Science.gov (United States)

    Cheng, David; Yoshinaka, Akio

    2014-11-01

    Electromagnetic velocity (EMV) gauges are a class of film gauges which permit the direct in-situ measurement of shocked material flow velocity. The active sensing element, typically a metallic foil, requires exposure to a known external magnetic field in order to produce motional electromotive force (emf). Due to signal distortion caused by mutual inductance between sample and EMV gauge, this technique is typically limited to shock waves in non-conductive materials. In conductive samples, motional emf generated in the EMV gauge has to be extracted from the measured signal which results from the combined effects of both motional emf and voltage changes from induced currents. An electromagnetic technique is presented which analytically models the dynamics of induced current between a copper disk moving as a rigid body with constant 1D translational velocity toward an EMV gauge, where both disk and gauge are exposed to a uniform external static magnetic field. The disk is modelled as a magnetic dipole loop where its Foucault current is evaluated from the characteristics of the fields, whereas the EMV gauge is modelled as a circuit loop immersed in the field of the magnetic dipole loop, the intensity of which is calculated as a function of space and, implicitly, time. Equations of mutual induction are derived and the current induced in the EMV gauge loop is solved, allowing discrimination of the motional emf. Numerical analysis is provided for the step response of the induced EMV gauge current with respect to the Foucault current in the moving copper sample.

  10. Predicting sample lifetimes in creep fracture of heterogeneous materials

    Science.gov (United States)

    Koivisto, Juha; Ovaska, Markus; Miksic, Amandine; Laurson, Lasse; Alava, Mikko J.

    2016-08-01

    Materials flow—under creep or constant loads—and, finally, fail. The prediction of sample lifetimes is an important and highly challenging problem because of the inherently heterogeneous nature of most materials that results in large sample-to-sample lifetime fluctuations, even under the same conditions. We study creep deformation of paper sheets as one heterogeneous material and thus show how to predict lifetimes of individual samples by exploiting the "universal" features in the sample-inherent creep curves, particularly the passage to an accelerating creep rate. Using simulations of a viscoelastic fiber bundle model, we illustrate how deformation localization controls the shape of the creep curve and thus the degree of lifetime predictability.

  11. Different realizations of Cooper-Frye sampling with conservation laws

    Science.gov (United States)

    Schwarz, C.; Oliinychenko, D.; Pang, L.-G.; Ryu, S.; Petersen, H.

    2018-01-01

    Approaches based on viscous hydrodynamics for the hot and dense stage and hadronic transport for the final dilute rescattering stage are successfully applied to the dynamic description of heavy ion reactions at high beam energies. One crucial step in such hybrid approaches is the so-called particlization, which is the transition between the hydrodynamic description and the microscopic degrees of freedom. For this purpose, individual particles are sampled on the Cooper-Frye hypersurface. In this work, four different realizations of the sampling algorithms are compared, with three of them incorporating the global conservation laws of quantum numbers in each event. The algorithms are compared within two types of scenarios: a simple ‘box’ hypersurface consisting of only one static cell and a typical particlization hypersurface for Au+Au collisions at \\sqrt{{s}{NN}}=200 {GeV}. For all algorithms the mean multiplicities (or particle spectra) remain unaffected by global conservation laws in the case of large volumes. In contrast, the fluctuations of the particle numbers are affected considerably. The fluctuations of the newly developed SPREW algorithm based on the exponential weight, and the recently suggested SER algorithm based on ensemble rejection, are smaller than those without conservation laws and agree with the expectation from the canonical ensemble. The previously applied mode sampling algorithm produces dramatically larger fluctuations than expected in the corresponding microcanonical ensemble, and therefore should be avoided in fluctuation studies. This study might be of interest for the investigation of particle fluctuations and correlations, e.g. the suggested signatures for a phase transition or a critical endpoint, in hybrid approaches that are affected by global conservation laws.

  12. Sample size estimation and sampling techniques for selecting a representative sample

    Directory of Open Access Journals (Sweden)

    Aamir Omair

    2014-01-01

    Full Text Available Introduction: The purpose of this article is to provide a general understanding of the concepts of sampling as applied to health-related research. Sample Size Estimation: It is important to select a representative sample in quantitative research in order to be able to generalize the results to the target population. The sample should be of the required sample size and must be selected using an appropriate probability sampling technique. There are many hidden biases which can adversely affect the outcome of the study. Important factors to consider for estimating the sample size include the size of the study population, confidence level, expected proportion of the outcome variable (for categorical variables/standard deviation of the outcome variable (for numerical variables, and the required precision (margin of accuracy from the study. The more the precision required, the greater is the required sample size. Sampling Techniques: The probability sampling techniques applied for health related research include simple random sampling, systematic random sampling, stratified random sampling, cluster sampling, and multistage sampling. These are more recommended than the nonprobability sampling techniques, because the results of the study can be generalized to the target population.

  13. Space charge and steady state current in LDPE samples containing a permittivity/conductivity gradient

    DEFF Research Database (Denmark)

    Holbøll, Joachim; Bambery, K. R.; Fleming, R. J.

    2000-01-01

    Electromagnetic theory predicts that a dielectric sample in which a steady DC current of density ε is flowing, and in which the ratio of permittivity ε to conductivity σ varies with position, will acquire a space charge density j·grad(ε/σ). A simple and convenient way to generate an ε/σ gradient...... in a homogeneous sample is to establish a temperature gradient across it. The resulting spatial variation in ε is usually small in polymeric insulators, but the variation in σ can be appreciable. Laser induced pressure pulse (LIPP) measurements were made on 1.5 mm thick plaques of ultra pure LDPE equipped...... with vacuum-evaporated aluminium electrodes. Temperature differences up to 27°C were maintained across the samples, which were subjected to DC fields up to 20 kV/mm. Current density was measured as a function of temperature and field. Negligible thermally generated space charge was observed. The charge...

  14. Contributions to sampling statistics

    CERN Document Server

    Conti, Pier; Ranalli, Maria

    2014-01-01

    This book contains a selection of the papers presented at the ITACOSM 2013 Conference, held in Milan in June 2013. ITACOSM is the bi-annual meeting of the Survey Sampling Group S2G of the Italian Statistical Society, intended as an international  forum of scientific discussion on the developments of theory and application of survey sampling methodologies and applications in human and natural sciences. The book gathers research papers carefully selected from both invited and contributed sessions of the conference. The whole book appears to be a relevant contribution to various key aspects of sampling methodology and techniques; it deals with some hot topics in sampling theory, such as calibration, quantile-regression and multiple frame surveys, and with innovative methodologies in important topics of both sampling theory and applications. Contributions cut across current sampling methodologies such as interval estimation for complex samples, randomized responses, bootstrap, weighting, modeling, imputati...

  15. Procedures for sampling and sample reduction within quality assurance systems for solid biofuels

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-07-01

    The objective of this experimental study on sampling was to determine the size and number of samples of biofuels required (taken at two sampling points in each case) and to compare two methods of sampling. The first objective of the sample-reduction exercise was to compare the reliability of various sampling methods, and the second objective was to measure the variations introduced as a result of reducing the sample size to form suitable test portions. The materials studied were sawdust, wood chips, wood pellets and bales of straw, and these were analysed for moisture, ash, particle size and chloride. The sampling procedures are described. The study was conducted in Scandinavia. The results of the study were presented in Leipzig in October 2004. The work was carried out as part of the UK's DTI Technology Programme: New and Renewable Energy.

  16. Electron quantum interferences and universal conductance fluctuations

    International Nuclear Information System (INIS)

    Benoit, A.; Pichard, J.L.

    1988-05-01

    Quantum interferences yield corrections to the classical ohmic behaviour predicted by Boltzmann theory in electronic transport: for instance the well-known ''weak localization'' effects. Furthermore, very recently, quantum interference effects have been proved to be responsible for statistically different phenomena, associated with Universal Conductance Fluctuations and observed on very small devices [fr

  17. The impact of different blood sampling methods on laboratory rats under different types of anaesthesia

    DEFF Research Database (Denmark)

    Toft, Martin Fitzner; Petersen, Mikke Haxø; Dragsted, Nils

    2006-01-01

    for rats sampled from the tail vein, which showed fluctuations in body temperature in excess of 30 h after sampling. Increases in heart rate and blood pressure within the first hours after sampling indicated that periorbital puncture was the method that had the largest acute impact on the rats......Rats with implanted telemetry transponders were blood sampled by jugular puncture, periorbital puncture or tail vein puncture, or sampled by jugular puncture in carbon dioxide (CO?), isoflurane or without anaesthesia in a crossover design. Heart rate, blood pressure and body temperature were...... registered for three days after sampling. Initially blood pressure increased, but shortly after sampling it decreased, which led to increased heart rate. Sampling induced rapid fluctuations in body temperature, and an increase in body temperature. Generally, rats recovered from sampling within 2-3 h, except...

  18. Two sampling techniques for game meat

    OpenAIRE

    van der Merwe, Maretha; Jooste, Piet J.; Hoffman, Louw C.; Calitz, Frikkie J.

    2013-01-01

    A study was conducted to compare the excision sampling technique used by the export market and the sampling technique preferred by European countries, namely the biotrace cattle and swine test. The measuring unit for the excision sampling was grams (g) and square centimetres (cm2) for the swabbing technique. The two techniques were compared after a pilot test was conducted on spiked approved beef carcasses (n = 12) that statistically proved the two measuring units correlated. The two sampling...

  19. First Sample Delivery to Mars Microscope

    Science.gov (United States)

    2008-01-01

    The Robotic Arm on NASA's Phoenix Mars Lander has just delivered the first sample of dug-up soil to the spacecraft's microscope station in this image taken by the Surface Stereo Imager during the mission's Sol 17 (June 12), or 17th Martian day after landing. The scoop is positioned above the box containing key parts of Phoenix's Microscopy, Electrochemistry and Conductivity Analyzer, or MECA, instrument suite. It has sprinkled a small amount of soil into a notch in the MECA box where the microscope's sample wheel is exposed. The wheel turns to present sample particles on various substrates to the Optical Microscope for viewing. The scoop is about 8.5 centimeters (3.3 inches) wide. The top of the MECA box is 20 centimeters (7.9 inches) wide. This image has been lightened to make details more visible. The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.

  20. Simultaneous Determination of Different Anions in Milk Samples Using Ion Chromatography with Conductivity Detection

    Directory of Open Access Journals (Sweden)

    Gülçin Gümüş Yılmaz

    2016-12-01

    Full Text Available The description of a simple method for simultaneous determination of chloride, nitrate, sulfate, iodide, phosphate, thiocyanate, perchlorate, and orotic acid in milk samples was outlined. The method involves the use of dialysis cassettes for matrix elimination, followed by ion chromatography on a high capacity anion exchange column with suppressed conductivity detection. The novelty of dialysis process was that it did not need any chemical and organic solvent for elimination of macromolecules such as fat, carbohydrates and proteins from milk samples. External standard calibration curves for these analytes were linear with great correlation coefficients. The relative standard deviations of analyte concentrations were acceptable both inter-day and intra-day evaluations. Under optimized conditions, the limit of detection (Signal-to-Noise ratio = 3 for chloride, phosphate, thiocyanate, perchlorate, iodide, nitrate, sulfate, and orotate was found to be 0.012, 0.112, 0.140, 0.280, 0.312, 0.516, 0.520, and 0.840 mg L−1, respectively. Significant results were obtained for various spiked milk samples with % recovery in the range of 93.88 - 109.75 %. The proposed method was successfully applied to milk samples collected from Istanbul markets. The advantages of the method described herein are reagent-free, simple, and reliable.

  1. Weak antilocalization and conductance fluctuation in a single crystalline Bi nanowire

    International Nuclear Information System (INIS)

    Kim, Jeongmin; Lee, Seunghyun; Kim, MinGin; Lee, Wooyoung; Brovman, Yuri M.; Kim, Philip

    2014-01-01

    We present the low temperature transport properties of an individual single-crystalline Bi nanowire grown by the on-film formation of nanowire method. The temperature dependent resistance and magnetoresistance of Bi nanowires were investigated. The phase coherence length was obtained from the fluctuation pattern of the magnetoresistance below 40 K using universal conductance fluctuation theory. The obtained temperature dependence of phase coherence length and the fluctuation amplitude indicates that the transport of electrons shows 2-dimensional characteristics originating from the surface states. The temperature dependence of the coherence length derived from the weak antilocalization effect using the Hikami–Larkin–Nagaoka model is consistent with that from the universal conductance fluctuations theory

  2. Environmental surveillance master sampling schedule

    International Nuclear Information System (INIS)

    Bisping, L.E.

    1991-01-01

    Environmental surveillance of the Hanford Site and surrounding areas is conducted by the Pacific Northwest Laboratory (PNL) for the US Department of Energy (DOE). This document contains the planned schedule for routine sample collection for the Surface Environmental Surveillance Project (SESP) and Ground-Water Monitoring Project. The routine sampling plan for the SESP has been revised this year to reflect changing site operations and priorities. Some sampling previously performed at least annually has been reduced in frequency, and some new sampling to be performed at a less than annual frequency has been added. Therefore, the SESP schedule reflects sampling to be conducted in calendar year 1991 as well as future years. The ground-water sampling schedule is for 1991. This schedule is subject to modification during the year in response to changes in Site operation, program requirements, and the nature of the observed results. Operational limitations such as weather, mechanical failures, sample availability, etc., may also require schedule modifications. Changes will be documented in the respective project files, but this plan will not be reissued. The purpose of these monitoring projects is to evaluate levels of radioactive and nonradioactive pollutants in the Hanford evirons

  3. Environmental surveillance master sampling schedule

    Energy Technology Data Exchange (ETDEWEB)

    Bisping, L.E.

    1991-01-01

    Environmental surveillance of the Hanford Site and surrounding areas is conducted by the Pacific Northwest Laboratory (PNL) for the US Department of Energy (DOE). This document contains the planned schedule for routine sample collection for the Surface Environmental Surveillance Project (SESP) and Ground-Water Monitoring Project. The routine sampling plan for the SESP has been revised this year to reflect changing site operations and priorities. Some sampling previously performed at least annually has been reduced in frequency, and some new sampling to be performed at a less than annual frequency has been added. Therefore, the SESP schedule reflects sampling to be conducted in calendar year 1991 as well as future years. The ground-water sampling schedule is for 1991. This schedule is subject to modification during the year in response to changes in Site operation, program requirements, and the nature of the observed results. Operational limitations such as weather, mechanical failures, sample availability, etc., may also require schedule modifications. Changes will be documented in the respective project files, but this plan will not be reissued. The purpose of these monitoring projects is to evaluate levels of radioactive and nonradioactive pollutants in the Hanford evirons.

  4. Sampling system and method

    Science.gov (United States)

    Decker, David L.; Lyles, Brad F.; Purcell, Richard G.; Hershey, Ronald Lee

    2013-04-16

    The present disclosure provides an apparatus and method for coupling conduit segments together. A first pump obtains a sample and transmits it through a first conduit to a reservoir accessible by a second pump. The second pump further conducts the sample from the reservoir through a second conduit.

  5. Environmental surveillance master sampling schedule

    International Nuclear Information System (INIS)

    Bisping, L.E.

    1995-02-01

    Environmental surveillance of the Hanford Site and surrounding areas is conducted by the Pacific Northwest Laboratory (PNL) for the U.S. Department of Energy (DOE). This document contains the planned 1994 schedules for routine collection of samples for the Surface Environmental Surveillance Project (SESP), Drinking Water Project, and Ground-Water Surveillance Project. Samples are routinely collected for the SESP and analyzed to determine the quality of air, surface water, soil, sediment, wildlife, vegetation, foodstuffs, and farm products at Hanford Site and surrounding communities. The responsibility for monitoring onsite drinking water falls outside the scope of the SESP. PNL conducts the drinking water monitoring project concurrent with the SESP to promote efficiency and consistency, utilize expertise developed over the years, and reduce costs associated with management, procedure development, data management, quality control, and reporting. The ground-water sampling schedule identifies ground-water sampling .events used by PNL for environmental surveillance of the Hanford Site. Sampling is indicated as annual, semi-annual, quarterly, or monthly in the sampling schedule. Some samples are collected and analyzed as part of ground-water monitoring and characterization programs at Hanford (e.g. Resources Conservation and Recovery Act (RCRA), Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA), or Operational). The number of samples planned by other programs are identified in the sampling schedule by a number in the analysis column and a project designation in the Cosample column. Well sampling events may be merged to avoid redundancy in cases where sampling is planned by both-environmental surveillance and another program

  6. Environmental surveillance master sampling schedule

    Energy Technology Data Exchange (ETDEWEB)

    Bisping, L.E.

    1995-02-01

    Environmental surveillance of the Hanford Site and surrounding areas is conducted by the Pacific Northwest Laboratory (PNL) for the U.S. Department of Energy (DOE). This document contains the planned 1994 schedules for routine collection of samples for the Surface Environmental Surveillance Project (SESP), Drinking Water Project, and Ground-Water Surveillance Project. Samples are routinely collected for the SESP and analyzed to determine the quality of air, surface water, soil, sediment, wildlife, vegetation, foodstuffs, and farm products at Hanford Site and surrounding communities. The responsibility for monitoring onsite drinking water falls outside the scope of the SESP. PNL conducts the drinking water monitoring project concurrent with the SESP to promote efficiency and consistency, utilize expertise developed over the years, and reduce costs associated with management, procedure development, data management, quality control, and reporting. The ground-water sampling schedule identifies ground-water sampling .events used by PNL for environmental surveillance of the Hanford Site. Sampling is indicated as annual, semi-annual, quarterly, or monthly in the sampling schedule. Some samples are collected and analyzed as part of ground-water monitoring and characterization programs at Hanford (e.g. Resources Conservation and Recovery Act (RCRA), Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA), or Operational). The number of samples planned by other programs are identified in the sampling schedule by a number in the analysis column and a project designation in the Cosample column. Well sampling events may be merged to avoid redundancy in cases where sampling is planned by both-environmental surveillance and another program.

  7. Sampling procedure, receipt and conservation of water samples to determine environmental radioactivity

    International Nuclear Information System (INIS)

    Herranz, M.; Navarro, E.; Payeras, J.

    2009-01-01

    The present document informs about essential goals, processes and contents that the subgroups Sampling and Samples Preparation and Conservation believe they should be part of the procedure to obtain a correct sampling, receipt, conservation and preparation of samples of continental, marine and waste water before qualifying its radioactive content.

  8. Effects of limited spatial resolution on fluctuation measurements

    International Nuclear Information System (INIS)

    Bravenec, R.V.; Wootton, A.J.

    1994-01-01

    The finite sample volumes of fluctuation diagnostics distort the measurements not only by averaging the gross fluctuation parameters over the sample volumes, but more importantly (except for collective scattering), by attenuating the shorter wavelength components. In this work the response of various sample volume sizes and orientations to a model fluctuation power spectrum S(k,ω) are examined. The model spectrum is fashioned after observations by far-infrared scattering on TEXT. The sample-volume extent in the direction of propagation of the turbulence is shown to be the most critical - not only does it reduce the measured fluctuation amplitude and correlation length (as does an extent perpendicular to the propagation direction), but also reduces the measured mean frequency and increases the apparent average phase velocity of the fluctuations. The differing sizes, shapes, and orientations of the sample volumes among fluctuation diagnostics, as well as deliberate variations within a single diagnostic, provide information on the form of the underlying turbulence and can be exploited to refine the model

  9. A Simple Device for Lens-to-Sample Distance Adjustment in Laser-Induced Breakdown Spectroscopy (LIBS).

    Science.gov (United States)

    Cortez, Juliana; Farias Filho, Benedito B; Fontes, Laiane M; Pasquini, Celio; Raimundo, Ivo M; Pimentel, Maria Fernanda; de Souza Lins Borba, Flávia

    2017-04-01

    A simple device based on two commercial laser pointers is described to assist in the analysis of samples that present uneven surfaces and/or irregular shapes using laser-induced breakdown spectroscopy (LIBS). The device allows for easy positioning of the sample surface at a reproducible distance from the focusing lens that conveys the laser pulse to generate the micro-plasma in a LIBS system, with reproducibility better than ±0.2 mm. In this way, fluctuations in the fluence (J cm -2 ) are minimized and the LIBS analytical signals can be obtained with a better precision even when samples with irregular surfaces are probed.

  10. An Optimization Study on Listening Experiments to Improve the Comparability of Annoyance Ratings of Noise Samples from Different Experimental Sample Sets.

    Science.gov (United States)

    Di, Guoqing; Lu, Kuanguang; Shi, Xiaofan

    2018-03-08

    Annoyance ratings obtained from listening experiments are widely used in studies on health effect of environmental noise. In listening experiments, participants usually give the annoyance rating of each noise sample according to its relative annoyance degree among all samples in the experimental sample set if there are no reference sound samples, which leads to poor comparability between experimental results obtained from different experimental sample sets. To solve this problem, this study proposed to add several pink noise samples with certain loudness levels into experimental sample sets as reference sound samples. On this basis, the standard curve between logarithmic mean annoyance and loudness level of pink noise was used to calibrate the experimental results and the calibration procedures were described in detail. Furthermore, as a case study, six different types of noise sample sets were selected to conduct listening experiments using this method to examine the applicability of it. Results showed that the differences in the annoyance ratings of each identical noise sample from different experimental sample sets were markedly decreased after calibration. The determination coefficient ( R ²) of linear fitting functions between psychoacoustic annoyance (PA) and mean annoyance (MA) of noise samples from different experimental sample sets increased obviously after calibration. The case study indicated that the method above is applicable to calibrating annoyance ratings obtained from different types of noise sample sets. After calibration, the comparability of annoyance ratings of noise samples from different experimental sample sets can be distinctly improved.

  11. Mental health problems are associated with low-frequency fluctuations in reaction time in a large general population sample. The TRAILS study.

    Science.gov (United States)

    Bastiaansen, J A; van Roon, A M; Buitelaar, J K; Oldehinkel, A J

    2015-02-01

    Increased intra-subject reaction time variability (RT-ISV) as coarsely measured by the standard deviation (RT-SD) has been associated with many forms of psychopathology. Low-frequency RT fluctuations, which have been associated with intrinsic brain rhythms occurring approximately every 15-40s, have been shown to add unique information for ADHD. In this study, we investigated whether these fluctuations also relate to attentional problems in the general population, and contribute to the two major domains of psychopathology: externalizing and internalizing problems. RT was monitored throughout a self-paced sustained attention task (duration: 9.1 ± 1.2 min) in a Dutch population cohort of young adults (n=1455, mean age: 19.0 ± 0.6 years, 55.1% girls). To characterize temporal fluctuations in RT, we performed direct Fourier Transform on externally validated frequency bands based on frequency ranges of neuronal oscillations: Slow-5 (0.010-0.027 Hz), Slow-4 (0.027-0.073 Hz), and three additional higher frequency bands. Relative magnitude of Slow-4 fluctuations was the primary predictor in regression models for attentional, internalizing and externalizing problems (measured by the Adult Self-Report questionnaire). Additionally, stepwise regression models were created to investigate (a) whether Slow-4 significantly improved the prediction of problem behaviors beyond the RT-SD and (b) whether the other frequency bands provided important additional information. The magnitude of Slow-4 fluctuations significantly predicted attentional and externalizing problems and even improved model fit after modeling RT-SD first (R(2) change=0.6%, Pfrequency bands provided additional information. Low-frequency RT fluctuations have added predictive value for attentional and externalizing, but not internalizing problems beyond global differences in variability. This study extends previous findings in clinical samples of children with ADHD to adolescents from the general population and

  12. Automatic remote sampling and delivery system incorporating decontamination and disposal of sample bottles

    International Nuclear Information System (INIS)

    Savarkar, V.K.; Mishra, A.K.; Bajpai, D.D.; Nair, M.K.T.

    1990-01-01

    The present generation of reprocessing plants have sampling and delivery systems that have to be operated manually with its associated problems. The complete automation and remotisation of sampling system has hence been considered to reduce manual intervention and personnel exposure. As a part of this scheme an attempt to automate and remotise various steps in sampling system has been made. This paper discusses in detail the development work carried out in this area as well as the tests conducted to incorporate the same in the existing plants. (author). 3 figs

  13. Feasibility study of parallel conduction cooling of NbTi magnet and sample probe in a cryogen-free magnet system

    Science.gov (United States)

    Catarino, I.; Soni, V.; Barreto, J.; Martins, D.; Kar, S.

    2017-02-01

    The conduction cooling of both a 6 T superconducting magnet along with a sample probe in a parallel configuration is addressed in this work. A Gifford-McMahon (GM) cryocooler is directly cooling the NbTi magnet, which aims to be kept at 4 K, while a gas-gap heat switch (GGHS) manages the cooling power to be diverted to the sample probe, which may be swept from 4 K up to 300 K. A first prototype of a GGHS was customized and validated for this purpose. A sample probe assembly has been designed and assembled with the existing cryogen-free magnet system. The whole test setup and components are described and the preliminary experimental results on the integration are presented and discussed. The magnet was charged up to 3 T with a 4 K sample space and up to 1 T with a sweeping sample space temperature up to 300 K while acting on the GGHS. Despite some identified thermal insulation problems that occurred during this first test, the overall results demonstrated the feasibility of the cryogen-free parallel conduction cooling on study.

  14. Genetic Contribution to Alcohol Dependence: Investigation of a Heterogeneous German Sample of Individuals with Alcohol Dependence, Chronic Alcoholic Pancreatitis, and Alcohol-Related Cirrhosis

    Science.gov (United States)

    Treutlein, Jens; Streit, Fabian; Juraeva, Dilafruz; Degenhardt, Franziska; Rietschel, Liz; Forstner, Andreas J.; Ridinger, Monika; Dukal, Helene; Foo, Jerome C.; Soyka, Michael; Maier, Wolfgang; Gaebel, Wolfgang; Dahmen, Norbert; Scherbaum, Norbert; Müller-Myhsok, Bertram; Lucae, Susanne; Ising, Marcus; Stickel, Felix; Berg, Thomas; Roggenbuck, Ulla; Jöckel, Karl-Heinz; Scholz, Henrike; Zimmermann, Ulrich S.; Buch, Stephan; Sommer, Wolfgang H.; Spanagel, Rainer; Brors, Benedikt; Cichon, Sven; Mann, Karl; Kiefer, Falk; Hampe, Jochen; Rosendahl, Jonas; Nöthen, Markus M.; Rietschel, Marcella

    2017-01-01

    The present study investigated the genetic contribution to alcohol dependence (AD) using genome-wide association data from three German samples. These comprised patients with: (i) AD; (ii) chronic alcoholic pancreatitis (ACP); and (iii) alcohol-related liver cirrhosis (ALC). Single marker, gene-based, and pathway analyses were conducted. A significant association was detected for the ADH1B locus in a gene-based approach (puncorrected = 1.2 × 10−6; pcorrected = 0.020). This was driven by the AD subsample. No association with ADH1B was found in the combined ACP + ALC sample. On first inspection, this seems surprising, since ADH1B is a robustly replicated risk gene for AD and may therefore be expected to be associated also with subgroups of AD patients. The negative finding in the ACP + ALC sample, however, may reflect genetic stratification as well as random fluctuation of allele frequencies in the cases and controls, demonstrating the importance of large samples in which the phenotype is well assessed. PMID:28714907

  15. Superconducting fluctuations and pseudogap in high-Tc cuprates

    Directory of Open Access Journals (Sweden)

    Alloul H.

    2012-03-01

    Full Text Available Large pulsed magnetic fields up to 60 Tesla are used to suppress the contribution of superconducting fluctuations (SCF to the ab-plane conductivity above Tc in a series of YBa2Cu3O6+x. These experiments allow us to determine the field Hc’(T and the temperature Tc’ above which the SCFs are fully suppressed. A careful investigation near optimal doping shows that Tc’ is higher than the pseudogap temperature T*, which is an unambiguous evidence that the pseudogap cannot be assigned to preformed pairs. Accurate determinations of the SCF contribution to the conductivity versus temperature and magnetic field have been achieved. They can be accounted for by thermal fluctuations following the Ginzburg-Landau scheme for nearly optimally doped samples. A phase fluctuation contribution might be invoked for the most underdoped samples in a T range which increases when controlled disorder is introduced by electron irradiation. Quantitative analysis of the fluctuating magnetoconductance allows us to determine the critical field Hc2(0 which is found to be be quite similar to Hc’ (0 and to increase with hole doping. Studies of the incidence of disorder on both Tc’ and T* allow us to to propose a three dimensional phase diagram including a disorder axis, which allows to explain most observations done in other cuprate families.

  16. Implementation of conduct of operations at Paducah uranium hexafluoride (UF{sub 6}) sampling and transfer facility

    Energy Technology Data Exchange (ETDEWEB)

    Penrod, S.R. [Martin Marietta Energy Systems, Inc., KY (United States)

    1991-12-31

    This paper describes the initial planning and actual field activities associated with the implementation of {open_quotes}Conduct of Operations{close_quotes}. Conduct of Operations is an operating philosophy that was developed through the Institute of Nuclear Power Operations (INPO). Conduct of Operations covers many operating practices and is intended to provide formality and discipline to all aspects of plant operation. The implementation of these operating principles at the UF{sub 6} Sampling and Transfer Facility resulted in significant improvements in facility operations.

  17. Implementation of conduct of operations at Paducah uranium hexafluoride (UF{sub 6}) sampling and transfer facility

    Energy Technology Data Exchange (ETDEWEB)

    Penrod, S.R. [Martin Marietta Energy Systems, Inc., KY (United States)

    1991-12-31

    This paper describes the initial planning and actual field activities associated with the implementation of {open_quotes}Conduct of Operations{close_quotes}, Conduct of Operations is an operating philosophy that was developed through the Institute of Nuclear Power Operations (INPO). Conduct of Operations covers many operating practices and is intended to provide formality and discipline to all aspects of plant operation. The implementation of these operating principles at the UF{sub 6} Sampling and Transfer Facility resulted in significant improvements in facility operations.

  18. Focused conformational sampling in proteins

    Science.gov (United States)

    Bacci, Marco; Langini, Cassiano; Vymětal, Jiří; Caflisch, Amedeo; Vitalis, Andreas

    2017-11-01

    A detailed understanding of the conformational dynamics of biological molecules is difficult to obtain by experimental techniques due to resolution limitations in both time and space. Computer simulations avoid these in theory but are often too short to sample rare events reliably. Here we show that the progress index-guided sampling (PIGS) protocol can be used to enhance the sampling of rare events in selected parts of biomolecules without perturbing the remainder of the system. The method is very easy to use as it only requires as essential input a set of several features representing the parts of interest sufficiently. In this feature space, new states are discovered by spontaneous fluctuations alone and in unsupervised fashion. Because there are no energetic biases acting on phase space variables or projections thereof, the trajectories PIGS generates can be analyzed directly in the framework of transition networks. We demonstrate the possibility and usefulness of such focused explorations of biomolecules with two loops that are part of the binding sites of bromodomains, a family of epigenetic "reader" modules. This real-life application uncovers states that are structurally and kinetically far away from the initial crystallographic structures and are also metastable. Representative conformations are intended to be used in future high-throughput virtual screening campaigns.

  19. Green approaches in sample preparation of bioanalytical samples prior to chromatographic analysis.

    Science.gov (United States)

    Filippou, Olga; Bitas, Dimitrios; Samanidou, Victoria

    2017-02-01

    Sample preparation is considered to be the most challenging step of the analytical procedure, since it has an effect on the whole analytical methodology, therefore it contributes significantly to the greenness or lack of it of the entire process. The elimination of the sample treatment steps, pursuing at the same time the reduction of the amount of the sample, strong reductions in consumption of hazardous reagents and energy also maximizing safety for operators and environment, the avoidance of the use of big amount of organic solvents, form the basis for greening sample preparation and analytical methods. In the last decade, the development and utilization of greener and sustainable microextraction techniques is an alternative to classical sample preparation procedures. In this review, the main green microextraction techniques (solid phase microextraction, stir bar sorptive extraction, hollow-fiber liquid phase microextraction, dispersive liquid - liquid microextraction, etc.) will be presented, with special attention to bioanalytical applications of these environment-friendly sample preparation techniques which comply with the green analytical chemistry principles. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Discrimination of lymphoma using laser-induced breakdown spectroscopy conducted on whole blood samples

    Science.gov (United States)

    Chen, Xue; Li, Xiaohui; Yang, Sibo; Yu, Xin; Liu, Aichun

    2018-01-01

    Lymphoma is a significant cancer that affects the human lymphatic and hematopoietic systems. In this work, discrimination of lymphoma using laser-induced breakdown spectroscopy (LIBS) conducted on whole blood samples is presented. The whole blood samples collected from lymphoma patients and healthy controls are deposited onto standard quantitative filter papers and ablated with a 1064 nm Q-switched Nd:YAG laser. 16 atomic and ionic emission lines of calcium (Ca), iron (Fe), magnesium (Mg), potassium (K) and sodium (Na) are selected to discriminate the cancer disease. Chemometric methods, including principal component analysis (PCA), linear discriminant analysis (LDA) classification, and k nearest neighbor (kNN) classification are used to build the discrimination models. Both LDA and kNN models have achieved very good discrimination performances for lymphoma, with an accuracy of over 99.7%, a sensitivity of over 0.996, and a specificity of over 0.997. These results demonstrate that the whole-blood-based LIBS technique in combination with chemometric methods can serve as a fast, less invasive, and accurate method for detection and discrimination of human malignancies. PMID:29541503

  1. 'Intelligent' approach to radioimmunoassay sample counting employing a microprocessor controlled sample counter

    International Nuclear Information System (INIS)

    Ekins, R.P.; Sufi, S.; Malan, P.G.

    1977-01-01

    The enormous impact on medical science in the last two decades of microanalytical techniques employing radioisotopic labels has, in turn, generated a large demand for automatic radioisotopic sample counters. Such instruments frequently comprise the most important item of capital equipment required in the use of radioimmunoassay and related techniques and often form a principle bottleneck in the flow of samples through a busy laboratory. It is therefore particularly imperitive that such instruments should be used 'intelligently' and in an optimal fashion to avoid both the very large capital expenditure involved in the unnecessary proliferation of instruments and the time delays arising from their sub-optimal use. The majority of the current generation of radioactive sample counters nevertheless rely on primitive control mechanisms based on a simplistic statistical theory of radioactive sample counting which preclude their efficient and rational use. The fundamental principle upon which this approach is based is that it is useless to continue counting a radioactive sample for a time longer than that required to yield a significant increase in precision of the measurement. Thus, since substantial experimental errors occur during sample preparation, these errors should be assessed and must be releted to the counting errors for that sample. It is the objective of this presentation to demonstrate that the combination of a realistic statistical assessment of radioactive sample measurement, together with the more sophisticated control mechanisms that modern microprocessor technology make possible, may often enable savings in counter usage of the order of 5-10 fold to be made. (orig.) [de

  2. Environmental surveillance master sampling schedule

    International Nuclear Information System (INIS)

    Bisping, L.E.

    1994-02-01

    This document contains the planned 1994 schedules for routine collection of samples for the Surface Environmental Surveillance Project (SESP), Drinking Water Project, and Ground-Water Surveillance Project. Samples are routinely collected for the SESP and analyzed to determine the quality of air, surface water, soil, sediment, wildlife, vegetation, foodstuffs, and farm products at Hanford Site and surrounding communities. The responsibility for monitoring the onsite drinking water falls outside the scope of the SESP. The Hanford Environmental Health Foundation is responsible for monitoring the nonradiological parameters as defined in the National Drinking Water Standards while PNL conducts the radiological monitoring of the onsite drinking water. PNL conducts the drinking water monitoring project concurrent with the SESP to promote efficiency and consistency, utilize the expertise developed over the years, and reduce costs associated with management, procedure development, data management, quality control and reporting. The ground-water sampling schedule identifies ground-water sampling events used by PNL for environmental surveillance of the Hanford Site

  3. Environmental surveillance master sampling schedule

    Energy Technology Data Exchange (ETDEWEB)

    Bisping, L.E.

    1994-02-01

    This document contains the planned 1994 schedules for routine collection of samples for the Surface Environmental Surveillance Project (SESP), Drinking Water Project, and Ground-Water Surveillance Project. Samples are routinely collected for the SESP and analyzed to determine the quality of air, surface water, soil, sediment, wildlife, vegetation, foodstuffs, and farm products at Hanford Site and surrounding communities. The responsibility for monitoring the onsite drinking water falls outside the scope of the SESP. The Hanford Environmental Health Foundation is responsible for monitoring the nonradiological parameters as defined in the National Drinking Water Standards while PNL conducts the radiological monitoring of the onsite drinking water. PNL conducts the drinking water monitoring project concurrent with the SESP to promote efficiency and consistency, utilize the expertise developed over the years, and reduce costs associated with management, procedure development, data management, quality control and reporting. The ground-water sampling schedule identifies ground-water sampling events used by PNL for environmental surveillance of the Hanford Site.

  4. Catch me if you can: Comparing ballast water sampling skids to traditional net sampling

    Science.gov (United States)

    Bradie, Johanna; Gianoli, Claudio; Linley, Robert Dallas; Schillak, Lothar; Schneider, Gerd; Stehouwer, Peter; Bailey, Sarah

    2018-03-01

    With the recent ratification of the International Convention for the Control and Management of Ships' Ballast Water and Sediments, 2004, it will soon be necessary to assess ships for compliance with ballast water discharge standards. Sampling skids that allow the efficient collection of ballast water samples in a compact space have been developed for this purpose. We ran 22 trials on board the RV Meteor from June 4-15, 2015 to evaluate the performance of three ballast water sampling devices (traditional plankton net, Triton sampling skid, SGS sampling skid) for three organism size classes: ≥ 50 μm, ≥ 10 μm to Natural sea water was run through the ballast water system and untreated samples were collected using paired sampling devices. Collected samples were analyzed in parallel by multiple analysts using several different analytic methods to quantify organism concentrations. To determine whether there were differences in the number of viable organisms collected across sampling devices, results were standardized and statistically treated to filter out other sources of variability, resulting in an outcome variable representing the mean difference in measurements that can be attributed to sampling devices. These results were tested for significance using pairwise Tukey contrasts. Differences in organism concentrations were found in 50% of comparisons between sampling skids and the plankton net for ≥ 50 μm, and ≥ 10 μm to < 50 μm size classes, with net samples containing either higher or lower densities. There were no differences for < 10 μm organisms. Future work will be required to explicitly examine the potential effects of flow velocity, sampling duration, sampled volume, and organism concentrations on sampling device performance.

  5. [A comparison of convenience sampling and purposive sampling].

    Science.gov (United States)

    Suen, Lee-Jen Wu; Huang, Hui-Man; Lee, Hao-Hsien

    2014-06-01

    Convenience sampling and purposive sampling are two different sampling methods. This article first explains sampling terms such as target population, accessible population, simple random sampling, intended sample, actual sample, and statistical power analysis. These terms are then used to explain the difference between "convenience sampling" and purposive sampling." Convenience sampling is a non-probabilistic sampling technique applicable to qualitative or quantitative studies, although it is most frequently used in quantitative studies. In convenience samples, subjects more readily accessible to the researcher are more likely to be included. Thus, in quantitative studies, opportunity to participate is not equal for all qualified individuals in the target population and study results are not necessarily generalizable to this population. As in all quantitative studies, increasing the sample size increases the statistical power of the convenience sample. In contrast, purposive sampling is typically used in qualitative studies. Researchers who use this technique carefully select subjects based on study purpose with the expectation that each participant will provide unique and rich information of value to the study. As a result, members of the accessible population are not interchangeable and sample size is determined by data saturation not by statistical power analysis.

  6. Predictive Sampling of Rare Conformational Events in Aqueous Solution: Designing a Generalized Orthogonal Space Tempering Method.

    Science.gov (United States)

    Lu, Chao; Li, Xubin; Wu, Dongsheng; Zheng, Lianqing; Yang, Wei

    2016-01-12

    In aqueous solution, solute conformational transitions are governed by intimate interplays of the fluctuations of solute-solute, solute-water, and water-water interactions. To promote molecular fluctuations to enhance sampling of essential conformational changes, a common strategy is to construct an expanded Hamiltonian through a series of Hamiltonian perturbations and thereby broaden the distribution of certain interactions of focus. Due to a lack of active sampling of configuration response to Hamiltonian transitions, it is challenging for common expanded Hamiltonian methods to robustly explore solvent mediated rare conformational events. The orthogonal space sampling (OSS) scheme, as exemplified by the orthogonal space random walk and orthogonal space tempering methods, provides a general framework for synchronous acceleration of slow configuration responses. To more effectively sample conformational transitions in aqueous solution, in this work, we devised a generalized orthogonal space tempering (gOST) algorithm. Specifically, in the Hamiltonian perturbation part, a solvent-accessible-surface-area-dependent term is introduced to implicitly perturb near-solute water-water fluctuations; more importantly in the orthogonal space response part, the generalized force order parameter is generalized as a two-dimension order parameter set, in which essential solute-solvent and solute-solute components are separately treated. The gOST algorithm is evaluated through a molecular dynamics simulation study on the explicitly solvated deca-alanine (Ala10) peptide. On the basis of a fully automated sampling protocol, the gOST simulation enabled repetitive folding and unfolding of the solvated peptide within a single continuous trajectory and allowed for detailed constructions of Ala10 folding/unfolding free energy surfaces. The gOST result reveals that solvent cooperative fluctuations play a pivotal role in Ala10 folding/unfolding transitions. In addition, our assessment

  7. Two sampling techniques for game meat

    Directory of Open Access Journals (Sweden)

    Maretha van der Merwe

    2013-03-01

    Full Text Available A study was conducted to compare the excision sampling technique used by the export market and the sampling technique preferred by European countries, namely the biotrace cattle and swine test. The measuring unit for the excision sampling was grams (g and square centimetres (cm2 for the swabbing technique. The two techniques were compared after a pilot test was conducted on spiked approved beef carcasses (n = 12 that statistically proved the two measuring units correlated. The two sampling techniques were conducted on the same game carcasses (n = 13 and analyses performed for aerobic plate count (APC, Escherichia coli and Staphylococcus aureus, for both techniques. A more representative result was obtained by swabbing and no damage was caused to the carcass. Conversely, the excision technique yielded fewer organisms and caused minor damage to the carcass. The recovery ratio from the sampling technique improved 5.4 times for APC, 108.0 times for E. coli and 3.4 times for S. aureus over the results obtained from the excision technique. It was concluded that the sampling methods of excision and swabbing can be used to obtain bacterial profiles from both export and local carcasses and could be used to indicate whether game carcasses intended for the local market are possibly on par with game carcasses intended for the export market and therefore safe for human consumption.

  8. Two sampling techniques for game meat.

    Science.gov (United States)

    van der Merwe, Maretha; Jooste, Piet J; Hoffman, Louw C; Calitz, Frikkie J

    2013-03-20

    A study was conducted to compare the excision sampling technique used by the export market and the sampling technique preferred by European countries, namely the biotrace cattle and swine test. The measuring unit for the excision sampling was grams (g) and square centimetres (cm2) for the swabbing technique. The two techniques were compared after a pilot test was conducted on spiked approved beef carcasses (n = 12) that statistically proved the two measuring units correlated. The two sampling techniques were conducted on the same game carcasses (n = 13) and analyses performed for aerobic plate count (APC), Escherichia coli and Staphylococcus aureus, for both techniques. A more representative result was obtained by swabbing and no damage was caused to the carcass. Conversely, the excision technique yielded fewer organisms and caused minor damage to the carcass. The recovery ratio from the sampling technique improved 5.4 times for APC, 108.0 times for E. coli and 3.4 times for S. aureus over the results obtained from the excision technique. It was concluded that the sampling methods of excision and swabbing can be used to obtain bacterial profiles from both export and local carcasses and could be used to indicate whether game carcasses intended for the local market are possibly on par with game carcasses intended for the export market and therefore safe for human consumption.

  9. Systematic sampling with errors in sample locations

    DEFF Research Database (Denmark)

    Ziegel, Johanna; Baddeley, Adrian; Dorph-Petersen, Karl-Anton

    2010-01-01

    analysis using point process methods. We then analyze three different models for the error process, calculate exact expressions for the variances, and derive asymptotic variances. Errors in the placement of sample points can lead to substantial inflation of the variance, dampening of zitterbewegung......Systematic sampling of points in continuous space is widely used in microscopy and spatial surveys. Classical theory provides asymptotic expressions for the variance of estimators based on systematic sampling as the grid spacing decreases. However, the classical theory assumes that the sample grid...... is exactly periodic; real physical sampling procedures may introduce errors in the placement of the sample points. This paper studies the effect of errors in sample positioning on the variance of estimators in the case of one-dimensional systematic sampling. First we sketch a general approach to variance...

  10. Transition conductivity study of high temperature superconductor compounds: the role of fluctuations

    International Nuclear Information System (INIS)

    Pagnon, V.

    1991-04-01

    This memory subject is the transition conductivity study of high temperature superconductors in corelation with their anisotropy. Systematic conductivity measurements were made on YBaCuO and BaSrCaCuO in relation with temperature from 4.2 K to 1200 K, and with a magnetic field up to 8 T in several directions. Oxygen order has an effect on the characteristics at YBaCuO transition conductivity. The activation energy for oxygen absorption is about 0.5eV. One method of analysis of the conductivity fluctuations about the transition temperature is proposed. Two separate rates are noticeable in YBaCuO compound. The 3 D fluctuations rate in the immediate neighbourghood of the transition lets place to the 2 D fluctuations rate at high temperature. Transitions temperatures governing each rate are different, that's incompatible with the formula proposed by Lawrence and Doniach. On the other hand, the analogy with quasi-2 D magnetic systems seems more relevant. A magnetic field application or a lowering of oxygen concentration removes the 3 D fluctuations rate. Non ohmic effects observed at the transition conductivity foot are analysis as a non-linear 2 D excitation manifestation of the supraconductive phase. Finally, by measurements on strontium doped YBaCuO crystals, we confirm a metal-insulator transition along the C-Axe when oxygen concentration reduces. This is connected with the specific heat jump. All these results uplighten the fundamental bidimensional character of high transition temperature superconductivity [fr

  11. An algorithm to improve sampling efficiency for uncertainty propagation using sampling based method

    International Nuclear Information System (INIS)

    Campolina, Daniel; Lima, Paulo Rubens I.; Pereira, Claubia; Veloso, Maria Auxiliadora F.

    2015-01-01

    Sample size and computational uncertainty were varied in order to investigate sample efficiency and convergence of the sampling based method for uncertainty propagation. Transport code MCNPX was used to simulate a LWR model and allow the mapping, from uncertain inputs of the benchmark experiment, to uncertain outputs. Random sampling efficiency was improved through the use of an algorithm for selecting distributions. Mean range, standard deviation range and skewness were verified in order to obtain a better representation of uncertainty figures. Standard deviation of 5 pcm in the propagated uncertainties for 10 n-samples replicates was adopted as convergence criterion to the method. Estimation of 75 pcm uncertainty on reactor k eff was accomplished by using sample of size 93 and computational uncertainty of 28 pcm to propagate 1σ uncertainty of burnable poison radius. For a fixed computational time, in order to reduce the variance of the uncertainty propagated, it was found, for the example under investigation, it is preferable double the sample size than double the amount of particles followed by Monte Carlo process in MCNPX code. (author)

  12. 40 CFR 89.507 - Sample selection.

    Science.gov (United States)

    2010-07-01

    ... Auditing § 89.507 Sample selection. (a) Engines comprising a test sample will be selected at the location...). However, once the manufacturer ships any test engine, it relinquishes the prerogative to conduct retests...

  13. Tracer gas diffusion sampling test plan

    International Nuclear Information System (INIS)

    Rohay, V.J.

    1993-01-01

    Efforts are under way to employ active and passive vapor extraction to remove carbon tetrachloride from the soil in the 200 West Area an the Hanford Site as part of the 200 West Area Carbon Tetrachloride Expedited Response Action. In the active approach, a vacuum is applied to a well, which causes soil gas surrounding the well to be drawn up to the surface. The contaminated air is cleaned by passage through a granular activated carbon bed. There are questions concerning the radius of influence associated with application of the vacuum system and related uncertainties about the soil-gas diffusion rates with and without the vacuum system present. To address these questions, a series of tracer gas diffusion sampling tests is proposed in which an inert, nontoxic tracer gas, sulfur hexafluoride (SF 6 ), will be injected into a well, and the rates of SF 6 diffusion through the surrounding soil horizon will be measured by sampling in nearby wells. Tracer gas tests will be conducted at sites very near the active vacuum extraction system and also at sites beyond the radius of influence of the active vacuum system. In the passive vapor extraction approach, barometric pressure fluctuations cause soil gas to be drawn to the surface through the well. At the passive sites, the effects of barometric ''pumping'' due to changes in atmospheric pressure will be investigated. Application of tracer gas testing to both the active and passive vapor extraction methods is described in the wellfield enhancement work plan (Rohay and Cameron 1993)

  14. Effects of limited spatial resolution on fluctuation measurements (invited)

    International Nuclear Information System (INIS)

    Bravenec, R.V.; Wootton, A.J.

    1995-01-01

    The finite sample volumes of fluctuation diagnostics distort the measurements not only by averaging the gross fluctuation parameters over the sample volumes, but more importantly (except for collective scattering), by attenuating the shorter wavelength components. In this work, the response of various sample volume sizes and orientations to a model fluctuation power spectrum S(k,ω) are examined. The model spectrum is fashioned after observations by far-infrared scattering on TEXT. The sample-volume extent in the direction of propagation of the turbulence is shown to be the most critical---not only does it reduce the measured fluctuation amplitude and increase the correlation length (as does an extent perpendicular to the propagation direction), but it also reduces the measured mean frequency and increases the apparent average phase velocity of the fluctuations. The differing sizes, shapes, and orientations of the sample volumes among fluctuation diagnostics, as well as deliberate variations within a single diagnostic, provide information on the form of the underlying turbulence and can be exploited to refine the model

  15. Experimental and Sampling Design for the INL-2 Sample Collection Operational Test

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, Gregory F.; Amidan, Brett G.; Matzke, Brett D.

    2009-02-16

    This report describes the experimental and sampling design developed to assess sampling approaches and methods for detecting contamination in a building and clearing the building for use after decontamination. An Idaho National Laboratory (INL) building will be contaminated with BG (Bacillus globigii, renamed Bacillus atrophaeus), a simulant for Bacillus anthracis (BA). The contamination, sampling, decontamination, and re-sampling will occur per the experimental and sampling design. This INL-2 Sample Collection Operational Test is being planned by the Validated Sampling Plan Working Group (VSPWG). The primary objectives are: 1) Evaluate judgmental and probabilistic sampling for characterization as well as probabilistic and combined (judgment and probabilistic) sampling approaches for clearance, 2) Conduct these evaluations for gradient contamination (from low or moderate down to absent or undetectable) for different initial concentrations of the contaminant, 3) Explore judgment composite sampling approaches to reduce sample numbers, 4) Collect baseline data to serve as an indication of the actual levels of contamination in the tests. A combined judgmental and random (CJR) approach uses Bayesian methodology to combine judgmental and probabilistic samples to make clearance statements of the form "X% confidence that at least Y% of an area does not contain detectable contamination” (X%/Y% clearance statements). The INL-2 experimental design has five test events, which 1) vary the floor of the INL building on which the contaminant will be released, 2) provide for varying the amount of contaminant released to obtain desired concentration gradients, and 3) investigate overt as well as covert release of contaminants. Desirable contaminant gradients would have moderate to low concentrations of contaminant in rooms near the release point, with concentrations down to zero in other rooms. Such gradients would provide a range of contamination levels to challenge the sampling

  16. 40 CFR 90.507 - Sample selection.

    Science.gov (United States)

    2010-07-01

    ... Auditing § 90.507 Sample selection. (a) Engines comprising a test sample will be selected at the location... manufacturer ships any test engine, it relinquishes the prerogative to conduct retests as provided in § 90.508...

  17. Information sampling behavior with explicit sampling costs

    Science.gov (United States)

    Juni, Mordechai Z.; Gureckis, Todd M.; Maloney, Laurence T.

    2015-01-01

    The decision to gather information should take into account both the value of information and its accrual costs in time, energy and money. Here we explore how people balance the monetary costs and benefits of gathering additional information in a perceptual-motor estimation task. Participants were rewarded for touching a hidden circular target on a touch-screen display. The target’s center coincided with the mean of a circular Gaussian distribution from which participants could sample repeatedly. Each “cue” — sampled one at a time — was plotted as a dot on the display. Participants had to repeatedly decide, after sampling each cue, whether to stop sampling and attempt to touch the hidden target or continue sampling. Each additional cue increased the participants’ probability of successfully touching the hidden target but reduced their potential reward. Two experimental conditions differed in the initial reward associated with touching the hidden target and the fixed cost per cue. For each condition we computed the optimal number of cues that participants should sample, before taking action, to maximize expected gain. Contrary to recent claims that people gather less information than they objectively should before taking action, we found that participants over-sampled in one experimental condition, and did not significantly under- or over-sample in the other. Additionally, while the ideal observer model ignores the current sample dispersion, we found that participants used it to decide whether to stop sampling and take action or continue sampling, a possible consequence of imperfect learning of the underlying population dispersion across trials. PMID:27429991

  18. Hanford Site Environmental Surveillance Master Sampling Schedule

    International Nuclear Information System (INIS)

    Bisping, L.E.

    1999-01-01

    Environmental surveillance of the Hanford Site and surrounding areas is conducted by the Pacific Northwest National Laboratory (PNNL) for the U.S. Department of Energy (DOE). Sampling is conducted to evaluate levels of radioactive and nonradioactive pollutants in the Hanford environs, as required in DOE Order 5400.1, ''General Environmental protection Program,'' and DOE Order 5400.5, ''Radiation Protection of the Public and the Environment.'' The sampling methods are described in the Environmental Monitoring Plan, United States Department of Energy, Richland Operations Office, DOE/RL-91-50, Rev.2, U.S. Department of Energy, Richland, Washington. This document contains the CY1999 schedules for the routine collection of samples for the Surface Environmental Surveillance Project (SESP) and Drinking Water Monitoring Project. Each section includes the sampling location, sample type, and analyses to be performed on the sample. In some cases, samples are scheduled on a rotating basis and may not be collected in 1999 in which case the anticipated year for collection is provided. In addition, a map is included for each media showing approximate sampling locations

  19. Genetic Contribution to Alcohol Dependence: Investigation of a Heterogeneous German Sample of Individuals with Alcohol Dependence, Chronic Alcoholic Pancreatitis, and Alcohol-Related Cirrhosis

    Directory of Open Access Journals (Sweden)

    Jens Treutlein

    2017-07-01

    Full Text Available The present study investigated the genetic contribution to alcohol dependence (AD using genome-wide association data from three German samples. These comprised patients with: (i AD; (ii chronic alcoholic pancreatitis (ACP; and (iii alcohol-related liver cirrhosis (ALC. Single marker, gene-based, and pathway analyses were conducted. A significant association was detected for the ADH1B locus in a gene-based approach (puncorrected = 1.2 × 10−6; pcorrected = 0.020. This was driven by the AD subsample. No association with ADH1B was found in the combined ACP + ALC sample. On first inspection, this seems surprising, since ADH1B is a robustly replicated risk gene for AD and may therefore be expected to be associated also with subgroups of AD patients. The negative finding in the ACP + ALC sample, however, may reflect genetic stratification as well as random fluctuation of allele frequencies in the cases and controls, demonstrating the importance of large samples in which the phenotype is well assessed.

  20. Defining And Characterizing Sample Representativeness For DWPF Melter Feed Samples

    Energy Technology Data Exchange (ETDEWEB)

    Shine, E. P.; Poirier, M. R.

    2013-10-29

    Representative sampling is important throughout the Defense Waste Processing Facility (DWPF) process, and the demonstrated success of the DWPF process to achieve glass product quality over the past two decades is a direct result of the quality of information obtained from the process. The objective of this report was to present sampling methods that the Savannah River Site (SRS) used to qualify waste being dispositioned at the DWPF. The goal was to emphasize the methodology, not a list of outcomes from those studies. This methodology includes proven methods for taking representative samples, the use of controlled analytical methods, and data interpretation and reporting that considers the uncertainty of all error sources. Numerous sampling studies were conducted during the development of the DWPF process and still continue to be performed in order to evaluate options for process improvement. Study designs were based on use of statistical tools applicable to the determination of uncertainties associated with the data needs. Successful designs are apt to be repeated, so this report chose only to include prototypic case studies that typify the characteristics of frequently used designs. Case studies have been presented for studying in-tank homogeneity, evaluating the suitability of sampler systems, determining factors that affect mixing and sampling, comparing the final waste glass product chemical composition and durability to that of the glass pour stream sample and other samples from process vessels, and assessing the uniformity of the chemical composition in the waste glass product. Many of these studies efficiently addressed more than one of these areas of concern associated with demonstrating sample representativeness and provide examples of statistical tools in use for DWPF. The time when many of these designs were implemented was in an age when the sampling ideas of Pierre Gy were not as widespread as they are today. Nonetheless, the engineers and

  1. Defining And Characterizing Sample Representativeness For DWPF Melter Feed Samples

    International Nuclear Information System (INIS)

    Shine, E. P.; Poirier, M. R.

    2013-01-01

    Representative sampling is important throughout the Defense Waste Processing Facility (DWPF) process, and the demonstrated success of the DWPF process to achieve glass product quality over the past two decades is a direct result of the quality of information obtained from the process. The objective of this report was to present sampling methods that the Savannah River Site (SRS) used to qualify waste being dispositioned at the DWPF. The goal was to emphasize the methodology, not a list of outcomes from those studies. This methodology includes proven methods for taking representative samples, the use of controlled analytical methods, and data interpretation and reporting that considers the uncertainty of all error sources. Numerous sampling studies were conducted during the development of the DWPF process and still continue to be performed in order to evaluate options for process improvement. Study designs were based on use of statistical tools applicable to the determination of uncertainties associated with the data needs. Successful designs are apt to be repeated, so this report chose only to include prototypic case studies that typify the characteristics of frequently used designs. Case studies have been presented for studying in-tank homogeneity, evaluating the suitability of sampler systems, determining factors that affect mixing and sampling, comparing the final waste glass product chemical composition and durability to that of the glass pour stream sample and other samples from process vessels, and assessing the uniformity of the chemical composition in the waste glass product. Many of these studies efficiently addressed more than one of these areas of concern associated with demonstrating sample representativeness and provide examples of statistical tools in use for DWPF. The time when many of these designs were implemented was in an age when the sampling ideas of Pierre Gy were not as widespread as they are today. Nonetheless, the engineers and

  2. Rapid Gamma Screening of Shipments of Analytical Samples to Meet DOT Regulations

    International Nuclear Information System (INIS)

    Wojtaszek, P.A.; Remington, D.L.; Ideker-Mulligan, V.

    2006-01-01

    The accelerated closure program at Rocky Flats required the capacity to ship up to 1000 analytical samples per week to off-site commercial laboratories, and to conduct such shipment within 24 hours of sample collection. During a period of near peak activity in the closure project, a regulatory change significantly increased the level of radionuclide data required for shipment of each package. In order to meet these dual challenges, a centralized and streamlined sample management program was developed which channeled analytical samples through a single, high-throughput radiological screening facility. This trailerized facility utilized high purity germanium (HPGe) gamma spectrometers to conduct screening measurements of entire packages of samples at once, greatly increasing throughput compared to previous methods. The In Situ Object Counting System (ISOCS) was employed to calibrate the HPGe systems to accommodate the widely varied sample matrices and packing configurations encountered. Optimum modeling and configuration parameters were determined. Accuracy of the measurements of grouped sample jars was confirmed with blind samples in multiple configurations. Levels of radionuclides not observable by gamma spectroscopy were calculated utilizing a spreadsheet program that can accommodate isotopic ratios for large numbers of different waste streams based upon acceptable knowledge. This program integrated all radionuclide data and output all information required for shipment, including the shipping class of the package. (authors)

  3. The fluctuation Hall conductivity and the Hall angle in type-II superconductor under magnetic field

    Energy Technology Data Exchange (ETDEWEB)

    Tinh, Bui Duc, E-mail: tinhbd@hnue.edu.vn [Institute of Research and Development, Duy Tan University, K7/25 Quang Trung, Danang (Viet Nam); Department of Physics, Hanoi National University of Education, 136 Xuan Thuy, Cau Giay, Hanoi (Viet Nam); Hoc, Nguyen Quang; Thu, Le Minh [Department of Physics, Hanoi National University of Education, 136 Xuan Thuy, Cau Giay, Hanoi (Viet Nam)

    2016-02-15

    Highlights: • The time-dependent Ginzburg–Landau was used to calculate fluctuation Hall conductivity and Hall angle in type-II superconductor in 2D and 3D. • We obtain analytical expressions for the fluctuation Hall conductivity and the Hall angle summing all Landau levels without need to cutoff higher Landau levels to treat arbitrary magnetic field. • The results were compared to the experimental data on YBCO. - Abstract: The fluctuation Hall conductivity and the Hall angle, describing the Hall effect, are calculated for arbitrary value of the imaginary part of the relaxation time in the frame of the time-dependent Ginzburg–Landau theory in type II-superconductor with thermal noise describing strong thermal fluctuations. The self-consistent Gaussian approximation is used to treat the nonlinear interaction term in dynamics. We obtain analytical expressions for the fluctuation Hall conductivity and the Hall angle summing all Landau levels without need to cutoff higher Landau levels to treat arbitrary magnetic field. The results are compared with experimental data on high-T{sub c} superconductor.

  4. Intermittent Contact Alternating Current Scanning Electrochemical Microscopy: A Method for Mapping Conductivities in Solid Li Ion Conducting Electrolyte Samples

    Energy Technology Data Exchange (ETDEWEB)

    Catarelli, Samantha Raisa; Lonsdale, Daniel [Uniscan Instruments Ltd., Macclesfield (United Kingdom); Cheng, Lei [Energy Storage and Distribution Resources Division, Lawrence Berkeley National Laboratory, Berkeley, CA (United States); Materials Sciences and Engineering Department, University of California Berkeley, Berkeley, CA (United States); Syzdek, Jaroslaw [Bio-Logic USA LLC, Knoxville, TN (United States); Doeff, Marca, E-mail: mmdoeff@lbl.gov [Energy Storage and Distribution Resources Division, Lawrence Berkeley National Laboratory, Berkeley, CA (United States)

    2016-03-31

    Intermittent contact alternating current scanning electrochemical microscopy (ic-ac-SECM) has been used to determine the electrochemical response to an ac signal of several types of materials. A conductive gold foil and insulating Teflon sheet were first used to demonstrate that the intermittent contact function allows the topography and conductivity to be mapped simultaneously and independently in a single experiment. Then, a dense pellet of an electronically insulating but Li ion conducting garnet phase, Al-substituted Li{sub 7}La{sub 3}Zr{sub 2}O{sub 12} (LLZO), was characterized using the same technique. The polycrystalline pellet was prepared by classical ceramic sintering techniques and was comprised of large (~150 μm) grains. Critical information regarding the contributions of grain and grain boundary resistances to the total conductivity of the garnet phase was lacking due to ambiguities in the impedance data. In contrast, the use of the ic-ac-SECM technique allowed spatially resolved information regarding local conductivities to be measured directly. Impedance mapping of the pellet showed that the grain boundary resistance, while generally higher than that of grains, varied considerably, revealing the complex nature of the LLZO sample.

  5. Samples and Sampling Protocols for Scientific Investigations | Joel ...

    African Journals Online (AJOL)

    ... from sampling, through sample preparation, calibration to final measurement and reporting. This paper, therefore offers useful information on practical guidance on sampling protocols in line with best practice and international standards. Keywords: Sampling, sampling protocols, chain of custody, analysis, documentation ...

  6. Groundwater sampling: Chapter 5

    Science.gov (United States)

    Wang, Qingren; Munoz-Carpena, Rafael; Foster, Adam; Migliaccio, Kati W.; Li, Yuncong; Migliaccio, Kati

    2011-01-01

    About the book: As water quality becomes a leading concern for people and ecosystems worldwide, it must be properly assessed in order to protect water resources for current and future generations. Water Quality Concepts, Sampling, and Analyses supplies practical information for planning, conducting, or evaluating water quality monitoring programs. It presents the latest information and methodologies for water quality policy, regulation, monitoring, field measurement, laboratory analysis, and data analysis. The book addresses water quality issues, water quality regulatory development, monitoring and sampling techniques, best management practices, and laboratory methods related to the water quality of surface and ground waters. It also discusses basic concepts of water chemistry and hydrology related to water sampling and analysis; instrumentation; water quality data analysis; and evaluation and reporting results.

  7. An 'intelligent' approach to radioimmunoassay sample counting employing a microprocessor-controlled sample counter

    International Nuclear Information System (INIS)

    Ekins, R.P.; Sufi, S.; Malan, P.G.

    1978-01-01

    The enormous impact on medical science in the last two decades of microanalytical techniques employing radioisotopic labels has, in turn, generated a large demand for automatic radioisotopic sample counters. Such instruments frequently comprise the most important item of capital equipment required in the use of radioimmunoassay and related techniques and often form a principle bottleneck in the flow of samples through a busy laboratory. It is therefore imperative that such instruments should be used 'intelligently' and in an optimal fashion to avoid both the very large capital expenditure involved in the unnecessary proliferation of instruments and the time delays arising from their sub-optimal use. Most of the current generation of radioactive sample counters nevertheless rely on primitive control mechanisms based on a simplistic statistical theory of radioactive sample counting which preclude their efficient and rational use. The fundamental principle upon which this approach is based is that it is useless to continue counting a radioactive sample for a time longer than that required to yield a significant increase in precision of the measurement. Thus, since substantial experimental errors occur during sample preparation, these errors should be assessed and must be related to the counting errors for that sample. The objective of the paper is to demonstrate that the combination of a realistic statistical assessment of radioactive sample measurement, together with the more sophisticated control mechanisms that modern microprocessor technology make possible, may often enable savings in counter usage of the order of 5- to 10-fold to be made. (author)

  8. A solid phase extraction-ion chromatography with conductivity detection procedure for determining cationic surfactants in surface water samples.

    Science.gov (United States)

    Olkowska, Ewa; Polkowska, Żaneta; Namieśnik, Jacek

    2013-11-15

    A new analytical procedure for the simultaneous determination of individual cationic surfactants (alkyl benzyl dimethyl ammonium chlorides) in surface water samples has been developed. We describe this methodology for the first time: it involves the application of solid phase extraction (SPE-for sample preparation) coupled with ion chromatography-conductivity detection (IC-CD-for the final determination). Mean recoveries of analytes between 79% and 93%, and overall method quantification limits in the range from 0.0018 to 0.038 μg/mL for surface water and CRM samples were achieved. The methodology was applied to the determination of individual alkyl benzyl quaternary ammonium compounds in environmental samples (reservoir water) and enables their presence in such types of waters to be confirmed. In addition, it is a simpler, less time-consuming, labour-intensive, avoiding use of toxic chloroform and significantly less expensive methodology than previously described approaches (liquid-liquid extraction coupled with liquid chromatography-mass spectrometry). Copyright © 2013 Elsevier B.V. All rights reserved.

  9. Hanford Site Environmental Surveillance Master Sampling Schedule

    International Nuclear Information System (INIS)

    Bisping, L.E.

    2000-01-01

    Environmental surveillance of the Hanford Site and surrounding areas is conducted by the Pacific Northwest National Laboratory (PNNL) for the U.S. Department of Energy (DOE). Sampling is conducted to evaluate levels of radioactive and nonradioactive pollutants in the Hanford environs, as required in DOE Order 5400.1, General Environmental Protection Program: and DOE Order 5400.5, Radiation Protection of the Public and the Environment. The sampling design is described in the Operations Office, Environmental Monitoring Plan, United States Department of Energy, Richland DOE/RL-91-50, Rev.2, U.S. Department of Energy, Richland, Washington. This document contains the CY 2000 schedules for the routine collection of samples for the Surface Environmental Surveillance Project (SESP) and Drinking Water Monitoring Project. Each section includes sampling locations, sample types, and analyses to be performed. In some cases, samples are scheduled on a rotating basis and may not be collected in 2000 in which case the anticipated year for collection is provided. In addition, a map showing approximate sampling locations is included for each media scheduled for collection

  10. Pierre Gy's sampling theory and sampling practice heterogeneity, sampling correctness, and statistical process control

    CERN Document Server

    Pitard, Francis F

    1993-01-01

    Pierre Gy's Sampling Theory and Sampling Practice, Second Edition is a concise, step-by-step guide for process variability management and methods. Updated and expanded, this new edition provides a comprehensive study of heterogeneity, covering the basic principles of sampling theory and its various applications. It presents many practical examples to allow readers to select appropriate sampling protocols and assess the validity of sampling protocols from others. The variability of dynamic process streams using variography is discussed to help bridge sampling theory with statistical process control. Many descriptions of good sampling devices, as well as descriptions of poor ones, are featured to educate readers on what to look for when purchasing sampling systems. The book uses its accessible, tutorial style to focus on professional selection and use of methods. The book will be a valuable guide for mineral processing engineers; metallurgists; geologists; miners; chemists; environmental scientists; and practit...

  11. The design of high-temperature thermal conductivity measurements apparatus for thin sample size

    Directory of Open Access Journals (Sweden)

    Hadi Syamsul

    2017-01-01

    Full Text Available This study presents the designing, constructing and validating processes of thermal conductivity apparatus using steady-state heat-transfer techniques with the capability of testing a material at high temperatures. This design is an improvement from ASTM D5470 standard where meter-bars with the equal cross-sectional area were used to extrapolate surface temperature and measure heat transfer across a sample. There were two meter-bars in apparatus where each was placed three thermocouples. This Apparatus using a heater with a power of 1,000 watts, and cooling water to stable condition. The pressure applied was 3.4 MPa at the cross-sectional area of 113.09 mm2 meter-bar and thermal grease to minimized interfacial thermal contact resistance. To determine the performance, the validating process proceeded by comparing the results with thermal conductivity obtained by THB 500 made by LINSEIS. The tests showed the thermal conductivity of the stainless steel and bronze are 15.28 Wm-1K-1 and 38.01 Wm-1K-1 with a difference of test apparatus THB 500 are −2.55% and 2.49%. Furthermore, this apparatus has the capability to measure the thermal conductivity of the material to a temperature of 400°C where the results for the thermal conductivity of stainless steel is 19.21 Wm-1K-1 and the difference was 7.93%.

  12. Hanford site environmental surveillance master sampling schedule

    International Nuclear Information System (INIS)

    Bisping, L.E.

    1998-01-01

    Environmental surveillance of the Hanford Site and surrounding areas is conducted by the Pacific Northwest National Laboratory (PNNL) for the U.S. Department of Energy (DOE). Sampling is conducted to evaluate levels of radioactive and nonradioactive pollutants in the Hanford environs, as required in DOE Order 5400.1 open-quotes General Environmental Protection Program,close quotes and DOE Order 5400.5, open-quotes Radiation Protection of the Public and the Environment.close quotes The sampling methods are described in the Environmental Monitoring Plan, United States Department of Energy, Richland Operations Office, DOE/RL91-50, Rev. 2, U.S. Department of Energy, Richland, Washington. This document contains the 1998 schedules for routine collection of samples for the Surface Environmental Surveillance Project (SESP) and Drinking Water Monitoring Project. Each section of this document describes the planned sampling schedule for a specific media (air, surface water, biota, soil and vegetation, sediment, and external radiation). Each section includes the sample location, sample type, and analyses to be performed on the sample. In some cases, samples are scheduled on a rotating basis and may not be planned for 1998 in which case the anticipated year for collection is provided. In addition, a map is included for each media showing sample locations

  13. Determination of gamma-hydroxybutyric acid in clinical samples using capillary electrophoresis with contactless conductivity detection

    Czech Academy of Sciences Publication Activity Database

    Gong, X.Y.; Kubáň, Pavel; Scholer, A.; Hauser, P.C.

    2008-01-01

    Roč. 1213, 1-2 (2008), s. 100-104 ISSN 0021-9673 R&D Projects: GA AV ČR IAA400310609; GA AV ČR IAA400310703; GA ČR GA203/08/1536 Institutional research plan: CEZ:AV0Z40310501 Keywords : clinical samples * capillary electrophoresis * contactless conductivity detection Subject RIV: CB - Analytical Chemistry, Separation Impact factor: 3.756, year: 2008

  14. Statistical sampling method for releasing decontaminated vehicles

    International Nuclear Information System (INIS)

    Lively, J.W.; Ware, J.A.

    1996-01-01

    Earth moving vehicles (e.g., dump trucks, belly dumps) commonly haul radiologically contaminated materials from a site being remediated to a disposal site. Traditionally, each vehicle must be surveyed before being released. The logistical difficulties of implementing the traditional approach on a large scale demand that an alternative be devised. A statistical method (MIL-STD-105E, open-quotes Sampling Procedures and Tables for Inspection by Attributesclose quotes) for assessing product quality from a continuous process was adapted to the vehicle decontamination process. This method produced a sampling scheme that automatically compensates and accommodates fluctuating batch sizes and changing conditions without the need to modify or rectify the sampling scheme in the field. Vehicles are randomly selected (sampled) upon completion of the decontamination process to be surveyed for residual radioactive surface contamination. The frequency of sampling is based on the expected number of vehicles passing through the decontamination process in a given period and the confidence level desired. This process has been successfully used for 1 year at the former uranium mill site in Monticello, Utah (a CERCLA regulated clean-up site). The method forces improvement in the quality of the decontamination process and results in a lower likelihood that vehicles exceeding the surface contamination standards are offered for survey. Implementation of this statistical sampling method on Monticello Projects has resulted in more efficient processing of vehicles through decontamination and radiological release, saved hundreds of hours of processing time, provided a high level of confidence that release limits are met, and improved the radiological cleanliness of vehicles leaving the controlled site

  15. Sampling rare fluctuations of height in the Oslo ricepile model

    International Nuclear Information System (INIS)

    Pradhan, Punyabrata; Dhar, Deepak

    2007-01-01

    We describe a new Monte Carlo algorithm for studying large deviations of the height of the pile from its mean value in the Oslo ricepile model. We have used it to generate events which have probabilities of order 10 -1000 . These simulations check our qualitative argument (Pradhan P and Dhar D 2006 Phys. Rev. E 73 021303) that in the steady state of the Oslo ricepile model the probability of large negative height fluctuations Δh = -αL about the mean varies as exp(-κα 4 L 3 ) as L → ∞ with α held fixed and κ > 0

  16. Sampling and chemical analysis in environmental samples around Nuclear Power Plants and some environmental samples

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Yong Woo; Han, Man Jung; Cho, Seong Won; Cho, Hong Jun; Oh, Hyeon Kyun; Lee, Jeong Min; Chang, Jae Sook [KORTIC, Taejon (Korea, Republic of)

    2002-12-15

    Twelve kinds of environmental samples such as soil, seawater, underground water, etc. around Nuclear Power Plants(NPPs) were collected. Tritium chemical analysis was tried for the samples of rain water, pine-needle, air, seawater, underground water, chinese cabbage, a grain of rice and milk sampled around NPPs, and surface seawater and rain water sampled over the country. Strontium in the soil that sere sampled at 60 point of district in Korea were analyzed. Tritium were sampled at 60 point of district in Korea were analyzed. Tritium were analyzed in 21 samples of surface seawater around the Korea peninsular that were supplied from KFRDI(National Fisheries Research and Development Institute). Sampling and chemical analysis environmental samples around Kori, Woolsung, Youngkwang, Wooljin Npps and Taeduk science town for tritium and strontium analysis was managed according to plans. Succeed to KINS after all samples were tried.

  17. Choosing a suitable sample size in descriptive sampling

    International Nuclear Information System (INIS)

    Lee, Yong Kyun; Choi, Dong Hoon; Cha, Kyung Joon

    2010-01-01

    Descriptive sampling (DS) is an alternative to crude Monte Carlo sampling (CMCS) in finding solutions to structural reliability problems. It is known to be an effective sampling method in approximating the distribution of a random variable because it uses the deterministic selection of sample values and their random permutation,. However, because this method is difficult to apply to complex simulations, the sample size is occasionally determined without thorough consideration. Input sample variability may cause the sample size to change between runs, leading to poor simulation results. This paper proposes a numerical method for choosing a suitable sample size for use in DS. Using this method, one can estimate a more accurate probability of failure in a reliability problem while running a minimal number of simulations. The method is then applied to several examples and compared with CMCS and conventional DS to validate its usefulness and efficiency

  18. Comparing fixed sampling with minimizer sampling when using k-mer indexes to find maximal exact matches.

    Science.gov (United States)

    Almutairy, Meznah; Torng, Eric

    2018-01-01

    Bioinformatics applications and pipelines increasingly use k-mer indexes to search for similar sequences. The major problem with k-mer indexes is that they require lots of memory. Sampling is often used to reduce index size and query time. Most applications use one of two major types of sampling: fixed sampling and minimizer sampling. It is well known that fixed sampling will produce a smaller index, typically by roughly a factor of two, whereas it is generally assumed that minimizer sampling will produce faster query times since query k-mers can also be sampled. However, no direct comparison of fixed and minimizer sampling has been performed to verify these assumptions. We systematically compare fixed and minimizer sampling using the human genome as our database. We use the resulting k-mer indexes for fixed sampling and minimizer sampling to find all maximal exact matches between our database, the human genome, and three separate query sets, the mouse genome, the chimp genome, and an NGS data set. We reach the following conclusions. First, using larger k-mers reduces query time for both fixed sampling and minimizer sampling at a cost of requiring more space. If we use the same k-mer size for both methods, fixed sampling requires typically half as much space whereas minimizer sampling processes queries only slightly faster. If we are allowed to use any k-mer size for each method, then we can choose a k-mer size such that fixed sampling both uses less space and processes queries faster than minimizer sampling. The reason is that although minimizer sampling is able to sample query k-mers, the number of shared k-mer occurrences that must be processed is much larger for minimizer sampling than fixed sampling. In conclusion, we argue that for any application where each shared k-mer occurrence must be processed, fixed sampling is the right sampling method.

  19. Comparing fixed sampling with minimizer sampling when using k-mer indexes to find maximal exact matches.

    Directory of Open Access Journals (Sweden)

    Meznah Almutairy

    Full Text Available Bioinformatics applications and pipelines increasingly use k-mer indexes to search for similar sequences. The major problem with k-mer indexes is that they require lots of memory. Sampling is often used to reduce index size and query time. Most applications use one of two major types of sampling: fixed sampling and minimizer sampling. It is well known that fixed sampling will produce a smaller index, typically by roughly a factor of two, whereas it is generally assumed that minimizer sampling will produce faster query times since query k-mers can also be sampled. However, no direct comparison of fixed and minimizer sampling has been performed to verify these assumptions. We systematically compare fixed and minimizer sampling using the human genome as our database. We use the resulting k-mer indexes for fixed sampling and minimizer sampling to find all maximal exact matches between our database, the human genome, and three separate query sets, the mouse genome, the chimp genome, and an NGS data set. We reach the following conclusions. First, using larger k-mers reduces query time for both fixed sampling and minimizer sampling at a cost of requiring more space. If we use the same k-mer size for both methods, fixed sampling requires typically half as much space whereas minimizer sampling processes queries only slightly faster. If we are allowed to use any k-mer size for each method, then we can choose a k-mer size such that fixed sampling both uses less space and processes queries faster than minimizer sampling. The reason is that although minimizer sampling is able to sample query k-mers, the number of shared k-mer occurrences that must be processed is much larger for minimizer sampling than fixed sampling. In conclusion, we argue that for any application where each shared k-mer occurrence must be processed, fixed sampling is the right sampling method.

  20. Comparing fixed sampling with minimizer sampling when using k-mer indexes to find maximal exact matches

    Science.gov (United States)

    Torng, Eric

    2018-01-01

    Bioinformatics applications and pipelines increasingly use k-mer indexes to search for similar sequences. The major problem with k-mer indexes is that they require lots of memory. Sampling is often used to reduce index size and query time. Most applications use one of two major types of sampling: fixed sampling and minimizer sampling. It is well known that fixed sampling will produce a smaller index, typically by roughly a factor of two, whereas it is generally assumed that minimizer sampling will produce faster query times since query k-mers can also be sampled. However, no direct comparison of fixed and minimizer sampling has been performed to verify these assumptions. We systematically compare fixed and minimizer sampling using the human genome as our database. We use the resulting k-mer indexes for fixed sampling and minimizer sampling to find all maximal exact matches between our database, the human genome, and three separate query sets, the mouse genome, the chimp genome, and an NGS data set. We reach the following conclusions. First, using larger k-mers reduces query time for both fixed sampling and minimizer sampling at a cost of requiring more space. If we use the same k-mer size for both methods, fixed sampling requires typically half as much space whereas minimizer sampling processes queries only slightly faster. If we are allowed to use any k-mer size for each method, then we can choose a k-mer size such that fixed sampling both uses less space and processes queries faster than minimizer sampling. The reason is that although minimizer sampling is able to sample query k-mers, the number of shared k-mer occurrences that must be processed is much larger for minimizer sampling than fixed sampling. In conclusion, we argue that for any application where each shared k-mer occurrence must be processed, fixed sampling is the right sampling method. PMID:29389989

  1. Systematic Sampling and Cluster Sampling of Packet Delays

    OpenAIRE

    Lindh, Thomas

    2006-01-01

    Based on experiences of a traffic flow performance meter this papersuggests and evaluates cluster sampling and systematic sampling as methods toestimate average packet delays. Systematic sampling facilitates for exampletime analysis, frequency analysis and jitter measurements. Cluster samplingwith repeated trains of periodically spaced sampling units separated by randomstarting periods, and systematic sampling are evaluated with respect to accuracyand precision. Packet delay traces have been ...

  2. Sampling pig farms at the abattoir in a cross-sectional study − Evaluation of a sampling method

    DEFF Research Database (Denmark)

    Birkegård, Anna Camilla; Hisham Beshara Halasa, Tariq; Toft, Nils

    2017-01-01

    slaughter pigs. It was not possible to visit farms within the designated timeframe. Therefore, it was decided to use convenience sampling at the abattoir. Our approach was carried out in three steps: 1) planning: using data from meat inspection to plan at which abattoirs and how many farms to sample; 2......A cross-sectional study design is relatively inexpensive, fast and easy to conduct when compared to other study designs. Careful planning is essential to obtaining a representative sample of the population, and the recommended approach is to use simple random sampling from an exhaustive list...... of units in the target population. This approach is rarely feasible in practice, and other sampling procedures must often be adopted. For example, when slaughter pigs are the target population, sampling the pigs on the slaughter line may be an alternative to on-site sampling at a list of farms. However...

  3. Sampling

    CERN Document Server

    Thompson, Steven K

    2012-01-01

    Praise for the Second Edition "This book has never had a competitor. It is the only book that takes a broad approach to sampling . . . any good personal statistics library should include a copy of this book." —Technometrics "Well-written . . . an excellent book on an important subject. Highly recommended." —Choice "An ideal reference for scientific researchers and other professionals who use sampling." —Zentralblatt Math Features new developments in the field combined with all aspects of obtaining, interpreting, and using sample data Sampling provides an up-to-date treat

  4. Sampling Lesbian, Gay, and Bisexual Populations

    Science.gov (United States)

    Meyer, Ilan H.; Wilson, Patrick A.

    2009-01-01

    Sampling has been the single most influential component of conducting research with lesbian, gay, and bisexual (LGB) populations. Poor sampling designs can result in biased results that will mislead other researchers, policymakers, and practitioners. Investigators wishing to study LGB populations must therefore devote significant energy and…

  5. Sampling and examination methods used for TMI-2 samples

    International Nuclear Information System (INIS)

    Marley, A.W.; Akers, D.W.; McIsaac, C.V.

    1988-01-01

    The purpose of this paper is to summarize the sampling and examination techniques that were used in the collection and analysis of TMI-2 samples. Samples ranging from auxiliary building air to core debris were collected and analyzed. Handling of the larger samples and many of the smaller samples had to be done remotely and many standard laboratory analytical techniques were modified to accommodate the extremely high radiation fields associated with these samples. The TMI-2 samples presented unique problems with sampling and the laboratory analysis of prior molten fuel debris. 14 refs., 8 figs

  6. Data Stewardship in the Ocean Sciences Needs to Include Physical Samples

    Science.gov (United States)

    Carter, M.; Lehnert, K.

    2016-02-01

    Across the Ocean Sciences, research involves the collection and study of samples collected above, at, and below the seafloor, including but not limited to rocks, sediments, fluids, gases, and living organisms. Many domains in the Earth Sciences have recently expressed the need for better discovery, access, and sharing of scientific samples and collections (EarthCube End-User Domain workshops, 2012 and 2013, http://earthcube.org/info/about/end-user-workshops), as has the US government (OSTP Memo, March 2014). iSamples (Internet of Samples in the Earth Sciences) is a Research Coordination Network within the EarthCube program that aims to advance the use of innovative cyberinfrastructure to support and advance the utility of physical samples and sample collections for science and ensure reproducibility of sample-based data and research results. iSamples strives to build, grow, and foster a new community of practice, in which domain scientists, curators of sample repositories and collections, computer and information scientists, software developers and technology innovators engage in and collaborate on defining, articulating, and addressing the needs and challenges of physical samples as a critical component of digital data infrastructure. A primary goal of iSamples is to deliver a community-endorsed set of best practices and standards for the registration, description, identification, and citation of physical specimens and define an actionable plan for implementation. iSamples conducted a broad community survey about sample sharing and has created 5 different working groups to address the different challenges of developing the internet of samples - from metadata schemas and unique identifiers to an architecture for a shared cyberinfrastructure to manage collections, to digitization of existing collections, to education, and ultimately to establishing the physical infrastructure that will ensure preservation and access of the physical samples. Repositories that curate

  7. Effects of (α,n) contaminants and sample multiplication on statistical neutron correlation measurements

    International Nuclear Information System (INIS)

    Dowdy, E.J.; Hansen, G.E.; Robba, A.A.; Pratt, J.C.

    1980-01-01

    The complete formalism for the use of statistical neutron fluctuation measurements for the nondestructive assay of fissionable materials has been developed. This formalism includes the effect of detector deadtime, neutron multiplicity, random neutron pulse contributions from (α,n) contaminants in the sample, and the sample multiplication of both fission-related and background neutrons

  8. Preferential sampling in veterinary parasitological surveillance

    Directory of Open Access Journals (Sweden)

    Lorenzo Cecconi

    2016-04-01

    Full Text Available In parasitological surveillance of livestock, prevalence surveys are conducted on a sample of farms using several sampling designs. For example, opportunistic surveys or informative sampling designs are very common. Preferential sampling refers to any situation in which the spatial process and the sampling locations are not independent. Most examples of preferential sampling in the spatial statistics literature are in environmental statistics with focus on pollutant monitors, and it has been shown that, if preferential sampling is present and is not accounted for in the statistical modelling and data analysis, statistical inference can be misleading. In this paper, working in the context of veterinary parasitology, we propose and use geostatistical models to predict the continuous and spatially-varying risk of a parasite infection. Specifically, breaking with the common practice in veterinary parasitological surveillance to ignore preferential sampling even though informative or opportunistic samples are very common, we specify a two-stage hierarchical Bayesian model that adjusts for preferential sampling and we apply it to data on Fasciola hepatica infection in sheep farms in Campania region (Southern Italy in the years 2013-2014.

  9. Experimental evaluation of the detection threshold of uranium in urine samples

    International Nuclear Information System (INIS)

    Ferreyra, M. D.; Suarez Mendez, Sebastian; Tossi, Mirta H.

    1999-01-01

    The routine internal dosimetric tests for nuclear installations workers includes the determination of uranium in urine. The analysis is carried out, after chemical treatment, by UV fluorometry, comparing the results with urine blank samples from workers not exposed professionally to contamination. The fluctuation of the results of the uranium content in the blank samples greatly affects the determinations. In 30 blank samples the uranium content was determined and the results were evaluated by three calculation methods: 1) The procedure recommended by IUPAC; 2) The graphical method; 3) and The error propagation method. The last one has been adopted for the calculation of the detection threshold. (authors)

  10. On incomplete sampling under birth-death models and connections to the sampling-based coalescent.

    Science.gov (United States)

    Stadler, Tanja

    2009-11-07

    The constant rate birth-death process is used as a stochastic model for many biological systems, for example phylogenies or disease transmission. As the biological data are usually not fully available, it is crucial to understand the effect of incomplete sampling. In this paper, we analyze the constant rate birth-death process with incomplete sampling. We derive the density of the bifurcation events for trees on n leaves which evolved under this birth-death-sampling process. This density is used for calculating prior distributions in Bayesian inference programs and for efficiently simulating trees. We show that the birth-death-sampling process can be interpreted as a birth-death process with reduced rates and complete sampling. This shows that joint inference of birth rate, death rate and sampling probability is not possible. The birth-death-sampling process is compared to the sampling-based population genetics model, the coalescent. It is shown that despite many similarities between these two models, the distribution of bifurcation times remains different even in the case of very large population sizes. We illustrate these findings on an Hepatitis C virus dataset from Egypt. We show that the transmission times estimates are significantly different-the widely used Gamma statistic even changes its sign from negative to positive when switching from the coalescent to the birth-death process.

  11. Drone inflight mixing of biochemical samples.

    Science.gov (United States)

    Katariya, Mayur; Chung, Dwayne Chung Kim; Minife, Tristan; Gupta, Harshit; Zahidi, Alifa Afiah Ahmad; Liew, Oi Wah; Ng, Tuck Wah

    2018-03-15

    Autonomous systems for sample transport to the laboratory for analysis can be improved in terms of timeliness, cost and error mitigation in the pre-analytical testing phase. Drones have been reported for outdoor sample transport but incorporating devices on them to attain homogenous mixing of reagents during flight to enhance sample processing timeliness is limited by payload issues. It is shown here that flipping maneuvers conducted with quadcopters are able to facilitate complete and gentle mixing. This capability incorporated during automated sample transport serves to address an important factor contributing to pre-analytical variability which ultimately impacts on test result reliability. Copyright © 2018 Elsevier Inc. All rights reserved.

  12. Co-Occurrence of Conduct Disorder and Depression in a Clinic-Based Sample of Boys with ADHD

    Science.gov (United States)

    Drabick, Deborah A. G.; Gadow, Kenneth D.; Sprafkin, Joyce

    2006-01-01

    Background: Children with attention-deficit/hyperactivity disorder (ADHD) are at risk for the development of comorbid conduct disorder (CD) and depression. The current study examined potential psychosocial risk factors for CD and depression in a clinic-based sample of 203 boys (aged 6-10 years) with ADHD. Methods: The boys and their mothers…

  13. Water born pollutants sampling using porous suction samples

    International Nuclear Information System (INIS)

    Baig, M.A.

    1997-01-01

    The common standard method of sampling water born pollutants in the vadoze zone is core sampling and it is followed by extraction of pore fluid. This method does not allow sampling at the same location next time and again later on. There is an alternative approach for sampling fluids (water born pollutants) from both saturated and unsaturated regions of vadose zone using porous suction samplers. There are three types of porous suction samplers, vacuum-operated, pressure-vacuum lysimeters, high pressure vacuum samples. The suction samples are operated in the range of 0-70 centi bars and usually consist of ceramic and polytetrafluorethylene (PTFE). The operation range of PTFE is higher than ceramic cups. These samplers are well suited for in situ and repeated sampling form the same location. This paper discusses the physical properties and operating condition of such samplers to the utilized under our environmental sampling. (author)

  14. 40 CFR 86.607-84 - Sample selection.

    Science.gov (United States)

    2010-07-01

    ... Auditing of New Light-Duty Vehicles, Light-Duty Trucks, and Heavy-Duty Vehicles § 86.607-84 Sample..., once a manufacturer ships any vehicle from the test sample, it relinquishes the prerogative to conduct...

  15. Conductance fluctuations in high mobility monolayer graphene: Nonergodicity, lack of determinism and chaotic behavior.

    Science.gov (United States)

    da Cunha, C R; Mineharu, M; Matsunaga, M; Matsumoto, N; Chuang, C; Ochiai, Y; Kim, G-H; Watanabe, K; Taniguchi, T; Ferry, D K; Aoki, N

    2016-09-09

    We have fabricated a high mobility device, composed of a monolayer graphene flake sandwiched between two sheets of hexagonal boron nitride. Conductance fluctuations as functions of a back gate voltage and magnetic field were obtained to check for ergodicity. Non-linear dynamics concepts were used to study the nature of these fluctuations. The distribution of eigenvalues was estimated from the conductance fluctuations with Gaussian kernels and it indicates that the carrier motion is chaotic at low temperatures. We argue that a two-phase dynamical fluid model best describes the transport in this system and can be used to explain the violation of the so-called ergodic hypothesis found in graphene.

  16. Sampling and sample processing in pesticide residue analysis.

    Science.gov (United States)

    Lehotay, Steven J; Cook, Jo Marie

    2015-05-13

    Proper sampling and sample processing in pesticide residue analysis of food and soil have always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agrochemical industry and contract laboratories. As global food trade and the importance of monitoring increase, the food industry and regulatory laboratories are also considering miniaturized high-throughput methods. In conjunction with a summary of the symposium "Residues in Food and Feed - Going from Macro to Micro: The Future of Sample Processing in Residue Analytical Methods" held at the 13th IUPAC International Congress of Pesticide Chemistry, this is an opportune time to review sampling theory and sample processing for pesticide residue analysis. If collected samples and test portions do not adequately represent the actual lot from which they came and provide meaningful results, then all costs, time, and efforts involved in implementing programs using sophisticated analytical instruments and techniques are wasted and can actually yield misleading results. This paper is designed to briefly review the often-neglected but crucial topic of sample collection and processing and put the issue into perspective for the future of pesticide residue analysis. It also emphasizes that analysts should demonstrate the validity of their sample processing approaches for the analytes/matrices of interest and encourages further studies on sampling and sample mass reduction to produce a test portion.

  17. Non-equilibrium umbrella sampling applied to force spectroscopy of soft matter.

    Science.gov (United States)

    Gao, Y X; Wang, G M; Williams, D R M; Williams, Stephen R; Evans, Denis J; Sevick, E M

    2012-02-07

    Physical systems often respond on a timescale which is longer than that of the measurement. This is particularly true in soft matter where direct experimental measurement, for example in force spectroscopy, drives the soft system out of equilibrium and provides a non-equilibrium measure. Here we demonstrate experimentally for the first time that equilibrium physical quantities (such as the mean square displacement) can be obtained from non-equilibrium measurements via umbrella sampling. Our model experimental system is a bead fluctuating in a time-varying optical trap. We also show this for simulated force spectroscopy on a complex soft molecule--a piston-rotaxane.

  18. Pressure-assisted introduction of urine samples into a short capillary for electrophoretic separation with contactless conductivity and UV spectrometry detection.

    Science.gov (United States)

    Makrlíková, Anna; Opekar, František; Tůma, Petr

    2015-08-01

    A computer-controlled hydrodynamic sample introduction method has been proposed for short-capillary electrophoresis. In the method, the BGE flushes sample from the loop of a six-way sampling valve and is carried to the injection end of the capillary. A short pressure impulse is generated in the electrolyte stream at the time when the sample zone is at the capillary, leading to injection of the sample into the capillary. Then the electrolyte flow is stopped and the separation voltage is turned on. This way of sample introduction does not involve movement of the capillary and both of its ends remain constantly in the solution during both sample injection and separation. The amount of sample introduced to the capillary is controlled by the duration of the pressure pulse. The new sample introduction method was tested in the determination of ammonia, creatinine, uric acid, and hippuric acid in human urine. The determination was performed in a capillary with an overall length of 10.5 cm, in two BGEs with compositions 50 mM MES + 5 mM NaOH (pH 5.1) and 1 M acetic acid + 1.5 mM crown ether 18-crown-6 (pH 2.4). A dual contactless conductivity/UV spectrometric detector was used for the detection. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Acceptance sampling using judgmental and randomly selected samples

    Energy Technology Data Exchange (ETDEWEB)

    Sego, Landon H.; Shulman, Stanley A.; Anderson, Kevin K.; Wilson, John E.; Pulsipher, Brent A.; Sieber, W. Karl

    2010-09-01

    We present a Bayesian model for acceptance sampling where the population consists of two groups, each with different levels of risk of containing unacceptable items. Expert opinion, or judgment, may be required to distinguish between the high and low-risk groups. Hence, high-risk items are likely to be identifed (and sampled) using expert judgment, while the remaining low-risk items are sampled randomly. We focus on the situation where all observed samples must be acceptable. Consequently, the objective of the statistical inference is to quantify the probability that a large percentage of the unsampled items in the population are also acceptable. We demonstrate that traditional (frequentist) acceptance sampling and simpler Bayesian formulations of the problem are essentially special cases of the proposed model. We explore the properties of the model in detail, and discuss the conditions necessary to ensure that required samples sizes are non-decreasing function of the population size. The method is applicable to a variety of acceptance sampling problems, and, in particular, to environmental sampling where the objective is to demonstrate the safety of reoccupying a remediated facility that has been contaminated with a lethal agent.

  20. Comparison between thaw-mounting and use of conductive tape for sample preparation in ToF-SIMS imaging of lipids in Drosophila microRNA-14 model.

    Science.gov (United States)

    Le, Minh Uyen Thi; Son, Jin Gyeong; Shon, Hyun Kyoung; Park, Jeong Hyang; Lee, Sung Bae; Lee, Tae Geol

    2018-03-30

    Time-of-flight secondary ion mass spectrometry (ToF-SIMS) imaging elucidates molecular distributions in tissue sections, providing useful information about the metabolic pathways linked to diseases. However, delocalization of the analytes and inadequate tissue adherence during sample preparation are among some of the unfortunate phenomena associated with this technique due to their role in the reduction of the quality, reliability, and spatial resolution of the ToF-SIMS images. For these reasons, ToF-SIMS imaging requires a more rigorous sample preparation method in order to preserve the natural state of the tissues. The traditional thaw-mounting method is particularly vulnerable to altered distributions of the analytes due to thermal effects, as well as to tissue shrinkage. In the present study, the authors made comparisons of different tissue mounting methods, including the thaw-mounting method. The authors used conductive tape as the tissue-mounting material on the substrate because it does not require heat from the finger for the tissue section to adhere to the substrate and can reduce charge accumulation during data acquisition. With the conductive-tape sampling method, they were able to acquire reproducible tissue sections and high-quality images without redistribution of the molecules. Also, the authors were successful in preserving the natural states and chemical distributions of the different components of fat metabolites such as diacylglycerol and fatty acids by using the tape-supported sampling in microRNA-14 (miR-14) deleted Drosophila models. The method highlighted here shows an improvement in the accuracy of mass spectrometric imaging of tissue samples.

  1. A Bayesian Justification for Random Sampling in Sample Survey

    Directory of Open Access Journals (Sweden)

    Glen Meeden

    2012-07-01

    Full Text Available In the usual Bayesian approach to survey sampling the sampling design, plays a minimal role, at best. Although a close relationship between exchangeable prior distributions and simple random sampling has been noted; how to formally integrate simple random sampling into the Bayesian paradigm is not clear. Recently it has been argued that the sampling design can be thought of as part of a Bayesian's prior distribution. We will show here that under this scenario simple random sample can be given a Bayesian justification in survey sampling.

  2. Somatic Cells in Bulk Samples and Purchase Prices of Cow Milk

    Directory of Open Access Journals (Sweden)

    Jindřich Kvapilík

    2017-01-01

    Full Text Available There were calculated the somatic cell count (SCC 209 (36 – 468 103ml–1, the total count of microorganisms (TCM 25 103ml–1 (from 5 to 377, fat 3.84 % (from 3.23 to 4.46 and protein content 3.39 % (from 3.04 to 3.75 and milk freezing point (MFP –0.525 °C (from –0.534 to –0.395 of the 522 monthly bulk milk samples from 11 experimental stables during the period from 2012 to 2015. Residues of inhibitory substances were not detected in any sample. Milk sale reached 7,999 liters (l with fluctuating between 6,150 and 10,532 l per cow. This can be deduced from the regression coefficients that due to increase in the SCC by 100 103ml–1 the TCM increased by 2.9 to 4.2 103ml–1, the fat content decreased by 0.09 to 0.13 % and protein about 0.01 to 0.05 %. Influence of SCC, TCM and the fat and protein content calculated from monthly samples for individual stables can be estimated at –0.12 CZC, fluctuations between the stables at +0.46 to –0.84 CZC per l of milk. The increase in milk price by 0.17 CZC in the range of –0.92 to +0.92 CZC per l of milk corresponds to averages of indicators calculated from 522 samples.

  3. Small polaron hopping conduction in samples of ceramic La1.4Sr1.6Mn2O7.06

    International Nuclear Information System (INIS)

    Nakatsugawa, H.; Iguchi, E.; Jung, W.H.; Munakata, F.

    1999-01-01

    The ceramic sample of La 1.4 Sr 1.6 Mn 2 O 7.06 exhibits the metal-insulator transition and a negative magnetoresistance in the vicinity of the Curie temperature (T C ∼ 100 K). The dc magnetic susceptibility between 100 K and 280 K is nearly constant and decreases gradually with increasing temperature above 280 K. The measurements of dc resistivity and the thermoelectric power indicate that small polaron hopping conduction takes place at T > 280 K. The spin ordering due to the two-dimensional d x 2 -y 2 state occurring at T > 280 K is directly related to the hopping conduction above 280 K, although the spin ordering due to the one-dimensional d 3z 2 -r 2 state takes place at T > T C . The two-dimensional d x 2 -y 2 state extending within the MnO 2 sheets starts to narrow and leads to the carrier localisation at 280 K. The effective number of holes in this sample estimated from the thermoelectric power is considerably smaller than the nominal value. This indicates that the small polaron hopping conduction takes place predominantly within the in-plane MnO 2 sheets. A discussion is given of the experimental results of the ceramic sample of La 2/3 Ca 1/3 MnO 2.98 . Copyright (1999) CSIRO Australia

  4. Trench sampling report Salmon Site Lamar County, Mississippi

    Energy Technology Data Exchange (ETDEWEB)

    1994-07-01

    This report describes trench excavation and sample-collection activities conducted by IT Corporation (IT) as part of the ongoing Remedial Investigation and Feasibility Study at the Salmon Site, Lamar County, Mississippi (DOE, 1992). During construction, operation, and closure of the site wastes of unknown composition were buried in pits on site. Surface-geophysical field investigations were conducted intermittently between November 1992 and October 1993 to identify potential waste-burial sites and buried metallic materials. The geophysical investigations included vertical magnetic gradient, electromagnetic conductivity, electromagnetic in-phase component, and ground-penetrating radar surveys. A number of anomalies identified by the magnetic gradiometer survey in the Reynolds Electrical & Engineering Co., Inc., (REECo) pits area indicated buried metallic objects. All of the anomalies were field checked to determine if any were caused by surface features or debris. After field checking, 17 anomalies were still unexplained; trenching was planned to attempt to identify their sources. Between December 8, 1993, and December 17, 1993, 15 trenches were excavated and soil samples were collected at the anomalies. Samples were collected, placed in 250- and 500-milliliter (m{ell}) amber glass containers, and shipped on ice to IT Analytical Services (ITAS) in St. Louis, Missouri, using standard IT chain-of-custody procedures. The samples were analyzed for various chemical and radiological parameters. Data validation has not been conducted on any of the samples. During excavation and sampling, soil samples were also collected by IT for the MSDEQ and the Mississippi Department of Radiological Health, in accordance with their instructions, and delivered into their custody.

  5. Trench sampling report Salmon Site Lamar County, Mississippi

    International Nuclear Information System (INIS)

    1994-07-01

    This report describes trench excavation and sample-collection activities conducted by IT Corporation (IT) as part of the ongoing Remedial Investigation and Feasibility Study at the Salmon Site, Lamar County, Mississippi (DOE, 1992). During construction, operation, and closure of the site wastes of unknown composition were buried in pits on site. Surface-geophysical field investigations were conducted intermittently between November 1992 and October 1993 to identify potential waste-burial sites and buried metallic materials. The geophysical investigations included vertical magnetic gradient, electromagnetic conductivity, electromagnetic in-phase component, and ground-penetrating radar surveys. A number of anomalies identified by the magnetic gradiometer survey in the Reynolds Electrical ampersand Engineering Co., Inc., (REECo) pits area indicated buried metallic objects. All of the anomalies were field checked to determine if any were caused by surface features or debris. After field checking, 17 anomalies were still unexplained; trenching was planned to attempt to identify their sources. Between December 8, 1993, and December 17, 1993, 15 trenches were excavated and soil samples were collected at the anomalies. Samples were collected, placed in 250- and 500-milliliter (m ell) amber glass containers, and shipped on ice to IT Analytical Services (ITAS) in St. Louis, Missouri, using standard IT chain-of-custody procedures. The samples were analyzed for various chemical and radiological parameters. Data validation has not been conducted on any of the samples. During excavation and sampling, soil samples were also collected by IT for the MSDEQ and the Mississippi Department of Radiological Health, in accordance with their instructions, and delivered into their custody

  6. Learning to reason from samples

    NARCIS (Netherlands)

    Ben-Zvi, Dani; Bakker, Arthur; Makar, Katie

    2015-01-01

    The goal of this article is to introduce the topic of learning to reason from samples, which is the focus of this special issue of Educational Studies in Mathematics on statistical reasoning. Samples are data sets, taken from some wider universe (e.g., a population or a process) using a particular

  7. Fluctuations of Attentional Networks and Default Mode Network during the Resting State Reflect Variations in Cognitive States: Evidence from a Novel Resting-state Experience Sampling Method.

    Science.gov (United States)

    Van Calster, Laurens; D'Argembeau, Arnaud; Salmon, Eric; Peters, Frédéric; Majerus, Steve

    2017-01-01

    Neuroimaging studies have revealed the recruitment of a range of neural networks during the resting state, which might reflect a variety of cognitive experiences and processes occurring in an individual's mind. In this study, we focused on the default mode network (DMN) and attentional networks and investigated their association with distinct mental states when participants are not performing an explicit task. To investigate the range of possible cognitive experiences more directly, this study proposes a novel method of resting-state fMRI experience sampling, informed by a phenomenological investigation of the fluctuation of mental states during the resting state. We hypothesized that DMN activity would increase as a function of internal mentation and that the activity of dorsal and ventral networks would indicate states of top-down versus bottom-up attention at rest. Results showed that dorsal attention network activity fluctuated as a function of subjective reports of attentional control, providing evidence that activity of this network reflects the perceived recruitment of controlled attentional processes during spontaneous cognition. Activity of the DMN increased when participants reported to be in a subjective state of internal mentation, but not when they reported to be in a state of perception. This study provides direct evidence for a link between fluctuations of resting-state neural activity and fluctuations in specific cognitive processes.

  8. On Using a Pilot Sample Variance for Sample Size Determination in the Detection of Differences between Two Means: Power Consideration

    Science.gov (United States)

    Shieh, Gwowen

    2013-01-01

    The a priori determination of a proper sample size necessary to achieve some specified power is an important problem encountered frequently in practical studies. To establish the needed sample size for a two-sample "t" test, researchers may conduct the power analysis by specifying scientifically important values as the underlying population means…

  9. Sampling and measurement of long-lived radionuclides in environmental samples

    International Nuclear Information System (INIS)

    Brauer, F.P.; Goles, R.W.; Kaye, J.H.; Rieck, H.G. Jr.

    1977-01-01

    The volatile and semivolatile long-lived man-made radionuclides 3 H, 14 C, 79 Se, 85 Kr, 99 Tc, 129 I, 135 Cs, and 137 Cs are of concern in operation of nuclear facilities because they are difficult and expensive to contain and once emitted to the environment they become permanent ecological constituents with both local and global distributions. Species-selective sampling and analytical methods (radiochemical, neutron activation, and mass spectrometric) have been developed for many of these nuclides with sensitivities well below those required for radiation protection. These sampling and analytical methods have been applied to the measurement of current environmental levels of some of the more ecologically important radionuclides. The detection and tracing of long-lived radionuclides is being conducted in order to establish base-line values and to study environmental behavior. This paper describes detection and measurement techniques and summarizes current measurement results

  10. Graph sampling

    OpenAIRE

    Zhang, L.-C.; Patone, M.

    2017-01-01

    We synthesise the existing theory of graph sampling. We propose a formal definition of sampling in finite graphs, and provide a classification of potential graph parameters. We develop a general approach of Horvitz–Thompson estimation to T-stage snowball sampling, and present various reformulations of some common network sampling methods in the literature in terms of the outlined graph sampling theory.

  11. 40 CFR 1065.245 - Sample flow meter for batch sampling.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Sample flow meter for batch sampling... Sample flow meter for batch sampling. (a) Application. Use a sample flow meter to determine sample flow... difference between a diluted exhaust sample flow meter and a dilution air meter to calculate raw exhaust flow...

  12. Lead-position dependent regular oscillations and random fluctuations of conductance in graphene quantum dots

    International Nuclear Information System (INIS)

    Huang Liang; Yang Rui; Lai Yingcheng; Ferry, David K

    2013-01-01

    Quantum interference causes a wavefunction to have sensitive spatial dependence, and this has a significant effect on quantum transport. For example, in a quantum-dot system, the conductance can depend on the lead positions. We investigate, for graphene quantum dots, the conductance variations with the lead positions. Since for graphene the types of boundaries, e.g., zigzag and armchair, can fundamentally affect the quantum transport characteristics, we focus on rectangular graphene quantum dots, for which the effects of boundaries can be systematically studied. For both zigzag and armchair horizontal boundaries, we find that changing the positions of the leads can induce significant conductance variations. Depending on the Fermi energy, the variations can be either regular oscillations or random conductance fluctuations. We develop a physical theory to elucidate the origin of the conductance oscillation/fluctuation patterns. In particular, quantum interference leads to standing-wave-like-patterns in the quantum dot which, in the absence of leads, are regulated by the energy-band structure of the corresponding vertical graphene ribbon. The observed ‘coexistence’ of regular oscillations and random fluctuations in the conductance can be exploited for the development of graphene-based nanodevices. (paper)

  13. Analytical solutions to sampling effects in drop size distribution measurements during stationary rainfall: Estimation of bulk rainfall variables

    NARCIS (Netherlands)

    Uijlenhoet, R.; Porrà, J.M.; Sempere Torres, D.; Creutin, J.D.

    2006-01-01

    A stochastic model of the microstructure of rainfall is used to derive explicit expressions for the magnitude of the sampling fluctuations in rainfall properties estimated from raindrop size measurements in stationary rainfall. The model is a marked point process, in which the points represent the

  14. Liquid waste sampling device

    International Nuclear Information System (INIS)

    Kosuge, Tadashi

    1998-01-01

    A liquid pumping pressure regulator is disposed on the midway of a pressure control tube which connects the upper portion of a sampling pot and the upper portion of a liquid waste storage vessel. With such a constitution, when the pressure in the sampling pot is made negative, and liquid wastes are sucked to the liquid pumping tube passing through the sampling pot, the difference between the pressure on the entrance of the liquid pumping pressure regulator of the pressure regulating tube and the pressure at the bottom of the liquid waste storage vessel is made constant. An opening degree controlling meter is disposed to control the degree of opening of a pressure regulating valve for sending actuation pressurized air to the liquid pumping pressure regulator. Accordingly, even if the liquid level of liquid wastes in the liquid waste storage vessel is changed, the height for the suction of the liquid wastes in the liquid pumping tube can be kept constant. With such procedures, sampling can be conducted correctly, and the discharge of the liquid wastes to the outside can be prevented. (T.M.)

  15. Evaluation of pump pulsation in respirable size-selective sampling: part II. Changes in sampling efficiency.

    Science.gov (United States)

    Lee, Eun Gyung; Lee, Taekhee; Kim, Seung Won; Lee, Larry; Flemmer, Michael M; Harper, Martin

    2014-01-01

    This second, and concluding, part of this study evaluated changes in sampling efficiency of respirable size-selective samplers due to air pulsations generated by the selected personal sampling pumps characterized in Part I (Lee E, Lee L, Möhlmann C et al. Evaluation of pump pulsation in respirable size-selective sampling: Part I. Pulsation measurements. Ann Occup Hyg 2013). Nine particle sizes of monodisperse ammonium fluorescein (from 1 to 9 μm mass median aerodynamic diameter) were generated individually by a vibrating orifice aerosol generator from dilute solutions of fluorescein in aqueous ammonia and then injected into an environmental chamber. To collect these particles, 10-mm nylon cyclones, also known as Dorr-Oliver (DO) cyclones, were used with five medium volumetric flow rate pumps. Those were the Apex IS, HFS513, GilAir5, Elite5, and Basic5 pumps, which were found in Part I to generate pulsations of 5% (the lowest), 25%, 30%, 56%, and 70% (the highest), respectively. GK2.69 cyclones were used with the Legacy [pump pulsation (PP) = 15%] and Elite12 (PP = 41%) pumps for collection at high flows. The DO cyclone was also used to evaluate changes in sampling efficiency due to pulse shape. The HFS513 pump, which generates a more complex pulse shape, was compared to a single sine wave fluctuation generated by a piston. The luminescent intensity of the fluorescein extracted from each sample was measured with a luminescence spectrometer. Sampling efficiencies were obtained by dividing the intensity of the fluorescein extracted from the filter placed in a cyclone with the intensity obtained from the filter used with a sharp-edged reference sampler. Then, sampling efficiency curves were generated using a sigmoid function with three parameters and each sampling efficiency curve was compared to that of the reference cyclone by constructing bias maps. In general, no change in sampling efficiency (bias under ±10%) was observed until pulsations exceeded 25% for the

  16. Modern survey sampling

    CERN Document Server

    Chaudhuri, Arijit

    2014-01-01

    Exposure to SamplingAbstract Introduction Concepts of Population, Sample, and SamplingInitial RamificationsAbstract Introduction Sampling Design, Sampling SchemeRandom Numbers and Their Uses in Simple RandomSampling (SRS)Drawing Simple Random Samples with and withoutReplacementEstimation of Mean, Total, Ratio of Totals/Means:Variance and Variance EstimationDetermination of Sample SizesA.2 Appendix to Chapter 2 A.More on Equal Probability Sampling A.Horvitz-Thompson EstimatorA.SufficiencyA.LikelihoodA.Non-Existence Theorem More Intricacies Abstract Introduction Unequal Probability Sampling StrategiesPPS Sampling Exploring Improved WaysAbstract Introduction Stratified Sampling Cluster SamplingMulti-Stage SamplingMulti-Phase Sampling: Ratio and RegressionEstimationviiviii ContentsControlled SamplingModeling Introduction Super-Population ModelingPrediction Approach Model-Assisted Approach Bayesian Methods Spatial SmoothingSampling on Successive Occasions: Panel Rotation Non-Response and Not-at-Homes Weighting Adj...

  17. Sample preparation

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    Sample preparation prior to HPLC analysis is certainly one of the most important steps to consider in trace or ultratrace analysis. For many years scientists have tried to simplify the sample preparation process. It is rarely possible to inject a neat liquid sample or a sample where preparation may not be any more complex than dissolution of the sample in a given solvent. The last process alone can remove insoluble materials, which is especially helpful with the samples in complex matrices if other interactions do not affect extraction. Here, it is very likely a large number of components will not dissolve and are, therefore, eliminated by a simple filtration process. In most cases, the process of sample preparation is not as simple as dissolution of the component interest. At times, enrichment is necessary, that is, the component of interest is present in very large volume or mass of material. It needs to be concentrated in some manner so a small volume of the concentrated or enriched sample can be injected into HPLC. 88 refs

  18. Ensemble Sampling

    OpenAIRE

    Lu, Xiuyuan; Van Roy, Benjamin

    2017-01-01

    Thompson sampling has emerged as an effective heuristic for a broad range of online decision problems. In its basic form, the algorithm requires computing and sampling from a posterior distribution over models, which is tractable only for simple special cases. This paper develops ensemble sampling, which aims to approximate Thompson sampling while maintaining tractability even in the face of complex models such as neural networks. Ensemble sampling dramatically expands on the range of applica...

  19. Contribution to the assessment of sampling data of coke. Beitrag zur Auswertung von Probenahme-ergebnissen von Koks

    Energy Technology Data Exchange (ETDEWEB)

    Goell, G; Helfricht, R; Mueller, H [Forschungsinstitut fuer Aufbereitung, Freiberg (Germany, F.R.)

    1991-03-01

    Controlling the quality characteristics of disperse material systems in plants where these are produced and applied, by means of test procedures, assumes that samples are taken from mass flows. In this contribution, a report is presented on research on the clarification of the causes of fluctuations and differences of test results for coke between supplier and customer. It is revealed that for an appraisal, prehistory and material behaviour of the test material as well as analysis of the actual condition of the sampling and processing systems of both plants are to be considered. The magnitude of the individual error in relation to the total sampling error is explored and conclusions are drawn, to rule out differences as well as unfavourable developments in the production, utilization and the appraisal of coke products. 15 refs., 7 figs.

  20. Communication: Multiple atomistic force fields in a single enhanced sampling simulation

    Energy Technology Data Exchange (ETDEWEB)

    Hoang Viet, Man [Department of Physics, North Carolina State University, Raleigh, North Carolina 27695-8202 (United States); Derreumaux, Philippe, E-mail: philippe.derreumaux@ibpc.fr [Laboratoire de Biochimie Théorique, UPR 9080, CNRS, Université Denis Diderot, Sorbonne Paris Cité IBPC, 13 rue Pierre et Marie Curie, 75005 Paris (France); Institut Universitaire de France, 103 Bvd Saint-Germain, 75005 Paris (France); Nguyen, Phuong H., E-mail: phuong.nguyen@ibpc.fr [Laboratoire de Biochimie Théorique, UPR 9080, CNRS, Université Denis Diderot, Sorbonne Paris Cité IBPC, 13 rue Pierre et Marie Curie, 75005 Paris (France)

    2015-07-14

    The main concerns of biomolecular dynamics simulations are the convergence of the conformational sampling and the dependence of the results on the force fields. While the first issue can be addressed by employing enhanced sampling techniques such as simulated tempering or replica exchange molecular dynamics, repeating these simulations with different force fields is very time consuming. Here, we propose an automatic method that includes different force fields into a single advanced sampling simulation. Conformational sampling using three all-atom force fields is enhanced by simulated tempering and by formulating the weight parameters of the simulated tempering method in terms of the energy fluctuations, the system is able to perform random walk in both temperature and force field spaces. The method is first demonstrated on a 1D system and then validated by the folding of the 10-residue chignolin peptide in explicit water.

  1. Diffusion probe for gas sampling in undisturbed soil

    DEFF Research Database (Denmark)

    Petersen, Søren O

    2014-01-01

    Soil-atmosphere fluxes of trace gases such as methane (CH4) and nitrous oxide (N2O) are determined by complex interactions between biological activity and soil conditions. Soil gas concentration profiles may, in combination with other information about soil conditions, help to understand emission...... controls. This note describes a simple and robust diffusion probe for soil gas sampling as part of flux monitoring programs. It can be deployed with minimum disturbance of in-situ conditions, also at sites with a high or fluctuating water table. Separate probes are used for each sampling depth...... on peat soils used for grazing showed soil gas concentrations of CH4 and N2O as influenced by topography, site conditions, and season. The applicability of the diffusion probe for trace gas monitoring is discussed....

  2. Two-sample discrimination of Poisson means

    Science.gov (United States)

    Lampton, M.

    1994-01-01

    This paper presents a statistical test for detecting significant differences between two random count accumulations. The null hypothesis is that the two samples share a common random arrival process with a mean count proportional to each sample's exposure. The model represents the partition of N total events into two counts, A and B, as a sequence of N independent Bernoulli trials whose partition fraction, f, is determined by the ratio of the exposures of A and B. The detection of a significant difference is claimed when the background (null) hypothesis is rejected, which occurs when the observed sample falls in a critical region of (A, B) space. The critical region depends on f and the desired significance level, alpha. The model correctly takes into account the fluctuations in both the signals and the background data, including the important case of small numbers of counts in the signal, the background, or both. The significance can be exactly determined from the cumulative binomial distribution, which in turn can be inverted to determine the critical A(B) or B(A) contour. This paper gives efficient implementations of these tests, based on lookup tables. Applications include the detection of clustering of astronomical objects, the detection of faint emission or absorption lines in photon-limited spectroscopy, the detection of faint emitters or absorbers in photon-limited imaging, and dosimetry.

  3. Conductance fluctuations and distribution in disordered chains in presence of an electric field

    International Nuclear Information System (INIS)

    Senouci, K.

    1995-07-01

    A simple Kronig-Penney model for 1D mesoscopic systems with disordered δ-peak and finite width potentials under an electric field is used to study the conductance fluctuations and distributions in different phase states. The electric field allows us to obtain the insulating, transition and metallic regimes. In the superlocalized electron states found previously near the Brillouin zone edges of the corresponding periodic system the conductance fluctuations are smaller than those of the insulating regime corresponding to the vanishing field, but the conductance probability distribution has a similar behaviour. Extensive results are compared to the previous works on higher dimensions and quasi-1D mesoscopic systems in each regime and found to be in good agreement. Further discussions are also included. (author). 33 refs, 11 figs

  4. Environmental surveillance master sampling schedule

    Energy Technology Data Exchange (ETDEWEB)

    Bisping, L.E.

    1993-01-01

    Environmental surveillance of the Hanford Site and surrounding areas is conducted by the Pacific Northwest Laboratory (PNL) for the US Department of Energy (DOE). Samples are routinely collected and analyzed to determine the quality of air, surface water, ground water, soil, sediment, wildlife, vegetation, foodstuffs, and farm products at Hanford Site and surrounding communities. This document contains the planned schedule for routine sample collection for the Surface Environmental Surveillance Project (SESP) and Drinking Water Project, and Ground-Water Surveillance Project.

  5. Sample summary report for ARG 1 pressure tube sample

    International Nuclear Information System (INIS)

    Belinco, C.

    2006-01-01

    The ARG 1 sample is made from an un-irradiated Zr-2.5% Nb pressure tube. The sample has 103.4 mm ID, 112 mm OD and approximately 500 mm length. A punch mark was made very close to one end of the sample. The punch mark indicates the 12 O'clock position and also identifies the face of the tube for making all the measurements. ARG 1 sample contains flaws on ID and OD surface. There was no intentional flaw within the wall of the pressure tube sample. Once the flaws are machined the pressure tube sample was covered from outside to hide the OD flaws. Approximately 50 mm length of pressure tube was left open at both the ends to facilitate the holding of sample in the fixtures for inspection. No flaw was machined in this zone of 50 mm on either end of the pressure tube sample. A total of 20 flaws were machined in ARG 1 sample. Out of these, 16 flaws were on the OD surface and the remaining 4 on the ID surface of the pressure tube. The flaws were characterized in to various groups like axial flaws, circumferential flaws, etc

  6. Sampling wild species to conserve genetic diversity

    Science.gov (United States)

    Sampling seed from natural populations of crop wild relatives requires choice of the locations to sample from and the amount of seed to sample. While this may seem like a simple choice, in fact careful planning of a collector’s sampling strategy is needed to ensure that a crop wild collection will ...

  7. Respondent-Driven Sampling – Testing Assumptions: Sampling with Replacement

    Directory of Open Access Journals (Sweden)

    Barash Vladimir D.

    2016-03-01

    Full Text Available Classical Respondent-Driven Sampling (RDS estimators are based on a Markov Process model in which sampling occurs with replacement. Given that respondents generally cannot be interviewed more than once, this assumption is counterfactual. We join recent work by Gile and Handcock in exploring the implications of the sampling-with-replacement assumption for bias of RDS estimators. We differ from previous studies in examining a wider range of sampling fractions and in using not only simulations but also formal proofs. One key finding is that RDS estimates are surprisingly stable even in the presence of substantial sampling fractions. Our analyses show that the sampling-with-replacement assumption is a minor contributor to bias for sampling fractions under 40%, and bias is negligible for the 20% or smaller sampling fractions typical of field applications of RDS.

  8. Sampling for Beryllium Surface Contamination using Wet, Dry and Alcohol Wipe Sampling

    Energy Technology Data Exchange (ETDEWEB)

    Kerr, Kent [Central Missouri State Univ., Warrensburg, MO (United States)

    2004-12-01

    This research project was conducted at the National Nuclear Security Administration's Kansas City Plant, operated by Honeywell Federal Manufacturing and Technologies, in conjunction with the Safety Sciences Department of Central Missouri State University, to compare relative removal efficiencies of three wipe sampling techniques currently used at Department of Energy facilities. Efficiencies of removal of beryllium contamination from typical painted surfaces were tested by wipe sampling with dry Whatman 42 filter paper, with water-moistened (Ghost Wipe) materials, and by methanol-moistened wipes. Test plates were prepared using 100 mm X 15 mm Pyrex Petri dishes with interior surfaces spray painted with a bond coat primer. To achieve uniform deposition over the test plate surface, 10 ml aliquots of solution containing 1 beryllium and 0.1 ml of metal working fluid were transferred to the test plates and subsequently evaporated. Metal working fluid was added to simulate the slight oiliness common on surfaces in metal working shops where fugitive oil mist accumulates over time. Sixteen test plates for each wipe method (dry, water, and methanol) were processed and sampled using a modification of wiping patterns recommended by OSHA Method 125G. Laboratory and statistical analysis showed that methanol-moistened wipe sampling removed significantly more (about twice as much) beryllium/oil-film surface contamination as water-moistened wipes (p< 0.001), which removed significantly more (about twice as much) residue as dry wipes (p <0.001). Evidence for beryllium sensitization via skin exposure argues in favor of wipe sampling with wetting agents that provide enhanced residue removal efficiency.

  9. Quenched disorder and thermopower fluctuations in high temperature superconductors

    International Nuclear Information System (INIS)

    Khalil, A.E.

    1997-01-01

    Thermopower behavior in high temperature superconductors YBa 2 Cu 3 O 7-δ single crystals near the transition temperature was examined. An expression for the thermoelectric power containing the divergent term (1-T/T c ) -s , where s is a scaling exponent that does not appear in Maki's calculations, was derived. This divergent term is the result of contributions due to the flow of currents across disordered conduction paths in the sample. These currents are driven by the density gradients of the conductivity fluctuations as a result of the increased disorder due to the existence of amorphous regions in the two-dimensional lattice. The present calculations include the most divergent effects to the thermopower due to the conductivity fluctuations near the transition temperature. The model predictions are in good agreement with recent experimental measurements reported in the literature. (orig.)

  10. High priority tank sampling and analysis report

    International Nuclear Information System (INIS)

    Brown, T.M.

    1998-01-01

    In July 1993, the Defense Nuclear Facilities Board issued Recommendation 93-5 (Conway 1993) which noted that there was insufficient tank waste technical information and the pace to obtain it was too slow to ensure that Hanford Site wastes could be safely stored, that associated operations could be conducted safely, and that future disposal data requirements could be met. In response, the US Department of Energy, in May 1996, issued Revision 1 of the Recommendation 93-5 Implementation Plan (DOE-RL 1996). The Implementation Plan presented a modified approach to achieve the original plan's objectives, concentrating on actions necessary to ensure that wastes can be safely stored, that operations can be safely conducted, and that timely characterization information for the tank waste Disposal Program could be obtained. The Implementation Plan proposed 28 High Priority tanks for near term core sampling and analysis, which along with sampling and analysis of other non-High Priority tanks, could provide the scientific and technical data to confirm assumptions, calibrate models, and.measure safety related phenomenology of the waste. When the analysis results of the High Priority and other-tank sampling were reviewed, it was expected that a series of 12 questions, 9 related to safety issues and 3 related to planning for the disposal process, should be answered allowing key decisions to be made. This report discusses the execution of the Implementation Plan and the results achieved in addressing the questions. Through sampling and analysis, all nine safety related questions have been answered and extensive data for the three disposal planning related questions have been collected, allowing for key decision making. Many more tanks than the original 28 High Priority tanks identified in the Implementation Plan were sampled and analyzed. Twenty-one High Priority tanks and 85 other tanks were core sampled and used to address the questions. Thirty-eight additional tanks were auger

  11. Estimating Sample Size for Usability Testing

    Directory of Open Access Journals (Sweden)

    Alex Cazañas

    2017-02-01

    Full Text Available One strategy used to assure that an interface meets user requirements is to conduct usability testing. When conducting such testing one of the unknowns is sample size. Since extensive testing is costly, minimizing the number of participants can contribute greatly to successful resource management of a project. Even though a significant number of models have been proposed to estimate sample size in usability testing, there is still not consensus on the optimal size. Several studies claim that 3 to 5 users suffice to uncover 80% of problems in a software interface. However, many other studies challenge this assertion. This study analyzed data collected from the user testing of a web application to verify the rule of thumb, commonly known as the “magic number 5”. The outcomes of the analysis showed that the 5-user rule significantly underestimates the required sample size to achieve reasonable levels of problem detection.

  12. Sampling efficiency for species composition assessments using the ...

    African Journals Online (AJOL)

    A pilot survey to determine the sampling efficiency of the wheel-point method, using the nearest plant method, to assess species composition (using replicate similarity related to sampling intensity, and total sampling time) was conducted on three plot sizes (20 x 20m, 30 x 30m, 40 x 40m) at two sites in a semi-arid savanna.

  13. Conductance fluctuations and distribution at metal-insulator transition induced by electric field in disordered chain

    International Nuclear Information System (INIS)

    Senouci, Khaled

    2000-08-01

    A simple Kronig-Penney model for 1D mesoscopic systems with δ peak potentials is used to study numerically the influence of a constant electric field on the conductance fluctuations and distribution at the transition. We found that the conductance probability distribution has a system-size independent form with large fluctuations in good agreement with the previous works in 2D and 3D systems. (author)

  14. Efficient sampling of complex network with modified random walk strategies

    Science.gov (United States)

    Xie, Yunya; Chang, Shuhua; Zhang, Zhipeng; Zhang, Mi; Yang, Lei

    2018-02-01

    We present two novel random walk strategies, choosing seed node (CSN) random walk and no-retracing (NR) random walk. Different from the classical random walk sampling, the CSN and NR strategies focus on the influences of the seed node choice and path overlap, respectively. Three random walk samplings are applied in the Erdös-Rényi (ER), Barabási-Albert (BA), Watts-Strogatz (WS), and the weighted USAir networks, respectively. Then, the major properties of sampled subnets, such as sampling efficiency, degree distributions, average degree and average clustering coefficient, are studied. The similar conclusions can be reached with these three random walk strategies. Firstly, the networks with small scales and simple structures are conducive to the sampling. Secondly, the average degree and the average clustering coefficient of the sampled subnet tend to the corresponding values of original networks with limited steps. And thirdly, all the degree distributions of the subnets are slightly biased to the high degree side. However, the NR strategy performs better for the average clustering coefficient of the subnet. In the real weighted USAir networks, some obvious characters like the larger clustering coefficient and the fluctuation of degree distribution are reproduced well by these random walk strategies.

  15. Enhanced conformational sampling using enveloping distribution sampling.

    Science.gov (United States)

    Lin, Zhixiong; van Gunsteren, Wilfred F

    2013-10-14

    To lessen the problem of insufficient conformational sampling in biomolecular simulations is still a major challenge in computational biochemistry. In this article, an application of the method of enveloping distribution sampling (EDS) is proposed that addresses this challenge and its sampling efficiency is demonstrated in simulations of a hexa-β-peptide whose conformational equilibrium encompasses two different helical folds, i.e., a right-handed 2.7(10∕12)-helix and a left-handed 3(14)-helix, separated by a high energy barrier. Standard MD simulations of this peptide using the GROMOS 53A6 force field did not reach convergence of the free enthalpy difference between the two helices even after 500 ns of simulation time. The use of soft-core non-bonded interactions in the centre of the peptide did enhance the number of transitions between the helices, but at the same time led to neglect of relevant helical configurations. In the simulations of a two-state EDS reference Hamiltonian that envelops both the physical peptide and the soft-core peptide, sampling of the conformational space of the physical peptide ensures that physically relevant conformations can be visited, and sampling of the conformational space of the soft-core peptide helps to enhance the transitions between the two helices. The EDS simulations sampled many more transitions between the two helices and showed much faster convergence of the relative free enthalpy of the two helices compared with the standard MD simulations with only a slightly larger computational effort to determine optimized EDS parameters. Combined with various methods to smoothen the potential energy surface, the proposed EDS application will be a powerful technique to enhance the sampling efficiency in biomolecular simulations.

  16. Boat sampling

    International Nuclear Information System (INIS)

    Citanovic, M.; Bezlaj, H.

    1994-01-01

    This presentation describes essential boat sampling activities: on site boat sampling process optimization and qualification; boat sampling of base material (beltline region); boat sampling of weld material (weld No. 4); problems accompanied with weld crown varieties, RPV shell inner radius tolerance, local corrosion pitting and water clarity. The equipment used for boat sampling is described too. 7 pictures

  17. Barriers to acceptance of self-sampling for human papillomavirus across ethnolinguistic groups of women.

    Science.gov (United States)

    Howard, Michelle; Lytwyn, Alice; Lohfeld, Lynne; Redwood-Campbell, Lynda; Fowler, Nancy; Karwalajtys, Tina

    2009-01-01

    Immigrant and low socio-economic (SES) women in North America underutilize Papanicolaou screening. Vaginal swab self-sampling for oncogenic human papillomavirus (HPV) has the potential to increase cervical cancer screening participation. The purpose of this qualitative study was to understand the perceptions of lower SES and immigrant women regarding self-sampling for HPV. Eleven focus-group interviews were conducted: one with Canadian-born English-speaking lower SES women, and two groups each with Arabic, Cantonese, Dari (Afghani), Somali and Spanish (Latino)-speaking women (one group conducted in English, the other in the native language) recently immigrated to Canada. Five to nine women aged 35 to 65 years and married with children participated in each group. Themes included 1) who might use self-sampling and why; 2) aversion to self-sampling and reasons to prefer physician; 3) ways to improve the appeal of self-sampling. Women generally perceived benefits of self-sampling and a small number felt they might use the method, but all groups had some reservations. Reasons included: uncertainty over performing the sampling correctly; fear of hurting themselves; concern about obtaining appropriate material; and concerns about test accuracy. Women preferred testing by a health care professional because they were accustomed to pelvic examinations, it was more convenient, or they trusted the results. Perceptions of self-sampling for HPV were similar across cultures and pertained to issues of confidence in self-sampling and need for physician involvement in care. These findings can inform programs and studies planning to employ self-sampling as a screening modality for cervical cancer.

  18. Sample Preparation Report of the Fourth OPCW Confidence Building Exercise on Biomedical Sample Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Udey, R. N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Corzett, T. H. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Alcaraz, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-07-03

    Following the successful completion of the 3rd biomedical confidence building exercise (February 2013 – March 2013), which included the analysis of plasma and urine samples spiked at low ppb levels as part of the exercise scenario, another confidence building exercise was targeted to be conducted in 2014. In this 4th exercise, it was desired to focus specifically on the analysis of plasma samples. The scenario was designed as an investigation of an alleged use of chemical weapons where plasma samples were collected, as plasma has been reported to contain CWA adducts which remain present in the human body for several weeks (Solano et al. 2008). In the 3rd exercise most participants used the fluoride regeneration method to analyze for the presence of nerve agents in plasma samples. For the 4th biomedical exercise it was decided to evaluate the analysis of human plasma samples for the presence/absence of the VX adducts and aged adducts to blood proteins (e.g., VX-butyrylcholinesterase (BuChE) and aged BuChE adducts using a pepsin digest technique to yield nonapeptides; or equivalent). As the aging of VX-BuChE adducts is relatively slow (t1/2 = 77 hr at 37 °C [Aurbek et al. 2009]), soman (GD), which ages much more quickly (t1/2 = 9 min at 37 °C [Masson et al. 2010]), was used to simulate an aged VX sample. Additional objectives of this exercise included having laboratories assess novel OP-adducted plasma sample preparation techniques and analytical instrumentation methodologies, as well as refining/designating the reporting formats for these new techniques.

  19. Signal sampling circuit

    NARCIS (Netherlands)

    Louwsma, S.M.; Vertregt, Maarten

    2011-01-01

    A sampling circuit for sampling a signal is disclosed. The sampling circuit comprises a plurality of sampling channels adapted to sample the signal in time-multiplexed fashion, each sampling channel comprising a respective track-and-hold circuit connected to a respective analogue to digital

  20. Signal sampling circuit

    NARCIS (Netherlands)

    Louwsma, S.M.; Vertregt, Maarten

    2010-01-01

    A sampling circuit for sampling a signal is disclosed. The sampling circuit comprises a plurality of sampling channels adapted to sample the signal in time-multiplexed fashion, each sampling channel comprising a respective track-and-hold circuit connected to a respective analogue to digital

  1. Sample vial inserts: A better approach for sampling heterogeneous slurry samples in the SRS Defense Waste Processing Facility

    International Nuclear Information System (INIS)

    Coleman, C.J.; Goode, S.R.

    1996-01-01

    A convenient and effective new approach for analyzing DWPF samples involves the use of inserts with volumes of 1.5--3 ml placed in the neck of 14 ml sample vials. The inserts have rims that conform to the rim of the vials so that they sit straight and stable in the vial. The DWPF tank sampling system fills the pre-weighed insert rather than the entire vial, so the vial functions only as the insert holder. The shielded cell operator then removes the vial cap and decants the insert containing the sample into a plastic bottle, crucible, etc., for analysis. Inert materials such as Teflon, plastic, and zirconium are used for the insert so it is unnecessary to separate the insert from the sample for most analyses. The key technique advantage of using inserts to take DWPF samples versus filling sample vials is that it provides a convenient and almost foolproof way of obtaining and handling small volumes of slurry samples in a shielded cell without corrupting the sample. Since the insert allows the entire sample to be analyzed, this approach eliminates the errors inherent with subsampling heterogeneous slurries that comprise DWPF samples. Slurry samples can then be analyzed with confidence. Analysis times are dramatically reduced by eliminating the drying and vitrification steps normally used to produce a homogeneous solid sample. Direct dissolution and elemental analysis of slurry samples are achieved in 8 hours or less compared with 40 hours for analysis of vitrified slurry samples. Comparison of samples taken in inserts versus full vials indicate that the insert does not significantly affect sample composition

  2. Sparse covariance estimation in heterogeneous samples.

    Science.gov (United States)

    Rodríguez, Abel; Lenkoski, Alex; Dobra, Adrian

    Standard Gaussian graphical models implicitly assume that the conditional independence among variables is common to all observations in the sample. However, in practice, observations are usually collected from heterogeneous populations where such an assumption is not satisfied, leading in turn to nonlinear relationships among variables. To address such situations we explore mixtures of Gaussian graphical models; in particular, we consider both infinite mixtures and infinite hidden Markov models where the emission distributions correspond to Gaussian graphical models. Such models allow us to divide a heterogeneous population into homogenous groups, with each cluster having its own conditional independence structure. As an illustration, we study the trends in foreign exchange rate fluctuations in the pre-Euro era.

  3. Standard methods for sampling and sample preparation for gamma spectroscopy

    International Nuclear Information System (INIS)

    Taskaeva, M.; Taskaev, E.; Nikolov, P.

    1993-01-01

    The strategy for sampling and sample preparation is outlined: necessary number of samples; analysis and treatment of the results received; quantity of the analysed material according to the radionuclide concentrations and analytical methods; the minimal quantity and kind of the data needed for making final conclusions and decisions on the base of the results received. This strategy was tested in gamma spectroscopic analysis of radionuclide contamination of the region of Eleshnitsa Uranium Mines. The water samples was taken and stored according to the ASTM D 3370-82. The general sampling procedures were in conformity with the recommendations of ISO 5667. The radionuclides was concentrated by coprecipitation with iron hydroxide and ion exchange. The sampling of soil samples complied with the rules of ASTM C 998, and their sample preparation - with ASTM C 999. After preparation the samples were sealed hermetically and measured. (author)

  4. Field evaluation of personal sampling methods for multiple bioaerosols.

    Science.gov (United States)

    Wang, Chi-Hsun; Chen, Bean T; Han, Bor-Cheng; Liu, Andrew Chi-Yeu; Hung, Po-Chen; Chen, Chih-Yong; Chao, Hsing Jasmine

    2015-01-01

    Ambient bioaerosols are ubiquitous in the daily environment and can affect health in various ways. However, few studies have been conducted to comprehensively evaluate personal bioaerosol exposure in occupational and indoor environments because of the complex composition of bioaerosols and the lack of standardized sampling/analysis methods. We conducted a study to determine the most efficient collection/analysis method for the personal exposure assessment of multiple bioaerosols. The sampling efficiencies of three filters and four samplers were compared. According to our results, polycarbonate (PC) filters had the highest relative efficiency, particularly for bacteria. Side-by-side sampling was conducted to evaluate the three filter samplers (with PC filters) and the NIOSH Personal Bioaerosol Cyclone Sampler. According to the results, the Button Aerosol Sampler and the IOM Inhalable Dust Sampler had the highest relative efficiencies for fungi and bacteria, followed by the NIOSH sampler. Personal sampling was performed in a pig farm to assess occupational bioaerosol exposure and to evaluate the sampling/analysis methods. The Button and IOM samplers yielded a similar performance for personal bioaerosol sampling at the pig farm. However, the Button sampler is more likely to be clogged at high airborne dust concentrations because of its higher flow rate (4 L/min). Therefore, the IOM sampler is a more appropriate choice for performing personal sampling in environments with high dust levels. In summary, the Button and IOM samplers with PC filters are efficient sampling/analysis methods for the personal exposure assessment of multiple bioaerosols.

  5. An antithetic variate to facilitate upper-stem height measurements for critical height sampling with importance sampling

    Science.gov (United States)

    Thomas B. Lynch; Jeffrey H. Gove

    2013-01-01

    Critical height sampling (CHS) estimates cubic volume per unit area by multiplying the sum of critical heights measured on trees tallied in a horizontal point sample (HPS) by the HPS basal area factor. One of the barriers to practical application of CHS is the fact that trees near the field location of the point-sampling sample point have critical heights that occur...

  6. Balanced sampling

    NARCIS (Netherlands)

    Brus, D.J.

    2015-01-01

    In balanced sampling a linear relation between the soil property of interest and one or more covariates with known means is exploited in selecting the sampling locations. Recent developments make this sampling design attractive for statistical soil surveys. This paper introduces balanced sampling

  7. Experimental study of glass sampling devices

    International Nuclear Information System (INIS)

    Jouan, A.; Moncouyoux, J.P.; Meyere, A.

    1992-01-01

    Two high-level liquid waste containment glass sampling systems have been designed and built. The first device fits entirely inside a standard glass storage canister, and may thus be used in facilities not initially designed for this function. It has been tested successfully in the nonradioactive prototype unit at Marcoule. The work primarily covered the design and construction of an articulated arm supporting the sampling vessel, and the mechanisms necessary for filling the vessel and recovering the sample. System actuation and operation are fully automatic, and the resulting sample is representative of the glass melt. Implementation of the device is delicate however, and its reliability is estimated at about 75%. A second device was designed specifically for new vitrification facilities. It is installed directly on the glass melting furnace, and meets process operating and quality control requirements. Tests conducted at the Marcoule prototype vitrification facility demonstrated the feasibility of the system. Special attention was given to the sampling vessel transfer mechanisms, with two filling and controlled sample cooling options

  8. Microfabricated Devices for Sample Extraction, Concentrations, and Related Sample Processing Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Gang; Lin, Yuehe

    2006-12-01

    This is an invited book chapter. As with other analytical techniques, sample pretreatments, sample extraction, sample introduction, and related techniques are of extreme importance for micro-electro-mechanical systems (MEMS). Bio-MEMS devices and systems start with a sampling step. The biological sample then usually undergoes some kinds of sample preparation steps before the actual analysis. These steps may involve extracting the target sample from its matrix, removing interferences from the sample, derivatizing the sample to detectable species, or performing a sample preconcentration step. The integration of the components for sample pretreatment into microfluidic devices represents one of the remaining the bottle-neck towards achieving true miniaturized total analysis systems (?TAS). This chapter provides a thorough state-of-art of the developments in this field to date.

  9. APTIMA assay on SurePath liquid-based cervical samples compared to endocervical swab samples facilitated by a real time database

    Directory of Open Access Journals (Sweden)

    Khader Samer

    2010-01-01

    samples transferred to APTIMA specimen transfer medium within seven days is sufficiently sensitive and specific to be used to screen for CT and GC. CT sensitivity may be somewhat reduced in samples from patients over 25 years. SP specimens retained in the original SP fixative for longer time intervals also may have decreased sensitivity, due to deterioration of RNA, but this was not assessed in this study. The ability to tap the live pathology database is a valuable tool that can useful to conduct clinical studies without a costly prospective clinical trial.

  10. Obtaining Self-Samples to Diagnose Curable Sexually Transmitted Infections: A Systematic Review of Patients’ Experiences

    Science.gov (United States)

    Paudyal, Priyamvada; Llewellyn, Carrie; Lau, Jason; Mahmud, Mohammad; Smith, Helen

    2015-01-01

    Background Routine screening is key to sexually transmitted infection (STI) prevention and control. Previous studies suggest that clinic-based screening programmes capture only a small proportion of people with STIs. Self-sampling using non- or minimally invasive techniques may be beneficial for those reluctant to actively engage with conventional sampling methods. We systematically reviewed studies of patients’ experiences of obtaining self-samples to diagnose curable STIs. Methods We conducted an electronic search of MEDLINE, EMBASE, CINAHL, PsychINFO, BNI, and Cochrane Database of Systematic Reviews to identify relevant articles published in English between January 1980 and March 2014. Studies were included if participants self-sampled for the diagnosis of a curable STI and had specifically sought participants’ opinions of their experience, acceptability, preferences, or willingness to self-sample. Results The initial search yielded 558 references. Of these, 45 studies met the inclusion criteria. Thirty-six studies assessed patients’ acceptability and experiences of self-sampling. Pooled results from these studies shows that self-sampling is a highly acceptable method with 85% of patients reporting the method to be well received and acceptable. Twenty-eight studies reported on ease of self-sampling; the majority of patients (88%) in these studies found self-sampling an “easy” procedure. Self-sampling was favoured compared to clinician sampling, and home sampling was preferred to clinic-based sampling. Females and older participants were more accepting of self-sampling. Only a small minority of participants (13%) reported pain during self-sampling. Participants were willing to undergo self-sampling and recommend others. Privacy and safety were the most common concerns. Conclusion Self-sampling for diagnostic testing is well accepted with the majority having a positive experience and willingness to use again. Standardization of self-sampling procedures

  11. Randomized branch sampling to estimatefruit production in Pecan trees cv. ‘Barton’

    Directory of Open Access Journals (Sweden)

    Filemom Manoel Mokochinski

    Full Text Available ABSTRACT: Sampling techniques to quantify the production of fruits are still very scarce and create a gap in crop development research. This study was conducted in a rural property in the county of Cachoeira do Sul - RS to estimate the efficiency of randomized branch sampling (RBS in quantifying the production of pecan fruit at three different ages (5,7 and 10 years. Two selection techniques were tested: the probability proportional to the diameter (PPD and the uniform probability (UP techniques, which were performed on nine trees, three from each age and randomly chosen. The RBS underestimated fruit production for all ages, and its main drawback was the high sampling error (125.17% - PPD and 111.04% - UP. The UP was regarded as more efficient than the PPD, though both techniques estimated similar production and similar experimental errors. In conclusion, we reported that branch sampling was inaccurate for this case study, requiring new studies to produce estimates with smaller sampling error.

  12. Enhancement of tunnel conductivity by Cooper pair fluctuations in electron-hole bilayer

    International Nuclear Information System (INIS)

    Efimkin, D K; Lozovik, Yu E

    2012-01-01

    Influence of Cooper pair fluctuations that are precursor of pairing of electrons and holes located on opposite surfaces of topological insulator film on tunnel conductivity between the surfaces is investigated. Due to restrictions caused by momentum and energy conservation dependence of tunnel conductivity on external bias voltage has peak that becomes more prominent with decreasing of disorder and temperature. We have shown that Cooper pair fluctuations considerably enhance tunneling and height of the peak diverges in vicinity of critical temperature with critical index ν = 2. Width of the peak tends to zero in proximity of critical temperature. Pairing of electrons and holes can be suppressed by disorder and in vicinity of quantum critical point height of the peak also diverges as function of Cooper pair damping with critical index μ = 2.

  13. Sample management implementation plan: Salt Repository Project

    International Nuclear Information System (INIS)

    1987-01-01

    The purpose of the Sample Management Implementation Plan is to define management controls and building requirements for handling materials collected during the site characterization of the Deaf Smith County, Texas, site. This work will be conducted for the US Department of Energy Salt Repository Project Office (SRPO). The plan provides for controls mandated by the US Nuclear Regulatory Commission and the US Environmental Protection Agency. Salt Repository Project (SRP) Sample Management will interface with program participants who request, collect, and test samples. SRP Sample Management will be responsible for the following: (1) preparing samples; (2) ensuring documentation control; (3) providing for uniform forms, labels, data formats, and transportation and storage requirements; and (4) identifying sample specifications to ensure sample quality. The SRP Sample Management Facility will be operated under a set of procedures that will impact numerous program participants. Requesters of samples will be responsible for definition of requirements in advance of collection. Sample requests for field activities will be approved by the SRPO, aided by an advisory group, the SRP Sample Allocation Committee. This document details the staffing, building, storage, and transportation requirements for establishing an SRP Sample Management Facility. Materials to be managed in the facility include rock core and rock discontinuities, soils, fluids, biota, air particulates, cultural artifacts, and crop and food stuffs. 39 refs., 3 figs., 11 tabs

  14. Rational Arithmetic Mathematica Functions to Evaluate the Two-Sided One Sample K-S Cumulative Sampling Distribution

    Directory of Open Access Journals (Sweden)

    J. Randall Brown

    2007-06-01

    Full Text Available One of the most widely used goodness-of-fit tests is the two-sided one sample Kolmogorov-Smirnov (K-S test which has been implemented by many computer statistical software packages. To calculate a two-sided p value (evaluate the cumulative sampling distribution, these packages use various methods including recursion formulae, limiting distributions, and approximations of unknown accuracy developed over thirty years ago. Based on an extensive literature search for the two-sided one sample K-S test, this paper identifies an exact formula for sample sizes up to 31, six recursion formulae, and one matrix formula that can be used to calculate a p value. To ensure accurate calculation by avoiding catastrophic cancelation and eliminating rounding error, each of these formulae is implemented in rational arithmetic. For the six recursion formulae and the matrix formula, computational experience for sample sizes up to 500 shows that computational times are increasing functions of both the sample size and the number of digits in the numerator and denominator integers of the rational number test statistic. The computational times of the seven formulae vary immensely but the Durbin recursion formula is almost always the fastest. Linear search is used to calculate the inverse of the cumulative sampling distribution (find the confidence interval half-width and tables of calculated half-widths are presented for sample sizes up to 500. Using calculated half-widths as input, computational times for the fastest formula, the Durbin recursion formula, are given for sample sizes up to two thousand.

  15. Mechanical Conversion for High-Throughput TEM Sample Preparation

    International Nuclear Information System (INIS)

    Kendrick, Anthony B; Moore, Thomas M; Zaykova-Feldman, Lyudmila

    2006-01-01

    This paper presents a novel method of direct mechanical conversion from lift-out sample to TEM sample holder. The lift-out sample is prepared in the FIB using the in-situ liftout Total Release TM method. The mechanical conversion is conducted using a mechanical press and one of a variety of TEM coupons, including coupons for both top-side and back-side thinning. The press joins a probe tip point with attached TEM sample to the sample coupon and separates the complete assembly as a 3mm diameter TEM grid, compatible with commercially available TEM sample holder rods. This mechanical conversion process lends itself well to the high through-put requirements of in-line process control and to materials characterization labs where instrument utilization and sample security are critically important

  16. Language sampling

    DEFF Research Database (Denmark)

    Rijkhoff, Jan; Bakker, Dik

    1998-01-01

    This article has two aims: [1] to present a revised version of the sampling method that was originally proposed in 1993 by Rijkhoff, Bakker, Hengeveld and Kahrel, and [2] to discuss a number of other approaches to language sampling in the light of our own method. We will also demonstrate how our...... sampling method is used with different genetic classifications (Voegelin & Voegelin 1977, Ruhlen 1987, Grimes ed. 1997) and argue that —on the whole— our sampling technique compares favourably with other methods, especially in the case of exploratory research....

  17. Environmental sampling

    International Nuclear Information System (INIS)

    Puckett, J.M.

    1998-01-01

    Environmental Sampling (ES) is a technology option that can have application in transparency in nuclear nonproliferation. The basic process is to take a sample from the environment, e.g., soil, water, vegetation, or dust and debris from a surface, and through very careful sample preparation and analysis, determine the types, elemental concentration, and isotopic composition of actinides in the sample. The sample is prepared and the analysis performed in a clean chemistry laboratory (CCL). This ES capability is part of the IAEA Strengthened Safeguards System. Such a Laboratory is planned to be built by JAERI at Tokai and will give Japan an intrinsic ES capability. This paper presents options for the use of ES as a transparency measure for nuclear nonproliferation

  18. 40 CFR 141.23 - Inorganic chemical sampling and analytical requirements.

    Science.gov (United States)

    2010-07-01

    ... may allow a groundwater system to reduce the sampling frequency to annually after four consecutive... this section. (a) Monitoring shall be conducted as follows: (1) Groundwater systems shall take a... system shall take each sample at the same sampling point unless conditions make another sampling point...

  19. Maintaining continuity of knowledge on safeguards samples

    International Nuclear Information System (INIS)

    Franssen, F.; Islam, A.B.M.N.; Sonnier, C.; Schoeneman, J.L.; Baumann, M.

    1992-01-01

    The conclusions of the vulnerability test on VOPAN (verification of Operator's Analysis) as conducted at Safeguards Analytical Laboratory (ASA) at Seibersdorf, Austria in October 1990 and documented in STR-266, indicate that ''whenever samples are taken for safeguards purposes extreme care must be taken to ensure that they have not been interfered with during the sample taking, transportation, storage or sample preparation process.'' Indeed there exist a number of possibilities to alter the content of a safeguards sample vial from the moment of sampling up to the arrival of the treated (or untreated) sample at SAL. The time lapse between these two events can range from a few days up to months. The sample history over this period can be subdivided into three main sub-periods: (1) the period from when the sampling activities are commenced up to the treatment in the operator's laboratory, (2) during treatment of samples in the operator's laboratory, and finally, (3) the period between that treatment and the arrival of the sample at SAL. A combined effort between the Agency and the United States Support Program to the Agency (POTAS) has resulted in two active tasks and one proposed task to investigate improving the maintenance of continuity of knowledge on safeguards samples during the entire period of their existence. This paper describes the use of the Sample Vial Secure Container (SVSC), of the Authenticated Secure Container System (ASCS), and of the Secure Container for Storage and Transportation of samples (SCST) to guarantee that a representative portion of the solution sample will be received at SAL

  20. Cluster-sample surveys and lot quality assurance sampling to evaluate yellow fever immunisation coverage following a national campaign, Bolivia, 2007.

    Science.gov (United States)

    Pezzoli, Lorenzo; Pineda, Silvia; Halkyer, Percy; Crespo, Gladys; Andrews, Nick; Ronveaux, Olivier

    2009-03-01

    To estimate the yellow fever (YF) vaccine coverage for the endemic and non-endemic areas of Bolivia and to determine whether selected districts had acceptable levels of coverage (>70%). We conducted two surveys of 600 individuals (25 x 12 clusters) to estimate coverage in the endemic and non-endemic areas. We assessed 11 districts using lot quality assurance sampling (LQAS). The lot (district) sample was 35 individuals with six as decision value (alpha error 6% if true coverage 70%; beta error 6% if true coverage 90%). To increase feasibility, we divided the lots into five clusters of seven individuals; to investigate the effect of clustering, we calculated alpha and beta by conducting simulations where each cluster's true coverage was sampled from a normal distribution with a mean of 70% or 90% and standard deviations of 5% or 10%. Estimated coverage was 84.3% (95% CI: 78.9-89.7) in endemic areas, 86.8% (82.5-91.0) in non-endemic and 86.0% (82.8-89.1) nationally. LQAS showed that four lots had unacceptable coverage levels. In six lots, results were inconsistent with the estimated administrative coverage. The simulations suggested that the effect of clustering the lots is unlikely to have significantly increased the risk of making incorrect accept/reject decisions. Estimated YF coverage was high. Discrepancies between administrative coverage and LQAS results may be due to incorrect population data. Even allowing for clustering in LQAS, the statistical errors would remain low. Catch-up campaigns are recommended in districts with unacceptable coverage.

  1. Evaluation of sampling strategies to estimate crown biomass

    Directory of Open Access Journals (Sweden)

    Krishna P Poudel

    2015-01-01

    Full Text Available Background Depending on tree and site characteristics crown biomass accounts for a significant portion of the total aboveground biomass in the tree. Crown biomass estimation is useful for different purposes including evaluating the economic feasibility of crown utilization for energy production or forest products, fuel load assessments and fire management strategies, and wildfire modeling. However, crown biomass is difficult to predict because of the variability within and among species and sites. Thus the allometric equations used for predicting crown biomass should be based on data collected with precise and unbiased sampling strategies. In this study, we evaluate the performance different sampling strategies to estimate crown biomass and to evaluate the effect of sample size in estimating crown biomass. Methods Using data collected from 20 destructively sampled trees, we evaluated 11 different sampling strategies using six evaluation statistics: bias, relative bias, root mean square error (RMSE, relative RMSE, amount of biomass sampled, and relative biomass sampled. We also evaluated the performance of the selected sampling strategies when different numbers of branches (3, 6, 9, and 12 are selected from each tree. Tree specific log linear model with branch diameter and branch length as covariates was used to obtain individual branch biomass. Results Compared to all other methods stratified sampling with probability proportional to size estimation technique produced better results when three or six branches per tree were sampled. However, the systematic sampling with ratio estimation technique was the best when at least nine branches per tree were sampled. Under the stratified sampling strategy, selecting unequal number of branches per stratum produced approximately similar results to simple random sampling, but it further decreased RMSE when information on branch diameter is used in the design and estimation phases. Conclusions Use of

  2. Inverse heat conduction estimation of inner wall temperature fluctuations under turbulent penetration

    Science.gov (United States)

    Guo, Zhouchao; Lu, Tao; Liu, Bo

    2017-04-01

    Turbulent penetration can occur when hot and cold fluids mix in a horizontal T-junction pipe at nuclear plants. Caused by the unstable turbulent penetration, temperature fluctuations with large amplitude and high frequency can lead to time-varying wall thermal stress and even thermal fatigue on the inner wall. Numerous cases, however, exist where inner wall temperatures cannot be measured and only outer wall temperature measurements are feasible. Therefore, it is one of the popular research areas in nuclear science and engineering to estimate temperature fluctuations on the inner wall from measurements of outer wall temperatures without damaging the structure of the pipe. In this study, both the one-dimensional (1D) and the two-dimensional (2D) inverse heat conduction problem (IHCP) were solved to estimate the temperature fluctuations on the inner wall. First, numerical models of both the 1D and the 2D direct heat conduction problem (DHCP) were structured in MATLAB, based on the finite difference method with an implicit scheme. Second, both the 1D IHCP and the 2D IHCP were solved by the steepest descent method (SDM), and the DHCP results of temperatures on the outer wall were used to estimate the temperature fluctuations on the inner wall. Third, we compared the temperature fluctuations on the inner wall estimated by the 1D IHCP with those estimated by the 2D IHCP in four cases: (1) when the maximum disturbance of temperature of fluid inside the pipe was 3°C, (2) when the maximum disturbance of temperature of fluid inside the pipe was 30°C, (3) when the maximum disturbance of temperature of fluid inside the pipe was 160°C, and (4) when the fluid temperatures inside the pipe were random from 50°C to 210°C.

  3. Effective sample labeling

    International Nuclear Information System (INIS)

    Rieger, J.T.; Bryce, R.W.

    1990-01-01

    Ground-water samples collected for hazardous-waste and radiological monitoring have come under strict regulatory and quality assurance requirements as a result of laws such as the Resource Conservation and Recovery Act. To comply with these laws, the labeling system used to identify environmental samples had to be upgraded to ensure proper handling and to protect collection personnel from exposure to sample contaminants and sample preservatives. The sample label now used as the Pacific Northwest Laboratory is a complete sample document. In the event other paperwork on a labeled sample were lost, the necessary information could be found on the label

  4. Size effects in many-valley fluctuations in semiconductors

    International Nuclear Information System (INIS)

    Sokolov, V.N.; Kochelap, V.A.

    1995-08-01

    We present the results of theoretical investigations of nonhomogeneous fluctuations in submicron active regions of many-valley semiconductors with equivalent valleys(Ge, Si-type), where the dimension 2d of the region is comparable to or less than the intervalley diffusion relaxation length L iv . It is shown that for arbitrary orientations of the valley axes (the crystal axes) with respect to lateral sample surfaces, the fluctuation spectra depend on the bias voltage applied to the layer in the region of weak nonheating electric fields. The new physical phenomenon is reported: the fluctuation spectra depend on the sample thickness, with 2d iv the suppression of fluctuations arises for fluctuation frequencies ω -1 iv , τ -1 iv is the characteristic intervalley relaxation time. (author). 43 refs, 5 figs

  5. Field evaluation of personal sampling methods for multiple bioaerosols.

    Directory of Open Access Journals (Sweden)

    Chi-Hsun Wang

    Full Text Available Ambient bioaerosols are ubiquitous in the daily environment and can affect health in various ways. However, few studies have been conducted to comprehensively evaluate personal bioaerosol exposure in occupational and indoor environments because of the complex composition of bioaerosols and the lack of standardized sampling/analysis methods. We conducted a study to determine the most efficient collection/analysis method for the personal exposure assessment of multiple bioaerosols. The sampling efficiencies of three filters and four samplers were compared. According to our results, polycarbonate (PC filters had the highest relative efficiency, particularly for bacteria. Side-by-side sampling was conducted to evaluate the three filter samplers (with PC filters and the NIOSH Personal Bioaerosol Cyclone Sampler. According to the results, the Button Aerosol Sampler and the IOM Inhalable Dust Sampler had the highest relative efficiencies for fungi and bacteria, followed by the NIOSH sampler. Personal sampling was performed in a pig farm to assess occupational bioaerosol exposure and to evaluate the sampling/analysis methods. The Button and IOM samplers yielded a similar performance for personal bioaerosol sampling at the pig farm. However, the Button sampler is more likely to be clogged at high airborne dust concentrations because of its higher flow rate (4 L/min. Therefore, the IOM sampler is a more appropriate choice for performing personal sampling in environments with high dust levels. In summary, the Button and IOM samplers with PC filters are efficient sampling/analysis methods for the personal exposure assessment of multiple bioaerosols.

  6. Electric Field Fluctuations in Water

    Science.gov (United States)

    Thorpe, Dayton; Limmer, David; Chandler, David

    2013-03-01

    Charge transfer in solution, such as autoionization and ion pair dissociation in water, is governed by rare electric field fluctuations of the solvent. Knowing the statistics of such fluctuations can help explain the dynamics of these rare events. Trajectories short enough to be tractable by computer simulation are virtually certain not to sample the large fluctuations that promote rare events. Here, we employ importance sampling techniques with classical molecular dynamics simulations of liquid water to study statistics of electric field fluctuations far from their means. We find that the distributions of electric fields located on individual water molecules are not in general gaussian. Near the mean this non-gaussianity is due to the internal charge distribution of the water molecule. Further from the mean, however, there is a previously unreported Bjerrum-like defect that stabilizes certain large fluctuations out of equilibrium. As expected, differences in electric fields acting between molecules are gaussian to a remarkable degree. By studying these differences, though, we are able to determine what configurations result not only in large electric fields, but also in electric fields with long spatial correlations that may be needed to promote charge separation.

  7. Comparisons of methods for generating conditional Poisson samples and Sampford samples

    OpenAIRE

    Grafström, Anton

    2005-01-01

    Methods for conditional Poisson sampling (CP-sampling) and Sampford sampling are compared and the focus is on the efficiency of the methods. The efficiency is investigated by simulation in different sampling situations. It was of interest to compare methods since new methods for both CP-sampling and Sampford sampling were introduced by Bondesson, Traat & Lundqvist in 2004. The new methods are acceptance rejection methods that use the efficient Pareto sampling method. They are found to be ...

  8. Cast Stone Oxidation Front Evaluation: Preliminary Results For Samples Exposed To Moist Air

    International Nuclear Information System (INIS)

    Langton, C. A.; Almond, P. M.

    2013-01-01

    The rate of oxidation is important to the long-term performance of reducing salt waste forms because the solubility of some contaminants, e.g., technetium, is a function of oxidation state. TcO 4 - in the salt solution is reduced to Tc(IV) and has been shown to react with ingredients in the waste form to precipitate low solubility sulfide and/or oxide phases. Upon exposure to oxygen, the compounds containing Tc(IV) oxidize to the pertechnetate ion, Tc(VII)O 4 - , which is very soluble. Consequently the rate of technetium oxidation front advancement into a monolith and the technetium leaching profile as a function of depth from an exposed surface are important to waste form performance and ground water concentration predictions. An approach for measuring contaminant oxidation rate (effective contaminant specific oxidation rate) based on leaching of select contaminants of concern is described in this report. In addition, the relationship between reduction capacity and contaminant oxidation is addressed. Chromate (Cr(VI) was used as a non-radioactive surrogate for pertechnetate, Tc(VII), in Cast Stone samples prepared with 5 M Simulant. Cast Stone spiked with pertechnetate was also prepared and tested. Depth discrete subsamples spiked with Cr were cut from Cast Stone exposed to Savannah River Site (SRS) outdoor ambient temperature fluctuations and moist air. Depth discrete subsamples spiked with Tc-99 were cut from Cast Stone exposed to laboratory ambient temperature fluctuations and moist air. Similar conditions are expected to be encountered in the Cast Stone curing container. The leachability of Cr and Tc-99 and the reduction capacities, measured by the Angus-Glasser method, were determined for each subsample as a function of depth from the exposed surface. The results obtained to date were focused on continued method development and are preliminary and apply to the sample composition and curing / exposure conditions described in this report. The Cr oxidation front

  9. Hanford Site Environmental Surveillance Master Sampling Schedule for Calendar Year 2010

    Energy Technology Data Exchange (ETDEWEB)

    Bisping, Lynn E.

    2010-01-08

    Environmental surveillance of the Hanford Site and surrounding areas is conducted by Pacific Northwest National Laboratory for the U.S. Department of Energy (DOE). Sampling is conducted to evaluate levels of radioactive and nonradioactive pollutants in the Hanford Site environs per regulatory requirements. This document contains the calendar year 2010 schedule for the routine collection of samples for the Surface Environmental Surveillance Project and the Drinking Water Monitoring Project. Each section includes sampling locations, sampling frequencies, sample types, and analyses to be performed. In some cases, samples are scheduled on a rotating basis. If a sample will not be collected in 2010, the anticipated year for collection is provided. Maps showing approximate sampling locations are included for media scheduled for collection in 2010.

  10. Sample summary report for KOR1 pressure tube sample

    International Nuclear Information System (INIS)

    Lee, Hee Jong; Nam, Min Woo; Choi, Young Ha

    2006-01-01

    This summary report includes basically the following: - The FLAW CHARACTERIZATION TABLE of KOR1 sample and supporting documentation. - The CROSS REFERENCE TABLES for each investigator, which is the SAMPLE INSPECTION TABLE that cross reference to the FLAW CHARACTERIZATION TABLE. - Each Sample Inspection Report as Appendices

  11. Are most samples of animals systematically biased? Consistent individual trait differences bias samples despite random sampling.

    Science.gov (United States)

    Biro, Peter A

    2013-02-01

    Sampling animals from the wild for study is something nearly every biologist has done, but despite our best efforts to obtain random samples of animals, 'hidden' trait biases may still exist. For example, consistent behavioral traits can affect trappability/catchability, independent of obvious factors such as size and gender, and these traits are often correlated with other repeatable physiological and/or life history traits. If so, systematic sampling bias may exist for any of these traits. The extent to which this is a problem, of course, depends on the magnitude of bias, which is presently unknown because the underlying trait distributions in populations are usually unknown, or unknowable. Indeed, our present knowledge about sampling bias comes from samples (not complete population censuses), which can possess bias to begin with. I had the unique opportunity to create naturalized populations of fish by seeding each of four small fishless lakes with equal densities of slow-, intermediate-, and fast-growing fish. Using sampling methods that are not size-selective, I observed that fast-growing fish were up to two-times more likely to be sampled than slower-growing fish. This indicates substantial and systematic bias with respect to an important life history trait (growth rate). If correlations between behavioral, physiological and life-history traits are as widespread as the literature suggests, then many animal samples may be systematically biased with respect to these traits (e.g., when collecting animals for laboratory use), and affect our inferences about population structure and abundance. I conclude with a discussion on ways to minimize sampling bias for particular physiological/behavioral/life-history types within animal populations.

  12. Sampling of ore

    International Nuclear Information System (INIS)

    Boehme, R.C.; Nicholas, B.L.

    1987-01-01

    This invention relates to a method of an apparatus for ore sampling. The method includes the steps of periodically removing a sample of the output material of a sorting machine, weighing each sample so that each is of the same weight, measuring a characteristic such as the radioactivity, magnetivity or the like of each sample, subjecting at least an equal portion of each sample to chemical analysis to determine the mineral content of the sample and comparing the characteristic measurement with desired mineral content of the chemically analysed portion of the sample to determine the characteristic/mineral ratio of the sample. The apparatus includes an ore sample collector, a deflector for deflecting a sample of ore particles from the output of an ore sorter into the collector and means for moving the deflector from a first position in which it is clear of the particle path from the sorter to a second position in which it is in the particle path at predetermined time intervals and for predetermined time periods to deflect the sample particles into the collector. The apparatus conveniently includes an ore crusher for comminuting the sample particle, a sample hopper means for weighing the hopper, a detector in the hopper for measuring a characteristic such as radioactivity, magnetivity or the like of particles in the hopper, a discharge outlet from the hopper and means for feeding the particles from the collector to the crusher and then to the hopper

  13. Optimum sample size allocation to minimize cost or maximize power for the two-sample trimmed mean test.

    Science.gov (United States)

    Guo, Jiin-Huarng; Luh, Wei-Ming

    2009-05-01

    When planning a study, sample size determination is one of the most important tasks facing the researcher. The size will depend on the purpose of the study, the cost limitations, and the nature of the data. By specifying the standard deviation ratio and/or the sample size ratio, the present study considers the problem of heterogeneous variances and non-normality for Yuen's two-group test and develops sample size formulas to minimize the total cost or maximize the power of the test. For a given power, the sample size allocation ratio can be manipulated so that the proposed formulas can minimize the total cost, the total sample size, or the sum of total sample size and total cost. On the other hand, for a given total cost, the optimum sample size allocation ratio can maximize the statistical power of the test. After the sample size is determined, the present simulation applies Yuen's test to the sample generated, and then the procedure is validated in terms of Type I errors and power. Simulation results show that the proposed formulas can control Type I errors and achieve the desired power under the various conditions specified. Finally, the implications for determining sample sizes in experimental studies and future research are discussed.

  14. Latex Rubber Gloves as a Sampling Dosimeter Using a Novel Surrogate Sampling Device.

    Science.gov (United States)

    Sankaran, Gayatri; Lopez, Terry; Ries, Steve; Ross, John; Vega, Helen; Eastmond, David A; Krieger, Robert I

    2015-01-01

    Pesticide exposure during harvesting of crops occurs primarily to the workers' hands. When harvesters wear latex rubber gloves for personal safety and hygiene harvesting reasons, gloves accumulate pesticide residues. Hence, characterization of the gloves' properties may be useful for pesticide exposure assessments. Controlled field studies were conducted using latex rubber gloves to define the factors that influence the transfer of pesticides to the glove and that would affect their use as a residue monitoring device. A novel sampling device called the Brinkman Contact Transfer Unit (BCTU) was constructed to study the glove characteristics and residue transfer and accumulation under controlled conditions on turf. The effectiveness of latex rubber gloves as sampling dosimeters was evaluated by measuring the transferable pesticide residues as a function of time. The validation of latex rubber gloves as a residue sampling dosimeter was performed by comparing pesticide transfer and dissipation from the gloves, with the turf transferable residues sampled using the validated California (CA) Roller, a standard measure of residue transfer. The observed correlation (Pearson's correlation coefficient R(2)) between the two methods was .84 for malathion and .96 for fenpropathrin, indicating that the BCTU is a useful, reliable surrogate tool for studying available residue transfer to latex rubber gloves under experimental conditions. Perhaps more importantly, these data demonstrate that latex gloves worn by workers may be useful quantifiable matrices for measuring pesticide exposure.

  15. Northern Marshall Islands radiological survey: sampling and analysis summary

    Energy Technology Data Exchange (ETDEWEB)

    Robison, W.L.; Conrado, C.L.; Eagle, R.J.; Stuart, M.L.

    1981-07-23

    A radiological survey was conducted in the Northern Marshall Islands to document reamining external gamma exposures from nuclear tests conducted at Enewetak and Bikini Atolls. An additional program was later included to obtain terrestrial and marine samples for radiological dose assessment for current or potential atoll inhabitants. This report is the first of a series summarizing the results from the terrestrial and marine surveys. The sample collection and processing procedures and the general survey methodology are discussed; a summary of the collected samples and radionuclide analyses is presented. Over 5400 samples were collected from the 12 atolls and 2 islands and prepared for analysis including 3093 soil, 961 vegetation, 153 animal, 965 fish composite samples (average of 30 fish per sample), 101 clam, 50 lagoon water, 15 cistern water, 17 groundwater, and 85 lagoon sediment samples. A complete breakdown by sample type, atoll, and island is given here. The total number of analyses by radionuclide are 8840 for /sup 241/Am, 6569 for /sup 137/Cs, 4535 for /sup 239 +240/Pu, 4431 for /sup 90/Sr, 1146 for /sup 238/Pu, 269 for /sup 241/Pu, and 114 each for /sup 239/Pu and /sup 240/Pu. A complete breakdown by sample category, atoll or island, and radionuclide is also included.

  16. Northern Marshall Islands radiological survey: sampling and analysis summary

    International Nuclear Information System (INIS)

    Robison, W.L.; Conrado, C.L.; Eagle, R.J.; Stuart, M.L.

    1981-01-01

    A radiological survey was conducted in the Northern Marshall Islands to document reamining external gamma exposures from nuclear tests conducted at Enewetak and Bikini Atolls. An additional program was later included to obtain terrestrial and marine samples for radiological dose assessment for current or potential atoll inhabitants. This report is the first of a series summarizing the results from the terrestrial and marine surveys. The sample collection and processing procedures and the general survey methodology are discussed; a summary of the collected samples and radionuclide analyses is presented. Over 5400 samples were collected from the 12 atolls and 2 islands and prepared for analysis including 3093 soil, 961 vegetation, 153 animal, 965 fish composite samples (average of 30 fish per sample), 101 clam, 50 lagoon water, 15 cistern water, 17 groundwater, and 85 lagoon sediment samples. A complete breakdown by sample type, atoll, and island is given here. The total number of analyses by radionuclide are 8840 for 241 Am, 6569 for 137 Cs, 4535 for 239+240 Pu, 4431 for 90 Sr, 1146 for 238 Pu, 269 for 241 Pu, and 114 each for 239 Pu and 240 Pu. A complete breakdown by sample category, atoll or island, and radionuclide is also included

  17. Large Sample Neutron Activation Analysis of Heterogeneous Samples

    International Nuclear Information System (INIS)

    Stamatelatos, I.E.; Vasilopoulou, T.; Tzika, F.

    2018-01-01

    A Large Sample Neutron Activation Analysis (LSNAA) technique was developed for non-destructive analysis of heterogeneous bulk samples. The technique incorporated collimated scanning and combining experimental measurements and Monte Carlo simulations for the identification of inhomogeneities in large volume samples and the correction of their effect on the interpretation of gamma-spectrometry data. Corrections were applied for the effect of neutron self-shielding, gamma-ray attenuation, geometrical factor and heterogeneous activity distribution within the sample. A benchmark experiment was performed to investigate the effect of heterogeneity on the accuracy of LSNAA. Moreover, a ceramic vase was analyzed as a whole demonstrating the feasibility of the technique. The LSNAA results were compared against results obtained by INAA and a satisfactory agreement between the two methods was observed. This study showed that LSNAA is a technique capable to perform accurate non-destructive, multi-elemental compositional analysis of heterogeneous objects. It also revealed the great potential of the technique for the analysis of precious objects and artefacts that need to be preserved intact and cannot be damaged for sampling purposes. (author)

  18. Research Note Pilot survey to assess sample size for herbaceous ...

    African Journals Online (AJOL)

    A pilot survey to determine sub-sample size (number of point observations per plot) for herbaceous species composition assessments, using a wheel-point apparatus applying the nearest-plant method, was conducted. Three plots differing in species composition on the Zululand coastal plain were selected, and on each plot ...

  19. A Note on Information-Directed Sampling and Thompson Sampling

    OpenAIRE

    Zhou, Li

    2015-01-01

    This note introduce three Bayesian style Multi-armed bandit algorithms: Information-directed sampling, Thompson Sampling and Generalized Thompson Sampling. The goal is to give an intuitive explanation for these three algorithms and their regret bounds, and provide some derivations that are omitted in the original papers.

  20. How Sample Size Affects a Sampling Distribution

    Science.gov (United States)

    Mulekar, Madhuri S.; Siegel, Murray H.

    2009-01-01

    If students are to understand inferential statistics successfully, they must have a profound understanding of the nature of the sampling distribution. Specifically, they must comprehend the determination of the expected value and standard error of a sampling distribution as well as the meaning of the central limit theorem. Many students in a high…

  1. Fluctuation of zonulin levels in blood vs stability of antibodies.

    Science.gov (United States)

    Vojdani, Aristo; Vojdani, Elroy; Kharrazian, Datis

    2017-08-21

    To evaluate the measurement of zonulin level and antibodies of zonulin and other tight junction proteins in the blood of controls and celiac disease patients. This study was conducted to assess the variability or stability of zonulin levels vs IgA and IgG antibodies against zonulin in blood samples from 18 controls at 0, 6, 24 and 30 h after blood draw. We also measured zonulin level as well as zonulin, occludin, vinculin, aquaporin 4 and glial fibrillary acidic protein antibodies in the sera of 30 patients with celiac disease and 30 controls using enzyme-linked immunosorbent assay methodology. The serum zonulin level in 6 out of 18 subjects was low or zonulin levels of > 2.8 ng/mL and showed significant fluctuation from sample to sample. Comparatively, zonulin antibody measured in all samples was highly stable and reproducible from sample to sample. Celiac disease patients showed zonulin levels with a mean of 8.5 ng/mL compared to 3.7 ng/mL in controls ( P zonulin level at 2SD above the mean was demonstrated in 37% of celiac disease patients, while antibodies against zonulin, occludin and other tight junction proteins was detected in up to 86% of patients with celiac disease. Due to its fluctuation, a single measurement of zonulin level is not recommended for assessment of intestinal barrier integrity. Measurement of IgG and IgA antibodies against zonulin, occludin, and other tight junction proteins is proposed for the evaluation of the loss of intestinal barrier integrity.

  2. Sampling in practice

    DEFF Research Database (Denmark)

    Esbensen, Kim Harry; Petersen, Lars

    2005-01-01

    A basic knowledge of the Theory of Sampling (TOS) and a set of only eight sampling unit operations is all the practical sampler needs to ensure representativeness of samples extracted from all kinds of lots: production batches, - truckloads, - barrels, sub-division in the laboratory, sampling...... in nature and in the field (environmental sampling, forestry, geology, biology), from raw materials or manufactory processes etc. We here can only give a brief introduction to the Fundamental Sampling Principle (FSP) and these eight Sampling Unit Operations (SUO’s). Always respecting FSP and invoking only...... the necessary SUO’s (dependent on the practical situation) is the only prerequisite needed for eliminating all sampling bias and simultaneously minimizing sampling variance, and this is in addition a sure guarantee for making the final analytical results trustworthy. No reliable conclusions can be made unless...

  3. Simulated tempering distributed replica sampling: A practical guide to enhanced conformational sampling

    Energy Technology Data Exchange (ETDEWEB)

    Rauscher, Sarah; Pomes, Regis, E-mail: pomes@sickkids.ca

    2010-11-01

    Simulated tempering distributed replica sampling (STDR) is a generalized-ensemble method designed specifically for simulations of large molecular systems on shared and heterogeneous computing platforms [Rauscher, Neale and Pomes (2009) J. Chem. Theor. Comput. 5, 2640]. The STDR algorithm consists of an alternation of two steps: (1) a short molecular dynamics (MD) simulation; and (2) a stochastic temperature jump. Repeating these steps thousands of times results in a random walk in temperature, which allows the system to overcome energetic barriers, thereby enhancing conformational sampling. The aim of the present paper is to provide a practical guide to applying STDR to complex biomolecular systems. We discuss the details of our STDR implementation, which is a highly-parallel algorithm designed to maximize computational efficiency while simultaneously minimizing network communication and data storage requirements. Using a 35-residue disordered peptide in explicit water as a test system, we characterize the efficiency of the STDR algorithm with respect to both diffusion in temperature space and statistical convergence of structural properties. Importantly, we show that STDR provides a dramatic enhancement of conformational sampling compared to a canonical MD simulation.

  4. Calculating Confidence, Uncertainty, and Numbers of Samples When Using Statistical Sampling Approaches to Characterize and Clear Contaminated Areas

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, Gregory F.; Matzke, Brett D.; Sego, Landon H.; Amidan, Brett G.

    2013-04-27

    This report discusses the methodology, formulas, and inputs needed to make characterization and clearance decisions for Bacillus anthracis-contaminated and uncontaminated (or decontaminated) areas using a statistical sampling approach. Specifically, the report includes the methods and formulas for calculating the • number of samples required to achieve a specified confidence in characterization and clearance decisions • confidence in making characterization and clearance decisions for a specified number of samples for two common statistically based environmental sampling approaches. In particular, the report addresses an issue raised by the Government Accountability Office by providing methods and formulas to calculate the confidence that a decision area is uncontaminated (or successfully decontaminated) if all samples collected according to a statistical sampling approach have negative results. Key to addressing this topic is the probability that an individual sample result is a false negative, which is commonly referred to as the false negative rate (FNR). The two statistical sampling approaches currently discussed in this report are 1) hotspot sampling to detect small isolated contaminated locations during the characterization phase, and 2) combined judgment and random (CJR) sampling during the clearance phase. Typically if contamination is widely distributed in a decision area, it will be detectable via judgment sampling during the characterization phrase. Hotspot sampling is appropriate for characterization situations where contamination is not widely distributed and may not be detected by judgment sampling. CJR sampling is appropriate during the clearance phase when it is desired to augment judgment samples with statistical (random) samples. The hotspot and CJR statistical sampling approaches are discussed in the report for four situations: 1. qualitative data (detect and non-detect) when the FNR = 0 or when using statistical sampling methods that account

  5. Hanford Site Environmental Surveillance Master Sampling Schedule for Calendar Year 2009

    Energy Technology Data Exchange (ETDEWEB)

    Bisping, Lynn E.

    2009-01-20

    Environmental surveillance of the Hanford Site and surrounding areas is conducted by the Pacific Northwest National Laboratory for the U.S. Department of Energy. Sampling is conducted to evaluate levels of radioactive and nonradioactive pollutants in the Hanford environs, as required in DOE Order 450.1 and DOE Order 5400.5. This document contains the calendar year 2009 schedule for the routine collection of samples for the Surface Environmental Surveillance Project and Drinking Water Monitoring Project. Each section includes sampling locations, sampling frequencies, sample types, and analyses to be performed. In some cases, samples are scheduled on a rotating basis. If a sample will not be collected in 2009, the anticipated year for collection is provided. Maps showing approximate sampling locations are included for media scheduled for collection in 2009.

  6. Sampling and Analysis Plan for the 221-U Facility

    International Nuclear Information System (INIS)

    Rugg, J.E.

    1998-02-01

    This sampling and analysis plan (SAP) presents the rationale and strategy for the sampling and analysis activities proposed to be conducted to support the evaluation of alternatives for the final disposition of the 221-U Facility. This SAP will describe general sample locations and the minimum number of samples required. It will also identify the specific contaminants of potential concern (COPCs) and the required analysis. This SAP does not define the exact sample locations and equipment to be used in the field due to the nature of unknowns associated with the 221-U Facility

  7. Results of Macroinvertebrate Sampling Conducted at 33 SRS Stream Locations, July--August 1993

    Energy Technology Data Exchange (ETDEWEB)

    Specht, W.L.

    1994-12-01

    In order to assess the health of the macroinvertebrate communities of SRS streams, the macroinvertebrate communities at 30 stream locations on SRS were sampled during the summer of 1993, using Hester-Dendy multiplate samplers. In addition, three off-site locations in the Upper Three Runs drainage were sampled in order to assess the potential for impact from off-site activities. In interpreting the data, it is important to recognize that these data were from a single set of collections. Macroinvertebrate communities often undergo considerable temporal variation, and are also greatly influenced by such factors as water depth, water velocity, and available habitat. These stations were selected with the intent of developing an on-going sampling program at a smaller number of stations, with the selection of the stations to be based largely upon the results of this preliminary sampling program. When stations within a given stream showed similar results, fewer stations would be sampled in the future. Similarly, if a stream appeared to be perturbed, additional stations or chemical analyses might be added so that the source of the perturbation could be identified. In general, unperturbed streams will contain more taxa than perturbed streams, and the distribution of taxa among orders or families will differ. Some groups of macroinvertebrates, such as Ephemeroptera (mayflies), Plecoptera (stoneflies) and Trichoptera (caddisflies), which are collectively called EPT taxa, are considered to be relatively sensitive to most kinds of stream perturbation; therefore a reduced number of EPT taxa generally indicates that the stream has been subject to chemical or physical stressors. In coastal plain streams, EPT taxa are generally less dominant than in streams with rocky substrates, while Chironomidae (midges) are more abundant. (Abstract Truncated)

  8. Laser sampling

    International Nuclear Information System (INIS)

    Gorbatenko, A A; Revina, E I

    2015-01-01

    The review is devoted to the major advances in laser sampling. The advantages and drawbacks of the technique are considered. Specific features of combinations of laser sampling with various instrumental analytical methods, primarily inductively coupled plasma mass spectrometry, are discussed. Examples of practical implementation of hybrid methods involving laser sampling as well as corresponding analytical characteristics are presented. The bibliography includes 78 references

  9. Hanford Sampling Quality Management Plan (HSQMP)

    International Nuclear Information System (INIS)

    Hyatt, J.E.

    1995-01-01

    This document provides a management tool for evaluating and designing the appropriate elements of a field sampling program. This document provides discussion of the elements of a program and is to be used as a guidance document during the preparation of project and/or function specific documentation. This document does not specify how a sampling program shall be organized. The HSQMP is to be used as a companion document to the Hanford Analytical Services Quality Assurance Plan (HASQAP) DOE/RL-94-55. The generation of this document was enhanced by conducting baseline evaluations of current sampling organizations. Valuable input was received from members of field and Quality Assurance organizations. The HSQMP is expected to be a living document. Revisions will be made as regulations and or Hanford Site conditions warrant changes in the best management practices. Appendices included are: summary of the sampling and analysis work flow process, a user's guide to the Data Quality Objective process, and a self-assessment checklist

  10. Sample Transport for a European Sample Curation Facility

    Science.gov (United States)

    Berthoud, L.; Vrublevskis, J. B.; Bennett, A.; Pottage, T.; Bridges, J. C.; Holt, J. M. C.; Dirri, F.; Longobardo, A.; Palomba, E.; Russell, S.; Smith, C.

    2018-04-01

    This work has looked at the recovery of Mars Sample Return capsule once it arrives on Earth. It covers possible landing sites, planetary protection requirements, and transportation from the landing site to a European Sample Curation Facility.

  11. Bespoke Bias for Obtaining Free Energy Differences within Variationally Enhanced Sampling.

    Science.gov (United States)

    McCarty, James; Valsson, Omar; Parrinello, Michele

    2016-05-10

    Obtaining efficient sampling of multiple metastable states through molecular dynamics and hence determining free energy differences is central for understanding many important phenomena. Here we present a new biasing strategy, which employs the recent variationally enhanced sampling approach (Valsson and Parrinello Phys. Rev. Lett. 2014, 113, 090601). The bias is constructed from an intuitive model of the local free energy surface describing fluctuations around metastable minima and depends on only a few parameters which are determined variationally such that efficient sampling between states is obtained. The bias constructed in this manner largely reduces the need of finding a set of collective variables that completely spans the conformational space of interest, as they only need to be a locally valid descriptor of the system about its local minimum. We introduce the method and demonstrate its power on two representative examples.

  12. Using lot quality-assurance sampling and area sampling to identify priority areas for trachoma control: Viet Nam.

    Science.gov (United States)

    Myatt, Mark; Mai, Nguyen Phuong; Quynh, Nguyen Quang; Nga, Nguyen Huy; Tai, Ha Huy; Long, Nguyen Hung; Minh, Tran Hung; Limburg, Hans

    2005-10-01

    To report on the use of lot quality-assurance sampling (LQAS) surveys undertaken within an area-sampling framework to identify priority areas for intervention with trachoma control activities in Viet Nam. The LQAS survey method for the rapid assessment of the prevalence of active trachoma was adapted for use in Viet Nam with the aim of classifying individual communes by the prevalence of active trachoma among children in primary school. School-based sampling was used; school sites to be sampled were selected using an area-sampling approach. A total of 719 communes in 41 districts in 18 provinces were surveyed. Survey staff found the LQAS survey method both simple and rapid to use after initial problems with area-sampling methods were identified and remedied. The method yielded a finer spatial resolution of prevalence than had been previously achieved in Viet Nam using semiquantitative rapid assessment surveys and multistage cluster-sampled surveys. When used with area-sampling techniques, the LQAS survey method has the potential to form the basis of survey instruments that can be used to efficiently target resources for interventions against active trachoma. With additional work, such methods could provide a generally applicable tool for effective programme planning and for the certification of the elimination of trachoma as a blinding disease.

  13. Environmental surveillance master sampling schedule

    International Nuclear Information System (INIS)

    Bisping, L.E.

    1997-01-01

    Environmental surveillance of the Hanford Site and surrounding areas is conducted by the Pacific Northwest National Laboratory (PNNL)(a) for the US Department of Energy (DOE). This document contains the planned 1997 schedules for routine collection of samples for the Surface Environmental Surveillance Project (SESP) and Drinking Water Monitoring Project. In addition, Section 3.0, Biota, also reflects a rotating collection schedule identifying the year a specific sample is scheduled for collection. The purpose of these monitoring projects is to evaluate levels of radioactive and nonradioactive pollutants in the Hanford environs, as required in DOE Order 5400.1, General Environmental Protection Program, and DOE Order 5400.5, Radiation Protection of the Public and the Environment. The sampling methods will be the same as those described in the Environmental Monitoring Plan, US Department of Energy, Richland Operations Office, DOE/RL91-50, Rev. 1, US Department of Energy, Richland, Washington

  14. Characterization of Volatiles Loss from Soil Samples at Lunar Environments

    Science.gov (United States)

    Kleinhenz, Julie; Smith, Jim; Roush, Ted; Colaprete, Anthony; Zacny, Kris; Paulsen, Gale; Wang, Alex; Paz, Aaron

    2017-01-01

    Resource Prospector Integrated Thermal Vacuum Test Program A series of ground based dirty thermal vacuum tests are being conducted to better understand the subsurface sampling operations for RP Volatiles loss during sampling operations Hardware performance Sample removal and transfer Concept of operationsInstrumentation5 test campaigns over 5 years have been conducted with RP hardware with advancing hardware designs and additional RP subsystems Volatiles sampling 4 years Using flight-forward regolith sampling hardware, empirically determine volatile retention at lunar-relevant conditions Use data to improve theoretical predictions Determine driving variables for retention Bound water loss potential to define measurement uncertainties. The main goal of this talk is to introduce you to our approach to characterizing volatiles loss for RP. Introduce the facility and its capabilities Overview of the RP hardware used in integrated testing (most recent iteration) Summarize the test variables used thus farReview a sample of the results.

  15. Optimizing sampling approaches along ecological gradients

    DEFF Research Database (Denmark)

    Schweiger, Andreas; Irl, Severin D. H.; Steinbauer, Manuel

    2016-01-01

    1. Natural scientists and especially ecologists use manipulative experiments or field observations along gradients to differentiate patterns driven by processes from those caused by random noise. A well-conceived sampling design is essential for identifying, analysing and reporting underlying...... patterns in a statistically solid and reproducible manner, given the normal restrictions in labour, time and money. However, a technical guideline about an adequate sampling design to maximize prediction success under restricted resources is lacking. This study aims at developing such a solid...... and reproducible guideline for sampling along gradients in all fields of ecology and science in general. 2. We conducted simulations with artificial data for five common response types known in ecology, each represented by a simple function (no response, linear, exponential, symmetric unimodal and asymmetric...

  16. Perilymph sampling from the cochlear apex: a reliable method to obtain higher purity perilymph samples from scala tympani.

    Science.gov (United States)

    Salt, Alec N; Hale, Shane A; Plonkte, Stefan K R

    2006-05-15

    Measurements of drug levels in the fluids of the inner ear are required to establish kinetic parameters and to determine the influence of specific local delivery protocols. For most substances, this requires cochlear fluids samples to be obtained for analysis. When auditory function is of primary interest, the drug level in the perilymph of scala tympani (ST) is most relevant, since drug in this scala has ready access to the auditory sensory cells. In many prior studies, ST perilymph samples have been obtained from the basal turn, either by aspiration through the round window membrane (RWM) or through an opening in the bony wall. A number of studies have demonstrated that such samples are likely to be contaminated with cerebrospinal fluid (CSF). CSF enters the basal turn of ST through the cochlear aqueduct when the bony capsule is perforated or when fluid is aspirated. The degree of sample contamination has, however, not been widely appreciated. Recent studies have shown that perilymph samples taken through the round window membrane are highly contaminated with CSF, with samples greater than 2microL in volume containing more CSF than perilymph. In spite of this knowledge, many groups continue to sample from the base of the cochlea, as it is a well-established method. We have developed an alternative, technically simple method to increase the proportion of ST perilymph in a fluid sample. The sample is taken from the apex of the cochlea, a site that is distant from the cochlear aqueduct. A previous problem with sampling through a perforation in the bone was that the native perilymph rapidly leaked out driven by CSF pressure and was lost to the middle ear space. We therefore developed a procedure to collect all the fluid that emerged from the perforated apex after perforation. We evaluated the method using a marker ion trimethylphenylammonium (TMPA). TMPA was applied to the perilymph of guinea pigs either by RW irrigation or by microinjection into the apical turn. The

  17. Low conductive support for thermal insulation of a sample holder of a variable temperature scanning tunneling microscope.

    Science.gov (United States)

    Hanzelka, Pavel; Vonka, Jakub; Musilova, Vera

    2013-08-01

    We have designed a supporting system to fix a sample holder of a scanning tunneling microscope in an UHV chamber at room temperature. The microscope will operate down to a temperature of 20 K. Low thermal conductance, high mechanical stiffness, and small dimensions are the main features of the supporting system. Three sets of four glass balls placed in vertices of a tetrahedron are used for thermal insulation based on small contact areas between the glass balls. We have analyzed the thermal conductivity of the contacts between the balls mutually and between a ball and a metallic plate while the results have been applied to the entire support. The calculation based on a simple model of the setup has been verified with some experimental measurements. In comparison with other feasible supporting structures, the designed support has the lowest thermal conductance.

  18. Descriptions of sampling practices within five approaches to qualitative research in education and the health sciences

    OpenAIRE

    Guetterman, Timothy C.

    2015-01-01

    Although recommendations exist for determining qualitative sample sizes, the literature appears to contain few instances of research on the topic. Practical guidance is needed for determining sample sizes to conduct rigorous qualitative research, to develop proposals, and to budget resources. The purpose of this article is to describe qualitative sample size and sampling practices within published studies in education and the health sciences by research design: case study, ethnography, ground...

  19. HICOSMO - cosmology with a complete sample of galaxy clusters - I. Data analysis, sample selection and luminosity-mass scaling relation

    Science.gov (United States)

    Schellenberger, G.; Reiprich, T. H.

    2017-08-01

    The X-ray regime, where the most massive visible component of galaxy clusters, the intracluster medium, is visible, offers directly measured quantities, like the luminosity, and derived quantities, like the total mass, to characterize these objects. The aim of this project is to analyse a complete sample of galaxy clusters in detail and constrain cosmological parameters, like the matter density, Ωm, or the amplitude of initial density fluctuations, σ8. The purely X-ray flux-limited sample (HIFLUGCS) consists of the 64 X-ray brightest galaxy clusters, which are excellent targets to study the systematic effects, that can bias results. We analysed in total 196 Chandra observations of the 64 HIFLUGCS clusters, with a total exposure time of 7.7 Ms. Here, we present our data analysis procedure (including an automated substructure detection and an energy band optimization for surface brightness profile analysis) that gives individually determined, robust total mass estimates. These masses are tested against dynamical and Planck Sunyaev-Zeldovich (SZ) derived masses of the same clusters, where good overall agreement is found with the dynamical masses. The Planck SZ masses seem to show a mass-dependent bias to our hydrostatic masses; possible biases in this mass-mass comparison are discussed including the Planck selection function. Furthermore, we show the results for the (0.1-2.4) keV luminosity versus mass scaling relation. The overall slope of the sample (1.34) is in agreement with expectations and values from literature. Splitting the sample into galaxy groups and clusters reveals, even after a selection bias correction, that galaxy groups exhibit a significantly steeper slope (1.88) compared to clusters (1.06).

  20. A simple vibrating sample magnetometer for macroscopic samples

    Science.gov (United States)

    Lopez-Dominguez, V.; Quesada, A.; Guzmán-Mínguez, J. C.; Moreno, L.; Lere, M.; Spottorno, J.; Giacomone, F.; Fernández, J. F.; Hernando, A.; García, M. A.

    2018-03-01

    We here present a simple model of a vibrating sample magnetometer (VSM). The system allows recording magnetization curves at room temperature with a resolution of the order of 0.01 emu and is appropriated for macroscopic samples. The setup can be mounted with different configurations depending on the requirements of the sample to be measured (mass, saturation magnetization, saturation field, etc.). We also include here examples of curves obtained with our setup and comparison curves measured with a standard commercial VSM that confirms the reliability of our device.

  1. Fluidic sampling

    International Nuclear Information System (INIS)

    Houck, E.D.

    1992-01-01

    This paper covers the development of the fluidic sampler and its testing in a fluidic transfer system. The major findings of this paper are as follows. Fluidic jet samples can dependably produce unbiased samples of acceptable volume. The fluidic transfer system with a fluidic sampler in-line will transfer water to a net lift of 37.2--39.9 feet at an average ratio of 0.02--0.05 gpm (77--192 cc/min). The fluidic sample system circulation rate compares very favorably with the normal 0.016--0.026 gpm (60--100 cc/min) circulation rate that is commonly produced for this lift and solution with the jet-assisted airlift sample system that is normally used at ICPP. The volume of the sample taken with a fluidic sampler is dependant on the motive pressure to the fluidic sampler, the sample bottle size and on the fluidic sampler jet characteristics. The fluidic sampler should be supplied with fluid having the motive pressure of the 140--150 percent of the peak vacuum producing motive pressure for the jet in the sampler. Fluidic transfer systems should be operated by emptying a full pumping chamber to nearly empty or empty during the pumping cycle, this maximizes the solution transfer rate

  2. Feasibility of self-sampled dried blood spot and saliva samples sent by mail in a population-based study.

    Science.gov (United States)

    Sakhi, Amrit Kaur; Bastani, Nasser Ezzatkhah; Ellingjord-Dale, Merete; Gundersen, Thomas Erik; Blomhoff, Rune; Ursin, Giske

    2015-04-11

    In large epidemiological studies it is often challenging to obtain biological samples. Self-sampling by study participants using dried blood spots (DBS) technique has been suggested to overcome this challenge. DBS is a type of biosampling where blood samples are obtained by a finger-prick lancet, blotted and dried on filter paper. However, the feasibility and efficacy of collecting DBS samples from study participants in large-scale epidemiological studies is not known. The aim of the present study was to test the feasibility and response rate of collecting self-sampled DBS and saliva samples in a population-based study of women above 50 years of age. We determined response proportions, number of phone calls to the study center with questions about sampling, and quality of the DBS. We recruited women through a study conducted within the Norwegian Breast Cancer Screening Program. Invitations, instructions and materials were sent to 4,597 women. The data collection took place over a 3 month period in the spring of 2009. Response proportions for the collection of DBS and saliva samples were 71.0% (3,263) and 70.9% (3,258), respectively. We received 312 phone calls (7% of the 4,597 women) with questions regarding sampling. Of the 3,263 individuals that returned DBS cards, 3,038 (93.1%) had been packaged and shipped according to instructions. A total of 3,032 DBS samples were sufficient for at least one biomarker analysis (i.e. 92.9% of DBS samples received by the laboratory). 2,418 (74.1%) of the DBS cards received by the laboratory were filled with blood according to the instructions (i.e. 10 completely filled spots with up to 7 punches per spot for up to 70 separate analyses). To assess the quality of the samples, we selected and measured two biomarkers (carotenoids and vitamin D). The biomarker levels were consistent with previous reports. Collecting self-sampled DBS and saliva samples through the postal services provides a low cost, effective and feasible

  3. Sampling or gambling

    Energy Technology Data Exchange (ETDEWEB)

    Gy, P.M.

    1981-12-01

    Sampling can be compared to no other technique. A mechanical sampler must above all be selected according to its aptitude for supressing or reducing all components of the sampling error. Sampling is said to be correct when it gives all elements making up the batch of matter submitted to sampling an uniform probability of being selected. A sampler must be correctly designed, built, installed, operated and maintained. When the conditions of sampling correctness are not strictly respected, the sampling error can no longer be controlled and can, unknown to the user, be unacceptably large: the sample is no longer representative. The implementation of an incorrect sampler is a form of gambling and this paper intends to show that at this game the user is nearly always the loser in the long run. The users' and the manufacturers' interests may diverge and the standards which should safeguard the users' interests very often fail to do so by tolerating or even recommending incorrect techniques such as the implementation of too narrow cutters traveling too fast through the stream to be sampled.

  4. Choice of Sample Split in Out-of-Sample Forecast Evaluation

    DEFF Research Database (Denmark)

    Hansen, Peter Reinhard; Timmermann, Allan

    , while conversely the power of forecast evaluation tests is strongest with long out-of-sample periods. To deal with size distortions, we propose a test statistic that is robust to the effect of considering multiple sample split points. Empirical applications to predictabil- ity of stock returns......Out-of-sample tests of forecast performance depend on how a given data set is split into estimation and evaluation periods, yet no guidance exists on how to choose the split point. Empirical forecast evaluation results can therefore be difficult to interpret, particularly when several values...... and inflation demonstrate that out-of-sample forecast evaluation results can critically depend on how the sample split is determined....

  5. Compressive sampling of polynomial chaos expansions: Convergence analysis and sampling strategies

    International Nuclear Information System (INIS)

    Hampton, Jerrad; Doostan, Alireza

    2015-01-01

    Sampling orthogonal polynomial bases via Monte Carlo is of interest for uncertainty quantification of models with random inputs, using Polynomial Chaos (PC) expansions. It is known that bounding a probabilistic parameter, referred to as coherence, yields a bound on the number of samples necessary to identify coefficients in a sparse PC expansion via solution to an ℓ 1 -minimization problem. Utilizing results for orthogonal polynomials, we bound the coherence parameter for polynomials of Hermite and Legendre type under their respective natural sampling distribution. In both polynomial bases we identify an importance sampling distribution which yields a bound with weaker dependence on the order of the approximation. For more general orthonormal bases, we propose the coherence-optimal sampling: a Markov Chain Monte Carlo sampling, which directly uses the basis functions under consideration to achieve a statistical optimality among all sampling schemes with identical support. We demonstrate these different sampling strategies numerically in both high-order and high-dimensional, manufactured PC expansions. In addition, the quality of each sampling method is compared in the identification of solutions to two differential equations, one with a high-dimensional random input and the other with a high-order PC expansion. In both cases, the coherence-optimal sampling scheme leads to similar or considerably improved accuracy

  6. Compressive sampling of polynomial chaos expansions: Convergence analysis and sampling strategies

    Science.gov (United States)

    Hampton, Jerrad; Doostan, Alireza

    2015-01-01

    Sampling orthogonal polynomial bases via Monte Carlo is of interest for uncertainty quantification of models with random inputs, using Polynomial Chaos (PC) expansions. It is known that bounding a probabilistic parameter, referred to as coherence, yields a bound on the number of samples necessary to identify coefficients in a sparse PC expansion via solution to an ℓ1-minimization problem. Utilizing results for orthogonal polynomials, we bound the coherence parameter for polynomials of Hermite and Legendre type under their respective natural sampling distribution. In both polynomial bases we identify an importance sampling distribution which yields a bound with weaker dependence on the order of the approximation. For more general orthonormal bases, we propose the coherence-optimal sampling: a Markov Chain Monte Carlo sampling, which directly uses the basis functions under consideration to achieve a statistical optimality among all sampling schemes with identical support. We demonstrate these different sampling strategies numerically in both high-order and high-dimensional, manufactured PC expansions. In addition, the quality of each sampling method is compared in the identification of solutions to two differential equations, one with a high-dimensional random input and the other with a high-order PC expansion. In both cases, the coherence-optimal sampling scheme leads to similar or considerably improved accuracy.

  7. An introduction to automatic radioactive sample counters

    International Nuclear Information System (INIS)

    1980-01-01

    The subject is covered in chapters, entitled; the detection of radiation in sample counters; nucleonic equipment; liquid scintillation counting; basic features of automatic sample counters; statistics of counting; data analysis; purchase, installation, calibration and maintenance of automatic sample counters. (U.K.)

  8. Quantitative NMR measurements on core samples

    International Nuclear Information System (INIS)

    Olsen, Dan

    1997-01-01

    Within the frame of an EFP-95 project NMR methods for porosity determination in 2D, and for fluid saturation determination in 1D and 2D have been developed. The three methods have been developed and tested on cleaned core samples of chalk from the Danish North Sea. The main restriction for the use of the methods is the inherently short T2 relaxation constants of rock samples. Referring to measurements conducted at 200 MHz, the 2D porosity determination method is applicable to sample material with T2 relaxation constants down to 5 ms. The 1D fluid saturation determination method is applicable to sample material with T2 relaxation constants down to 3 ms, while the 2D fluid saturation determination method is applicable to material with T2 relaxation constants down to 8 ms. In the case of the 2D methods these constraints as a minimum enables work on the majority of chalk samples of Maastrichtian age. The 1D fluid saturation determination method in addition is applicable to at least some chalk samples of Danian and pre-Maastrichtian age. The spatial resolution of the 2D porosity determination method, the 1D fluid saturation methods, and the 2D fluid saturation method is respectively 0.8 mm, 0.8 mm and 2 mm. Reproducibility of pixel values is for all three methods 2%- points. (au)

  9. Sampling soils for 137Cs using various field-sampling volumes

    International Nuclear Information System (INIS)

    Nyhan, J.W.; Schofield, T.G.; White, G.C.; Trujillo, G.

    1981-10-01

    The sediments from a liquid effluent receiving area at the Los Alamos National Laboratory and soils from intensive study area in the fallout pathway of Trinity were sampled for 137 Cs using 25-, 500-, 2500-, and 12 500-cm 3 field sampling volumes. A highly replicated sampling program was used to determine mean concentrations and inventories of 137 Cs at each site, as well as estimates of spatial, aliquoting, and counting variance components of the radionuclide data. The sampling methods were also analyzed as a function of soil size fractions collected in each field sampling volume and of the total cost of the program for a given variation in the radionuclide survey results. Coefficients of variation (CV) of 137 Cs inventory estimates ranged from 0.063 to 0.14 for Mortandad Canyon sediments, where CV values for Trinity soils were observed from 0.38 to 0.57. Spatial variance components of 137 Cs concentration data were usually found to be larger than either the aliquoting or counting variance estimates and were inversely related to field sampling volume at the Trinity intensive site. Subsequent optimization studies of the sampling schemes demonstrated that each aliquot should be counted once, and that only 2 to 4 aliquots out of an many as 30 collected need be assayed for 137 Cs. The optimization studies showed that as sample costs increased to 45 man-hours of labor per sample, the variance of the mean 137 Cs concentration decreased dramatically, but decreased very little with additional labor

  10. Delineating sampling procedures: Pedagogical significance of analysing sampling descriptions and their justifications in TESL experimental research reports

    Directory of Open Access Journals (Sweden)

    Jason Miin-Hwa Lim

    2011-04-01

    Full Text Available Teaching second language learners how to write research reports constitutes a crucial component in programmes on English for Specific Purposes (ESP in institutions of higher learning. One of the rhetorical segments in research reports that merit attention has to do with the descriptions and justifications of sampling procedures. This genre-based study looks into sampling delineations in the Method-related sections of research articles on the teaching of English as a second language (TESL written by expert writers and published in eight reputed international refereed journals. Using Swales’s (1990 & 2004 framework, I conducted a quantitative analysis of the rhetorical steps and a qualitative investigation into the language resources employed in delineating sampling procedures. This investigation has considerable relevance to ESP students and instructors as it has yielded pertinent findings on how samples can be appropriately described to meet the expectations of dissertation examiners, reviewers, and supervisors. The findings of this study have furnished insights into how supervisors and instructors can possibly teach novice writers ways of using specific linguistic mechanisms to lucidly describe and convincingly justify the sampling procedures in the Method sections of experimental research reports.

  11. Soil sampling

    International Nuclear Information System (INIS)

    Fortunati, G.U.; Banfi, C.; Pasturenzi, M.

    1994-01-01

    This study attempts to survey the problems associated with techniques and strategies of soil sampling. Keeping in mind the well defined objectives of a sampling campaign, the aim was to highlight the most important aspect of representativeness of samples as a function of the available resources. Particular emphasis was given to the techniques and particularly to a description of the many types of samplers which are in use. The procedures and techniques employed during the investigations following the Seveso accident are described. (orig.)

  12. Hanford Site Environmental Surveillance Master Sampling Schedule for Calendar Year 2008

    Energy Technology Data Exchange (ETDEWEB)

    Bisping, Lynn E.

    2008-01-01

    Environmental surveillance of the Hanford Site and surrounding areas is conducted by Pacific Northwest National Laboratory for the U.S. Department of Energy. Sampling is conducted to evaluate levels of radioactive and nonradioactive pollutants in the Hanford environs, as required in DOE Order 450.1, "Environmental Protection Program," and DOE Order 5400.5, "Radiation Protection of the Public and the Environment." The environmental surveillance sampling design is described in the "Hanford Site Environmental Monitoring Plan, United States Department of Energy, Richland Operations Office." This document contains the calendar year 2008 schedule for the routine collection of samples for the Surface Environmental Surveillance Project and Drinking Water Monitoring Project. Each section includes sampling locations, sampling frequencies, sample types, and analyses to be performed. In some cases, samples are scheduled on a rotating basis. If a sample will not be collected in 2008, the anticipated year for collection is provided. Maps showing approximate sampling locations are included for media scheduled for collection in 2008.

  13. Evaluation of Sampling Methods for Bacillus Spore ...

    Science.gov (United States)

    Journal Article Following a wide area release of biological materials, mapping the extent of contamination is essential for orderly response and decontamination operations. HVAC filters process large volumes of air and therefore collect highly representative particulate samples in buildings. HVAC filter extraction may have great utility in rapidly estimating the extent of building contamination following a large-scale incident. However, until now, no studies have been conducted comparing the two most appropriate sampling approaches for HVAC filter materials: direct extraction and vacuum-based sampling.

  14. Sampling strategies in antimicrobial resistance monitoring: evaluating how precision and sensitivity vary with the number of animals sampled per farm.

    Directory of Open Access Journals (Sweden)

    Takehisa Yamamoto

    Full Text Available Because antimicrobial resistance in food-producing animals is a major public health concern, many countries have implemented antimicrobial monitoring systems at a national level. When designing a sampling scheme for antimicrobial resistance monitoring, it is necessary to consider both cost effectiveness and statistical plausibility. In this study, we examined how sampling scheme precision and sensitivity can vary with the number of animals sampled from each farm, while keeping the overall sample size constant to avoid additional sampling costs. Five sampling strategies were investigated. These employed 1, 2, 3, 4 or 6 animal samples per farm, with a total of 12 animals sampled in each strategy. A total of 1,500 Escherichia coli isolates from 300 fattening pigs on 30 farms were tested for resistance against 12 antimicrobials. The performance of each sampling strategy was evaluated by bootstrap resampling from the observational data. In the bootstrapping procedure, farms, animals, and isolates were selected randomly with replacement, and a total of 10,000 replications were conducted. For each antimicrobial, we observed that the standard deviation and 2.5-97.5 percentile interval of resistance prevalence were smallest in the sampling strategy that employed 1 animal per farm. The proportion of bootstrap samples that included at least 1 isolate with resistance was also evaluated as an indicator of the sensitivity of the sampling strategy to previously unidentified antimicrobial resistance. The proportion was greatest with 1 sample per farm and decreased with larger samples per farm. We concluded that when the total number of samples is pre-specified, the most precise and sensitive sampling strategy involves collecting 1 sample per farm.

  15. Mapping the structural organization of the brain in conduct disorder: replication of findings in two independent samples.

    Science.gov (United States)

    Fairchild, Graeme; Toschi, Nicola; Sully, Kate; Sonuga-Barke, Edmund J S; Hagan, Cindy C; Diciotti, Stefano; Goodyer, Ian M; Calder, Andrew J; Passamonti, Luca

    2016-09-01

    Neuroimaging methods that allow researchers to investigate structural covariance between brain regions are increasingly being used to study psychiatric disorders. Structural covariance analyses are particularly well suited for studying disorders with putative neurodevelopmental origins as they appear sensitive to changes in the synchronized maturation of different brain regions. We assessed interregional correlations in cortical thickness as a measure of structural covariance, and applied this method to investigate the coordinated development of different brain regions in conduct disorder (CD). We also assessed whether structural covariance measures could differentiate between the childhood-onset (CO-CD) and adolescence-onset (AO-CD) subtypes of CD, which may differ in terms of etiology and adult outcomes. We examined interregional correlations in cortical thickness in male youths with CO-CD or AO-CD relative to healthy controls (HCs) in two independent datasets. The age range in the Cambridge sample was 16-21 years (mean: 18.0), whereas the age range of the Southampton sample was 13-18 years (mean: 16.7). We used FreeSurfer to perform segmentations and applied structural covariance methods to the resulting parcellations. In both samples, CO-CD participants displayed a strikingly higher number of significant cross-cortical correlations compared to HC or AO-CD participants, whereas AO-CD participants presented fewer significant correlations than HCs. Group differences in the strength of the interregional correlations were observed in both samples, and each set of results remained significant when controlling for IQ and comorbid attention-deficit/hyperactivity disorder symptoms. This study provides new evidence for quantitative differences in structural brain organization between the CO-CD and AO-CD subtypes, and supports the hypothesis that both subtypes of CD have neurodevelopmental origins. © 2016 The Authors. Journal of Child Psychology and Psychiatry

  16. Control Charts for Processes with an Inherent Between-Sample Variation

    Directory of Open Access Journals (Sweden)

    Eva Jarošová

    2018-06-01

    Full Text Available A number of processes to which statistical control is applied are subject to various effects that cause random changes in the mean value. The removal of these fluctuations is either technologically impossible or economically disadvantageous under current conditions. The frequent occurrence of signals in the Shewhart chart due to these fluctuations is then undesirable and therefore the conventional control limits need to be extended. Several approaches to the design of the control charts with extended limits are presented in the paper and applied on the data from a real production process. The methods assume samples of size greater than 1. The performance of the charts is examined using the operating characteristic and average run length. The study reveals that in many cases, reducing the risk of false alarms is insufficient.

  17. An Improvement to Interval Estimation for Small Samples

    Directory of Open Access Journals (Sweden)

    SUN Hui-Ling

    2017-02-01

    Full Text Available Because it is difficult and complex to determine the probability distribution of small samples,it is improper to use traditional probability theory to process parameter estimation for small samples. Bayes Bootstrap method is always used in the project. Although,the Bayes Bootstrap method has its own limitation,In this article an improvement is given to the Bayes Bootstrap method,This method extended the amount of samples by numerical simulation without changing the circumstances in a small sample of the original sample. And the new method can give the accurate interval estimation for the small samples. Finally,by using the Monte Carlo simulation to model simulation to the specific small sample problems. The effectiveness and practicability of the Improved-Bootstrap method was proved.

  18. OUTPACE long duration stations: physical variability, context of biogeochemical sampling, and evaluation of sampling strategy

    Directory of Open Access Journals (Sweden)

    A. de Verneil

    2018-04-01

    Full Text Available Research cruises to quantify biogeochemical fluxes in the ocean require taking measurements at stations lasting at least several days. A popular experimental design is the quasi-Lagrangian drifter, often mounted with in situ incubations or sediment traps that follow the flow of water over time. After initial drifter deployment, the ship tracks the drifter for continuing measurements that are supposed to represent the same water environment. An outstanding question is how to best determine whether this is true. During the Oligotrophy to UlTra-oligotrophy PACific Experiment (OUTPACE cruise, from 18 February to 3 April 2015 in the western tropical South Pacific, three separate stations of long duration (five days over the upper 500 m were conducted in this quasi-Lagrangian sampling scheme. Here we present physical data to provide context for these three stations and to assess whether the sampling strategy worked, i.e., that a single body of water was sampled. After analyzing tracer variability and local water circulation at each station, we identify water layers and times where the drifter risks encountering another body of water. While almost no realization of this sampling scheme will be truly Lagrangian, due to the presence of vertical shear, the depth-resolved observations during the three stations show most layers sampled sufficiently homogeneous physical environments during OUTPACE. By directly addressing the concerns raised by these quasi-Lagrangian sampling platforms, a protocol of best practices can begin to be formulated so that future research campaigns include the complementary datasets and analyses presented here to verify the appropriate use of the drifter platform.

  19. CHOMIK -Sampling Device of Penetrating Type for Russian Phobos Sample Return Mission

    Science.gov (United States)

    Seweryn, Karol; Grygorczuk, Jerzy; Rickmann, Hans; Morawski, Marek; Aleksashkin, Sergey; Banaszkiewicz, Marek; Drogosz, Michal; Gurgurewicz, Joanna; Kozlov, Oleg E.; Krolikowska-Soltan, Malgorzata; Sutugin, Sergiej E.; Wawrzaszek, Roman; Wisniewski, Lukasz; Zakharov, Alexander

    Measurements of physical properties of planetary bodies allow to determine many important parameters for scientists working in different fields of research. For example effective heat conductivity of the regolith can help with better understanding of processes occurring in the body interior. Chemical and mineralogical composition gives us a chance to better understand the origin and evolution of the moons. In principle such parameters of the planetary bodies can be determined based on three different measurement techniques: (i) in situ measurements (ii) measurements of the samples in laboratory conditions at the Earth and (iii) remote sensing measurements. Scientific missions which allow us to perform all type of measurements, give us a chance for not only parameters determination but also cross calibration of the instruments. Russian Phobos Sample Return (PhSR) mission is one of few which allows for all type of such measurements. The spacecraft will be equipped with remote sensing instruments like: spectrometers, long wave radar and dust counter, instruments for in-situ measurements -gas-chromatograph, seismometer, thermodetector and others and also robotic arm and sampling device. PhSR mission will be launched in November 2011 on board of a launch vehicle Zenit. About a year later (11 months) the vehicle will reach the Martian orbit. It is anticipated that it will land on Phobos in the beginning of 2013. A take off back will take place a month later and the re-entry module containing a capsule that will hold the soil sample enclosed in a container will be on its way back to Earth. The 11 kg re-entry capsule with the container will land in Kazakhstan in mid-2014. A unique geological penetrator CHOMIK dedicated for the Phobos Sample Return space mis-sion will be designed and manufactured at the Space Mechatronics and Robotics Laboratory, Space Research Centre Polish Academy of Sciences (SRC PAS) in Warsaw. Functionally CHOMIK is based on the well known MUPUS

  20. Crowdsourcing for large-scale mosquito (Diptera: Culicidae) sampling

    Science.gov (United States)

    Sampling a cosmopolitan mosquito (Diptera: Culicidae) species throughout its range is logistically challenging and extremely resource intensive. Mosquito control programmes and regional networks operate at the local level and often conduct sampling activities across much of North America. A method f...

  1. AND/OR Importance Sampling

    OpenAIRE

    Gogate, Vibhav; Dechter, Rina

    2012-01-01

    The paper introduces AND/OR importance sampling for probabilistic graphical models. In contrast to importance sampling, AND/OR importance sampling caches samples in the AND/OR space and then extracts a new sample mean from the stored samples. We prove that AND/OR importance sampling may have lower variance than importance sampling; thereby providing a theoretical justification for preferring it over importance sampling. Our empirical evaluation demonstrates that AND/OR importance sampling is ...

  2. Sample representativeness verification of the FADN CZ farm business sample

    Directory of Open Access Journals (Sweden)

    Marie Prášilová

    2011-01-01

    Full Text Available Sample representativeness verification is one of the key stages of statistical work. After having joined the European Union the Czech Republic joined also the Farm Accountancy Data Network system of the Union. This is a sample of bodies and companies doing business in agriculture. Detailed production and economic data on the results of farming business are collected from that sample annually and results for the entire population of the country´s farms are then estimated and assessed. It is important hence, that the sample be representative. Representativeness is to be assessed as to the number of farms included in the survey and also as to the degree of accordance of the measures and indices as related to the population. The paper deals with the special statistical techniques and methods of the FADN CZ sample representativeness verification including the necessary sample size statement procedure. The Czech farm population data have been obtained from the Czech Statistical Office data bank.

  3. Interval sampling methods and measurement error: a computer simulation.

    Science.gov (United States)

    Wirth, Oliver; Slaven, James; Taylor, Matthew A

    2014-01-01

    A simulation study was conducted to provide a more thorough account of measurement error associated with interval sampling methods. A computer program simulated the application of momentary time sampling, partial-interval recording, and whole-interval recording methods on target events randomly distributed across an observation period. The simulation yielded measures of error for multiple combinations of observation period, interval duration, event duration, and cumulative event duration. The simulations were conducted up to 100 times to yield measures of error variability. Although the present simulation confirmed some previously reported characteristics of interval sampling methods, it also revealed many new findings that pertain to each method's inherent strengths and weaknesses. The analysis and resulting error tables can help guide the selection of the most appropriate sampling method for observation-based behavioral assessments. © Society for the Experimental Analysis of Behavior.

  4. Modification of ion chromatograph for analyses of radioactive samples

    International Nuclear Information System (INIS)

    Curfman, L.L.; Johnson, S.J.

    1979-01-01

    In ion chromatographic analysis, the sample is injected through a sample loop onto an analytical column where separation occurs. The sample then passes through a suppressor column to remove or neutralize background ions. A flow-through conductivity cell is used as a detector. Depending upon column and eluent selection, ion chromatography can be used for anion or cation analyses. Ion chromatography has proven to be a versatile analytical tool for the analysis of anions in Hanford waste samples. These radioactive samples range from caustic high salt solutions to hydrochloric acid dissolutions of insoluble sludges. Instrument modifications which provide safe and convenient handling of these samples without lengthening analysis time or altering instrument performance are described

  5. Optimized preparation of urine samples for two-dimensional electrophoresis and initial application to patient samples

    DEFF Research Database (Denmark)

    Lafitte, Daniel; Dussol, Bertrand; Andersen, Søren

    2002-01-01

    OBJECTIVE: We optimized of the preparation of urinary samples to obtain a comprehensive map of urinary proteins of healthy subjects and then compared this map with the ones obtained with patient samples to show that the pattern was specific of their kidney disease. DESIGN AND METHODS: The urinary...

  6. Two methods of self-sampling compared to clinician sampling to detect reproductive tract infections in Gugulethu, South Africa

    NARCIS (Netherlands)

    van de Wijgert, Janneke; Altini, Lydia; Jones, Heidi; de Kock, Alana; Young, Taryn; Williamson, Anna-Lise; Hoosen, Anwar; Coetzee, Nicol

    2006-01-01

    To assess the validity, feasibility, and acceptability of 2 methods of self-sampling compared to clinician sampling during a speculum examination. To improve screening for reproductive tract infections (RTIs) in resource-poor settings. In a public clinic in Cape Town, 450 women underwent a speculum

  7. Sampling strategies to capture single-cell heterogeneity

    OpenAIRE

    Satwik Rajaram; Louise E. Heinrich; John D. Gordan; Jayant Avva; Kathy M. Bonness; Agnieszka K. Witkiewicz; James S. Malter; Chloe E. Atreya; Robert S. Warren; Lani F. Wu; Steven J. Altschuler

    2017-01-01

    Advances in single-cell technologies have highlighted the prevalence and biological significance of cellular heterogeneity. A critical question is how to design experiments that faithfully capture the true range of heterogeneity from samples of cellular populations. Here, we develop a data-driven approach, illustrated in the context of image data, that estimates the sampling depth required for prospective investigations of single-cell heterogeneity from an existing collection of samples. ...

  8. Sample preparation combined with electroanalysis to improve simultaneous determination of antibiotics in animal derived food samples.

    Science.gov (United States)

    da Silva, Wesley Pereira; de Oliveira, Luiz Henrique; Santos, André Luiz Dos; Ferreira, Valdir Souza; Trindade, Magno Aparecido Gonçalves

    2018-06-01

    A procedure based on liquid-liquid extraction (LLE) and phase separation using magnetically stirred salt-induced high-temperature liquid-liquid extraction (PS-MSSI-HT-LLE) was developed to extract and pre-concentrate ciprofloxacin (CIPRO) and enrofloxacin (ENRO) from animal food samples before electroanalysis. Firstly, simple LLE was used to extract the fluoroquinolones (FQs) from animal food samples, in which dilution was performed to reduce interference effects to below a tolerable threshold. Then, adapted PS-MSSI-HT-LLE protocols allowed re-extraction and further pre-concentration of target analytes in the diluted acid samples for simultaneous electrochemical quantification at low concentration levels. To improve the peak separation, in simultaneous detection, a baseline-corrected second-order derivative approach was processed. These approaches allowed quantification of target FQs from animal food samples spiked at levels of 0.80 to 2.00 µmol L -1 in chicken meat, with recovery values always higher than 80.5%, as well as in milk samples spiked at 4.00 µmol L -1 , with recovery values close to 70.0%. Copyright © 2018 Elsevier Ltd. All rights reserved.

  9. Research-Grade 3D Virtual Astromaterials Samples: Novel Visualization of NASA's Apollo Lunar Samples and Antarctic Meteorite Samples to Benefit Curation, Research, and Education

    Science.gov (United States)

    Blumenfeld, E. H.; Evans, C. A.; Oshel, E. R.; Liddle, D. A.; Beaulieu, K. R.; Zeigler, R. A.; Righter, K.; Hanna, R. D.; Ketcham, R. A.

    2017-01-01

    NASA's vast and growing collections of astromaterials are both scientifically and culturally significant, requiring unique preservation strategies that need to be recurrently updated to contemporary technological capabilities and increasing accessibility demands. New technologies have made it possible to advance documentation and visualization practices that can enhance conservation and curation protocols for NASA's Astromaterials Collections. Our interdisciplinary team has developed a method to create 3D Virtual Astromaterials Samples (VAS) of the existing collections of Apollo Lunar Samples and Antarctic Meteorites. Research-grade 3D VAS will virtually put these samples in the hands of researchers and educators worldwide, increasing accessibility and visibility of these significant collections. With new sample return missions on the horizon, it is of primary importance to develop advanced curation standards for documentation and visualization methodologies.

  10. Mixing and sampling tests for Radiochemical Plant

    International Nuclear Information System (INIS)

    Ehinger, M.N.; Marfin, H.R.; Hunt, B.

    1999-01-01

    The paper describes results and test procedures used to evaluate uncertainly and basis effects introduced by the sampler systems of a radiochemical plant, and similar parameters associated with mixing. This report will concentrate on experiences at the Barnwell Nuclear Fuels Plant. Mixing and sampling tests can be conducted to establish the statistical parameters for those activities related to overall measurement uncertainties. Density measurements by state-of-the art, commercially availability equipment is the key to conducting those tests. Experience in the U.S. suggests the statistical contribution of mixing and sampling can be controlled to less than 0.01 % and with new equipment and new tests in operating facilities might be controlled to better accuracy [ru

  11. Sample design for the residential energy consumption survey

    Energy Technology Data Exchange (ETDEWEB)

    1994-08-01

    The purpose of this report is to provide detailed information about the multistage area-probability sample design used for the Residential Energy Consumption Survey (RECS). It is intended as a technical report, for use by statisticians, to better understand the theory and procedures followed in the creation of the RECS sample frame. For a more cursory overview of the RECS sample design, refer to the appendix entitled ``How the Survey was Conducted,`` which is included in the statistical reports produced for each RECS survey year.

  12. Sample collection and sample analysis plan in support of the 105-C/190-C concrete and soil sampling activities

    International Nuclear Information System (INIS)

    Marske, S.G.

    1996-07-01

    This sampling and analysis plan describes the sample collection and sample analysis in support of the 105-C water tunnels and 190-C main pumphouse concrete and soil sampling activities. These analytical data will be used to identify the radiological contamination and presence of hazardous materials to support the decontamination and disposal activities

  13. On the Sampling

    OpenAIRE

    Güleda Doğan

    2017-01-01

    This editorial is on statistical sampling, which is one of the most two important reasons for editorial rejection from our journal Turkish Librarianship. The stages of quantitative research, the stage in which we are sampling, the importance of sampling for a research, deciding on sample size and sampling methods are summarised briefly.

  14. Sample preparation strategies for food and biological samples prior to nanoparticle detection and imaging

    DEFF Research Database (Denmark)

    Larsen, Erik Huusfeldt; Löschner, Katrin

    2014-01-01

    microscopy (TEM) proved to be necessary for trouble shooting of results obtained from AFFF-LS-ICP-MS. Aqueous and enzymatic extraction strategies were tested for thorough sample preparation aiming at degrading the sample matrix and to liberate the AgNPs from chicken meat into liquid suspension. The resulting...... AFFF-ICP-MS fractograms, which corresponded to the enzymatic digests, showed a major nano-peak (about 80 % recovery of AgNPs spiked to the meat) plus new smaller peaks that eluted close to the void volume of the fractograms. Small, but significant shifts in retention time of AFFF peaks were observed...... for the meat sample extracts and the corresponding neat AgNP suspension, and rendered sizing by way of calibration with AgNPs as sizing standards inaccurate. In order to gain further insight into the sizes of the separated AgNPs, or their possible dissolved state, fractions of the AFFF eluate were collected...

  15. Sampling efficiency of modified 37-mm sampling cassettes using computational fluid dynamics.

    Science.gov (United States)

    Anthony, T Renée; Sleeth, Darrah; Volckens, John

    2016-01-01

    In the U.S., most industrial hygiene practitioners continue to rely on the closed-face cassette (CFC) to assess worker exposures to hazardous dusts, primarily because ease of use, cost, and familiarity. However, mass concentrations measured with this classic sampler underestimate exposures to larger particles throughout the inhalable particulate mass (IPM) size range (up to aerodynamic diameters of 100 μm). To investigate whether the current 37-mm inlet cap can be redesigned to better meet the IPM sampling criterion, computational fluid dynamics (CFD) models were developed, and particle sampling efficiencies associated with various modifications to the CFC inlet cap were determined. Simulations of fluid flow (standard k-epsilon turbulent model) and particle transport (laminar trajectories, 1-116 μm) were conducted using sampling flow rates of 10 L min(-1) in slow moving air (0.2 m s(-1)) in the facing-the-wind orientation. Combinations of seven inlet shapes and three inlet diameters were evaluated as candidates to replace the current 37-mm inlet cap. For a given inlet geometry, differences in sampler efficiency between inlet diameters averaged less than 1% for particles through 100 μm, but the largest opening was found to increase the efficiency for the 116 μm particles by 14% for the flat inlet cap. A substantial reduction in sampler efficiency was identified for sampler inlets with side walls extending beyond the dimension of the external lip of the current 37-mm CFC. The inlet cap based on the 37-mm CFC dimensions with an expanded 15-mm entry provided the best agreement with facing-the-wind human aspiration efficiency. The sampler efficiency was increased with a flat entry or with a thin central lip adjacent to the new enlarged entry. This work provides a substantial body of sampling efficiency estimates as a function of particle size and inlet geometry for personal aerosol samplers.

  16. Reproducibility of Serum Potassium Values in Serum From Blood Samples Stored for Increasing Times Prior to Centrifugation and Analysis.

    Science.gov (United States)

    Harper, Aaron; Lu, Chuanyong; Sun, Yi; Garcia, Rafael; Rets, Anton; Alexis, Herol; Saad, Heba; Eid, Ikram; Harris, Loretta; Marshall, Barbara; Tafani, Edlira; Pincus, Matthew R

    2016-05-01

    The goal of this work was to determine if immediate versus postponed centrifugation of samples affects the levels of serum potassium. Twenty participants donated normal venous blood that was collected in four serum separator tubes per donor, each of which was analyzed at 0, 1, 2, or 4 hr on the Siemens Advia 1800 autoanalyzer. Coefficients of variation (CVs) for potassium levels ranged from 0% to 7.6% with a mean of 3 ± 2%. ANOVA testing of the means for all 20 samples showed a P-value of 0.72 (>0.05) indicating that there was no statistically significant difference between the means of the samples at the four time points. Sixteen samples were found to have CVs that were ≤5%. Two samples showed increases of potassium from the reference range to levels higher than the upper reference limit, one of which had a 4-hr value that was within the reference or normal range (3.5-5 mEq/l). Overall, most samples were found to have reproducible levels of serum potassium. Serum potassium levels from stored whole blood collected in serum separator tubes are, for the most part, stable at room temperature for at least 4 hr prior to analysis. However, some samples can exhibit significant fluctuations of values. © 2015 Wiley Periodicals, Inc.

  17. Sampling and Analysis Plan for the 216-A-29 Ditch

    International Nuclear Information System (INIS)

    Petersen, S.W.

    1998-06-01

    This sampling and analysis plan defines procedures to be used for collecting and handling samples to be obtained from the 216-A-29 Ditch, and identifies requirements for field and laboratory measurements. The sampling strategy describes here is derived from a Data Quality Objectives workshop conducted in January 1997 to support sampling to assure worker safety during construction and to assess the validity of a 1988 ditch sampling campaign and the effectiveness of subsequent stabilization. The purpose of the proposed sampling and analysis activities is to characterize soil contamination in the vicinity of a proposed road over the 216-A-29 Ditch

  18. Present status of NMCC and sample preparation method for bio-samples

    International Nuclear Information System (INIS)

    Futatsugawa, S.; Hatakeyama, S.; Saitou, S.; Sera, K.

    1993-01-01

    In NMCC(Nishina Memorial Cyclotron Center) we are doing researches on PET of nuclear medicine (Positron Emission Computed Tomography) and PIXE analysis (Particle Induced X-ray Emission) using a small cyclotron of compactly designed. The NMCC facilities have been opened to researchers of other institutions since April 1993. The present status of NMCC is described. Bio-samples (medical samples, plants, animals and environmental samples) have mainly been analyzed by PIXE in NMCC. Small amounts of bio-samples for PIXE are decomposed quickly and easily in a sealed PTFE (polytetrafluoroethylene) vessel with a microwave oven. This sample preparation method of bio-samples also is described. (author)

  19. Feasibility of self-sampled dried blood spot and saliva samples sent by mail in a population-based study

    International Nuclear Information System (INIS)

    Sakhi, Amrit Kaur; Bastani, Nasser Ezzatkhah; Ellingjord-Dale, Merete; Gundersen, Thomas Erik; Blomhoff, Rune; Ursin, Giske

    2015-01-01

    In large epidemiological studies it is often challenging to obtain biological samples. Self-sampling by study participants using dried blood spots (DBS) technique has been suggested to overcome this challenge. DBS is a type of biosampling where blood samples are obtained by a finger-prick lancet, blotted and dried on filter paper. However, the feasibility and efficacy of collecting DBS samples from study participants in large-scale epidemiological studies is not known. The aim of the present study was to test the feasibility and response rate of collecting self-sampled DBS and saliva samples in a population–based study of women above 50 years of age. We determined response proportions, number of phone calls to the study center with questions about sampling, and quality of the DBS. We recruited women through a study conducted within the Norwegian Breast Cancer Screening Program. Invitations, instructions and materials were sent to 4,597 women. The data collection took place over a 3 month period in the spring of 2009. Response proportions for the collection of DBS and saliva samples were 71.0% (3,263) and 70.9% (3,258), respectively. We received 312 phone calls (7% of the 4,597 women) with questions regarding sampling. Of the 3,263 individuals that returned DBS cards, 3,038 (93.1%) had been packaged and shipped according to instructions. A total of 3,032 DBS samples were sufficient for at least one biomarker analysis (i.e. 92.9% of DBS samples received by the laboratory). 2,418 (74.1%) of the DBS cards received by the laboratory were filled with blood according to the instructions (i.e. 10 completely filled spots with up to 7 punches per spot for up to 70 separate analyses). To assess the quality of the samples, we selected and measured two biomarkers (carotenoids and vitamin D). The biomarker levels were consistent with previous reports. Collecting self-sampled DBS and saliva samples through the postal services provides a low cost, effective and feasible

  20. Assessment the impact of samplers change on the uncertainty related to geothermalwater sampling

    Science.gov (United States)

    Wątor, Katarzyna; Mika, Anna; Sekuła, Klaudia; Kmiecik, Ewa

    2018-02-01

    The aim of this study is to assess the impact of samplers change on the uncertainty associated with the process of the geothermal water sampling. The study was carried out on geothermal water exploited in Podhale region, southern Poland (Małopolska province). To estimate the uncertainty associated with sampling the results of determinations of metasilicic acid (H2SiO3) in normal and duplicate samples collected in two series were used (in each series the samples were collected by qualified sampler). Chemical analyses were performed using ICP-OES method in the certified Hydrogeochemical Laboratory of the Hydrogeology and Engineering Geology Department at the AGH University of Science and Technology in Krakow (Certificate of Polish Centre for Accreditation No. AB 1050). To evaluate the uncertainty arising from sampling the empirical approach was implemented, based on double analysis of normal and duplicate samples taken from the same well in the series of testing. The analyses of the results were done using ROBAN software based on technique of robust statistics analysis of variance (rANOVA). Conducted research proved that in the case of qualified and experienced samplers uncertainty connected with the sampling can be reduced what results in small measurement uncertainty.

  1. Technical note: Alternatives to reduce adipose tissue sampling bias.

    Science.gov (United States)

    Cruz, G D; Wang, Y; Fadel, J G

    2014-10-01

    Understanding the mechanisms by which nutritional and pharmaceutical factors can manipulate adipose tissue growth and development in production animals has direct and indirect effects in the profitability of an enterprise. Adipocyte cellularity (number and size) is a key biological response that is commonly measured in animal science research. The variability and sampling of adipocyte cellularity within a muscle has been addressed in previous studies, but no attempt to critically investigate these issues has been proposed in the literature. The present study evaluated 2 sampling techniques (random and systematic) in an attempt to minimize sampling bias and to determine the minimum number of samples from 1 to 15 needed to represent the overall adipose tissue in the muscle. Both sampling procedures were applied on adipose tissue samples dissected from 30 longissimus muscles from cattle finished either on grass or grain. Briefly, adipose tissue samples were fixed with osmium tetroxide, and size and number of adipocytes were determined by a Coulter Counter. These results were then fit in a finite mixture model to obtain distribution parameters of each sample. To evaluate the benefits of increasing number of samples and the advantage of the new sampling technique, the concept of acceptance ratio was used; simply stated, the higher the acceptance ratio, the better the representation of the overall population. As expected, a great improvement on the estimation of the overall adipocyte cellularity parameters was observed using both sampling techniques when sample size number increased from 1 to 15 samples, considering both techniques' acceptance ratio increased from approximately 3 to 25%. When comparing sampling techniques, the systematic procedure slightly improved parameters estimation. The results suggest that more detailed research using other sampling techniques may provide better estimates for minimum sampling.

  2. Hanford site transuranic waste sampling plan

    International Nuclear Information System (INIS)

    GREAGER, T.M.

    1999-01-01

    This sampling plan (SP) describes the selection of containers for sampling of homogeneous solids and soil/gravel and for visual examination of transuranic and mixed transuranic (collectively referred to as TRU) waste generated at the U.S. Department of Energy (DOE) Hanford Site. The activities described in this SP will be conducted under the Hanford Site TRU Waste Certification Program. This SP is designed to meet the requirements of the Transuranic Waste Characterization Quality Assurance Program Plan (CAO-94-1010) (DOE 1996a) (QAPP), site-specific implementation of which is described in the Hanford Site Transuranic Waste Characterization Program Quality Assurance Project Plan (HNF-2599) (Hanford 1998b) (QAPP). The QAPP defines the quality assurance (QA) requirements and protocols for TRU waste characterization activities at the Hanford Site. In addition, the QAPP identifies responsible organizations, describes required program activities, outlines sampling and analysis strategies, and identifies procedures for characterization activities. The QAPP identifies specific requirements for TRU waste sampling plans. Table 1-1 presents these requirements and indicates sections in this SP where these requirements are addressed

  3. Spherical sampling

    CERN Document Server

    Freeden, Willi; Schreiner, Michael

    2018-01-01

    This book presents, in a consistent and unified overview, results and developments in the field of today´s spherical sampling, particularly arising in mathematical geosciences. Although the book often refers to original contributions, the authors made them accessible to (graduate) students and scientists not only from mathematics but also from geosciences and geoengineering. Building a library of topics in spherical sampling theory it shows how advances in this theory lead to new discoveries in mathematical, geodetic, geophysical as well as other scientific branches like neuro-medicine. A must-to-read for everybody working in the area of spherical sampling.

  4. Laboratory Sampling Guide

    Science.gov (United States)

    2012-05-11

    environment, and by ingestion of foodstuffs that have incorporated C-14 by photosynthesis . Like tritium, C-14 is a very low energy beta emitter and is... bacterial growth and to minimize development of solids in the sample. • Properly identify each sample container with name, SSN, and collection start and...sampling in the same cardboard carton. The sample may be kept cool or frozen during collection to control odor and bacterial growth. • Once

  5. Quota sampling in internet research: practical issues.

    Science.gov (United States)

    Im, Eun-Ok; Chee, Wonshik

    2011-07-01

    Quota sampling has been suggested as a potentially good method for Internet-based research and has been used by several researchers working with Internet samples. However, very little is known about the issues or concerns in using a quota sampling method in Internet research. The purpose of this article was to present the practical issues using quota sampling in an Internet-based study. During the Internet study, the research team recorded all recruitment issues that arose and made written notes indicating the possible reasons for the problems. In addition, biweekly team discussions were conducted for which written records were kept. Overall, quota sampling was effective in ensuring that an adequate number of midlife women were recruited from the targeted ethnic groups. However, during the study process, we encountered the following practical issues using quota sampling: (1) difficulty reaching out to women in lower socioeconomic classes, (2) difficulty ensuring authenticity of participants' identities, (3) participants giving inconsistent answers for the screening questions versus the Internet survey questions, (4) potential problems with a question on socioeconomic status, (5) resentment toward the research project and/or researchers because of rejection, and (6) a longer time and more expense than anticipated.

  6. Large sample neutron activation analysis of a reference inhomogeneous sample

    International Nuclear Information System (INIS)

    Vasilopoulou, T.; Athens National Technical University, Athens; Tzika, F.; Stamatelatos, I.E.; Koster-Ammerlaan, M.J.J.

    2011-01-01

    A benchmark experiment was performed for Neutron Activation Analysis (NAA) of a large inhomogeneous sample. The reference sample was developed in-house and consisted of SiO 2 matrix and an Al-Zn alloy 'inhomogeneity' body. Monte Carlo simulations were employed to derive appropriate correction factors for neutron self-shielding during irradiation as well as self-attenuation of gamma rays and sample geometry during counting. The large sample neutron activation analysis (LSNAA) results were compared against reference values and the trueness of the technique was evaluated. An agreement within ±10% was observed between LSNAA and reference elemental mass values, for all matrix and inhomogeneity elements except Samarium, provided that the inhomogeneity body was fully simulated. However, in cases that the inhomogeneity was treated as not known, the results showed a reasonable agreement for most matrix elements, while large discrepancies were observed for the inhomogeneity elements. This study provided a quantification of the uncertainties associated with inhomogeneity in large sample analysis and contributed to the identification of the needs for future development of LSNAA facilities for analysis of inhomogeneous samples. (author)

  7. Sample size of the reference sample in a case-augmented study.

    Science.gov (United States)

    Ghosh, Palash; Dewanji, Anup

    2017-05-01

    The case-augmented study, in which a case sample is augmented with a reference (random) sample from the source population with only covariates information known, is becoming popular in different areas of applied science such as pharmacovigilance, ecology, and econometrics. In general, the case sample is available from some source (for example, hospital database, case registry, etc.); however, the reference sample is required to be drawn from the corresponding source population. The required minimum size of the reference sample is an important issue in this regard. In this work, we address the minimum sample size calculation and discuss related issues. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  8. Noise and pile-up in liquid sampling calorimeters

    International Nuclear Information System (INIS)

    Franzini, P.

    1988-01-01

    The design and construction of detectors for the SSC presents new challenges and requires electronics of priorly unavailable performance. Most detector elements produce minute signals which can be lost in the ever present noise from thermal fluctuation and the finite charge of the electron. The author presents in these notes a pedagogical introduction to noise and pile-up, as applicable to liquid sampling calorimeters, beginning with a brief, only descriptive, introduction to amplifiers and the physical origins of noise. He then studies the particular case of noise in charge measurements, in particular for calorimeters, where parallel noise is usually negligible. By a physical example he discusses optimal filtering, proving that gaussian filters are not optimal. The scaling laws of noise versus source capacitance and filter bandwidth or shaping time, are emphasized. An explicit example for pulse shapes peaking at 1 μs is computed and extrapolated to 0.1 μs, more appropriate for an SSC detector. Solutions for the several problems arising at short shaping times are discussed and the conditions for optimal preamp-detector matching (for minimum noise) are derived. He also briefly discusses pile-up and its scaling laws. Correlations between signal samples at different times are considered and computed for an example. Correlations are necessary to compute the noise in signals reconstructed from sampling

  9. Risks for Conduct Disorder Symptoms Associated with Parental Alcoholism in Stepfather Families versus Intact Families from a Community Sample

    Science.gov (United States)

    Foley, Debra L.; Pickles, Andrew; Rutter, Michael; Gardner, Charles O.; Maes, Hermine H.; Silberg, Judy L.; Eaves, Lindon J.

    2004-01-01

    Background: It is not known if the prevalence of parental psychiatric disorders is higher in stepfather than intact families, or if parental alcoholism is differentially associated with risk for conduct disorder (CD) symptoms in stepfather families versus intact families. Method: The sample comprised 839 girls and 741 boys from 792 intact families…

  10. Fluctuation conductivity in cuprate superconductors

    Indian Academy of Sciences (India)

    We have measured the in-plane resistivity of Bi2Sr2CaCu2O8+δ and Tl2Ba2. CaCu2O8+δ ... assumed to be Josephson coupled, the interaction was treated in terms of an effective mass tensor. .... Further details of the sample preparation.

  11. A stochastic optimisation method to estimate the spatial distribution of a pathogen from a sample.

    Science.gov (United States)

    Sampling is of central importance in plant pathology. It facilitates our understanding of how epidemics develop in space and time and can also be used to inform disease management decisions. Making inferences from a sample is necessary because we rarely have the resources to conduct a complete censu...

  12. Second NATO/SIBCA Exercise on Sampling of Chemical Warfare Agents

    National Research Council Canada - National Science Library

    Wils, E

    1999-01-01

    In order to practise the sampling of chemical warfare agents under realistic conditions, the Netherlands participated successfully in the second NATO/SIBCA sampling exercise conducted in Poland on 1-3...

  13. A Hybrid Monte Carlo importance sampling of rare events in Turbulence and in Turbulent Models

    Science.gov (United States)

    Margazoglou, Georgios; Biferale, Luca; Grauer, Rainer; Jansen, Karl; Mesterhazy, David; Rosenow, Tillmann; Tripiccione, Raffaele

    2017-11-01

    Extreme and rare events is a challenging topic in the field of turbulence. Trying to investigate those instances through the use of traditional numerical tools turns to be a notorious task, as they fail to systematically sample the fluctuations around them. On the other hand, we propose that an importance sampling Monte Carlo method can selectively highlight extreme events in remote areas of the phase space and induce their occurrence. We present a brand new computational approach, based on the path integral formulation of stochastic dynamics, and employ an accelerated Hybrid Monte Carlo (HMC) algorithm for this purpose. Through the paradigm of stochastic one-dimensional Burgers' equation, subjected to a random noise that is white-in-time and power-law correlated in Fourier space, we will prove our concept and benchmark our results with standard CFD methods. Furthermore, we will present our first results of constrained sampling around saddle-point instanton configurations (optimal fluctuations). The research leading to these results has received funding from the EU Horizon 2020 research and innovation programme under Grant Agreement No. 642069, and from the EU Seventh Framework Programme (FP7/2007-2013) under ERC Grant Agreement No. 339032.

  14. Comparison of sampling methods for hard-to-reach francophone populations: yield and adequacy of advertisement and respondent-driven sampling.

    Science.gov (United States)

    Ngwakongnwi, Emmanuel; King-Shier, Kathryn M; Hemmelgarn, Brenda R; Musto, Richard; Quan, Hude

    2014-01-01

    Francophones who live outside the primarily French-speaking province of Quebec, Canada, risk being excluded from research by lack of a sampling frame. We examined the adequacy of random sampling, advertising, and respondent-driven sampling for recruitment of francophones for survey research. We recruited francophones residing in the city of Calgary, Alberta, through advertising and respondentdriven sampling. These 2 samples were then compared with a random subsample of Calgary francophones derived from the 2006 Canadian Community Health Survey (CCHS). We assessed the effectiveness of advertising and respondent-driven sampling in relation to the CCHS sample by comparing demographic characteristics and selected items from the CCHS (specifically self-reported general health status, perceived weight, and having a family doctor). We recruited 120 francophones through advertising and 145 through respondent-driven sampling; the random sample from the CCHS consisted of 259 records. The samples derived from advertising and respondentdriven sampling differed from the CCHS in terms of age (mean ages 41.0, 37.6, and 42.5 years, respectively), sex (proportion of males 26.1%, 40.6%, and 56.6%, respectively), education (college or higher 86.7% , 77.9% , and 59.1%, respectively), place of birth (immigrants accounting for 45.8%, 55.2%, and 3.7%, respectively), and not having a regular medical doctor (16.7%, 34.5%, and 16.6%, respectively). Differences were not tested statistically because of limitations on the analysis of CCHS data imposed by Statistics Canada. The samples generated exclusively through advertising and respondent-driven sampling were not representative of the gold standard sample from the CCHS. Use of such biased samples for research studies could generate misleading results.

  15. Optimal sampling designs for large-scale fishery sample surveys in Greece

    Directory of Open Access Journals (Sweden)

    G. BAZIGOS

    2007-12-01

    The paper deals with the optimization of the following three large scale sample surveys: biological sample survey of commercial landings (BSCL, experimental fishing sample survey (EFSS, and commercial landings and effort sample survey (CLES.

  16. Wide-range bipolar pulse conductance instrument employing current and voltage modes with sampled or integrated signal acquisition

    Energy Technology Data Exchange (ETDEWEB)

    Calhoun, R K; Holler, F J [Kentucky Univ., Lexington, KY (United States). Dept. of Chemistry; Geiger, jr, R F; Nieman, T A [Illinois Univ., Urbana, IL (United States). Dept. of Chemistry; Caserta, K J [Procter and Gamble Co., Cincinnati, OH (United States)

    1991-11-05

    An instrument for measuring solution conductance using the bipolar pulse technique is described. The instrument is capable of measuring conductances in the range of 5x10{sup -9}-10{Omega}{sup -1} with 1% accuracy or better in as little as 32 {mu}s. Accuracy of 0.001-0.01% is achievable over the range 1x10{sup -6}-1{Omega}{sup -1}. Circuitry and software are described that allow the instrument to adjust automatically the pulse height, pulse duration, excitation mode (current or voltage pulse) and data acquisition mode (sampled or integrated) to acquire data of optimum accuracy and precision. The urease-catalyzed decomposition of urea is used to illustrate the versality of the instrument, and other applications are cited. (author). 60 refs.; 7 figs.; 2 tabs.

  17. X-ray intensity fluctuation spectroscopy in the energy range from 1 to 4 keV

    Energy Technology Data Exchange (ETDEWEB)

    Retsch, C.C.

    2001-06-01

    X-ray intensity fluctuation spectroscopy was developed in the energy range of 1 to 4 keV and was used to study complex sample structures and dynamics in a liquid-crystal - aerosil dispersion. The advantages of a focusing versus a nonfocusing setup were explored, and the effects of using X-ray energies near absorption edges were investigated to enhance the capabilities of the method. It was found that even though excellent real space resolution and an increase in flux density can be gained from a Fresnel zone plate focusing setup, this usually comes at the expense of speckle contrast. At absorption edges, the speckle contrast is dominated by the imaginary part of the sample's index of refraction and therefore varies in a way similar to the total transmitted intensity. Employing these results, the dynamics of a dispersion of low-density silica aerosil in octylcyanobiphenyl (8CB) were studied. It was found that the known cross-over behavior of 8CB - aerosil samples towards the 3d-XY universality class should be understood as the coupling of the aerosil-gel dynamics to the dynamics of the director fluctuations in the liquid-crystal. This work indicates that the aerosil-gel mimics and dampens these director fluctuations and thus, by suppressing the director fluctuations, achieves a pure 3d-XY system. (orig.)

  18. Plasma phenylalanine and tyrosine responses to different nutritional conditions (fasting/postprandial) in patients with phenylketonuria: effect of sample timing.

    Science.gov (United States)

    van Spronsen, F J; van Rijn, M; van Dijk, T; Smit, G P; Reijngoud, D J; Berger, R; Heymans, H S

    1993-10-01

    To evaluate the adequacy of dietary treatment in patients with phenylketonuria, the monitoring of plasma phenylalanine and tyrosine concentrations is of great importance. The preferable time of blood sampling in relation to the nutritional condition during the day, however, is not known. It was the aim of this study to define guidelines for the timing of blood sampling with a minimal burden for the patient. Plasma concentrations of phenylalanine and tyrosine were measured in nine patients with phenylketonuria who had no clinical evidence of tyrosine deficiency. These values were measured during the day both after a prolonged overnight fast, and before and after breakfast. Phenylalanine showed a small rise during prolonged fasting, while tyrosine decreased slightly. After an individually tailored breakfast, phenylalanine remained stable, while tyrosine showed large fluctuations. It is concluded that the patient's nutritional condition (fasting/postprandial) is not important in the evaluation of the phenylalanine intake. To detect a possible tyrosine deficiency, however, a single blood sample is not sufficient and a combination of a preprandial and postprandial blood sample on the same day is advocated.

  19. A Rover Mobility Platform with Autonomous Capability to Enable Mars Sample Return

    Science.gov (United States)

    Fulford, P.; Langley, C.; Shaw, A.

    2018-04-01

    The next step in understanding Mars is sample return. In Fall 2016, the CSA conducted an analogue deployment using the Mars Exploration Science Rover. An objective was to demonstrate the maturity of the rover's guidance, navigation, and control.

  20. An integrated and accessible sample data library for Mars sample return science

    Science.gov (United States)

    Tuite, M. L., Jr.; Williford, K. H.

    2015-12-01

    Over the course of the next decade or more, many thousands of geological samples will be collected and analyzed in a variety of ways by researchers at the Jet Propulsion Laboratory (California Institute of Technology) in order to facilitate discovery and contextualize observations made of Mars rocks both in situ and here on Earth if samples are eventually returned. Integration of data from multiple analyses of samples including petrography, thin section and SEM imaging, isotope and organic geochemistry, XRF, XRD, and Raman spectrometry is a challenge and a potential obstacle to discoveries that require supporting lines of evidence. We report the development of a web-accessible repository, the Sample Data Library (SDL) for the sample-based data that are generated by the laboratories and instruments that comprise JPL's Center for Analysis of Returned Samples (CARS) in order to facilitate collaborative interpretation of potential biosignatures in Mars-analog geological samples. The SDL is constructed using low-cost, open-standards-based Amazon Web Services (AWS), including web-accessible storage, relational data base services, and a virtual web server. The data structure is sample-centered with a shared registry for assigning unique identifiers to all samples including International Geo-Sample Numbers. Both raw and derived data produced by instruments and post-processing workflows are automatically uploaded to online storage and linked via the unique identifiers. Through the web interface, users are able to find all the analyses associated with a single sample or search across features shared by multiple samples, sample localities, and analysis types. Planned features include more sophisticated search and analytical interfaces as well as data discoverability through NSF's EarthCube program.

  1. A user-friendly robotic sample preparation program for fully automated biological sample pipetting and dilution to benefit the regulated bioanalysis.

    Science.gov (United States)

    Jiang, Hao; Ouyang, Zheng; Zeng, Jianing; Yuan, Long; Zheng, Naiyu; Jemal, Mohammed; Arnold, Mark E

    2012-06-01

    Biological sample dilution is a rate-limiting step in bioanalytical sample preparation when the concentrations of samples are beyond standard curve ranges, especially when multiple dilution factors are needed in an analytical run. We have developed and validated a Microsoft Excel-based robotic sample preparation program (RSPP) that automatically transforms Watson worklist sample information (identification, sequence and dilution factor) to comma-separated value (CSV) files. The Freedom EVO liquid handler software imports and transforms the CSV files to executable worklists (.gwl files), allowing the robot to perform sample dilutions at variable dilution factors. The dynamic dilution range is 1- to 1000-fold and divided into three dilution steps: 1- to 10-, 11- to 100-, and 101- to 1000-fold. The whole process, including pipetting samples, diluting samples, and adding internal standard(s), is accomplished within 1 h for two racks of samples (96 samples/rack). This platform also supports online sample extraction (liquid-liquid extraction, solid-phase extraction, protein precipitation, etc.) using 96 multichannel arms. This fully automated and validated sample dilution and preparation process has been applied to several drug development programs. The results demonstrate that application of the RSPP for fully automated sample processing is efficient and rugged. The RSPP not only saved more than 50% of the time in sample pipetting and dilution but also reduced human errors. The generated bioanalytical data are accurate and precise; therefore, this application can be used in regulated bioanalysis.

  2. Optimal sampling strategies for detecting zoonotic disease epidemics.

    Directory of Open Access Journals (Sweden)

    Jake M Ferguson

    2014-06-01

    Full Text Available The early detection of disease epidemics reduces the chance of successful introductions into new locales, minimizes the number of infections, and reduces the financial impact. We develop a framework to determine the optimal sampling strategy for disease detection in zoonotic host-vector epidemiological systems when a disease goes from below detectable levels to an epidemic. We find that if the time of disease introduction is known then the optimal sampling strategy can switch abruptly between sampling only from the vector population to sampling only from the host population. We also construct time-independent optimal sampling strategies when conducting periodic sampling that can involve sampling both the host and the vector populations simultaneously. Both time-dependent and -independent solutions can be useful for sampling design, depending on whether the time of introduction of the disease is known or not. We illustrate the approach with West Nile virus, a globally-spreading zoonotic arbovirus. Though our analytical results are based on a linearization of the dynamical systems, the sampling rules appear robust over a wide range of parameter space when compared to nonlinear simulation models. Our results suggest some simple rules that can be used by practitioners when developing surveillance programs. These rules require knowledge of transition rates between epidemiological compartments, which population was initially infected, and of the cost per sample for serological tests.

  3. Optimal sampling strategies for detecting zoonotic disease epidemics.

    Science.gov (United States)

    Ferguson, Jake M; Langebrake, Jessica B; Cannataro, Vincent L; Garcia, Andres J; Hamman, Elizabeth A; Martcheva, Maia; Osenberg, Craig W

    2014-06-01

    The early detection of disease epidemics reduces the chance of successful introductions into new locales, minimizes the number of infections, and reduces the financial impact. We develop a framework to determine the optimal sampling strategy for disease detection in zoonotic host-vector epidemiological systems when a disease goes from below detectable levels to an epidemic. We find that if the time of disease introduction is known then the optimal sampling strategy can switch abruptly between sampling only from the vector population to sampling only from the host population. We also construct time-independent optimal sampling strategies when conducting periodic sampling that can involve sampling both the host and the vector populations simultaneously. Both time-dependent and -independent solutions can be useful for sampling design, depending on whether the time of introduction of the disease is known or not. We illustrate the approach with West Nile virus, a globally-spreading zoonotic arbovirus. Though our analytical results are based on a linearization of the dynamical systems, the sampling rules appear robust over a wide range of parameter space when compared to nonlinear simulation models. Our results suggest some simple rules that can be used by practitioners when developing surveillance programs. These rules require knowledge of transition rates between epidemiological compartments, which population was initially infected, and of the cost per sample for serological tests.

  4. Apparatus for surface treatment of U-Pu carbide fuel samples

    International Nuclear Information System (INIS)

    Fukushima, Susumu; Arai, Yasuo; Handa, Muneo; Ohmichi, Toshihiko; Shiozawa, Ken-ichi.

    1979-05-01

    Apparatus has been constructed for treating the surface of U-Pu carbide fuel samples for EPMA. The treatment is to clean off oxide layer on the surface, then coat with an electric-conductive material. The apparatus, safe in handling plutonium, operates as follows. (1) To avoid oxidation of the analyzing surface by oxygen and water in the air, series of cleaning and coating, i.e. ion-etching and ion-coating or ion-etching and vacuum-evaporation is done at the same time in an inert gas atmosphere. (2) Ion-etching is possible on samples embedded in non-electric-conductive and low heat-conductive resin. (3) Since the temperature rise in (2) is negligible, there is no deterioration of the samples. (author)

  5. Nonequilibrium electron-vibration coupling and conductance fluctuations in a C60 junction

    DEFF Research Database (Denmark)

    Ulstrup, Søren; Frederiksen, Thomas; Brandbyge, Mads

    2012-01-01

    displacement. Combined with a vibrational heating mechanism we construct a model from our results that explain the polarity-dependent two-level conductance fluctuations observed in recent scanning tunneling microscopy (STM) experiments [N. Ne´el et al., Nano Lett. 11, 3593 (2011)]. These findings highlight...

  6. Wilsonville wastewater sampling program. Final report

    Energy Technology Data Exchange (ETDEWEB)

    None

    1983-10-01

    As part of its contrast to design, build and operate the SRC-1 Demonstration Plant in cooperation with the US Department of Energy (DOE), International Coal Refining Company (ICRC) was required to collect and evaluate data related to wastewater streams and wastewater treatment procedures at the SRC-1 Pilot Plant facility. The pilot plant is located at Wilsonville, Alabama and is operated by Catalytic, Inc. under the direction of Southern Company Services. The plant is funded in part by the Electric Power Research Institute and the DOE. ICRC contracted with Catalytic, Inc. to conduct wastewater sampling. Tasks 1 through 5 included sampling and analysis of various wastewater sources and points of different steps in the biological treatment facility at the plant. The sampling program ran from May 1 to July 31, 1982. Also included in the sampling program was the generation and analysis of leachate from SRC product using standard laboratory leaching procedures. For Task 6, available plant wastewater data covering the period from February 1978 to December 1981 was analyzed to gain information that might be useful for a demonstration plant design basis. This report contains a tabulation of the analytical data, a summary tabulation of the historical operating data that was evaluated and comments concerning the data. The procedures used during the sampling program are also documented.

  7. ExSample. A library for sampling Sudakov-type distributions

    Energy Technology Data Exchange (ETDEWEB)

    Plaetzer, Simon

    2011-08-15

    Sudakov-type distributions are at the heart of generating radiation in parton showers as well as contemporary NLO matching algorithms along the lines of the POWHEG algorithm. In this paper, the C++ library ExSample is introduced, which implements adaptive sampling of Sudakov-type distributions for splitting kernels which are in general only known numerically. Besides the evolution variable, the splitting kernels can depend on an arbitrary number of other degrees of freedom to be sampled, and any number of further parameters which are fixed on an event-by-event basis. (orig.)

  8. ExSample. A library for sampling Sudakov-type distributions

    International Nuclear Information System (INIS)

    Plaetzer, Simon

    2011-08-01

    Sudakov-type distributions are at the heart of generating radiation in parton showers as well as contemporary NLO matching algorithms along the lines of the POWHEG algorithm. In this paper, the C++ library ExSample is introduced, which implements adaptive sampling of Sudakov-type distributions for splitting kernels which are in general only known numerically. Besides the evolution variable, the splitting kernels can depend on an arbitrary number of other degrees of freedom to be sampled, and any number of further parameters which are fixed on an event-by-event basis. (orig.)

  9. Remote sampling and analysis of highly radioactive samples in shielded boxes

    International Nuclear Information System (INIS)

    Kirpikov, D.A.; Miroshnichenko, I.V.; Pykhteev, O.Yu.

    2010-01-01

    The sampling procedure used for highly radioactive coolant water is associated with high risk of personnel irradiation and uncontrolled radioactive contamination. Remote sample manipulation with provision for proper radiation shielding is intended for safety enhancement of the sampling procedure. The sampling lines are located in an isolated compartment, a shielded box. Various equipment which enables remote or automatic sample manipulation is used for this purpose. The main issues of development of the shielded box equipment intended for a wider ranger of remote chemical analyses and manipulation techniques for highly radioactive water samples are considered in the paper. There were three principal directions of work: Transfer of chemical analysis performed in the laboratory inside the shielded box; Prevalence of computer-aided and remote techniques of highly radioactive sample manipulation inside the shielded box; and, Increase in control over sampling and determination of thermal-hydraulic parameters of the coolant water in the sampling lines. The developed equipment and solutions enable remote chemical analysis in the restricted volume of the shielded box by using ion-chromatographic, amperometrical, fluorimetric, flow injection, phototurbidimetric, conductometric and potentiometric methods. Extent of control performed in the shielded box is determined taking into account the requirements of the regulatory documents as well as feasibility and cost of the technical adaptation of various methods to the shielded box conditions. The work resulted in highly precise determination of more than 15 indexes of the coolant water quality performed in on-line mode in the shielded box. It averages to 80% of the total extent of control performed at the prototype reactor plants. The novel solutions for highly radioactive sample handling are implemented in the shielded box (for example, packaging, sample transportation to the laboratory, volume measurement). The shielded box is

  10. Applicability of neutron activation analysis to geological samples

    Energy Technology Data Exchange (ETDEWEB)

    Ebihara, Mitsuru [Tokyo Metropolitan Univ., Graduate School of Science, Tokyo (Japan)

    2003-03-01

    The applicability of neutron activation analysis (NAA) to geological samples in space is discussed by referring to future space mission programs, by which the extraterrestrial samples are to be delivered to the earth for scientific inspections. It is concluded that both destructive and non-destructive NAA are highly effective in analyzing these samples. (author)

  11. Applicability of neutron activation analysis to geological samples

    International Nuclear Information System (INIS)

    Ebihara, Mitsuru

    2003-01-01

    The applicability of neutron activation analysis (NAA) to geological samples in space is discussed by referring to future space mission programs, by which the extraterrestrial samples are to be delivered to the earth for scientific inspections. It is concluded that both destructive and non-destructive NAA are highly effective in analyzing these samples. (author)

  12. Descriptions of Sampling Practices Within Five Approaches to Qualitative Research in Education and the Health Sciences

    Directory of Open Access Journals (Sweden)

    Timothy C. Guetterman

    2015-05-01

    Full Text Available Although recommendations exist for determining qualitative sample sizes, the literature appears to contain few instances of research on the topic. Practical guidance is needed for determining sample sizes to conduct rigorous qualitative research, to develop proposals, and to budget resources. The purpose of this article is to describe qualitative sample size and sampling practices within published studies in education and the health sciences by research design: case study, ethnography, grounded theory methodology, narrative inquiry, and phenomenology. I analyzed the 51 most highly cited studies using predetermined content categories and noteworthy sampling characteristics that emerged. In brief, the findings revealed a mean sample size of 87. Less than half of the studies identified a sampling strategy. I include a description of findings by approach and recommendations for sampling to assist methodologists, reviewers, program officers, graduate students, and other qualitative researchers in understanding qualitative sampling practices in recent studies. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1502256

  13. USAGE OF PRESSURE OSCILLATIONS OF FLUCTUATING GAS FLOW FOR HANDLING OF 40X HARDENED STEEL SAMPLES

    Directory of Open Access Journals (Sweden)

    E. E. Il'ina

    2016-07-01

    Full Text Available Subject of Research. The paper deals with experience in the use of advanced technology of aeroacoustic treatment of materials for impact toughness improvement of the 40X type constructional steel samples. The method is based on the influence of pulsating air stream with oscillating shock-wave structures on the sample. As a result, the so-called Maxwell's waves are generated in the sample, that can lead to a beneficial transformation in the micro- and substructure and also in the phase structure of hardened steels. Obtained changes may be enough to improve impact toughness and decrease the residual stresses that arise in the course of previous treatments. Distortion of components decreases in this case, and failure probability becomes lower at the further treatment and operation. The advantage of technology is elimination of the additional heat treatment, for example, of the relaxation annealing that serves to reduce the residual stresses. This can be useful, particularly, for the preservation of high hardness and wear resistance, obtained by hardening and low-temperature tempering (about 200 ° C, as the relaxation annealing has typically a higher temperature and will result in their reduction. The toughness increase of the samples is assumed as an indicator of the positive impact of the considered treatment. Main Results. We have defined characteristics and modes of experimental acoustic transducer implementing the aeroacoustic processing. Experiments have been carried out on the impact assessment of aeroacoustic effects on the toughness of widely used 40X type steel. The obtained results enable to suggest that the application of aeroacoustic treatment for samples hardened by heat treatment leads to the toughness increasing of the investigated material. In this case an increased value of hardness obtained after heat treatment is maintained. Practical Relevance. The results supplement previously obtained experimental data for aeroacoustic

  14. IAEA Sampling Plan

    Energy Technology Data Exchange (ETDEWEB)

    Geist, William H. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-15

    The objectives for this presentation are to describe the method that the IAEA uses to determine a sampling plan for nuclear material measurements; describe the terms detection probability and significant quantity; list the three nuclear materials measurement types; describe the sampling method applied to an item facility; and describe multiple method sampling.

  15. Testing a groundwater sampling tool: Are the samples representative?

    International Nuclear Information System (INIS)

    Kaback, D.S.; Bergren, C.L.; Carlson, C.A.; Carlson, C.L.

    1989-01-01

    A ground water sampling tool, the HydroPunch trademark, was tested at the Department of Energy's Savannah River Site in South Carolina to determine if representative ground water samples could be obtained without installing monitoring wells. Chemical analyses of ground water samples collected with the HydroPunch trademark from various depths within a borehole were compared with chemical analyses of ground water from nearby monitoring wells. The site selected for the test was in the vicinity of a large coal storage pile and a coal pile runoff basin that was constructed to collect the runoff from the coal storage pile. Existing monitoring wells in the area indicate the presence of a ground water contaminant plume that: (1) contains elevated concentrations of trace metals; (2) has an extremely low pH; and (3) contains elevated concentrations of major cations and anions. Ground water samples collected with the HydroPunch trademark provide in excellent estimate of ground water quality at discrete depths. Groundwater chemical data collected from various depths using the HydroPunch trademark can be averaged to simulate what a screen zone in a monitoring well would sample. The averaged depth-discrete data compared favorably with the data obtained from the nearby monitoring wells

  16. Bioremediation of PAH contaminated soil samples

    International Nuclear Information System (INIS)

    Joshi, M.M.; Lee, S.

    1994-01-01

    Soils contaminated with polynuclear aromatic hydrocarbons (PAHs) pose a hazard to life. The remediation of such sites can be done using physical, chemical, and biological treatment methods or a combination of them. It is of interest to study the decontamination of soil using bioremediation. The experiments were conducted using Acinetobacter (ATCC 31012) at room temperature without pH or temperature control. In the first series of experiments, contaminated soil samples obtained from Alberta Research Council were analyzed to determine the toxic contaminant and their composition in the soil. These samples were then treated using aerobic fermentation and removal efficiency for each contaminant was determined. In the second series of experiments, a single contaminant was used to prepare a synthetic soil sample. This sample of known composition was then treated using aerobic fermentation in continuously stirred flasks. In one set of flasks, contaminant was the only carbon source and in the other set, starch was an additional carbon source. In the third series of experiments, the synthetic contaminated soil sample was treated in continuously stirred flasks in the first set and in fixed bed in the second set and the removal efficiencies were compared. The removal efficiencies obtained indicated the extent of biodegradation for various contaminants, the effect of additional carbon source, and performance in fixed bed without external aeration

  17. System for Packaging Planetary Samples for Return to Earth

    Science.gov (United States)

    Badescu, Mircea; Bar-Cohen, Yoseph; Backes, paul G.; Sherrit, Stewart; Bao, Xiaoqi; Scott, James S.

    2010-01-01

    A system is proposed for packaging material samples on a remote planet (especially Mars) in sealed sample tubes in preparation for later return to Earth. The sample tubes (Figure 1) would comprise (1) tubes initially having open tops and closed bottoms; (2) small, bellows-like collapsible bodies inside the tubes at their bottoms; and (3) plugs to be eventually used to close the tops of the tubes. The top inner surface of each tube would be coated with solder. The side of each plug, which would fit snugly into a tube, would feature a solder-filled ring groove. The system would include equipment for storing, manipulating, filling, and sealing the tubes. The containerization system (see Figure 2) will be organized in stations and will include: the storage station, the loading station, and the heating station. These stations can be structured in circular or linear pattern to minimize the manipulator complexity, allowing for compact design and mass efficiency. The manipulation of the sample tube between stations is done by a simple manipulator arm. The storage station contains the unloaded sample tubes and the plugs before sealing as well as the sealed sample tubes with samples after loading and sealing. The chambers at the storage station also allow for plug insertion into the sample tube. At the loading station the sample is poured or inserted into the sample tube and then the tube is topped off. At the heating station the plug is heated so the solder ring melts and seals the plug to the sample tube. The process is performed as follows: Each tube is filled or slightly overfilled with sample material and the excess sample material is wiped off the top. Then, the plug is inserted into the top section of the tube packing the sample material against the collapsible bellowslike body allowing the accommodation of the sample volume. The plug and the top of the tube are heated momentarily to melt the solder in order to seal the tube.

  18. Sampling trace organic compounds in water: a comparison of a continuous active sampler to continuous passive and discrete sampling methods.

    Science.gov (United States)

    Coes, Alissa L; Paretti, Nicholas V; Foreman, William T; Iverson, Jana L; Alvarez, David A

    2014-03-01

    A continuous active sampling method was compared to continuous passive and discrete sampling methods for the sampling of trace organic compounds (TOCs) in water. Results from each method are compared and contrasted in order to provide information for future investigators to use while selecting appropriate sampling methods for their research. The continuous low-level aquatic monitoring (CLAM) sampler (C.I.Agent® Storm-Water Solutions) is a submersible, low flow-rate sampler, that continuously draws water through solid-phase extraction media. CLAM samplers were deployed at two wastewater-dominated stream field sites in conjunction with the deployment of polar organic chemical integrative samplers (POCIS) and the collection of discrete (grab) water samples. All samples were analyzed for a suite of 69 TOCs. The CLAM and POCIS samples represent time-integrated samples that accumulate the TOCs present in the water over the deployment period (19-23 h for CLAM and 29 days for POCIS); the discrete samples represent only the TOCs present in the water at the time and place of sampling. Non-metric multi-dimensional scaling and cluster analysis were used to examine patterns in both TOC detections and relative concentrations between the three sampling methods. A greater number of TOCs were detected in the CLAM samples than in corresponding discrete and POCIS samples, but TOC concentrations in the CLAM samples were significantly lower than in the discrete and (or) POCIS samples. Thirteen TOCs of varying polarity were detected by all of the three methods. TOC detections and concentrations obtained by the three sampling methods, however, are dependent on multiple factors. This study found that stream discharge, constituent loading, and compound type all affected TOC concentrations detected by each method. In addition, TOC detections and concentrations were affected by the reporting limits, bias, recovery, and performance of each method. Published by Elsevier B.V.

  19. Sampling trace organic compounds in water: a comparison of a continuous active sampler to continuous passive and discrete sampling methods

    Science.gov (United States)

    Coes, Alissa L.; Paretti, Nicholas V.; Foreman, William T.; Iverson, Jana L.; Alvarez, David A.

    2014-01-01

    A continuous active sampling method was compared to continuous passive and discrete sampling methods for the sampling of trace organic compounds (TOCs) in water. Results from each method are compared and contrasted in order to provide information for future investigators to use while selecting appropriate sampling methods for their research. The continuous low-level aquatic monitoring (CLAM) sampler (C.I.Agent® Storm-Water Solutions) is a submersible, low flow-rate sampler, that continuously draws water through solid-phase extraction media. CLAM samplers were deployed at two wastewater-dominated stream field sites in conjunction with the deployment of polar organic chemical integrative samplers (POCIS) and the collection of discrete (grab) water samples. All samples were analyzed for a suite of 69 TOCs. The CLAM and POCIS samples represent time-integrated samples that accumulate the TOCs present in the water over the deployment period (19–23 h for CLAM and 29 days for POCIS); the discrete samples represent only the TOCs present in the water at the time and place of sampling. Non-metric multi-dimensional scaling and cluster analysis were used to examine patterns in both TOC detections and relative concentrations between the three sampling methods. A greater number of TOCs were detected in the CLAM samples than in corresponding discrete and POCIS samples, but TOC concentrations in the CLAM samples were significantly lower than in the discrete and (or) POCIS samples. Thirteen TOCs of varying polarity were detected by all of the three methods. TOC detections and concentrations obtained by the three sampling methods, however, are dependent on multiple factors. This study found that stream discharge, constituent loading, and compound type all affected TOC concentrations detected by each method. In addition, TOC detections and concentrations were affected by the reporting limits, bias, recovery, and performance of each method.

  20. Bacterial diversity of surface sand samples from the Gobi and Taklamaken deserts.

    Science.gov (United States)

    An, Shu; Couteau, Cécile; Luo, Fan; Neveu, Julie; DuBow, Michael S

    2013-11-01

    Arid regions represent nearly 30 % of the Earth's terrestrial surface, but their microbial biodiversity is not yet well characterized. The surface sands of deserts, a subset of arid regions, are generally subjected to large temperature fluctuations plus high UV light exposure and are low in organic matter. We examined surface sand samples from the Taklamaken (China, three samples) and Gobi (Mongolia, two samples) deserts, using pyrosequencing of PCR-amplified 16S V1/V2 rDNA sequences from total extracted DNA in order to gain an assessment of the bacterial population diversity. In total, 4,088 OTUs (using ≥97 % sequence similarity levels), with Chao1 estimates varying from 1,172 to 2,425 OTUs per sample, were discernable. These could be grouped into 102 families belonging to 15 phyla, with OTUs belonging to the Firmicutes, Proteobacteria, Bacteroidetes, and Actinobacteria phyla being the most abundant. The bacterial population composition was statistically different among the samples, though members from 30 genera were found to be common among the five samples. An increase in phylotype numbers with increasing C/N ratio was noted, suggesting a possible role in the bacterial richness of these desert sand environments. Our results imply an unexpectedly large bacterial diversity residing in the harsh environment of these two Asian deserts, worthy of further investigation.

  1. Sample volume and alignment analysis for an optical particle counter sizer, and other applications

    International Nuclear Information System (INIS)

    Holve, D.J.; Davis, G.W.

    1985-01-01

    Optical methods for particle size distribution measurements in practical high temperature environments are approaching feasibility and offer significant advantages over conventional sampling methods. A key requirement of single particle counting techniques is the need to know features of the sample volume intensity distribution which in general are a function of the particle scattering properties and optical system geometry. In addition, the sample volume intensity distribution is sensitive to system alignment and thus calculations of alignment sensitivity are required for assessment of practical alignment tolerances. To this end, an analysis of sample volume characteristics for single particle counters in general has been developed. Results from the theory are compared with experimental measurements and shown to be in good agreement. A parametric sensitivity analysis is performed and a criterion for allowable optical misalignment is derived for conditions where beam steering caused by fluctuating refractive-index gradients is significant

  2. Environmental surveillance master sampling schedule

    Energy Technology Data Exchange (ETDEWEB)

    Bisping, L E

    1992-01-01

    Environmental surveillance of the Hanford Site and surrounding areas is conducted by the Pacific Northwest Laboratory (PNL) for the US Department of Energy (DOE). This document contains the planned schedule for routine sample collection for the Surface Environmental Surveillance Project (SESP) and Ground-Water Monitoring Project. Samples for radiological analyses include Air-Particulate Filter, gases and vapor; Water/Columbia River, Onsite Pond, Spring, Irrigation, and Drinking; Foodstuffs/Animal Products including Whole Milk, Poultry and Eggs, and Beef; Foodstuffs/Produce including Leafy Vegetables, Vegetables, and Fruit; Foodstuffs/Farm Products including Wine, Wheat and Alfalfa; Wildlife; Soil; Vegetation; and Sediment. Direct Radiation Measurements include Terrestrial Locations, Columbia River Shoreline Locations, and Onsite Roadway, Railway and Aerial, Radiation Surveys.

  3. Nanostructured conducting molecularly imprinted polymer for selective extraction of salicylate from urine and serum samples by electrochemically controlled solid-phase micro-extraction

    Energy Technology Data Exchange (ETDEWEB)

    Ameli, Akram [Department of Chemistry, Faculty of Science, Tarbiat Modares University, P.O. Box 14115-175, Tehran (Iran, Islamic Republic of); Alizadeh, Naader, E-mail: alizaden@modares.ac.ir [Department of Chemistry, Faculty of Science, Tarbiat Modares University, P.O. Box 14115-175, Tehran (Iran, Islamic Republic of)

    2011-11-30

    Highlights: Black-Right-Pointing-Pointer Overoxidized polypyrrole templated with salicylate has been utilized as conducting molecular imprinted polymer for EC-SPME. Black-Right-Pointing-Pointer This first study reported on conducting molecular imprinted polymer was used to EC-SPME of salicylate. Black-Right-Pointing-Pointer Proposed method, is particularly effective in sample clean-up and selective monitoring of salicylate in physiological samples. - Abstract: Overoxidized polypyrrole (OPPy) films templated with salicylate (SA) have been utilized as conducting molecular imprinted polymers (CMIPs) for potential-induced selective solid-phase micro-extraction processes. Various important fabrication factors for controlling the performance of the OPPy films have been investigated using fluorescence spectrometry. Several key parameters such as applied potential for uptake, release, pH of uptake and release solution were varied to achieve the optimum micro-extraction procedure. The film template with SA exhibited excellent selectivity over some interference. The calibration graphs were linear in the ranges of 5 Multiplication-Sign 10{sup -8} to 5 Multiplication-Sign 10{sup -4} and 1.2 Multiplication-Sign 10{sup -6} to 5 Multiplication-Sign 10{sup -4} mol mL{sup -1} and the detection limit was 4 Multiplication-Sign 10{sup -8} mol L{sup -1}. The OPPy film as the solid-phase micro-extraction absorbent has been applied for the selective clean-up and quantification of trace amounts of SA from physiological samples. The results of scanning electron microscopy (SEM) have confirmed the nano-structure morphologies of the films.

  4. Nanostructured conducting molecularly imprinted polymer for selective extraction of salicylate from urine and serum samples by electrochemically controlled solid-phase micro-extraction

    International Nuclear Information System (INIS)

    Ameli, Akram; Alizadeh, Naader

    2011-01-01

    Highlights: ► Overoxidized polypyrrole templated with salicylate has been utilized as conducting molecular imprinted polymer for EC-SPME. ► This first study reported on conducting molecular imprinted polymer was used to EC-SPME of salicylate. ► Proposed method, is particularly effective in sample clean-up and selective monitoring of salicylate in physiological samples. - Abstract: Overoxidized polypyrrole (OPPy) films templated with salicylate (SA) have been utilized as conducting molecular imprinted polymers (CMIPs) for potential-induced selective solid-phase micro-extraction processes. Various important fabrication factors for controlling the performance of the OPPy films have been investigated using fluorescence spectrometry. Several key parameters such as applied potential for uptake, release, pH of uptake and release solution were varied to achieve the optimum micro-extraction procedure. The film template with SA exhibited excellent selectivity over some interference. The calibration graphs were linear in the ranges of 5 × 10 −8 to 5 × 10 −4 and 1.2 × 10 −6 to 5 × 10 −4 mol mL −1 and the detection limit was 4 × 10 −8 mol L −1 . The OPPy film as the solid-phase micro-extraction absorbent has been applied for the selective clean-up and quantification of trace amounts of SA from physiological samples. The results of scanning electron microscopy (SEM) have confirmed the nano-structure morphologies of the films.

  5. Sampling bee communities using pan traps: alternative methods increase sample size

    Science.gov (United States)

    Monitoring of the status of bee populations and inventories of bee faunas require systematic sampling. Efficiency and ease of implementation has encouraged the use of pan traps to sample bees. Efforts to find an optimal standardized sampling method for pan traps have focused on pan trap color. Th...

  6. New Development on Modelling Fluctuations and Fragmentation in Heavy-Ion Collisions

    Science.gov (United States)

    Lin, Hao; Danielewicz, Pawel

    2017-09-01

    During heavy-ion collisions (HIC), colliding nuclei form an excited composite system. Instabilities present in the system may deform the shape of the system exotically, leading to a break-up into fragments. Many experimental efforts have been devoted to the nuclear multifragmentation phenomenon, while traditional HIC models, lacking in proper treatment of fluctuations, fall short in explaining it. In view of this, we are developing a new model to implement realistic fluctuations into transport simulation. The new model is motivated by the Brownian motion description of colliding particles. The effects of two-body collisions are recast in one-body diffusion processes. Vastly different dynamical paths are sampled by solving Langevin equations in momentum space. It is the stochastic sampling of dynamical paths that leads to a wide spread of exit channels. In addition, the nucleon degree of freedom is used to enhance the fluctuations. The model has been tested in reactions such as 112Sn + 112Sn and 58Ni + 58Ni, where reasonable results are yielded. An exploratory comparison on the 112Sn + 112Sn reaction at 50 MeV/nucleon with two other models, the stochastic mean-field (SMF) and the antisymmetrized molecular dynamics (AMD) models, has also been conducted. Work supported by the NSF Grant No. PHY-1403906.

  7. Sample Acquisition for Materials in Planetary Exploration (SAMPLE), Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — ORBITEC proposes to analyze, design, and develop a device for autonomous lunar surface/subsurface sampling and processing applications. The Sample Acquisition for...

  8. Biological sample collector

    Science.gov (United States)

    Murphy, Gloria A [French Camp, CA

    2010-09-07

    A biological sample collector is adapted to a collect several biological samples in a plurality of filter wells. A biological sample collector may comprise a manifold plate for mounting a filter plate thereon, the filter plate having a plurality of filter wells therein; a hollow slider for engaging and positioning a tube that slides therethrough; and a slide case within which the hollow slider travels to allow the tube to be aligned with a selected filter well of the plurality of filter wells, wherein when the tube is aligned with the selected filter well, the tube is pushed through the hollow slider and into the selected filter well to sealingly engage the selected filter well and to allow the tube to deposit a biological sample onto a filter in the bottom of the selected filter well. The biological sample collector may be portable.

  9. Statistical sampling strategies

    International Nuclear Information System (INIS)

    Andres, T.H.

    1987-01-01

    Systems assessment codes use mathematical models to simulate natural and engineered systems. Probabilistic systems assessment codes carry out multiple simulations to reveal the uncertainty in values of output variables due to uncertainty in the values of the model parameters. In this paper, methods are described for sampling sets of parameter values to be used in a probabilistic systems assessment code. Three Monte Carlo parameter selection methods are discussed: simple random sampling, Latin hypercube sampling, and sampling using two-level orthogonal arrays. Three post-selection transformations are also described: truncation, importance transformation, and discretization. Advantages and disadvantages of each method are summarized

  10. Novel Sample-handling Approach for XRD Analysis with Minimal Sample Preparation

    Science.gov (United States)

    Sarrazin, P.; Chipera, S.; Bish, D.; Blake, D.; Feldman, S.; Vaniman, D.; Bryson, C.

    2004-01-01

    Sample preparation and sample handling are among the most critical operations associated with X-ray diffraction (XRD) analysis. These operations require attention in a laboratory environment, but they become a major constraint in the deployment of XRD instruments for robotic planetary exploration. We are developing a novel sample handling system that dramatically relaxes the constraints on sample preparation by allowing characterization of coarse-grained material that would normally be impossible to analyze with conventional powder-XRD techniques.

  11. Network and adaptive sampling

    CERN Document Server

    Chaudhuri, Arijit

    2014-01-01

    Combining the two statistical techniques of network sampling and adaptive sampling, this book illustrates the advantages of using them in tandem to effectively capture sparsely located elements in unknown pockets. It shows how network sampling is a reliable guide in capturing inaccessible entities through linked auxiliaries. The text also explores how adaptive sampling is strengthened in information content through subsidiary sampling with devices to mitigate unmanageable expanding sample sizes. Empirical data illustrates the applicability of both methods.

  12. Types of non-probabilistic sampling used in marketing research. „Snowball” sampling

    OpenAIRE

    Manuela Rozalia Gabor

    2007-01-01

    A significant way of investigating a firm’s market is the statistical sampling. The sampling typology provides a non / probabilistic models of gathering information and this paper describes thorough information related to network sampling, named “snowball” sampling. This type of sampling enables the survey of occurrence forms concerning the decision power within an organisation and of the interpersonal relation network governing a certain collectivity, a certain consumer panel. The snowball s...

  13. Environmental monitoring master sampling schedule: January--December 1989

    International Nuclear Information System (INIS)

    Bisping, L.E.

    1989-01-01

    Environmental monitoring of the Hanford Site is conducted by the Pacific Northwest Laboratory (PNL) for the US Department of Energy (DOE). This document contains the planned schedule for routine sample collection for calendar year 1989 for the Surface and Ground-Water Environmental Monitoring Projects. This schedule is subject to modification during the year in response to changes in Site operations, program requirements, and the nature of the observed results. Operational limitations such as weather, mechanical failures, sample availability, etc., may also require schedule modifications. Changes will be documented in the respective project files, but this plan will not be reissued. This schedule includes routine ground-water sampling performed by PNL for Westinghouse Hanford Company, but does not include samples that may be collected in 1989 to support special studies or special contractor projects, or for quality control. The sampling schedule for Site-wide chemical monitoring is not included here, because it varies each quarter as needed, based on past results and operating needs. This schedule does not include Resource Conservation and Recovery Act ground-water sampling performed by PNL for Hanford Site contractors, nor does it include sampling that may be done by other DOE Hanford contractors

  14. Atmospheric Sampling of Aerosols to Stratospheric Altitudes using High Altitude Balloons

    Science.gov (United States)

    Jerde, E. A.; Thomas, E.

    2010-12-01

    burst and the modules return to the surface. The second module will contain instrumentation recording temperature, pressure, and humidity, plus a radio beacon to track the location, facilitating recovery. Another instrument we are planning is a small, lightweight optical aerosol spectrometer probe. This would provide a valuable secondary set of data to compare with the actual sampling. The aerosol particle population will be assessed using the SEM at Morehead State University. Over the next several years, sampling is planned at locations both near and far from urban areas, and at intermediate locations. Sampling will be conducted at four times during the year to assess seasonal variations and, at some sites, repeated short-term samplings (e.g., 5 flights in 10 days) will be undertaken to assess short-term variations. In addition, the SEM should permit the assessment of the ratio of BC to organic carbon (OC). Like BC, organic carbon species are produced through biomass burning, but are not as effective as light absorbers, so are not responsible for as much forcing as black carbon. The atmosphere is sampled at a known volumetric rate, resulting in a picture of the atmospheric column density for both BC and OC, information of great use in modeling of the aerosol contribution to climate change.

  15. Digital Microfluidics Sample Analyzer

    Science.gov (United States)

    Pollack, Michael G.; Srinivasan, Vijay; Eckhardt, Allen; Paik, Philip Y.; Sudarsan, Arjun; Shenderov, Alex; Hua, Zhishan; Pamula, Vamsee K.

    2010-01-01

    Three innovations address the needs of the medical world with regard to microfluidic manipulation and testing of physiological samples in ways that can benefit point-of-care needs for patients such as premature infants, for which drawing of blood for continuous tests can be life-threatening in their own right, and for expedited results. A chip with sample injection elements, reservoirs (and waste), droplet formation structures, fluidic pathways, mixing areas, and optical detection sites, was fabricated to test the various components of the microfluidic platform, both individually and in integrated fashion. The droplet control system permits a user to control droplet microactuator system functions, such as droplet operations and detector operations. Also, the programming system allows a user to develop software routines for controlling droplet microactuator system functions, such as droplet operations and detector operations. A chip is incorporated into the system with a controller, a detector, input and output devices, and software. A novel filler fluid formulation is used for the transport of droplets with high protein concentrations. Novel assemblies for detection of photons from an on-chip droplet are present, as well as novel systems for conducting various assays, such as immunoassays and PCR (polymerase chain reaction). The lab-on-a-chip (a.k.a., lab-on-a-printed-circuit board) processes physiological samples and comprises a system for automated, multi-analyte measurements using sub-microliter samples of human serum. The invention also relates to a diagnostic chip and system including the chip that performs many of the routine operations of a central labbased chemistry analyzer, integrating, for example, colorimetric assays (e.g., for proteins), chemiluminescence/fluorescence assays (e.g., for enzymes, electrolytes, and gases), and/or conductometric assays (e.g., for hematocrit on plasma and whole blood) on a single chip platform.

  16. The effect of sampling rate on interpretation of the temporal characteristics of radiative and convective heating in wildland flames

    Science.gov (United States)

    David Frankman; Brent W. Webb; Bret W. Butler; Daniel Jimenez; Michael Harrington

    2012-01-01

    Time-resolved radiative and convective heating measurements were collected on a prescribed burn in coniferous fuels at a sampling frequency of 500 Hz. Evaluation of the data in the time and frequency domain indicate that this sampling rate was sufficient to capture the temporal fluctuations of radiative and convective heating. The convective heating signal contained...

  17. Sparse sampling and reconstruction for electron and scanning probe microscope imaging

    Science.gov (United States)

    Anderson, Hyrum; Helms, Jovana; Wheeler, Jason W.; Larson, Kurt W.; Rohrer, Brandon R.

    2015-07-28

    Systems and methods for conducting electron or scanning probe microscopy are provided herein. In a general embodiment, the systems and methods for conducting electron or scanning probe microscopy with an undersampled data set include: driving an electron beam or probe to scan across a sample and visit a subset of pixel locations of the sample that are randomly or pseudo-randomly designated; determining actual pixel locations on the sample that are visited by the electron beam or probe; and processing data collected by detectors from the visits of the electron beam or probe at the actual pixel locations and recovering a reconstructed image of the sample.

  18. A topological analysis of large-scale structure, studied using the CMASS sample of SDSS-III

    International Nuclear Information System (INIS)

    Parihar, Prachi; Gott, J. Richard III; Vogeley, Michael S.; Choi, Yun-Young; Kim, Juhan; Kim, Sungsoo S.; Speare, Robert; Brownstein, Joel R.; Brinkmann, J.

    2014-01-01

    We study the three-dimensional genus topology of large-scale structure using the northern region of the CMASS Data Release 10 (DR10) sample of the SDSS-III Baryon Oscillation Spectroscopic Survey. We select galaxies with redshift 0.452 < z < 0.625 and with a stellar mass M stellar > 10 11.56 M ☉ . We study the topology at two smoothing lengths: R G = 21 h –1 Mpc and R G = 34 h –1 Mpc. The genus topology studied at the R G = 21 h –1 Mpc scale results in the highest genus amplitude observed to date. The CMASS sample yields a genus curve that is characteristic of one produced by Gaussian random phase initial conditions. The data thus support the standard model of inflation where random quantum fluctuations in the early universe produced Gaussian random phase initial conditions. Modest deviations in the observed genus from random phase are as expected from shot noise effects and the nonlinear evolution of structure. We suggest the use of a fitting formula motivated by perturbation theory to characterize the shift and asymmetries in the observed genus curve with a single parameter. We construct 54 mock SDSS CMASS surveys along the past light cone from the Horizon Run 3 (HR3) N-body simulations, where gravitationally bound dark matter subhalos are identified as the sites of galaxy formation. We study the genus topology of the HR3 mock surveys with the same geometry and sampling density as the observational sample and find the observed genus topology to be consistent with ΛCDM as simulated by the HR3 mock samples. We conclude that the topology of the large-scale structure in the SDSS CMASS sample is consistent with cosmological models having primordial Gaussian density fluctuations growing in accordance with general relativity to form galaxies in massive dark matter halos.

  19. Spin fluctuations in iron based superconductors probed by NMR relaxation rate

    Energy Technology Data Exchange (ETDEWEB)

    Graefe, Uwe; Kuehne, Tim; Wurmehl, Sabine; Buechner, Bernd; Grafe, Hans-Joachim [IFW Dresden, Institute for Solid State Research, PF 270116, 01171 Dresden (Germany); Hammerath, Franziska [IFW Dresden, Institute for Solid State Research, PF 270116, 01171 Dresden (Germany); Department of Physics ' ' A. Volta' ' , University of Pavia-CNISM, I-27100 Pavia (Italy); Lang, Guillaume [3LPEM-UPR5, CNRS, ESPCI Paris Tech, 10 Rue Vauquelin, 75005 Paris (France)

    2013-07-01

    We present {sup 75}As nuclear magnetic resonance (NMR) results in F doped LaOFeAs iron pnictides. In the underdoped superconducting samples, pronounced spin fluctuations lead to a peak in the NMR spin lattice relaxation rate, (T{sub 1}T){sup -1}. The peak shows a typical field dependence that indicates a critical slowing of spin fluctuations: it is reduced in height and shifted to higher temperatures. In contrast, a similar peak in the underdoped magnetic samples at the ordering temperature of the spin density wave does not show such a field dependence. Furthermore, the peak is absent in optimally and overdoped samples, suggesting the absence of strong spin fluctuations. Our results indicate a glassy magnetic ordering in the underdoped samples that is in contrast to the often reported Curie Weiss like increase of spin fluctuations towards T{sub c}. Additional measurements of the linewidth and the spin spin relaxation rate are in agreement with such a glassy magnetic ordering that is most likely competing with superconductivity. Our results will be compared to Co doped BaFe{sub 2}As{sub 2}, where a similar peak in (T{sub 1}T){sup -1} has been observed.

  20. Geostatistical Sampling Methods for Efficient Uncertainty Analysis in Flow and Transport Problems

    Science.gov (United States)

    Liodakis, Stylianos; Kyriakidis, Phaedon; Gaganis, Petros

    2015-04-01

    In hydrogeological applications involving flow and transport of in heterogeneous porous media the spatial distribution of hydraulic conductivity is often parameterized in terms of a lognormal random field based on a histogram and variogram model inferred from data and/or synthesized from relevant knowledge. Realizations of simulated conductivity fields are then generated using geostatistical simulation involving simple random (SR) sampling and are subsequently used as inputs to physically-based simulators of flow and transport in a Monte Carlo framework for evaluating the uncertainty in the spatial distribution of solute concentration due to the uncertainty in the spatial distribution of hydraulic con- ductivity [1]. Realistic uncertainty analysis, however, calls for a large number of simulated concentration fields; hence, can become expensive in terms of both time and computer re- sources. A more efficient alternative to SR sampling is Latin hypercube (LH) sampling, a special case of stratified random sampling, which yields a more representative distribution of simulated attribute values with fewer realizations [2]. Here, term representative implies realizations spanning efficiently the range of possible conductivity values corresponding to the lognormal random field. In this work we investigate the efficiency of alternative methods to classical LH sampling within the context of simulation of flow and transport in a heterogeneous porous medium. More precisely, we consider the stratified likelihood (SL) sampling method of [3], in which attribute realizations are generated using the polar simulation method by exploring the geometrical properties of the multivariate Gaussian distribution function. In addition, we propose a more efficient version of the above method, here termed minimum energy (ME) sampling, whereby a set of N representative conductivity realizations at M locations is constructed by: (i) generating a representative set of N points distributed on the

  1. Measurement of radioactivity in the environment - Soil - Part 2: Guidance for the selection of the sampling strategy, sampling and pre-treatment of samples

    International Nuclear Information System (INIS)

    2007-01-01

    This part of ISO 18589 specifies the general requirements, based on ISO 11074 and ISO/IEC 17025, for all steps in the planning (desk study and area reconnaissance) of the sampling and the preparation of samples for testing. It includes the selection of the sampling strategy, the outline of the sampling plan, the presentation of general sampling methods and equipment, as well as the methodology of the pre-treatment of samples adapted to the measurements of the activity of radionuclides in soil. This part of ISO 18589 is addressed to the people responsible for determining the radioactivity present in soil for the purpose of radiation protection. It is applicable to soil from gardens, farmland, urban or industrial sites, as well as soil not affected by human activities. This part of ISO 18589 is applicable to all laboratories regardless of the number of personnel or the range of the testing performed. When a laboratory does not undertake one or more of the activities covered by this part of ISO 18589, such as planning, sampling or testing, the corresponding requirements do not apply. Information is provided on scope, normative references, terms and definitions and symbols, principle, sampling strategy, sampling plan, sampling process, pre-treatment of samples and recorded information. Five annexes inform about selection of the sampling strategy according to the objectives and the radiological characterization of the site and sampling areas, diagram of the evolution of the sample characteristics from the sampling site to the laboratory, example of sampling plan for a site divided in three sampling areas, example of a sampling record for a single/composite sample and example for a sample record for a soil profile with soil description. A bibliography is provided

  2. The effect of clustering on lot quality assurance sampling: a probabilistic model to calculate sample sizes for quality assessments.

    Science.gov (United States)

    Hedt-Gauthier, Bethany L; Mitsunaga, Tisha; Hund, Lauren; Olives, Casey; Pagano, Marcello

    2013-10-26

    Traditional Lot Quality Assurance Sampling (LQAS) designs assume observations are collected using simple random sampling. Alternatively, randomly sampling clusters of observations and then individuals within clusters reduces costs but decreases the precision of the classifications. In this paper, we develop a general framework for designing the cluster(C)-LQAS system and illustrate the method with the design of data quality assessments for the community health worker program in Rwanda. To determine sample size and decision rules for C-LQAS, we use the beta-binomial distribution to account for inflated risk of errors introduced by sampling clusters at the first stage. We present general theory and code for sample size calculations.The C-LQAS sample sizes provided in this paper constrain misclassification risks below user-specified limits. Multiple C-LQAS systems meet the specified risk requirements, but numerous considerations, including per-cluster versus per-individual sampling costs, help identify optimal systems for distinct applications. We show the utility of C-LQAS for data quality assessments, but the method generalizes to numerous applications. This paper provides the necessary technical detail and supplemental code to support the design of C-LQAS for specific programs.

  3. Psychometric properties of a German parent rating scale for oppositional defiant and conduct disorder (FBB-SSV) in clinical and community samples.

    Science.gov (United States)

    Görtz-Dorten, Anja; Ise, Elena; Hautmann, Christopher; Walter, Daniel; Döpfner, Manfred

    2014-08-01

    The Fremdbeurteilungsbogen für Störungen des Sozialverhaltens (FBB-SSV) is a commonly used DSM- and ICD-based rating scale for disruptive behaviour problems in Germany. This study examined the psychometric properties of the FBB-SSV rated by parents in both a clinical sample (N = 596) and a community sample (N = 720) of children aged 4-17 years. Results indicate that the FBB-SSV is internally consistent (α = .69-.90). Principal component analyses produced two-factor structures that are largely consistent with the distinction between oppositional defiant disorder (ODD) and conduct disorder (CD). Diagnostic accuracy was examined using receiver operating characteristic analyses, which showed that the FBB-SSV is excellent at discriminating children with ODD/CD from those in the community sample (AUC = .91). It has satisfactory diagnostic accuracy for detecting ODD/CD in the clinical sample (AUC = .76). Overall, the results show that the FBB-SSV is a reliable and valid instrument. This finding provides further support for the clinical utility of DSM- and ICD-based rating scales.

  4. Urine sample collection protocols for bioassay samples

    Energy Technology Data Exchange (ETDEWEB)

    MacLellan, J.A.; McFadden, K.M.

    1992-11-01

    In vitro radiobioassay analyses are used to measure the amount of radioactive material excreted by personnel exposed to the potential intake of radioactive material. The analytical results are then used with various metabolic models to estimate the amount of radioactive material in the subject`s body and the original intake of radioactive material. Proper application of these metabolic models requires knowledge of the excretion period. It is normal practice to design the bioassay program based on a 24-hour excretion sample. The Hanford bioassay program simulates a total 24-hour urine excretion sample with urine collection periods lasting from one-half hour before retiring to one-half hour after rising on two consecutive days. Urine passed during the specified periods is collected in three 1-L bottles. Because the daily excretion volume given in Publication 23 of the International Commission on Radiological Protection (ICRP 1975, p. 354) for Reference Man is 1.4 L, it was proposed to use only two 1-L bottles as a cost-saving measure. This raised the broader question of what should be the design capacity of a 24-hour urine sample kit.

  5. Urine sample collection protocols for bioassay samples

    Energy Technology Data Exchange (ETDEWEB)

    MacLellan, J.A.; McFadden, K.M.

    1992-11-01

    In vitro radiobioassay analyses are used to measure the amount of radioactive material excreted by personnel exposed to the potential intake of radioactive material. The analytical results are then used with various metabolic models to estimate the amount of radioactive material in the subject's body and the original intake of radioactive material. Proper application of these metabolic models requires knowledge of the excretion period. It is normal practice to design the bioassay program based on a 24-hour excretion sample. The Hanford bioassay program simulates a total 24-hour urine excretion sample with urine collection periods lasting from one-half hour before retiring to one-half hour after rising on two consecutive days. Urine passed during the specified periods is collected in three 1-L bottles. Because the daily excretion volume given in Publication 23 of the International Commission on Radiological Protection (ICRP 1975, p. 354) for Reference Man is 1.4 L, it was proposed to use only two 1-L bottles as a cost-saving measure. This raised the broader question of what should be the design capacity of a 24-hour urine sample kit.

  6. The collection and field chemical analysis of water samples

    International Nuclear Information System (INIS)

    Korte, N.E.; Ealey, D.T.; Hollenbach, M.H.

    1984-01-01

    A successful water sampling program requires a clear understanding of appropriate measurement and sampling procedures in order to obtain reliable field data and representative samples. It is imperative that the personnel involved have a thorough knowledge of the limitations of the techniques being used. Though this seems self-evident, many sampling and field-chemical-analysis programs are still not properly conducted. Recognizing these problems, the Department of Energy contracted with Bendix Field Engineering Corporation through the Technical Measurements Center to develop and select procedures for water sampling and field chemical analysis at waste sites. The fundamental causese of poor field programs are addressed in this paper, largely through discussion of specific field-measurement techniques and their limitations. Recommendations for improvement, including quality-assurance measures, are also presented

  7. A Method for Choosing the Best Samples for Mars Sample Return.

    Science.gov (United States)

    Gordon, Peter R; Sephton, Mark A

    2018-05-01

    Success of a future Mars Sample Return mission will depend on the correct choice of samples. Pyrolysis-FTIR can be employed as a triage instrument for Mars Sample Return. The technique can thermally dissociate minerals and organic matter for detection. Identification of certain mineral types can determine the habitability of the depositional environment, past or present, while detection of organic matter may suggest past or present habitation. In Mars' history, the Theiikian era represents an attractive target for life search missions and the acquisition of samples. The acidic and increasingly dry Theiikian may have been habitable and followed a lengthy neutral and wet period in Mars' history during which life could have originated and proliferated to achieve relatively abundant levels of biomass with a wide distribution. Moreover, the sulfate minerals produced in the Theiikian are also known to be good preservers of organic matter. We have used pyrolysis-FTIR and samples from a Mars analog ferrous acid stream with a thriving ecosystem to test the triage concept. Pyrolysis-FTIR identified those samples with the greatest probability of habitability and habitation. A three-tier scoring system was developed based on the detection of (i) organic signals, (ii) carbon dioxide and water, and (iii) sulfur dioxide. The presence of each component was given a score of A, B, or C depending on whether the substance had been detected, tentatively detected, or not detected, respectively. Single-step (for greatest possible sensitivity) or multistep (for more diagnostic data) pyrolysis-FTIR methods informed the assignments. The system allowed the highest-priority samples to be categorized as AAA (or A*AA if the organic signal was complex), while the lowest-priority samples could be categorized as CCC. Our methods provide a mechanism with which to rank samples and identify those that should take the highest priority for return to Earth during a Mars Sample Return mission. Key Words

  8. Plasma as alternatively sample to quantify tetanus antitoxin

    Directory of Open Access Journals (Sweden)

    Ariel Menéndez-Barrios

    2015-08-01

    Full Text Available Tetanus antitoxin is quantified in Cuba at blood banks, from the serum of immunized donors, to produce aspecific human gamma globulin. A heterogeneous indirect immunoenzymatic assay is used, using the serum as analytical sample. The possible use of plasma obtained from plasmapheresis as alternative sample was evaluated in this research, to minimize the volume of total blood extracted to the donors. One hundred plasma donors who came to donate between October and November 2013 were selected by simple random sampling. Serum sample was obtained for extraction of 5 mL of blood, deposited in dry glass tube. While the other sample took 1.5 mL of plasma in a plastic tube with cover, at the end of the donation directly of the unit of plasma collected. Comparison of the difference between the means of both groups was done using SPSS for Windows. It was found that the values obtained in serum were bigger than those obtained in plasma. Difference between the means of both groups was statistically significant (p 0.00. It is not advisable to use the obtained plasma of the plasmapheresis as analytic sample in this assay.

  9. Quality Control Samples for the Radiological Determination of Tritium in Urine Samples

    International Nuclear Information System (INIS)

    Ost'pezuk, P.; Froning, M.; Laumen, S.; Richert, I.; Hill, P.

    2004-01-01

    The radioactive decay product of tritium is a low energy beta that cannot penetrate the outer dead layer of human skin. Therefore , the main hazard associated with tritium is internal exposure. In addition, due to the relatively long half life and short biological half life, tritium must be ingested in large amounts to pose a significant health risk. On the other hand, the internal exposure should be kept as low as practical. For incorporation monitoring of professional radiation workers the quality control is of utmost importance. In the Research Centre Juelich GmbH (FZJ) a considerable fraction of monitoring by excretion analysis relates to the isotope Tritium. Usually an aliquot of an urine sample is mixed with a liquid scintillator and measured in a liquid scintillation counter. Quality control samples in the form of three kind of internal reference samples (blank, reference samples with low activity and reference sample with elevated activity) were prepared from a mixed, Tritium (free) urine samples. 1 ml of these samples were pipetted into a liquid scintillation vial. In the part of theses vials a known amounts of Tritium were added. All these samples were stored at 20 degrees. Based on long term use of all these reference samples it was possible to construct appropriate control charts with the upper and lower alarm limits. Daily use of these reference samples decrease significantly the risk for false results in original urine with no significant increase of the determination time. (Author) 2 refs

  10. Lunar Sample Compendium

    Science.gov (United States)

    Meyer, Charles

    2005-01-01

    The purpose of the Lunar Sample Compendium will be to inform scientists, astronauts and the public about the various lunar samples that have been returned from the Moon. This Compendium will be organized rock by rock in the manor of a catalog, but will not be as comprehensive, nor as complete, as the various lunar sample catalogs that are available. Likewise, this Compendium will not duplicate the various excellent books and reviews on the subject of lunar samples (Cadogen 1981, Heiken et al. 1991, Papike et al. 1998, Warren 2003, Eugster 2003). However, it is thought that an online Compendium, such as this, will prove useful to scientists proposing to study individual lunar samples and should help provide backup information for lunar sample displays. This Compendium will allow easy access to the scientific literature by briefly summarizing the significant findings of each rock along with the documentation of where the detailed scientific data are to be found. In general, discussion and interpretation of the results is left to the formal reviews found in the scientific literature. An advantage of this Compendium will be that it can be updated, expanded and corrected as need be.

  11. Transactional effects among maternal depression, neighborhood deprivation, and child conduct problems from early childhood through adolescence: A tale of two low-income samples.

    Science.gov (United States)

    Shaw, Daniel S; Sitnick, Stephanie L; Reuben, Julia; Dishion, Thomas J; Wilson, Melvin N

    2016-08-01

    The current study sought to advance our understanding of transactional processes among maternal depression, neighborhood deprivation, and child conduct problems (CP) using two samples of low-income families assessed repeatedly from early childhood to early adolescence. After accounting for initial levels of negative parenting, independent and reciprocal effects between maternal depressive symptoms and child CP were evident across both samples, beginning in early childhood and continuing through middle childhood and adolescence. In addition, neighborhood effects were consistently found in both samples after children reached age 5, with earlier neighborhood effects on child CP and maternal depression found in the one exclusively urban sample of families with male children. The results confirm prior research on the independent contribution of maternal depression and child CP to the maintenance of both problem behaviors. The findings also have implications for designing preventative and clinical interventions to address child CP for families living in high-risk neighborhoods.

  12. Curiosity analyzes Martian soil samples

    Science.gov (United States)

    Showstack, Randy; Balcerak, Ernie

    2012-12-01

    NASA's Mars Curiosity rover has conducted its first analysis of Martian soil samples using multiple instruments, the agency announced at a 3 December news briefing at the AGU Fall Meeting in San Francisco. "These results are an unprecedented look at the chemical diversity in the area," said NASA's Michael Meyer, program scientist for Curiosity.

  13. Methods of sampling airborne fungi in working environments of waste treatment facilities.

    Science.gov (United States)

    Černá, Kristýna; Wittlingerová, Zdeňka; Zimová, Magdaléna; Janovský, Zdeněk

    2016-01-01

    The objective of the present study was to evaluate and compare the efficiency of a filter based sampling method and a high volume sampling method for sampling airborne culturable fungi present in waste sorting facilities. Membrane filters method was compared with surface air system method. The selected sampling methods were modified and tested in 2 plastic waste sorting facilities. The total number of colony-forming units (CFU)/m3 of airborne fungi was dependent on the type of sampling device, on the time of sampling, which was carried out every hour from the beginning of the work shift, and on the type of cultivation medium (p airborne fungi ranged 2×102-1.7×106 CFU/m3 when using the membrane filters (MF) method, and 3×102-6.4×104 CFU/m3 when using the surface air system (SAS) method. Both methods showed comparable sensitivity to the fluctuations of the concentrations of airborne fungi during the work shifts. The SAS method is adequate for a fast indicative determination of concentration of airborne fungi. The MF method is suitable for thorough assessment of working environment contamination by airborne fungi. Therefore we recommend the MF method for the implementation of a uniform standard methodology of airborne fungi sampling in working environments of waste treatment facilities. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.

  14. UMTRA project water sampling and analysis plan, Naturita, Colorado

    International Nuclear Information System (INIS)

    1994-04-01

    Surface remedial action is scheduled to begin at the Naturita UMTRA Project processing site in the spring of 1994. No water sampling was performed during 1993 at either the Naturita processing site (NAT-01) or the Dry Flats disposal site (NAT-12). Results of previous water sampling at the Naturita processing site indicate that ground water in the alluvium is contaminated as a result of uranium processing activities. Baseline ground water conditions have been established in the uppermost aquifer at the Dry Flats disposal site. Water sampling activities scheduled for April 1994 include preconstruction sampling of selected monitor wells at the processing site, surface water sampling of the San Miguel River, sampling of several springs/seeps in the vicinity of the disposal site, and sampling of two monitor wells in Coke Oven Valley. The monitor well locations provide sampling points to characterize ground water quality and flow conditions in the vicinity of the sites. The list of analytes has been updated to reflect constituents related to uranium processing activities and the parameters needed for geochemical evaluation. Water sampling will be conducted annually at minimum during the period of construction activities

  15. Re-sampling of the KLX02 deep borehole at Laxemar

    International Nuclear Information System (INIS)

    Laaksoharju, M.; Andersson, Cecilia; Tullborg, E.L.; Wallin, B.; Ekwall, K.; Pedersen, K.

    1999-01-01

    The project focuses on the origin and changes of deep groundwaters, which are important for understanding the stability of the groundwater surrounding the final repository. The results from the sampling campaign in 1997 down to a depth of 1500m are compared with the results from 1993 sampled in the same borehole. The analytical results and some preliminary calculations are presented. The changes since the last sampling campaign 4 years ago indicate a high degree of mixing and dynamics in the system. The following conclusions are drawn: More changes in the water composition than expected compared with the results from the sampling campaign in 1993; Larger portions of meteoric water in the upper part of the borehole; Less glacial water in the intermediate part of the borehole; More brine water in the lower part of the borehole. The conclusion is that there has been a relatively large change in the groundwater system during the last 4 years in the Laxemar deep borehole. The disturbance removed the effect from the last glaciation and pulled in groundwater, which resulted in a mixture mainly consisting of meteoric and brine waters. The most probable reason is that the annual fluctuation and flow in the open borehole play an important role as a modificator especially for the isotopes. The results show the sensitivity of deep groundwater to changes in the prevailing hydrogeological situation

  16. Sampling free energy surfaces as slices by combining umbrella sampling and metadynamics.

    Science.gov (United States)

    Awasthi, Shalini; Kapil, Venkat; Nair, Nisanth N

    2016-06-15

    Metadynamics (MTD) is a very powerful technique to sample high-dimensional free energy landscapes, and due to its self-guiding property, the method has been successful in studying complex reactions and conformational changes. MTD sampling is based on filling the free energy basins by biasing potentials and thus for cases with flat, broad, and unbound free energy wells, the computational time to sample them becomes very large. To alleviate this problem, we combine the standard Umbrella Sampling (US) technique with MTD to sample orthogonal collective variables (CVs) in a simultaneous way. Within this scheme, we construct the equilibrium distribution of CVs from biased distributions obtained from independent MTD simulations with umbrella potentials. Reweighting is carried out by a procedure that combines US reweighting and Tiwary-Parrinello MTD reweighting within the Weighted Histogram Analysis Method (WHAM). The approach is ideal for a controlled sampling of a CV in a MTD simulation, making it computationally efficient in sampling flat, broad, and unbound free energy surfaces. This technique also allows for a distributed sampling of a high-dimensional free energy surface, further increasing the computational efficiency in sampling. We demonstrate the application of this technique in sampling high-dimensional surface for various chemical reactions using ab initio and QM/MM hybrid molecular dynamics simulations. Further, to carry out MTD bias reweighting for computing forward reaction barriers in ab initio or QM/MM simulations, we propose a computationally affordable approach that does not require recrossing trajectories. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  17. Conductance fluctuations in a macroscopic 3-dimensional Anderson insulator

    International Nuclear Information System (INIS)

    Sanquer, M.

    1990-01-01

    We report magnetoconductance experiment on a amorphous Y x -Si 1-x alloy (∼0.3). which is an Anderson insulator where spin-orbit scattering is strong. Two principal and new features emerge from the data: the first one is an halving of the localization length by the application of a magnetic field of about 2.5 Teslas. This effect is predicted by a new approach of transport in Anderson insulators where basic symetry considerations are the most important ingredient. The second one is the observation of reproducible conductance fluctuations at very low temperature in this macroscopic 3 D amorphous material

  18. Influence of sampling depth and post-sampling analysis time on the ...

    African Journals Online (AJOL)

    Bacteriological analysis was carried out for samples taken at water depth and at 1, 6, 12 and 24 hours post-sampling. It was observed that the total and faecal coliform bacteria were significantly higher in the 3 m water depth samples than in the surface water samples (ANOVA, F = 59.41, 26.751, 9.82 (T.C); 46.41, 26.81, ...

  19. Improvements in and relating to the incubation of samples

    International Nuclear Information System (INIS)

    Bagshawe, K.D.

    1978-01-01

    Apparatus is described for incubating a plurality of biological samples and particularly as part of an analysis, e.g. radioimmunoassay or enzyme assay, of the samples. The apparatus is comprised of an incubation station with a plurality of containers to which samples together with diluent and reagents are supplied. The containers are arranged in rows in two side-by-side columns and are circulated sequentially. Sample removal means is provided either at a fixed location or at a movable point relative to the incubator. Circulation of the containers and the length of sample incubation time is controlled by a computer. The incubation station may include a plurality of sections with the columns in communication so that rows of samples can be moved from the column of one section to the column of an adjacent section, to provide alternative paths for circulation of the samples. (author)

  20. Mahalanobis' Contributions to Sample Surveys

    Indian Academy of Sciences (India)

    Sample Survey started its operations in October 1950 under the ... and adopted random cuts for estimating the acreage under jute ... demographic factors relating to indebtedness, unemployment, ... traffic surveys, demand for currency coins and average life of .... Mahalanobis derived the optimum allocation in stratified.

  1. Network Sampling with Memory: A proposal for more efficient sampling from social networks

    Science.gov (United States)

    Mouw, Ted; Verdery, Ashton M.

    2013-01-01

    Techniques for sampling from networks have grown into an important area of research across several fields. For sociologists, the possibility of sampling from a network is appealing for two reasons: (1) A network sample can yield substantively interesting data about network structures and social interactions, and (2) it is useful in situations where study populations are difficult or impossible to survey with traditional sampling approaches because of the lack of a sampling frame. Despite its appeal, methodological concerns about the precision and accuracy of network-based sampling methods remain. In particular, recent research has shown that sampling from a network using a random walk based approach such as Respondent Driven Sampling (RDS) can result in high design effects (DE)—the ratio of the sampling variance to the sampling variance of simple random sampling (SRS). A high design effect means that more cases must be collected to achieve the same level of precision as SRS. In this paper we propose an alternative strategy, Network Sampling with Memory (NSM), which collects network data from respondents in order to reduce design effects and, correspondingly, the number of interviews needed to achieve a given level of statistical power. NSM combines a “List” mode, where all individuals on the revealed network list are sampled with the same cumulative probability, with a “Search” mode, which gives priority to bridge nodes connecting the current sample to unexplored parts of the network. We test the relative efficiency of NSM compared to RDS and SRS on 162 school and university networks from Add Health and Facebook that range in size from 110 to 16,278 nodes. The results show that the average design effect for NSM on these 162 networks is 1.16, which is very close to the efficiency of a simple random sample (DE=1), and 98.5% lower than the average DE we observed for RDS. PMID:24159246

  2. DNA Sampling Hook

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The DNA Sampling Hook is a significant improvement on a method of obtaining a tissue sample from a live fish in situ from an aquatic environment. A tissue sample...

  3. Representative mass reduction in sampling

    DEFF Research Database (Denmark)

    Petersen, Lars; Esbensen, Harry Kim; Dahl, Casper Kierulf

    2004-01-01

    We here present a comprehensive survey of current mass reduction principles and hardware available in the current market. We conduct a rigorous comparison study of the performance of 17 field and/or laboratory instruments or methods which are quantitatively characterized (and ranked) for accuracy...... dividers, the Boerner Divider, the ??spoon method??, alternate/fractional shoveling and grab sampling. Only devices based on riffle splitting principles (static or rotational) passes the ultimate representativity test (with minor, but significant relative differences). Grab sampling, the overwhelmingly...... most often used mass reduction method, performs appallingly?its use must be discontinued (with the singular exception for completely homogenized fine powders). Only proper mass reduction (i.e. carried out in complete compliance with all appropriate design principles, maintenance and cleaning rules) can...

  4. The electrical signature of rock samples exposed to hydrostatic and triaxial pressures

    Energy Technology Data Exchange (ETDEWEB)

    Heikamp, S.; Nover, G. [Bonn Univ., Bonn (Germany). Mineralogical Institute

    2001-04-01

    The electrical signature of sedimentary (carbonate) and crystalline rock samples was studied in hydrostatic and triaxial pressure experiments up to 300 MPa. The aim was to establish a relation between an electrical signal stimulated by an external pressure acting on the sample and the mechanical stability of the rock. Natural open fractures tend to be closed under hydrostatic pressure conditions, whereas in triaxial pressure experiments new fractures are generated. These contrary processes of either decrease or increase in crack density and geometry, cause a decrease or increase in the inner surface of the sample. Such pressure induced variations in pore geometry were investigated by an interpretation and modelling of the frequency dependence of the complex electrical conductivity. In a series of hydrostatic pressure experiments crack-closure was found in the electrical signature by decrease of the model capacitor C being related to crack geometry. This capacitor increases in the triaxial experiments where new fractures were formed.

  5. Sampling Development

    Science.gov (United States)

    Adolph, Karen E.; Robinson, Scott R.

    2011-01-01

    Research in developmental psychology requires sampling at different time points. Accurate depictions of developmental change provide a foundation for further empirical studies and theories about developmental mechanisms. However, overreliance on widely spaced sampling intervals in cross-sectional and longitudinal designs threatens the validity of…

  6. Sample preparation of environmental samples using benzene synthesis followed by high-performance LSC

    International Nuclear Information System (INIS)

    Filippis, S. De; Noakes, J.E.

    1991-01-01

    Liquid scintillation counting (LSC) techniques have been widely employed as the detection method for determining environmental levels of tritium and 14 C. Since anthropogenic and nonanthropogenic inputs to the environment are a concern, sampling the environment surrounding a nuclear power facility or fuel reprocessing operation requires the collection of many different sample types, including agriculture products, water, biota, aquatic life, soil, and vegetation. These sample types are not suitable for the direct detection of tritium of 14 C for liquid scintillation techniques. Each sample type must be initially prepared in order to obtain the carbon or hydrogen component of interest and present this in a chemical form that is compatible with common chemicals used in scintillation counting applications. Converting the sample of interest to chemically pure benzene as a sample preparation technique has been widely accepted for processing samples for radiocarbon age-dating applications. The synthesized benzene is composed of the carbon or hydrogen atoms from the original sample and is ideal as a solvent for LSC with excellent photo-optical properties. Benzene synthesis followed by low-background scintillation counting can be applied to the preparation and measurement of environmental samples yielding good detection sensitivities, high radionuclide counting efficiency, and shorter preparation time. The method of benzene synthesis provides a unique approach to the preparation of a wide variety of environmental sample types using similar chemistry for all samples

  7. Superposition Enhanced Nested Sampling

    Directory of Open Access Journals (Sweden)

    Stefano Martiniani

    2014-08-01

    Full Text Available The theoretical analysis of many problems in physics, astronomy, and applied mathematics requires an efficient numerical exploration of multimodal parameter spaces that exhibit broken ergodicity. Monte Carlo methods are widely used to deal with these classes of problems, but such simulations suffer from a ubiquitous sampling problem: The probability of sampling a particular state is proportional to its entropic weight. Devising an algorithm capable of sampling efficiently the full phase space is a long-standing problem. Here, we report a new hybrid method for the exploration of multimodal parameter spaces exhibiting broken ergodicity. Superposition enhanced nested sampling combines the strengths of global optimization with the unbiased or athermal sampling of nested sampling, greatly enhancing its efficiency with no additional parameters. We report extensive tests of this new approach for atomic clusters that are known to have energy landscapes for which conventional sampling schemes suffer from broken ergodicity. We also introduce a novel parallelization algorithm for nested sampling.

  8. Direct sampling methods for inverse elastic scattering problems

    Science.gov (United States)

    Ji, Xia; Liu, Xiaodong; Xi, Yingxia

    2018-03-01

    We consider the inverse elastic scattering of incident plane compressional and shear waves from the knowledge of the far field patterns. Specifically, three direct sampling methods for location and shape reconstruction are proposed using the different component of the far field patterns. Only inner products are involved in the computation, thus the novel sampling methods are very simple and fast to be implemented. With the help of the factorization of the far field operator, we give a lower bound of the proposed indicator functionals for sampling points inside the scatterers. While for the sampling points outside the scatterers, we show that the indicator functionals decay like the Bessel functions as the sampling point goes away from the boundary of the scatterers. We also show that the proposed indicator functionals continuously dependent on the far field patterns, which further implies that the novel sampling methods are extremely stable with respect to data error. For the case when the observation directions are restricted into the limited aperture, we firstly introduce some data retrieval techniques to obtain those data that can not be measured directly and then use the proposed direct sampling methods for location and shape reconstructions. Finally, some numerical simulations in two dimensions are conducted with noisy data, and the results further verify the effectiveness and robustness of the proposed sampling methods, even for multiple multiscale cases and limited-aperture problems.

  9. Quantitative portable gamma spectroscopy sample analysis for non-standard sample geometries

    International Nuclear Information System (INIS)

    Enghauser, M.W.; Ebara, S.B.

    1997-01-01

    Utilizing a portable spectroscopy system, a quantitative method for analysis of samples containing a mixture of fission and activation products in nonstandard geometries was developed. The method can be used with various sample and shielding configurations where analysis on a laboratory based gamma spectroscopy system is impractical. The portable gamma spectroscopy method involves calibration of the detector and modeling of the sample and shielding to identify and quantify the radionuclides present in the sample. The method utilizes the intrinsic efficiency of the detector and the unattenuated gamma fluence rate at the detector surface per unit activity from the sample to calculate the nuclide activity and Minimum Detectable Activity (MDA). For a complex geometry, a computer code written for shielding applications (MICROSHIELD) is utilized to determine the unattenuated gamma fluence rate per unit activity at the detector surface. Lastly, the method is only applicable to nuclides which emit gamma rays and cannot be used for pure beta emitters. In addition, if sample self absorption and shielding is significant, the attenuation will result in high MDA's for nuclides which solely emit low energy gamma rays. The following presents the analysis technique and presents verification results demonstrating the accuracy of the method

  10. Assessing representativeness of sampling methods for reaching men who have sex with men: a direct comparison of results obtained from convenience and probability samples.

    Science.gov (United States)

    Schwarcz, Sandra; Spindler, Hilary; Scheer, Susan; Valleroy, Linda; Lansky, Amy

    2007-07-01

    Convenience samples are used to determine HIV-related behaviors among men who have sex with men (MSM) without measuring the extent to which the results are representative of the broader MSM population. We compared results from a cross-sectional survey of MSM recruited from gay bars between June and October 2001 to a random digit dial telephone survey conducted between June 2002 and January 2003. The men in the probability sample were older, better educated, and had higher incomes than men in the convenience sample, the convenience sample enrolled more employed men and men of color. Substance use around the time of sex was higher in the convenience sample but other sexual behaviors were similar. HIV testing was common among men in both samples. Periodic validation, through comparison of data collected by different sampling methods, may be useful when relying on survey data for program and policy development.

  11. Anomalous conductivity noise in gapped bilayer graphene heterostructure

    Science.gov (United States)

    Aamir, Mohammed Ali; Karnatak, Paritosh; Sai, T. Phanindra; Ghosh, Arindam

    Bilayer graphene has unique electronic properties - it has a tunable band gap and also, valley symmetry and pseudospin degree of freedom like its single layer counterpart. In this work, we present a study of conductance fluctuations in dual gated bilayer graphene heterostructures by varying the Fermi energy and the band gap independently. At a fixed band gap, we find that the conductance fluctuations obtained by Fermi energy ensemble sampling increase rapidly as the Fermi energy is tuned to charge neutrality point (CNP) whereas the time-dependent conductance fluctuations diminish rapidly. This discrepancy is completely absent at higher number densities, where the transport is expected to be through the 2D bulk of the bilayer system. This observation indicates that near the CNP, electrical transport is highly sensitive to Fermi energy, but becomes progressively immune to time-varying disorder. A possible explanation may involve transport via edge states which becomes the dominant conduction mechanism when the bilayer graphene is gapped and Fermi energy is situated close to the CNP, thereby causing a dimensional crossover from 2D to 1D transport. Our experiment outlines a possible experimental protocol to probe intrinsic topological states in gapped bilayer graphene.

  12. Improved sample size determination for attributes and variables sampling

    International Nuclear Information System (INIS)

    Stirpe, D.; Picard, R.R.

    1985-01-01

    Earlier INMM papers have addressed the attributes/variables problem and, under conservative/limiting approximations, have reported analytical solutions for the attributes and variables sample sizes. Through computer simulation of this problem, we have calculated attributes and variables sample sizes as a function of falsification, measurement uncertainties, and required detection probability without using approximations. Using realistic assumptions for uncertainty parameters of measurement, the simulation results support the conclusions: (1) previously used conservative approximations can be expensive because they lead to larger sample sizes than needed; and (2) the optimal verification strategy, as well as the falsification strategy, are highly dependent on the underlying uncertainty parameters of the measurement instruments. 1 ref., 3 figs

  13. 40 CFR Appendix I to Part 261 - Representative Sampling Methods

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 25 2010-07-01 2010-07-01 false Representative Sampling Methods I...—Representative Sampling Methods The methods and equipment used for sampling waste materials will vary with the form and consistency of the waste materials to be sampled. Samples collected using the sampling...

  14. Gravimetric dust sampling for control purposes and occupational dust sampling.

    CSIR Research Space (South Africa)

    Unsted, AD

    1997-02-01

    Full Text Available Prior to the introduction of gravimetric dust sampling, konimeters had been used for dust sampling, which was largely for control purposes. Whether or not absolute results were achievable was not an issue since relative results were used to evaluate...

  15. Forensic Tools to Track and Connect Physical Samples to Related Data

    Science.gov (United States)

    Molineux, A.; Thompson, A. C.; Baumgardner, R. W.

    2016-12-01

    Identifiers, such as local sample numbers, are critical to successfully connecting physical samples and related data. However, identifiers must be globally unique. The International Geo Sample Number (IGSN) generated when registering the sample in the System for Earth Sample Registration (SESAR) provides a globally unique alphanumeric code associated with basic metadata, related samples and their current physical storage location. When registered samples are published, users can link the figured samples to the basic metadata held at SESAR. The use cases we discuss include plant specimens from a Permian core, Holocene corals and derived powders, and thin sections with SEM stubs. Much of this material is now published. The plant taxonomic study from the core is a digital pdf and samples can be directly linked from the captions to the SESAR record. The study of stable isotopes from the corals is not yet digitally available, but individual samples are accessible. Full data and media records for both studies are located in our database where higher quality images, field notes, and section diagrams may exist. Georeferences permit mapping in current and deep time plate configurations. Several aspects emerged during this study. The first, ensure adequate and consistent details are registered with SESAR. Second, educate and encourage the researcher to obtain IGSNs. Third, publish the archive numbers, assigned prior to publication, alongside the IGSN. This provides access to further data through an Integrated Publishing Toolkit (IPT)/aggregators/or online repository databases, thus placing the initial sample in a much richer context for future studies. Fourth, encourage software developers to customize community software to extract data from a database and use it to register samples in bulk. This would improve workflow and provide a path for registration of large legacy collections.

  16. Fluctuation Flooding Method (FFM) for accelerating conformational transitions of proteins

    Science.gov (United States)

    Harada, Ryuhei; Takano, Yu; Shigeta, Yasuteru

    2014-03-01

    A powerful conformational sampling method for accelerating structural transitions of proteins, "Fluctuation Flooding Method (FFM)," is proposed. In FFM, cycles of the following steps enhance the transitions: (i) extractions of largely fluctuating snapshots along anisotropic modes obtained from trajectories of multiple independent molecular dynamics (MD) simulations and (ii) conformational re-sampling of the snapshots via re-generations of initial velocities when re-starting MD simulations. In an application to bacteriophage T4 lysozyme, FFM successfully accelerated the open-closed transition with the 6 ns simulation starting solely from the open state, although the 1-μs canonical MD simulation failed to sample such a rare event.

  17. Improvements to robotics-inspired conformational sampling in rosetta.

    Directory of Open Access Journals (Sweden)

    Amelie Stein

    Full Text Available To accurately predict protein conformations in atomic detail, a computational method must be capable of sampling models sufficiently close to the native structure. All-atom sampling is difficult because of the vast number of possible conformations and extremely rugged energy landscapes. Here, we test three sampling strategies to address these difficulties: conformational diversification, intensification of torsion and omega-angle sampling and parameter annealing. We evaluate these strategies in the context of the robotics-based kinematic closure (KIC method for local conformational sampling in Rosetta on an established benchmark set of 45 12-residue protein segments without regular secondary structure. We quantify performance as the fraction of sub-Angstrom models generated. While improvements with individual strategies are only modest, the combination of intensification and annealing strategies into a new "next-generation KIC" method yields a four-fold increase over standard KIC in the median percentage of sub-Angstrom models across the dataset. Such improvements enable progress on more difficult problems, as demonstrated on longer segments, several of which could not be accurately remodeled with previous methods. Given its improved sampling capability, next-generation KIC should allow advances in other applications such as local conformational remodeling of multiple segments simultaneously, flexible backbone sequence design, and development of more accurate energy functions.

  18. New and conventional evaporative systems in concentrating nitrogen samples prior to isotope-ratio analysis

    International Nuclear Information System (INIS)

    Lober, R.W.; Reeder, J.D.; Porter, L.K.

    1987-01-01

    Studies were conducted to quantify and compare the efficiencies of various evaporative systems used in evaporating 15 N samples prior to mass spectrometric analysis. Two new forced-air systems were designed and compared with a conventional forced-air system and with an open-air dry bath technique for effectiveness in preventing atmospheric contamination of evaporating samples. The forced-air evaporative systems significantly reduced the time needed to evaporate samples as compared to the open-air dry bath technique; samples were evaporated to dryness in 2.5 h with the forced-air systems as compared to 8 to 10 h on the open-air dry bath. The effectiveness of a given forced-air system to prevent atmospheric contamination of evaporating samples was significantly affected by the flow rate of the air stream flowing over the samples. The average atmospheric contaminant N found in samples evaporated on the open-air dry bath was 0.3 μ N, indicating very low concentrations of atmospheric NH 3 during this study. However, in previous studies the authors have experienced significant contamination of 15 N samples evaporated on an open-air dry bath because the level of contaminant N in the laboratory atmosphere varied and could not be adequately controlled. Average cross-contaminant levels of 0.28, 0.20, and 1.01 μ of N were measured between samples evaporated on the open-air dry bath, the newly-designed forced-air system, and the conventional forced-air system, respectively. The cross-contamination level is significantly higher on the conventional forced-air system than on the other two systems, and could significantly alter the atom % 15 N of high-enriched, low [N] evaporating samples

  19. Comparison of a continuous working level monitor for radon daughters with conventional grab-sampling

    International Nuclear Information System (INIS)

    Bigu, J.; Grenier, M.

    1982-08-01

    An evaluation of a radon daughter monitor was carried out under laboratory controlled conditions. The monitor operates on continuous sampling and time integrating principles and was tested in conjunction with a newly designed, large radon/thoron room calibration facility. The monitor was tested under constant and rapidly fluctuating radiation conditions. Experimental data obtained with the monitor were compared with data obtained by conventional grab-sampling and with an automated radon daughter/thoron daughter 'grab-sampler'. The Working Level used in the tests ranged from less than 0.01 WL to approximately 10 WL. The measurements were carried out under low aerosol concentration (1 x 10 3 - 2 x 10 3 cm -3 , approximately) to study plate-out effects in the sampling head. Good agreement (within about 10 %) was found between the monitor, conventional grab-sampling and the automated grab-sampler. The monitor should prove quite flexible, useful and reliable for monitoring underground and surface environments in the uranium mining industry

  20. Sample Curation in Support of the OSIRIS-REx Asteroid Sample Return Mission

    Science.gov (United States)

    Righter, Kevin; Nakamura-Messenger, Keiko

    2017-01-01

    The OSIRIS-REx asteroid sample return mission launched to asteroid Bennu Sept. 8, 2016. The spacecraft will arrive at Bennu in late 2019, orbit and map the asteroid, and perform a touch and go (TAG) sampling maneuver in July 2020. After sample is stowed and confirmed the spacecraft will return to Earth, and the sample return capsule (SRC) will land in Utah in September 2023. Samples will be recovered from Utah [2] and then transported and stored in a new sample cleanroom at NASA Johnson Space Center in Houston [3]. The materials curated for the mission are described here. a) Materials Archive and Witness Plate Collection: The SRC and TAGSAM were built between March 2014 and Summer of 2015, and instruments (OTES,OVIRS, OLA, OCAMS, REXIS) were integrated from Summer 2015 until May 2016. A total of 395 items were received for the materials archive at NASA-JSC, with archiving finishing 30 days after launch (with the final archived items being related to launch operations)[4]. The materials fall into several general categories including metals (stainless steel, aluminum, titanium alloys, brass and BeCu alloy), epoxies, paints, polymers, lubricants, non-volatile-residue samples (NVR), sapphire, and various miscellaneous materials. All through the ATLO process (from March 2015 until late August 2016) contamination knowledge witness plates (Si wafer and Al foil) were deployed in the various cleanrooms in Denver and KSC to provide an additional record of particle counts and volatiles that is archived for current and future scientific studies. These plates were deployed in roughly monthly increments with each unit containing 4 Si wafers and 4 Al foils. We archived 128 individual witness plates (64 Si wafers and 64 Al foils); one of each witness plate (Si and Al) was analyzed immediately by the science team after archiving, while the remaining 3 of each are archived indefinitely. Information about each material archived is stored in an extensive database at NASA-JSC, and key

  1. Remedial investigation sampling and analysis plan for J-Field, Aberdeen Proving Ground, Maryland. Volume 1: Field Sampling Plan

    Energy Technology Data Exchange (ETDEWEB)

    Benioff, P.; Biang, R.; Dolak, D.; Dunn, C.; Martino, L.; Patton, T.; Wang, Y.; Yuen, C.

    1995-03-01

    The Environmental Management Division (EMD) of Aberdeen Proving Ground (APG), Maryland, is conducting a remedial investigation and feasibility study (RI/FS) of the J-Field area at APG pursuant to the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA), as amended. J-Field is within the Edgewood Area of APG in Harford County, Maryland (Figure 1. 1). Since World War II activities in the Edgewood Area have included the development, manufacture, testing, and destruction of chemical agents and munitions. These materials were destroyed at J-Field by open burning and open detonation (OB/OD). Considerable archival information about J-Field exists as a result of efforts by APG staff to characterize the hazards associated with the site. Contamination of J-Field was first detected during an environmental survey of the Edgewood Area conducted in 1977 and 1978 by the US Army Toxic and Hazardous Materials Agency (USATHAMA) (predecessor to the US Army Environmental Center [AEC]). As part of a subsequent USATHAMA -environmental survey, 11 wells were installed and sampled at J-Field. Contamination at J-Field was also detected during a munitions disposal survey conducted by Princeton Aqua Science in 1983. The Princeton Aqua Science investigation involved the installation and sampling of nine wells and the collection and analysis of surficial and deep composite soil samples. In 1986, a Resource Conservation and Recovery Act (RCRA) permit (MD3-21-002-1355) requiring a basewide RCRA Facility Assessment (RFA) and a hydrogeologic assessment of J-Field was issued by the US Environmental Protection Agency (EPA). In 1987, the US Geological Survey (USGS) began a two-phased hydrogeologic assessment in data were collected to model, groundwater flow at J-Field. Soil gas investigations were conducted, several well clusters were installed, a groundwater flow model was developed, and groundwater and surface water monitoring programs were established that continue today.

  2. Procedures for sampling and sample-reduction within quality assurance systems for solid biofuels

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-04-15

    The bias introduced when sampling solid biofuels from stockpiles or containers instead of from moving streams is assessed as well as the number and size of samples required to represent accurately the bulk sample, variations introduced when reducing bulk samples into samples for testing, and the usefulness of sample reduction methods. Details are given of the experimental work carried out in Sweden and Denmark using sawdust, wood chips, wood pellets, forestry residues and straw. The production of a model European Standard for quality assurance of solid biofuels is examined.

  3. 21 CFR 203.38 - Sample lot or control numbers; labeling of sample units.

    Science.gov (United States)

    2010-04-01

    ... numbers; labeling of sample units. (a) Lot or control number required on drug sample labeling and sample... identifying lot or control number that will permit the tracking of the distribution of each drug sample unit... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Sample lot or control numbers; labeling of sample...

  4. Evaluation of Respondent-Driven Sampling

    Science.gov (United States)

    McCreesh, Nicky; Frost, Simon; Seeley, Janet; Katongole, Joseph; Tarsh, Matilda Ndagire; Ndunguse, Richard; Jichi, Fatima; Lunel, Natasha L; Maher, Dermot; Johnston, Lisa G; Sonnenberg, Pam; Copas, Andrew J; Hayes, Richard J; White, Richard G

    2012-01-01

    Background Respondent-driven sampling is a novel variant of link-tracing sampling for estimating the characteristics of hard-to-reach groups, such as HIV prevalence in sex-workers. Despite its use by leading health organizations, the performance of this method in realistic situations is still largely unknown. We evaluated respondent-driven sampling by comparing estimates from a respondent-driven sampling survey with total-population data. Methods Total-population data on age, tribe, religion, socioeconomic status, sexual activity and HIV status were available on a population of 2402 male household-heads from an open cohort in rural Uganda. A respondent-driven sampling (RDS) survey was carried out in this population, employing current methods of sampling (RDS sample) and statistical inference (RDS estimates). Analyses were carried out for the full RDS sample and then repeated for the first 250 recruits (small sample). Results We recruited 927 household-heads. Full and small RDS samples were largely representative of the total population, but both samples under-represented men who were younger, of higher socioeconomic status, and with unknown sexual activity and HIV status. Respondent-driven-sampling statistical-inference methods failed to reduce these biases. Only 31%-37% (depending on method and sample size) of RDS estimates were closer to the true population proportions than the RDS sample proportions. Only 50%-74% of respondent-driven-sampling bootstrap 95% confidence intervals included the population proportion. Conclusions Respondent-driven sampling produced a generally representative sample of this well-connected non-hidden population. However, current respondent-driven-sampling inference methods failed to reduce bias when it occurred. Whether the data required to remove bias and measure precision can be collected in a respondent-driven sampling survey is unresolved. Respondent-driven sampling should be regarded as a (potentially superior) form of convenience-sampling

  5. Evaluation of respondent-driven sampling.

    Science.gov (United States)

    McCreesh, Nicky; Frost, Simon D W; Seeley, Janet; Katongole, Joseph; Tarsh, Matilda N; Ndunguse, Richard; Jichi, Fatima; Lunel, Natasha L; Maher, Dermot; Johnston, Lisa G; Sonnenberg, Pam; Copas, Andrew J; Hayes, Richard J; White, Richard G

    2012-01-01

    Respondent-driven sampling is a novel variant of link-tracing sampling for estimating the characteristics of hard-to-reach groups, such as HIV prevalence in sex workers. Despite its use by leading health organizations, the performance of this method in realistic situations is still largely unknown. We evaluated respondent-driven sampling by comparing estimates from a respondent-driven sampling survey with total population data. Total population data on age, tribe, religion, socioeconomic status, sexual activity, and HIV status were available on a population of 2402 male household heads from an open cohort in rural Uganda. A respondent-driven sampling (RDS) survey was carried out in this population, using current methods of sampling (RDS sample) and statistical inference (RDS estimates). Analyses were carried out for the full RDS sample and then repeated for the first 250 recruits (small sample). We recruited 927 household heads. Full and small RDS samples were largely representative of the total population, but both samples underrepresented men who were younger, of higher socioeconomic status, and with unknown sexual activity and HIV status. Respondent-driven sampling statistical inference methods failed to reduce these biases. Only 31%-37% (depending on method and sample size) of RDS estimates were closer to the true population proportions than the RDS sample proportions. Only 50%-74% of respondent-driven sampling bootstrap 95% confidence intervals included the population proportion. Respondent-driven sampling produced a generally representative sample of this well-connected nonhidden population. However, current respondent-driven sampling inference methods failed to reduce bias when it occurred. Whether the data required to remove bias and measure precision can be collected in a respondent-driven sampling survey is unresolved. Respondent-driven sampling should be regarded as a (potentially superior) form of convenience sampling method, and caution is required

  6. Index to Marine and Lacustrine Geological Samples (IMLGS)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Index to Marine and Lacustrine Geological Samples (IMLGS) describes and provides access to ocean floor and lakebed rock and sediment samples curated by...

  7. Integrated sampling and analysis plan for samples measuring >10 mrem/hour

    International Nuclear Information System (INIS)

    Haller, C.S.

    1992-03-01

    This integrated sampling and analysis plan was prepared to assist in planning and scheduling of Hanford Site sampling and analytical activities for all waste characterization samples that measure greater than 10 mrem/hour. This report also satisfies the requirements of the renegotiated Interim Milestone M-10-05 of the Hanford Federal Facility Agreement and Consent Order (the Tri-Party Agreement). For purposes of comparing the various analytical needs with the Hanford Site laboratory capabilities, the analytical requirements of the various programs were normalized by converting required laboratory effort for each type of sample to a common unit of work, the standard analytical equivalency unit (AEU). The AEU approximates the amount of laboratory resources required to perform an extensive suite of analyses on five core segments individually plus one additional suite of analyses on a composite sample derived from a mixture of the five core segments and prepare a validated RCRA-type data package

  8. Vibronic Boson Sampling: Generalized Gaussian Boson Sampling for Molecular Vibronic Spectra at Finite Temperature.

    Science.gov (United States)

    Huh, Joonsuk; Yung, Man-Hong

    2017-08-07

    Molecular vibroic spectroscopy, where the transitions involve non-trivial Bosonic correlation due to the Duschinsky Rotation, is strongly believed to be in a similar complexity class as Boson Sampling. At finite temperature, the problem is represented as a Boson Sampling experiment with correlated Gaussian input states. This molecular problem with temperature effect is intimately related to the various versions of Boson Sampling sharing the similar computational complexity. Here we provide a full description to this relation in the context of Gaussian Boson Sampling. We find a hierarchical structure, which illustrates the relationship among various Boson Sampling schemes. Specifically, we show that every instance of Gaussian Boson Sampling with an initial correlation can be simulated by an instance of Gaussian Boson Sampling without initial correlation, with only a polynomial overhead. Since every Gaussian state is associated with a thermal state, our result implies that every sampling problem in molecular vibronic transitions, at any temperature, can be simulated by Gaussian Boson Sampling associated with a product of vacuum modes. We refer such a generalized Gaussian Boson Sampling motivated by the molecular sampling problem as Vibronic Boson Sampling.

  9. Sample size methodology

    CERN Document Server

    Desu, M M

    2012-01-01

    One of the most important problems in designing an experiment or a survey is sample size determination and this book presents the currently available methodology. It includes both random sampling from standard probability distributions and from finite populations. Also discussed is sample size determination for estimating parameters in a Bayesian setting by considering the posterior distribution of the parameter and specifying the necessary requirements. The determination of the sample size is considered for ranking and selection problems as well as for the design of clinical trials. Appropria

  10. Fluid sample collection and distribution system. [qualitative analysis of aqueous samples from several points

    Science.gov (United States)

    Brooks, R. L. (Inventor)

    1979-01-01

    A multipoint fluid sample collection and distribution system is provided wherein the sample inputs are made through one or more of a number of sampling valves to a progressive cavity pump which is not susceptible to damage by large unfiltered particles. The pump output is through a filter unit that can provide a filtered multipoint sample. An unfiltered multipoint sample is also provided. An effluent sample can be taken and applied to a second progressive cavity pump for pumping to a filter unit that can provide one or more filtered effluent samples. The second pump can also provide an unfiltered effluent sample. Means are provided to periodically back flush each filter unit without shutting off the whole system.

  11. The Lunar Sample Compendium

    Science.gov (United States)

    Meyer, Charles

    2009-01-01

    The Lunar Sample Compendium is a succinct summary of the data obtained from 40 years of study of Apollo and Luna samples of the Moon. Basic petrographic, chemical and age information is compiled, sample-by-sample, in the form of an advanced catalog in order to provide a basic description of each sample. The LSC can be found online using Google. The initial allocation of lunar samples was done sparingly, because it was realized that scientific techniques would improve over the years and new questions would be formulated. The LSC is important because it enables scientists to select samples within the context of the work that has already been done and facilitates better review of proposed allocations. It also provides back up material for public displays, captures information found only in abstracts, grey literature and curatorial databases and serves as a ready access to the now-vast scientific literature.

  12. Image Sampling with Quasicrystals

    Directory of Open Access Journals (Sweden)

    Mark Grundland

    2009-07-01

    Full Text Available We investigate the use of quasicrystals in image sampling. Quasicrystals produce space-filling, non-periodic point sets that are uniformly discrete and relatively dense, thereby ensuring the sample sites are evenly spread out throughout the sampled image. Their self-similar structure can be attractive for creating sampling patterns endowed with a decorative symmetry. We present a brief general overview of the algebraic theory of cut-and-project quasicrystals based on the geometry of the golden ratio. To assess the practical utility of quasicrystal sampling, we evaluate the visual effects of a variety of non-adaptive image sampling strategies on photorealistic image reconstruction and non-photorealistic image rendering used in multiresolution image representations. For computer visualization of point sets used in image sampling, we introduce a mosaic rendering technique.

  13. Membrane voltage fluctuations reduce spike frequency adaptation and preserve output gain in CA1 pyramidal neurons in a high conductance state

    Science.gov (United States)

    Fernandez, Fernando R.; Broicher, Tilman; Truong, Alan; White, John A.

    2011-01-01

    Modulating the gain of the input-output function of neurons is critical for processing of stimuli and network dynamics. Previous gain control mechanisms have suggested that voltage fluctuations play a key role in determining neuronal gain in vivo. Here we show that, under increased membrane conductance, voltage fluctuations restore Na+ current and reduce spike frequency adaptation in rat hippocampal CA1 pyramidal neurons in vitro. As a consequence, membrane voltage fluctuations produce a leftward shift in the f-I relationship without a change in gain, relative to an increase in conductance alone. Furthermore, we show that these changes have important implications for the integration of inhibitory inputs. Due to the ability to restore Na+ current, hyperpolarizing membrane voltage fluctuations mediated by GABAA-like inputs can increase firing rate in a high conductance state. Finally, our data show that the effects on gain and synaptic integration are mediated by voltage fluctuations within a physiologically relevant range of frequencies (10–40 Hz). PMID:21389243

  14. Personal gravimetric dust sampling and risk assessment.

    CSIR Research Space (South Africa)

    Unsted, AD

    1996-03-01

    Full Text Available The origin of the project SIMGAP project GAP046, were rooted in the industry’s need to establish the viability of such a simplification of sampling procedures. Extensive investigations were conducted at three underground sites and one surface...

  15. Sampling and characterization of radioactive liquid wastes

    International Nuclear Information System (INIS)

    Zepeda R, C.; Monroy G, F.; Reyes A, T.; Lizcano, D.; Cruz C, A. C.

    2017-09-01

    To define the management of radioactive liquid wastes stored in 200 L drums, its isotope and physicochemical characterization is essential. An adequate sampling, that is, representative and homogeneous, is fundamental to obtain reliable analytical results, therefore, in this work, the use of a sampling mechanism that allows collecting homogenous aliquots, in a safe way and minimizing the generation of secondary waste is proposed. With this mechanism, 56 drums of radioactive liquid wastes were sampled, which were characterized by gamma spectrometry, liquid scintillation, and determined the following physicochemical properties: ph, conductivity, viscosity, density and chemical composition by gas chromatography. 67.86% of the radioactive liquid wastes contains H-3 and of these, 47.36% can be released unconditionally, since it presents activities lower than 100 Bq/g. 94% of the wastes are acidic and 48% have viscosities <50 MPa s. (Author)

  16. How iSamples (Internet of Samples in the Earth Sciences) Improves Sample and Data Stewardship in the Next Generation of Geoscientists

    Science.gov (United States)

    Hallett, B. W.; Dere, A. L. D.; Lehnert, K.; Carter, M.

    2016-12-01

    Vast numbers of physical samples are routinely collected by geoscientists to probe key scientific questions related to global climate change, biogeochemical cycles, magmatic processes, mantle dynamics, etc. Despite their value as irreplaceable records of nature the majority of these samples remain undiscoverable by the broader scientific community because they lack a digital presence or are not well-documented enough to facilitate their discovery and reuse for future scientific and educational use. The NSF EarthCube iSamples Research Coordination Network seeks to develop a unified approach across all Earth Science disciplines for the registration, description, identification, and citation of physical specimens in order to take advantage of the new opportunities that cyberinfrastructure offers. Even as consensus around best practices begins to emerge, such as the use of the International Geo Sample Number (IGSN), more work is needed to communicate these practices to investigators to encourage widespread adoption. Recognizing the importance of students and early career scientists in particular to transforming data and sample management practices, the iSamples Education and Training Working Group is developing training modules for sample collection, documentation, and management workflows. These training materials are made available to educators/research supervisors online at http://earthcube.org/group/isamples and can be modularized for supervisors to create a customized research workflow. This study details the design and development of several sample management tutorials, created by early career scientists and documented in collaboration with undergraduate research students in field and lab settings. Modules under development focus on rock outcrops, rock cores, soil cores, and coral samples, with an emphasis on sample management throughout the collection, analysis and archiving process. We invite others to share their sample management/registration workflows and to

  17. Coordination of Conditional Poisson Samples

    Directory of Open Access Journals (Sweden)

    Grafström Anton

    2015-12-01

    Full Text Available Sample coordination seeks to maximize or to minimize the overlap of two or more samples. The former is known as positive coordination, and the latter as negative coordination. Positive coordination is mainly used for estimation purposes and to reduce data collection costs. Negative coordination is mainly performed to diminish the response burden of the sampled units. Poisson sampling design with permanent random numbers provides an optimum coordination degree of two or more samples. The size of a Poisson sample is, however, random. Conditional Poisson (CP sampling is a modification of the classical Poisson sampling that produces a fixed-size πps sample. We introduce two methods to coordinate Conditional Poisson samples over time or simultaneously. The first one uses permanent random numbers and the list-sequential implementation of CP sampling. The second method uses a CP sample in the first selection and provides an approximate one in the second selection because the prescribed inclusion probabilities are not respected exactly. The methods are evaluated using the size of the expected sample overlap, and are compared with their competitors using Monte Carlo simulation. The new methods provide a good coordination degree of two samples, close to the performance of Poisson sampling with permanent random numbers.

  18. Predicting sample size required for classification performance

    Directory of Open Access Journals (Sweden)

    Figueroa Rosa L

    2012-02-01

    Full Text Available Abstract Background Supervised learning methods need annotated data in order to generate efficient models. Annotated data, however, is a relatively scarce resource and can be expensive to obtain. For both passive and active learning methods, there is a need to estimate the size of the annotated sample required to reach a performance target. Methods We designed and implemented a method that fits an inverse power law model to points of a given learning curve created using a small annotated training set. Fitting is carried out using nonlinear weighted least squares optimization. The fitted model is then used to predict the classifier's performance and confidence interval for larger sample sizes. For evaluation, the nonlinear weighted curve fitting method was applied to a set of learning curves generated using clinical text and waveform classification tasks with active and passive sampling methods, and predictions were validated using standard goodness of fit measures. As control we used an un-weighted fitting method. Results A total of 568 models were fitted and the model predictions were compared with the observed performances. Depending on the data set and sampling method, it took between 80 to 560 annotated samples to achieve mean average and root mean squared error below 0.01. Results also show that our weighted fitting method outperformed the baseline un-weighted method (p Conclusions This paper describes a simple and effective sample size prediction algorithm that conducts weighted fitting of learning curves. The algorithm outperformed an un-weighted algorithm described in previous literature. It can help researchers determine annotation sample size for supervised machine learning.

  19. Reverse sample genome probing, a new technique for identification of bacteria in environmental samples by DNA hybridization, and its application to the identification of sulfate-reducing bacteria in oil field samples

    International Nuclear Information System (INIS)

    Voordouw, G.; Voordouw, J.K.; Karkhoff-Schweizer, R.R.; Fedorak, P.M.; Westlake, D.W.S.

    1991-01-01

    A novel method for identification of bacteria in environmental samples by DNA hybridization is presented. It is based on the fact that, even within a genus, the genomes of different bacteria may have little overall sequence homology. This allows the use of the labeled genomic DNA of a given bacterium (referred to as a standard) to probe for its presence and that of bacteria with highly homologous genomes in total DNA obtained from an environmental sample. Alternatively, total DNA extracted from the sample can be labeled and used to probe filters on which denatured chromosomal DNA from relevant bacterial standards has been spotted. The latter technique is referred to as reverse sample genome probing, since it is the reverse of the usual practice of deriving probes from reference bacteria for analyzing a DNA sample. Reverse sample genome probing allows identification of bacteria in a sample in a single step once a master filter with suitable standards has been developed. Application of reverse sample genome probing to the identification of sulfate-reducing bacteria in 31 samples obtained primarily from oil fields in the province of Alberta has indicated that there are at least 20 genotypically different sulfate-reducing bacteria in these samples

  20. Development of bull trout sampling protocols

    Science.gov (United States)

    R. F. Thurow; J. T. Peterson; J. W. Guzevich

    2001-01-01

    This report describes results of research conducted in Washington in 2000 through Interagency Agreement #134100H002 between the U.S. Fish and Wildlife Service (USFWS) and the U.S. Forest Service Rocky Mountain Research Station (RMRS). The purpose of this agreement is to develop a bull trout (Salvelinus confluentus) sampling protocol by integrating...

  1. Chorionic villus sampling

    Science.gov (United States)

    ... medlineplus.gov/ency/article/003406.htm Chorionic villus sampling To use the sharing features on this page, please enable JavaScript. Chorionic villus sampling (CVS) is a test some pregnant women have ...

  2. Role of adsorbates on current fluctuations in DC field emission

    International Nuclear Information System (INIS)

    Luong, M.; Bonin, B.; Long, H.; Safa, H.

    1996-01-01

    Field emission experiments in DC regime usually show important current fluctuations for a fixed electric field. These fluctuations are attributed to adsorbed layers (molecules or atoms), liable to affect the work function, height and shape of the potential barrier binding the electron in the metal. The role of these adsorbed species is investigated by showing that the field emission from a well desorbed sample is stable and reproducible and by comparing the emission from the same sample before and after desorption. (author)

  3. Transition conductivity study of high temperature superconductor compounds: the role of fluctuations; Etude de la transition resistive sur des composes supraconducteurs a haute temperature critique le role des fluctuations

    Energy Technology Data Exchange (ETDEWEB)

    Pagnon, V

    1991-04-01

    This memory subject is the transition conductivity study of high temperature superconductors in corelation with their anisotropy. Systematic conductivity measurements were made on YBaCuO and BaSrCaCuO in relation with temperature from 4.2 K to 1200 K, and with a magnetic field up to 8 T in several directions. Oxygen order has an effect on the characteristics at YBaCuO transition conductivity. The activation energy for oxygen absorption is about 0.5eV. One method of analysis of the conductivity fluctuations about the transition temperature is proposed. Two separate rates are noticeable in YBaCuO compound. The 3 D fluctuations rate in the immediate neighbourghood of the transition lets place to the 2 D fluctuations rate at high temperature. Transitions temperatures governing each rate are different, that`s incompatible with the formula proposed by Lawrence and Doniach. On the other hand, the analogy with quasi-2 D magnetic systems seems more relevant. A magnetic field application or a lowering of oxygen concentration removes the 3 D fluctuations rate. Non ohmic effects observed at the transition conductivity foot are analysis as a non-linear 2 D excitation manifestation of the supraconductive phase. Finally, by measurements on strontium doped YBaCuO crystals, we confirm a metal-insulator transition along the C-Axe when oxygen concentration reduces. This is connected with the specific heat jump. All these results uplighten the fundamental bidimensional character of high transition temperature superconductivity.

  4. Sharing, samples, and generics: an antitrust framework.

    Science.gov (United States)

    Carrier, Michael A

    Rising drug prices are in the news. By increasing price, drug companies have placed vital, even life-saving, medicines out of the reach of consumers. In a recent development, brand firms have prevented generics even from entering the market. The ruse for this strategy involves risk-management programs known as Risk Evaluation and Mitigation Strategies ("REMS"). Pursuant to legislation enacted in 2007, the FDA requires REMS when a drug's risks (such as death or injury) outweigh its rewards. Brands have used this regime, intended to bring drugs to the market, to block generic competition. Regulations such as the federal Hatch-Waxman Act and state substitution laws foster widespread generic competition. But these regimes can only be effectuated through generic entry. And that entry can take place only if a generic can use a brand's sample to show that its product is equivalent. More than 100 generic firms have complained that they have not been able to access needed samples. One study of 40 drugs subject to restricted access programs found that generics' inability to enter cost more than $5 billion a year. Brand firms have contended that antitrust law does not compel them to deal with their competitors and have highlighted concerns related to safety and product liability in justifying their refusals. This Article rebuts these claims. It highlights the importance of samples in the regulatory regime and the FDA's inability to address the issue. It shows how a sharing requirement in this setting is consistent with Supreme Court caselaw. And it demonstrates that the brands' behavior fails the defendant-friendly "no economic sense" test because the conduct literally makes no sense other than by harming generics. Brands' denial of samples offers a textbook case of monopolization. In the universe of pharmaceutical antitrust behavior, other conduct--such as "pay for delay" settlements between brands and generics and "product hopping" from one drug to a slightly modified

  5. Estimation after classification using lot quality assurance sampling: corrections for curtailed sampling with application to evaluating polio vaccination campaigns.

    Science.gov (United States)

    Olives, Casey; Valadez, Joseph J; Pagano, Marcello

    2014-03-01

    To assess the bias incurred when curtailment of Lot Quality Assurance Sampling (LQAS) is ignored, to present unbiased estimators, to consider the impact of cluster sampling by simulation and to apply our method to published polio immunization data from Nigeria. We present estimators of coverage when using two kinds of curtailed LQAS strategies: semicurtailed and curtailed. We study the proposed estimators with independent and clustered data using three field-tested LQAS designs for assessing polio vaccination coverage, with samples of size 60 and decision rules of 9, 21 and 33, and compare them to biased maximum likelihood estimators. Lastly, we present estimates of polio vaccination coverage from previously published data in 20 local government authorities (LGAs) from five Nigerian states. Simulations illustrate substantial bias if one ignores the curtailed sampling design. Proposed estimators show no bias. Clustering does not affect the bias of these estimators. Across simulations, standard errors show signs of inflation as clustering increases. Neither sampling strategy nor LQAS design influences estimates of polio vaccination coverage in 20 Nigerian LGAs. When coverage is low, semicurtailed LQAS strategies considerably reduces the sample size required to make a decision. Curtailed LQAS designs further reduce the sample size when coverage is high. Results presented dispel the misconception that curtailed LQAS data are unsuitable for estimation. These findings augment the utility of LQAS as a tool for monitoring vaccination efforts by demonstrating that unbiased estimation using curtailed designs is not only possible but these designs also reduce the sample size. © 2014 John Wiley & Sons Ltd.

  6. Sampling methods

    International Nuclear Information System (INIS)

    Loughran, R.J.; Wallbrink, P.J.; Walling, D.E.; Appleby, P.G.

    2002-01-01

    Methods for the collection of soil samples to determine levels of 137 Cs and other fallout radionuclides, such as excess 210 Pb and 7 Be, will depend on the purposes (aims) of the project, site and soil characteristics, analytical capacity, the total number of samples that can be analysed and the sample mass required. The latter two will depend partly on detector type and capabilities. A variety of field methods have been developed for different field conditions and circumstances over the past twenty years, many of them inherited or adapted from soil science and sedimentology. The use of them inherited or adapted from soil science and sedimentology. The use of 137 Cs in erosion studies has been widely developed, while the application of fallout 210 Pb and 7 Be is still developing. Although it is possible to measure these nuclides simultaneously, it is common for experiments to designed around the use of 137 Cs along. Caesium studies typically involve comparison of the inventories found at eroded or sedimentation sites with that of a 'reference' site. An accurate characterization of the depth distribution of these fallout nuclides is often required in order to apply and/or calibrate the conversion models. However, depending on the tracer involved, the depth distribution, and thus the sampling resolution required to define it, differs. For example, a depth resolution of 1 cm is often adequate when using 137 Cs. However, fallout 210 Pb and 7 Be commonly has very strong surface maxima that decrease exponentially with depth, and fine depth increments are required at or close to the soil surface. Consequently, different depth incremental sampling methods are required when using different fallout radionuclides. Geomorphic investigations also frequently require determination of the depth-distribution of fallout nuclides on slopes and depositional sites as well as their total inventories

  7. Hand drawing of pencil electrodes on paper platforms for contactless conductivity detection of inorganic cations in human tear samples using electrophoresis chips.

    Science.gov (United States)

    Chagas, Cyro L S; Costa Duarte, Lucas; Lobo-Júnior, Eulício O; Piccin, Evandro; Dossi, Nicolò; Coltro, Wendell K T

    2015-08-01

    This paper describes for the first time the fabrication of pencil drawn electrodes (PDE) on paper platforms for capacitively coupled contactless conductivity detection (C(4) D) on electrophoresis microchips. PDE-C(4) D devices were attached on PMMA electrophoresis chips and used for detection of K(+) and Na(+) in human tear samples. PDE-C(4) D devices were produced on office paper and chromatographic paper platforms and their performance were thoroughly investigated using a model mixture containing K(+) , Na(+) , and Li(+) . In comparison with chromatographic paper, PDE-C(4) D fabricated on office paper has exhibited better performance due to its higher electrical conductivity. Furthermore, the detector response was similar to that recorded with electrodes prepared with copper adhesive tape. The fabrication of PDE-C(4) D on office paper has offered great advantages including extremely low cost (paper). The proposed electrodes demonstrated excellent analytical performance with good reproducibility. For an inter-PDE comparison (n = 7), the RSD values for migration time, peak area, and separation efficiency were lower than 2.5, 10.5, and 14%, respectively. The LOD's achieved for K(+) , Na(+) , and Li(+) were 4.9, 6.8, and 9.0 μM, respectively. The clinical feasibility of the proposed approach was successfully demonstrated with the quantitative analysis of K(+) and Na(+) in tear samples. The concentration levels found for K(+) and Na(+) were, respectively, 20.8 ± 0.1 mM and 101.2 ± 0.1 mM for sample #1, and 20.4 ± 0.1 mM and 111.4 ± 0.1 mM for sample #2. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Applications of Liquid-Phase Microextraction in the Sample Preparation of Environmental Solid Samples

    Directory of Open Access Journals (Sweden)

    Helena Prosen

    2014-05-01

    Full Text Available Solvent extraction remains one of the fundamental sample preparation techniques in the analysis of environmental solid samples, but organic solvents are toxic and environmentally harmful, therefore one of the possible greening directions is its miniaturization. The present review covers the relevant research from the field of application of microextraction to the sample preparation of environmental solid samples (soil, sediments, sewage sludge, dust etc. published in the last decade. Several innovative liquid-phase microextraction (LPME techniques that have emerged recently have also been applied as an aid in sample preparation of these samples: single-drop microextraction (SDME, hollow fiber-liquid phase microextraction (HF-LPME, dispersive liquid-liquid microextraction (DLLME. Besides the common organic solvents, surfactants and ionic liquids are also used. However, these techniques have to be combined with another technique to release the analytes from the solid sample into an aqueous solution. In the present review, the published methods were categorized into three groups: LPME in combination with a conventional solvent extraction; LPME in combination with an environmentally friendly extraction; LPME without previous extraction. The applicability of these approaches to the sample preparation for the determination of pollutants in solid environmental samples is discussed, with emphasis on their strengths, weak points and environmental impact.

  9. Applications of liquid-phase microextraction in the sample preparation of environmental solid samples.

    Science.gov (United States)

    Prosen, Helena

    2014-05-23

    Solvent extraction remains one of the fundamental sample preparation techniques in the analysis of environmental solid samples, but organic solvents are toxic and environmentally harmful, therefore one of the possible greening directions is its miniaturization. The present review covers the relevant research from the field of application of microextraction to the sample preparation of environmental solid samples (soil, sediments, sewage sludge, dust etc.) published in the last decade. Several innovative liquid-phase microextraction (LPME) techniques that have emerged recently have also been applied as an aid in sample preparation of these samples: single-drop microextraction (SDME), hollow fiber-liquid phase microextraction (HF-LPME), dispersive liquid-liquid microextraction (DLLME). Besides the common organic solvents, surfactants and ionic liquids are also used. However, these techniques have to be combined with another technique to release the analytes from the solid sample into an aqueous solution. In the present review, the published methods were categorized into three groups: LPME in combination with a conventional solvent extraction; LPME in combination with an environmentally friendly extraction; LPME without previous extraction. The applicability of these approaches to the sample preparation for the determination of pollutants in solid environmental samples is discussed, with emphasis on their strengths, weak points and environmental impact.

  10. High speed network sampling

    OpenAIRE

    Rindalsholt, Ole Arild

    2005-01-01

    Master i nettverks- og systemadministrasjon Classical Sampling methods play an important role in the current practice of Internet measurement. With today’s high speed networks, routers cannot manage to generate complete Netflow data for every packet. They have to perform restricted sampling. This thesis summarizes some of the most important sampling schemes and their applications before diving into an analysis on the effect of sampling Netflow records.

  11. High-resolution neutron spectroscopy on protein solution samples

    International Nuclear Information System (INIS)

    Grimaldo, M.; Henning, M.; Roosen-Runge, F.; Seydel, T.; Jalarvo, N.; Zamponi, M.; Zanini, F.; Zhang, F.; Schreiber, F.

    2015-01-01

    Proteins in solution are subject to a complex superposition of global translational and rotational diffusion as well as internal relaxations covering a wide range of time scales. With the advent of new high-flux neutron spectrometers in combination with enhanced analysis frameworks it has become possible to separate these different contributions. We discuss new approaches to the analysis by presenting example spectra and fits from data recorded on the backscattering spectrometers IN16, IN16B, and BASIS on the same protein solution sample. We illustrate the separation of the rotational and translational diffusion contribution, the accurate treatment of the solvent contribution, and the extraction of information on internal fluctuations. We also highlight the progress made in passing from second- to third-generation backscattering spectrometers. (authors)

  12. Low-sampling-rate ultra-wideband channel estimation using equivalent-time sampling

    KAUST Repository

    Ballal, Tarig

    2014-09-01

    In this paper, a low-sampling-rate scheme for ultra-wideband channel estimation is proposed. The scheme exploits multiple observations generated by transmitting multiple pulses. In the proposed scheme, P pulses are transmitted to produce channel impulse response estimates at a desired sampling rate, while the ADC samples at a rate that is P times slower. To avoid loss of fidelity, the number of sampling periods (based on the desired rate) in the inter-pulse interval is restricted to be co-prime with P. This condition is affected when clock drift is present and the transmitted pulse locations change. To handle this case, and to achieve an overall good channel estimation performance, without using prior information, we derive an improved estimator based on the bounded data uncertainty (BDU) model. It is shown that this estimator is related to the Bayesian linear minimum mean squared error (LMMSE) estimator. Channel estimation performance of the proposed sub-sampling scheme combined with the new estimator is assessed in simulation. The results show that high reduction in sampling rate can be achieved. The proposed estimator outperforms the least squares estimator in almost all cases, while in the high SNR regime it also outperforms the LMMSE estimator. In addition to channel estimation, a synchronization method is also proposed that utilizes the same pulse sequence used for channel estimation. © 2014 IEEE.

  13. Fingerprinting analysis of oil samples for inter-laboratory Round Robin, 2007

    Energy Technology Data Exchange (ETDEWEB)

    Yang, C.; Wang, Z.; Hollebone, B.; Brown, C.E.; Landriault, M. [Environment Canada, Ottawa, ON (Canada). Emergencies Science and Technology Division, Science and Technology Branch, Environmental Science and Technology Centre; Shang, D. [Environment Canada, North Vancouver, BC (Canada). Pacific Environmental Science Centre; Losier, R.; Cook, A. [Environment Canada, Moncton, NB (Canada). Environmental Science Centre

    2008-07-01

    The oil from an oil spill must undergo a complete chemical characterization in order to determine the source of the oil, to distinguish the spilled oil from background hydrocarbons and to evaluate the extent of impact. A study was conducted to determine the ability of international analytical laboratories to independently conduct forensic oil analysis and identification. A Round Robin study was conducted in which advanced chemical fingerprinting and data interpretation techniques were used to differentiate the types and sources of spilled oils. The participants of the Round Robin exercise were the Institute of Inland Water Management and Waste Water Treatment (RIZA) in the Netherlands and the Federal Maritime and Hydrographic Agency (BSH) in Germany. In May 2007, 6 oil samples were distributed to the participants. In the artificial oil spill scenario, 2 oil samples were considered as candidate sources and the other 4 samples were labeled as spilled oils. No other information about these oils was provided before submission of final results. Chemical fingerprinting was carried out using gas chromatography, flame ionization detection and mass spectrometry along with statistical data to determine the source of the spill. N-alkanes, alkylated polyaromatic hydrocarbons, biomarker terpanes and steranes and triaromatic steranes were normalized to C{sub 30} 17{alpha}(H)21{beta}(H)-hopane and then semi-quantitated. Thirty diagnostic ratios of target compounds were calculated from their peak heights and areas at selected ions. Results of the 2 source samples were compared with 4 spill samples. Tiered fingerprinting analysis revealed that source oil 1 was a non-match with spill samples 3 and 4, but a probable match with spill samples 5 and 6. Source sample 2 did not match any of the 4 spilled oils. A lack of background information essential to oil spill identification made it impossible to draw an unambiguous conclusion. 14 refs., 4 tabs., 6 figs.

  14. Fingerprinting analysis of oil samples for inter-laboratory Round Robin, 2007

    International Nuclear Information System (INIS)

    Yang, C.; Wang, Z.; Hollebone, B.; Brown, C.E.; Landriault, M.; Shang, D.; Losier, R.; Cook, A.

    2008-01-01

    The oil from an oil spill must undergo a complete chemical characterization in order to determine the source of the oil, to distinguish the spilled oil from background hydrocarbons and to evaluate the extent of impact. A study was conducted to determine the ability of international analytical laboratories to independently conduct forensic oil analysis and identification. A Round Robin study was conducted in which advanced chemical fingerprinting and data interpretation techniques were used to differentiate the types and sources of spilled oils. The participants of the Round Robin exercise were the Institute of Inland Water Management and Waste Water Treatment (RIZA) in the Netherlands and the Federal Maritime and Hydrographic Agency (BSH) in Germany. In May 2007, 6 oil samples were distributed to the participants. In the artificial oil spill scenario, 2 oil samples were considered as candidate sources and the other 4 samples were labeled as spilled oils. No other information about these oils was provided before submission of final results. Chemical fingerprinting was carried out using gas chromatography, flame ionization detection and mass spectrometry along with statistical data to determine the source of the spill. N-alkanes, alkylated polyaromatic hydrocarbons, biomarker terpanes and steranes and triaromatic steranes were normalized to C 30 17α(H)21β(H)-hopane and then semi-quantitated. Thirty diagnostic ratios of target compounds were calculated from their peak heights and areas at selected ions. Results of the 2 source samples were compared with 4 spill samples. Tiered fingerprinting analysis revealed that source oil 1 was a non-match with spill samples 3 and 4, but a probable match with spill samples 5 and 6. Source sample 2 did not match any of the 4 spilled oils. A lack of background information essential to oil spill identification made it impossible to draw an unambiguous conclusion. 14 refs., 4 tabs., 6 figs

  15. SAMPLING IN EXTERNAL AUDIT - THE MONETARY UNIT SAMPLING METHOD

    Directory of Open Access Journals (Sweden)

    E. Dascalu

    2016-12-01

    Full Text Available This article approaches the general issue of diminishing the evidence investigation space in audit activities, by means of sampling techniques, given that in the instance of a significant data volume an exhaustive examination of the assessed popula¬tion is not possible and/or effective. The general perspective of the presentation involves dealing with sampling risk, in essence, the risk that a selected sample may not be representative for the overall population, in correlation with the audit risk model and with the component parts of this model (inherent risk, control risk and non detection risk and highlights the inter-conditionings between these two models.

  16. New approaches to wipe sampling methods for antineoplastic and other hazardous drugs in healthcare settings.

    Science.gov (United States)

    Connor, Thomas H; Smith, Jerome P

    2016-09-01

    At the present time, the method of choice to determine surface contamination of the workplace with antineoplastic and other hazardous drugs is surface wipe sampling and subsequent sample analysis with a variety of analytical techniques. The purpose of this article is to review current methodology for determining the level of surface contamination with hazardous drugs in healthcare settings and to discuss recent advances in this area. In addition it will provide some guidance for conducting surface wipe sampling and sample analysis for these drugs in healthcare settings. Published studies on the use of wipe sampling to measure hazardous drugs on surfaces in healthcare settings drugs were reviewed. These studies include the use of well-documented chromatographic techniques for sample analysis in addition to newly evolving technology that provides rapid analysis of specific antineoplastic. Methodology for the analysis of surface wipe samples for hazardous drugs are reviewed, including the purposes, technical factors, sampling strategy, materials required, and limitations. The use of lateral flow immunoassay (LFIA) and fluorescence covalent microbead immunosorbent assay (FCMIA) for surface wipe sample evaluation is also discussed. Current recommendations are that all healthc a re settings where antineoplastic and other hazardous drugs are handled include surface wipe sampling as part of a comprehensive hazardous drug-safe handling program. Surface wipe sampling may be used as a method to characterize potential occupational dermal exposure risk and to evaluate the effectiveness of implemented controls and the overall safety program. New technology, although currently limited in scope, may make wipe sampling for hazardous drugs more routine, less costly, and provide a shorter response time than classical analytical techniques now in use.

  17. A simulative comparison of respondent driven sampling with incentivized snowball sampling--the "strudel effect".

    Science.gov (United States)

    Gyarmathy, V Anna; Johnston, Lisa G; Caplinskiene, Irma; Caplinskas, Saulius; Latkin, Carl A

    2014-02-01

    Respondent driven sampling (RDS) and incentivized snowball sampling (ISS) are two sampling methods that are commonly used to reach people who inject drugs (PWID). We generated a set of simulated RDS samples on an actual sociometric ISS sample of PWID in Vilnius, Lithuania ("original sample") to assess if the simulated RDS estimates were statistically significantly different from the original ISS sample prevalences for HIV (9.8%), Hepatitis A (43.6%), Hepatitis B (Anti-HBc 43.9% and HBsAg 3.4%), Hepatitis C (87.5%), syphilis (6.8%) and Chlamydia (8.8%) infections and for selected behavioral risk characteristics. The original sample consisted of a large component of 249 people (83% of the sample) and 13 smaller components with 1-12 individuals. Generally, as long as all seeds were recruited from the large component of the original sample, the simulation samples simply recreated the large component. There were no significant differences between the large component and the entire original sample for the characteristics of interest. Altogether 99.2% of 360 simulation sample point estimates were within the confidence interval of the original prevalence values for the characteristics of interest. When population characteristics are reflected in large network components that dominate the population, RDS and ISS may produce samples that have statistically non-different prevalence values, even though some isolated network components may be under-sampled and/or statistically significantly different from the main groups. This so-called "strudel effect" is discussed in the paper. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  18. 27 CFR 6.95 - Consumer tasting or sampling at retail establishments.

    Science.gov (United States)

    2010-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2010-04-01 2010-04-01 false Consumer tasting or sampling at retail establishments. 6.95 Section 6.95 Alcohol, Tobacco Products and Firearms ALCOHOL AND... tasting or sampling at retail establishments. An industry member may conduct tasting or sampling...

  19. Comparison of sampling strategies for tobacco retailer inspections to maximize coverage in vulnerable areas and minimize cost.

    Science.gov (United States)

    Lee, Joseph G L; Shook-Sa, Bonnie E; Bowling, J Michael; Ribisl, Kurt M

    2017-06-23

    In the United States, tens of thousands of inspections of tobacco retailers are conducted each year. Various sampling choices can reduce travel costs, emphasize enforcement in areas with greater non-compliance, and allow for comparability between states and over time. We sought to develop a model sampling strategy for state tobacco retailer inspections. Using a 2014 list of 10,161 North Carolina tobacco retailers, we compared results from simple random sampling; stratified, clustered at the ZIP code sampling; and, stratified, clustered at the census tract sampling. We conducted a simulation of repeated sampling and compared approaches for their comparative level of precision, coverage, and retailer dispersion. While maintaining an adequate design effect and statistical precision appropriate for a public health enforcement program, both stratified, clustered ZIP- and tract-based approaches were feasible. Both ZIP and tract strategies yielded improvements over simple random sampling, with relative improvements, respectively, of average distance between retailers (reduced 5.0% and 1.9%), percent Black residents in sampled neighborhoods (increased 17.2% and 32.6%), percent Hispanic residents in sampled neighborhoods (reduced 2.2% and increased 18.3%), percentage of sampled retailers located near schools (increased 61.3% and 37.5%), and poverty rate in sampled neighborhoods (increased 14.0% and 38.2%). States can make retailer inspections more efficient and targeted with stratified, clustered sampling. Use of statistically appropriate sampling strategies like these should be considered by states, researchers, and the Food and Drug Administration to improve program impact and allow for comparisons over time and across states. The authors present a model tobacco retailer sampling strategy for promoting compliance and reducing costs that could be used by U.S. states and the Food and Drug Administration (FDA). The design is feasible to implement in North Carolina. Use of

  20. Initial Reliability and Validity of the Life Satisfaction Scale for Problem Youth in a Sample of Drug Abusing and Conduct Disordered Youth

    Science.gov (United States)

    Donohue, Brad; Teichner, Gordon; Azrin, Nathan; Weintraub, Noah; Crum, Thomas A.; Murphy, Leah; Silver, N. Clayton

    2003-01-01

    Responses to Life Satisfaction Scale for Problem Youth (LSSPY) items were examined in a sample of 193 substance abusing and conduct disordered adolescents. In responding to the LSSPY, youth endorse their percentage of happiness (0 to 100%) in twelve domains (i.e., friendships, family, school, employment/work, fun activities, appearance, sex…

  1. Generalized sampling in Julia

    DEFF Research Database (Denmark)

    Jacobsen, Christian Robert Dahl; Nielsen, Morten; Rasmussen, Morten Grud

    2017-01-01

    Generalized sampling is a numerically stable framework for obtaining reconstructions of signals in different bases and frames from their samples. For example, one can use wavelet bases for reconstruction given frequency measurements. In this paper, we will introduce a carefully documented toolbox...... for performing generalized sampling in Julia. Julia is a new language for technical computing with focus on performance, which is ideally suited to handle the large size problems often encountered in generalized sampling. The toolbox provides specialized solutions for the setup of Fourier bases and wavelets....... The performance of the toolbox is compared to existing implementations of generalized sampling in MATLAB....

  2. Stability of carboxyhemoglobin in stored and mailed blood samples.

    Science.gov (United States)

    Hampson, Neil B

    2008-02-01

    Elevated blood carboxyhemoglobin (COHb) levels are used to confirm a clinical diagnosis of exposure to carbon monoxide (CO) and, in some instances, assess severity of poisoning. However, many hospital laboratories cannot measure COHb because they do not have CO-oximeters. In such instances, blood samples are often sent to outside laboratories or with a transported patient for measurement at the receiving hospital. This study was conducted to assess the stability of COHb in stored and mailed blood samples anticoagulated with heparin. Adult human blood was drawn into standard sample tubes anticoagulated with sodium heparin. Carbon monoxide gas was infused to raise the COHb level to 25% to 35%. Samples were then refrigerated or stored at room temperature, and serial COHb determinations were performed for 28 days. Additional samples were measured after being mailed locally or across the United States and back. No significant changes in COHb levels were seen in samples stored either in refrigeration or at room temperature over a period of 28 days or in samples shipped without refrigeration locally or across the United States. Carboxyhemoglobin levels in whole blood samples anticoagulated with heparin are stable with or without refrigeration for up to 4 weeks. If COHb measurement capability is not available, such samples may be shipped or transported with patients with confidence that the COHb level will be stable when measured at a later time.

  3. Conduct Disorder and Oppositional Defiant Disorder in a National Sample: Developmental Epidemiology

    Science.gov (United States)

    Maughan, Barbara; Rowe, Richard; Messer, Julie; Goodman, Robert; Meltzer, Howard

    2004-01-01

    Background: Despite an expanding epidemiological evidence base, uncertainties remain over key aspects of the epidemiology of the "antisocial" disorders in childhood and adolescence. Methods: We used cross-sectional data on a nationally representative sample of 10,438 5-15-year-olds drawn from the 1999 British Child Mental Health Survey…

  4. The Effect of Asymmetrical Sample Training on Retention Functions for Hedonic Samples in Rats

    Science.gov (United States)

    Simmons, Sabrina; Santi, Angelo

    2012-01-01

    Rats were trained in a symbolic delayed matching-to-sample task to discriminate sample stimuli that consisted of the presence of food or the absence of food. Asymmetrical sample training was provided in which one group was initially trained with only the food sample and the other group was initially trained with only the no-food sample. In…

  5. Ophthalmologic changes related to radiation exposure and age in the adult health study sample, Hiroshima and Nagasaki

    International Nuclear Information System (INIS)

    Choshi, Kanji; Mishima, Hiromu; Takaku, Isao; Takase, Tomoko; Neriishi, Shotaro.

    1983-11-01

    A two-year ophthalmologic study of age- and radiation-related ophthalmologic lesions among the Adult Health Study (AHS) population of Hiroshima and Nagasaki was conducted at RERF in 1978-80. The study population in both cities was composed of all persons exposed to 100+ rad in the AHS, their controls, and all other persons in the AHS sample with a previous record of axial opacities or posterior subcapsular changes, and the in utero clinical sample. The ophthalmologic examination was conducted on 1,582 persons in Hiroshima and 719 persons in Nagasaki belonging to the AHS sample, and 67 persons in Hiroshima and 17 persons in Nagasaki belonging to the in utero clinical sample. Participation in the study was 42% of the eligible AHS sample in Hiroshima and 21% in Nagasaki, and 24% of the eligible in utero sample in Hiroshima and 26% in Nagasaki. Increased lenticular opacities, other lens changes, and loss of visual acuity and accommodation occurred with increasing age in both exposed and control subjects as manifestations of the normal aging process. A highly significant excess risk for all ages in the 300+ rad group in comparison to those in the control group was observed for both axial opacities and posterior subcapsular changes in Hiroshima, but not in Nagasaki. (J.P.N.)

  6. Final Sampling and Analysis Plan for Background Sampling, Fort Sheridan, Illinois

    National Research Council Canada - National Science Library

    1995-01-01

    .... This Background Sampling and Analysis Plan (BSAP) is designed to address this issue through the collection of additional background samples at Fort Sheridan to support the statistical analysis and the Baseline Risk Assessment (BRA...

  7. Simulation of the Sampling Distribution of the Mean Can Mislead

    Science.gov (United States)

    Watkins, Ann E.; Bargagliotti, Anna; Franklin, Christine

    2014-01-01

    Although the use of simulation to teach the sampling distribution of the mean is meant to provide students with sound conceptual understanding, it may lead them astray. We discuss a misunderstanding that can be introduced or reinforced when students who intuitively understand that "bigger samples are better" conduct a simulation to…

  8. The Sample Analysis at Mars Investigation and Instrument Suite

    Science.gov (United States)

    Mahaffy, Paul; Webster, Christopher R.; Conrad, Pamela G.; Arvey, Robert; Bleacher, Lora; Brinckerhoff, William B.; Eigenbrode, Jennifer L.; Chalmers, Robert A.; Dworkin, Jason P.; Errigo, Therese; hide

    2012-01-01

    The Sample Analysis at Mars (SAM) investigation of the Mars Science Laboratory (MSL) addresses the chemical and isotopic composition of the atmosphere and volatiles extracted from solid samples. The SAM investigation is designed to contribute substantially to the mission goal of quantitatively assessing the habitability of Mars as an essential step in the search for past or present life on Mars. SAM is a 40 kg instrument suite located in the interior of MSL's Curiosity rover. The SAM instruments are a quadrupole mass spectrometer, a tunable laser spectrometer, and a 6-column gas chromatograph all coupled through solid and gas processing systems to provide complementary information on the same samples. The SAM suite is able to measure a suite of light isotopes and to analyze volatiles directly from the atmosphere or thermally released from solid samples. In addition to measurements of simple inorganic compounds and noble gases SAM will conduct a sensitive search for organic compounds with either thermal or chemical extraction from sieved samples delivered by the sample processing system on the Curiosity rover's robotic arm,

  9. Effective sampling strategy to detect food and feed contamination

    NARCIS (Netherlands)

    Bouzembrak, Yamine; Fels, van der Ine

    2018-01-01

    Sampling plans for food safety hazards are aimed to be used to determine whether a lot of food is contaminated (with microbiological or chemical hazards) or not. One of the components of sampling plans is the sampling strategy. The aim of this study was to compare the performance of three

  10. Developing Students' Reasoning about Samples and Sampling Variability as a Path to Expert Statistical Thinking

    Science.gov (United States)

    Garfield, Joan; Le, Laura; Zieffler, Andrew; Ben-Zvi, Dani

    2015-01-01

    This paper describes the importance of developing students' reasoning about samples and sampling variability as a foundation for statistical thinking. Research on expert-novice thinking as well as statistical thinking is reviewed and compared. A case is made that statistical thinking is a type of expert thinking, and as such, research…

  11. Sampling device for withdrawing a representative sample from single and multi-phase flows

    Science.gov (United States)

    Apley, Walter J.; Cliff, William C.; Creer, James M.

    1984-01-01

    A fluid stream sampling device has been developed for the purpose of obtaining a representative sample from a single or multi-phase fluid flow. This objective is carried out by means of a probe which may be inserted into the fluid stream. Individual samples are withdrawn from the fluid flow by sampling ports with particular spacings, and the sampling parts are coupled to various analytical systems for characterization of the physical, thermal, and chemical properties of the fluid flow as a whole and also individually.

  12. ISOLOK VALVE ACCEPTANCE TESTING FOR DWPF SME SAMPLING PROCESS

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, T.; Hera, K.; Coleman, C.; Jones, M.; Wiedenman, B.

    2011-12-05

    Evaluation of the Defense Waste Processing Facility (DWPF) Chemical Process Cell (CPC) cycle time identified several opportunities to improve the CPC processing time. Of the opportunities, a focus area related to optimizing the equipment and efficiency of the sample turnaround time for DWPF Analytical Laboratory was identified. The Mechanical Systems & Custom Equipment Development (MS&CED) Section of the Savannah River National Laboratory (SRNL) evaluated the possibility of using an Isolok{reg_sign} sampling valve as an alternative to the Hydragard{reg_sign} valve for taking process samples. Previous viability testing was conducted with favorable results using the Isolok sampler and reported in SRNL-STI-2010-00749 (1). This task has the potential to improve operability, reduce maintenance time and decrease CPC cycle time. This report summarizes the results from acceptance testing which was requested in Task Technical Request (TTR) HLW-DWPF-TTR-2010-0036 (2) and which was conducted as outlined in Task Technical and Quality Assurance Plan (TTQAP) SRNL-RP-2011-00145 (3). The Isolok to be tested is the same model which was tested, qualified, and installed in the Sludge Receipt Adjustment Tank (SRAT) sample system. RW-0333P QA requirements apply to this task. This task was to qualify the Isolok sampler for use in the DWPF Slurry Mix Evaporator (SME) sampling process. The Hydragard, which is the current baseline sampling method, was used for comparison to the Isolok sampling data. The Isolok sampler is an air powered grab sampler used to 'pull' a sample volume from a process line. The operation of the sampler is shown in Figure 1. The image on the left shows the Isolok's spool extended into the process line and the image on the right shows the sampler retracted and then dispensing the liquid into the sampling container. To determine tank homogeneity, a Coliwasa sampler was used to grab samples at a high and low location within the mixing tank. Data from

  13. The new Chalk River AMS ion source, sample changer and external sample magazine

    International Nuclear Information System (INIS)

    Koslowsky, V.T.; Bray, N.; Imahori, Y.; Andrews, H.R.; Davies, W.G.

    1997-01-01

    A new sample magazine, sample changer and ion source have been developed and are in routine use at Chalk River. The system features a readily accessible 40-sample magazine at ground potential that is external to the ion source and high-voltage cage. The samples are held in an inert atmosphere and can be individually examined or removed; they can be exchanged en masse as a complete magazine concurrent with an AMS measurement. On-line sample changing is done with a pneumatic rabbit transfer system employing two stages of differential pumping. At Chalk River this is routinely performed across a 200 kV potential. Sample positioning is precise, and hundreds of 36 Cl and 129 I samples have been measured over a period of several days without interruption or alteration of ion source operating conditions. (author)

  14. How to take environmental samples for stable isotope analyses

    International Nuclear Information System (INIS)

    Rogers, K.M.

    2009-01-01

    It is possible to analyse a diverse range of samples for environmental investigations. The main types are soil/sediments, vegetation, fauna, shellfish, waste and water. Each type of samples requires different storage and collection methods. Outlined here are the preferred methods of collection to ensure maximum sample integrity and reliability. (author).

  15. How to take environmental samples for stable isotope analyses

    International Nuclear Information System (INIS)

    Rogers, K.M.

    2013-01-01

    It is possible to analyse a diverse range of samples for environmental investigations. The main types are soil/sediments, vegetation, fauna, shellfish, waste and water. Each type of samples requires different storage and collection methods. Outlined here are the preferred methods of collection to ensure maximum sample integrity and reliability. (author).

  16. How to take environmental samples for stable isotope analyses

    International Nuclear Information System (INIS)

    Rogers, K.M.

    2012-01-01

    It is possible to analyse a diverse range of samples for environmental investigations. The main types are soil/sediments, vegetation, fauna, shellfish, waste and water. Each type of samples requires different storage and collection methods. Outlined here are the preferred methods of collection to ensure maximum sample integrity and reliability. (author).

  17. How to take environmental samples for stable isotope analyses

    International Nuclear Information System (INIS)

    Rogers, K.M.

    2009-01-01

    It is possible to analyse a diverse range of samples for environmental investigations. The main types are soil/sediments, vegetation, fauna, shellfish, waste and water. Each type of samples requires different storage and collection methods. Outlined here are the preferred methods of collection to ensure maximum sample integrity and reliability. (author)

  18. Permeability and compression characteristics of municipal solid waste samples

    Science.gov (United States)

    Durmusoglu, Ertan; Sanchez, Itza M.; Corapcioglu, M. Yavuz

    2006-08-01

    Four series of laboratory tests were conducted to evaluate the permeability and compression characteristics of municipal solid waste (MSW) samples. While the two series of tests were conducted using a conventional small-scale consolidometer, the two others were conducted in a large-scale consolidometer specially constructed for this study. In each consolidometer, the MSW samples were tested at two different moisture contents, i.e., original moisture content and field capacity. A scale effect between the two consolidometers with different sizes was investigated. The tests were carried out on samples reconsolidated to pressures of 123, 246, and 369 kPa. Time settlement data gathered from each load increment were employed to plot strain versus log-time graphs. The data acquired from the compression tests were used to back calculate primary and secondary compression indices. The consolidometers were later adapted for permeability experiments. The values of indices and the coefficient of compressibility for the MSW samples tested were within a relatively narrow range despite the size of the consolidometer and the different moisture contents of the specimens tested. The values of the coefficient of permeability were within a band of two orders of magnitude (10-6-10-4 m/s). The data presented in this paper agreed very well with the data reported by previous researchers. It was concluded that the scale effect in the compression behavior was significant. However, there was usually no linear relationship between the results obtained in the tests.

  19. A whole-air relaxed eddy accumulation measurement system for sampling vertical vapour exchange of elemental mercury

    Directory of Open Access Journals (Sweden)

    Jonas Sommar

    2013-11-01

    Full Text Available An apparatus relying on relaxed eddy accumulation (REA methodology has been designed and developed for continuous-field measurements of vertical Hg0 fluxes over cropland ecosystems. This micro-meteorological technique requires sampling of turbulent eddies into up- and downdraught channels at a constant flow rate and accurate timing, based on a threshold involving the sign of vertical wind component (w. The fully automated system is of a whole-air type drawing air at a high velocity to the REA sampling apparatus and allowing for the rejection of samples associated with w-fluctuations around zero. Conditional sampling was executed at 10-Hz resolution on a sub-stream by two fast-response three-way solenoid switching valves connected in parallel to a zero Hg0 air supply through their normally open ports. To suppress flow transients resulting from switching, pressure differentials across the two upstream ports of the conditional valves were minimised using a control unit. The Hg0 concentrations of the up- and downdraught channel were sequentially (each by two consecutive 5-minute gas samples determined after enhancement collection onto gold traps by an automated cold vapour atomic fluorescence spectrophotometer (CVAFS instrument. A protocol of regular reference sampling periods was implemented during field campaigns to continuously adjust for bias that may exist between the two conditional sampling channels. Using a 5-minute running average was conditional threshold, nearly-constant relaxation coefficients (β s of ~0.56 were determined during two bi-weekly field deployments when turbulence statistics were assured for good quality, in accordance with previously reported estimates. The fully developed REA-CVAFS system underwent Hg0 flux field trial runs at a winter wheat cropland located in the North China Plain. Over a 15-d period during early May 2012, dynamic, often bi-directional, fluxes were observed during the course of a day with a tendency of

  20. Validation of a new HPV self-sampling device for cervical cancer screening: The Cervical and Self-Sample In Screening (CASSIS) study.

    Science.gov (United States)

    El-Zein, Mariam; Bouten, Sheila; Louvanto, Karolina; Gilbert, Lucy; Gotlieb, Walter; Hemmings, Robert; Behr, Marcel A; Franco, Eduardo L; Liang, Victoria; Martins, Claudia; Duarte, Silvy; Sarban, Natalia; Geddes, Patricia; Massa, Ana; Samios, Kathrin; Aboufadl, Siham; Verdon, Sophie; Pereria, Cynthia; Lacroix, Isabelle

    2018-04-17

    We compared the self-sampling performance of the newly designed HerSwab™ device with a physician-collected cervical sample and another self-sample using the cobas® PCR Female swab for the detection of cervical intraepithelial neoplasia (CIN) and cancer. Women referred for colposcopy at McGill University affiliated hospital clinics collected two consecutive self-samples, one with HerSwab™ and one with cobas® swab, after receiving instructions. The order of sampling was randomized. The colposcopist then collected a cervical sample and conducted a colposcopic examination. Samples were tested for human papillomavirus (HPV) DNA. Sensitivity and specificity to detect CIN2+ and respective 95% confidence intervals (CI) were calculated to compare sampling approaches. The HPV testing agreement between samples was measured using the Kappa statistic. Of 1217 women enrolled, 1076 had complete results for HPV and cytology; 148 (13.8%) had CIN1, 147 (13.7%) had CIN2/3, and 5 (0.5%) had cancer. There was very good agreement between methods for HPV detection (HerSwab™ versus physician: kappa=0.84; cobas® swabs versus physician: kappa=0.81; HerSwab™ versus cobas® swabs: kappa=0.87). The sensitivity of HPV detection for CIN2+ was 87.6% (95%CI: 79.8-93.2) with self-sampling using HerSwab™, 88.6% (95%CI: 80.9-94.0) with self-sampling using the cobas® swab, and 92.4% (95%CI: 85.5-96.7) with physician sampling. Corresponding estimates of specificity were 58.1% (95%CI: 54.1-62.1), 55.0% (95%CI: 50.9-59.0) and 58.7% (95%CI: 54.6-62.6). Cytology (ASC-US or more severe) done on the physician-collected specimen was 80.2% (95%CI: 70.8-87.6) sensitive and 61.4% (95%CI: 57.2-65.5) specific for CIN2+. The HerSwab™ had good agreement with physician sampling in detecting HPV, and adequate performance in detecting high-grade lesions among women referred to colposcopy for abnormal cytology. Copyright © 2018 Elsevier Inc. All rights reserved.

  1. The use of importance sampling in a trial assessment to obtain converged estimates of radiological risk

    International Nuclear Information System (INIS)

    Johnson, K.; Lucas, R.

    1986-12-01

    In developing a methodology for assessing potential sites for the disposal of radioactive wastes, the Department of the Environment has conducted a series of trial assessment exercises. In order to produce converged estimates of radiological risk using the SYVAC A/C simulation system an efficient sampling procedure is required. Previous work has demonstrated that importance sampling can substantially increase sampling efficiency. This study used importance sampling to produce converged estimates of risk for the first DoE trial assessment. Four major nuclide chains were analysed. In each case importance sampling produced converged risk estimates with between 10 and 170 times fewer runs of the SYVAC A/C model. This increase in sampling efficiency can reduce the total elapsed time required to obtain a converged estimate of risk from one nuclide chain by a factor of 20. The results of this study suggests that the use of importance sampling could reduce the elapsed time required to perform a risk assessment of a potential site by a factor of ten. (author)

  2. Elucidation of the fluctuation history of cosmic radiation and global environmental using AMS

    International Nuclear Information System (INIS)

    Horiuchi, Kazuho

    2008-01-01

    Recently, accuracy of AMS has further been raised in trace amounts of sample. Besides application of 14 C to the age estimation, it has been able to restore in detail the past fluctuation of cosmic radiation strength using the other radioactive isotopes ( 10 Be, 36 Cl etc) in environmental samples and to elucidate the correlation of this with the fluctuation of climate and environment. In this report, the attempts to elucidate the fluctuation history of cosmic radiation and global environment with ice cores using AMS are presented. (M.H.)

  3. Direct detection of Leishmania from clinical samples.

    Science.gov (United States)

    Waitumbi, John N; Bast, Joshua; Nyakoe, Nancy; Magiri, Charles; Quintana, Miguel; Takhampunya, Ratree; Schuster, Anthony L; Van de Wyngaerde, Marshall T; McAvin, James C; Coleman, Russell E

    2017-01-01

    The ability to rapidly and accurately diagnose leishmaniasis is a military priority. Testing was conducted to evaluate diagnostic sensitivity and specificity of field-expedient Leishmania genus and visceral Leishmania specific dual-fluorogenic, hydrolysis probe (TaqMan), polymerase chain reaction assays previously established for use in vector surveillance. Blood samples of patients with confirmed visceral leishmaniasis and controls without the disease from Baringo District, Kenya, were tested. Leishmania genus assay sensitivity was 100% (14/14) and specificity was 84% (16/19). Visceral Leishmania assay sensitivity was 93% (13/14) and specificity 80% (4/5). Cutaneous leishmaniasis (CL) skin scrapes of patients from Honduras were also evaluated. Leishmania genus assay sensitivity was 100% (10/10). Visceral Leishmania assay specificity was 100% (10/10) from cutaneous leishmaniasis samples; no fluorescence above background was reported. These results show promise in a rapid, sensitive, and specific method for Leishmania direct detection from clinical samples.

  4. Test sample handling apparatus

    International Nuclear Information System (INIS)

    1981-01-01

    A test sample handling apparatus using automatic scintillation counting for gamma detection, for use in such fields as radioimmunoassay, is described. The apparatus automatically and continuously counts large numbers of samples rapidly and efficiently by the simultaneous counting of two samples. By means of sequential ordering of non-sequential counting data, it is possible to obtain precisely ordered data while utilizing sample carrier holders having a minimum length. (U.K.)

  5. [Saarland Growth Study: sampling design].

    Science.gov (United States)

    Danker-Hopfe, H; Zabransky, S

    2000-01-01

    The use of reference data to evaluate the physical development of children and adolescents is part of the daily routine in the paediatric ambulance. The construction of such reference data is based on the collection of extensive reference data. There are different kinds of reference data: cross sectional references, which are based on data collected from a big representative cross-sectional sample of the population, longitudinal references, which are based on follow-up surveys of usually smaller samples of individuals from birth to maturity, and mixed longitudinal references, which are a combination of longitudinal and cross-sectional reference data. The advantages and disadvantages of the different methods of data collection and the resulting reference data are discussed. The Saarland Growth Study was conducted for several reasons: growth processes are subject to secular changes, there are no specific reference data for children and adolescents from this part of the country and the growth charts in use in the paediatric praxis are possibly not appropriate any more. Therefore, the Saarland Growth Study served two purposes a) to create actual regional reference data and b) to create a database for future studies on secular trends in growth processes of children and adolescents from Saarland. The present contribution focusses on general remarks on the sampling design of (cross-sectional) growth surveys and its inferences for the design of the present study.

  6. Aersol particle losses in sampling systems

    International Nuclear Information System (INIS)

    Fan, B.J.; Wong, F.S.; Ortiz, C.A.; Anand, N.K.; McFarland, A.R.

    1993-01-01

    When aerosols are sampled from stacks and ducts, it is usually necessary to transport them from the point of sampling to a location of collection or analysis. Losses of aerosol particles can occur in the inlet region of the probe, in straight horizontal and vertical tubes and in elbows. For probes in laminary flow, the Saffman lift force can cause substantial losses of particles in a short inlet region. An empirical model has been developed to predict probe inlet losses, which are often on the order of 40% for 10 μm AED particles. A user-friendly PC computer code, DEPOSITION, has been setup to model losses in transport systems. Experiments have been conducted to compare the actual aerosol particle losses in transport systems with those predicted by the DEPOSITION code

  7. Radioactivity in environmental samples

    International Nuclear Information System (INIS)

    Fornaro, Laura

    2001-01-01

    The objective of this practical work is to familiarize the student with radioactivity measures in environmental samples. For that were chosen samples a salt of natural potassium, a salt of uranium or torio and a sample of drinkable water

  8. Sampling and sample handling procedures for priority pollutants in surface coal mining wastewaters. [Detailed list to be analyzed for

    Energy Technology Data Exchange (ETDEWEB)

    Hayden, R. S.; Johnson, D. O.; Henricks, J. D.

    1979-03-01

    The report describes the procedures used by Argonne National Laboratory to sample surface coal mine effluents in order to obtain field and laboratory data on 110 organic compounds or classes of compounds and 14 metals and minerals that are known as priority pollutants, plus 5-day biochemical oxygen demand (BOD/sub 5/), total organic carbon (TOC), chemical oxygen demand (COD), total dissolved solids (TDS), and total suspended solids (TSS). Included are directions for preparation of sampling containers and equipment, methods of sampling and sample preservation, and field and laboratory protocols, including chain-of-custody procedures. Actual analytical procedures are not described, but their sources are referenced.

  9. Wet gas sampling

    Energy Technology Data Exchange (ETDEWEB)

    Welker, T.F.

    1997-07-01

    The quality of gas has changed drastically in the past few years. Most gas is wet with hydrocarbons, water, and heavier contaminants that tend to condense if not handled properly. If a gas stream is contaminated with condensables, the sampling of that stream must be done in a manner that will ensure all of the components in the stream are introduced into the sample container as the composite. The sampling and handling of wet gas is extremely difficult under ideal conditions. There are no ideal conditions in the real world. The problems related to offshore operations and other wet gas systems, as well as the transportation of the sample, are additional problems that must be overcome if the analysis is to mean anything to the producer and gatherer. The sampling of wet gas systems is decidedly more difficult than sampling conventional dry gas systems. Wet gas systems were generally going to result in the measurement of one heating value at the inlet of the pipe and a drastic reduction in the heating value of the gas at the outlet end of the system. This is caused by the fallout or accumulation of the heavier products that, at the inlet, may be in the vapor state in the pipeline; hence, the high gravity and high BTU. But, in fact, because of pressure and temperature variances, these liquids condense and form a liquid that is actually running down the pipe as a stream or is accumulated in drips to be blown from the system. (author)

  10. 43 CFR Appendix A to Part 10 - Sample Summary

    Science.gov (United States)

    2010-10-01

    ... 43 Public Lands: Interior 1 2010-10-01 2010-10-01 false Sample Summary A Appendix A to Part 10... REPATRIATION REGULATIONS Pt. 10, App. A Appendix A to Part 10—Sample Summary The following is a generic sample and should be used as a guideline for preparation of summaries tailoring the information to the...

  11. A method of language sampling

    DEFF Research Database (Denmark)

    Rijkhoff, Jan; Bakker, Dik; Hengeveld, Kees

    1993-01-01

    In recent years more attention is paid to the quality of language samples in typological work. Without an adequate sampling strategy, samples may suffer from various kinds of bias. In this article we propose a sampling method in which the genetic criterion is taken as the most important: samples...... to determine how many languages from each phylum should be selected, given any required sample size....

  12. Hanford Site Environmental Surveillance Master Sampling Schedule for Calendar Year 2005

    Energy Technology Data Exchange (ETDEWEB)

    Bisping, Lynn E.

    2005-01-19

    Environmental surveillance of the Hanford Site and surrounding areas is conducted by the Pacific Northwest National Laboratory (PNNL) for the U.S. Department of Energy (DOE). Sampling is conducted to evaluate levels of radioactive and nonradioactive pollutants in the Hanford environs. This document contains the calendar year 2005 schedules for the routine and non-routine collection of samples for the Surface Environmental Surveillance Project (SESP) and Drinking Water Monitoring Project.

  13. Big Data, Small Sample.

    Science.gov (United States)

    Gerlovina, Inna; van der Laan, Mark J; Hubbard, Alan

    2017-05-20

    Multiple comparisons and small sample size, common characteristics of many types of "Big Data" including those that are produced by genomic studies, present specific challenges that affect reliability of inference. Use of multiple testing procedures necessitates calculation of very small tail probabilities of a test statistic distribution. Results based on large deviation theory provide a formal condition that is necessary to guarantee error rate control given practical sample sizes, linking the number of tests and the sample size; this condition, however, is rarely satisfied. Using methods that are based on Edgeworth expansions (relying especially on the work of Peter Hall), we explore the impact of departures of sampling distributions from typical assumptions on actual error rates. Our investigation illustrates how far the actual error rates can be from the declared nominal levels, suggesting potentially wide-spread problems with error rate control, specifically excessive false positives. This is an important factor that contributes to "reproducibility crisis". We also review some other commonly used methods (such as permutation and methods based on finite sampling inequalities) in their application to multiple testing/small sample data. We point out that Edgeworth expansions, providing higher order approximations to the sampling distribution, offer a promising direction for data analysis that could improve reliability of studies relying on large numbers of comparisons with modest sample sizes.

  14. Pre-Mission Input Requirements to Enable Successful Sample Collection by a Remote Field/EVA Team

    Science.gov (United States)

    Cohen, B. A.; Young, K. E.; Lim, D. S.

    2015-01-01

    objectives than samples collected in the absence of this premission information. We conducted three tests of this hypothesis. Our investigation was designed to document processes, tools and procedures for crew sampling of planetary targets. This is not meant to be a blind, controlled test of crew efficacy, but rather an effort to recognize the relevant variables that enter into sampling protocol and to develop recommendations for crew and backroom training in future endeavors. Methods: One of the primary FINESSE field deployment objectives was to collect impact melt rocks and impact melt-bearing breccias from a number of locations around the WCIS structure to enable high precision geochronology of the crater to be performed [1]. We conducted three tests at WCIS after two full days of team participation in field site activities, including using remote sensing data and geologic maps, hiking overland to become familiar with the terrain, and examining previously-collected samples from other islands. In addition, the team members shared their projects and techniques with the entire team. We chose our "crew members" as volunteers from the team, all of whom had had moderate training in geologic fieldwork and became familiar with the general field setting. The first two tests were short, focused tests of our hypothesis. Test A was to obtain hydrothermal vugs; Test B was to obtain impact melt and intrusive rock as well as the contact between the two to check for contact metamorphism and age differences. In both cases, the test director had prior knowledge of the site geology and had developed a study-specific objective for sampling prior to deployment. Prior to the field deployment, the crewmember was briefed on the sampling objective and the laboratory techniques that would be used on the samples. At the field sites (Fig. 2), the crewmember was given 30 minutes to survey a small section of outcrop (10-15 m) and acquire a suite of three samples. The crewmember talked through his

  15. Weak antilocalization and universal conductance fluctuations in bismuth telluro-sulfide topological insulators

    Energy Technology Data Exchange (ETDEWEB)

    Trivedi, Tanuj, E-mail: tanuj@utexas.edu; Sonde, Sushant; Movva, Hema C. P.; Banerjee, Sanjay K., E-mail: banerjee@ece.utexas.edu [Microelectronics Research Center, The University of Texas at Austin, Austin, Texas 78758 (United States)

    2016-02-07

    We report on van der Waals epitaxial growth, materials characterization, and magnetotransport experiments in crystalline nanosheets of Bismuth Telluro-Sulfide (BTS). Highly layered, good-quality crystalline nanosheets of BTS are obtained on SiO{sub 2} and muscovite mica. Weak-antilocalization (WAL), electron-electron interaction-driven insulating ground state and universal conductance fluctuations are observed in magnetotransport experiments on BTS devices. Temperature, thickness, and magnetic field dependence of the transport data indicate the presence of two-dimensional surface states along with bulk conduction, in agreement with theoretical models. An extended-WAL model is proposed and utilized in conjunction with a two-channel conduction model to analyze the data, revealing a surface component and evidence of multiple conducting channels. A facile growth method and detailed magnetotransport results indicating BTS as an alternative topological insulator material system are presented.

  16. The Study on Mental Health at Work: Design and sampling.

    Science.gov (United States)

    Rose, Uwe; Schiel, Stefan; Schröder, Helmut; Kleudgen, Martin; Tophoven, Silke; Rauch, Angela; Freude, Gabriele; Müller, Grit

    2017-08-01

    The Study on Mental Health at Work (S-MGA) generates the first nationwide representative survey enabling the exploration of the relationship between working conditions, mental health and functioning. This paper describes the study design, sampling procedures and data collection, and presents a summary of the sample characteristics. S-MGA is a representative study of German employees aged 31-60 years subject to social security contributions. The sample was drawn from the employment register based on a two-stage cluster sampling procedure. Firstly, 206 municipalities were randomly selected from a pool of 12,227 municipalities in Germany. Secondly, 13,590 addresses were drawn from the selected municipalities for the purpose of conducting 4500 face-to-face interviews. The questionnaire covers psychosocial working and employment conditions, measures of mental health, work ability and functioning. Data from personal interviews were combined with employment histories from register data. Descriptive statistics of socio-demographic characteristics and logistic regressions analyses were used for comparing population, gross sample and respondents. In total, 4511 face-to-face interviews were conducted. A test for sampling bias revealed that individuals in older cohorts participated more often, while individuals with an unknown educational level, residing in major cities or with a non-German ethnic background were slightly underrepresented. There is no indication of major deviations in characteristics between the basic population and the sample of respondents. Hence, S-MGA provides representative data for research on work and health, designed as a cohort study with plans to rerun the survey 5 years after the first assessment.

  17. The Study on Mental Health at Work: Design and sampling

    Science.gov (United States)

    Rose, Uwe; Schiel, Stefan; Schröder, Helmut; Kleudgen, Martin; Tophoven, Silke; Rauch, Angela; Freude, Gabriele; Müller, Grit

    2017-01-01

    Aims: The Study on Mental Health at Work (S-MGA) generates the first nationwide representative survey enabling the exploration of the relationship between working conditions, mental health and functioning. This paper describes the study design, sampling procedures and data collection, and presents a summary of the sample characteristics. Methods: S-MGA is a representative study of German employees aged 31–60 years subject to social security contributions. The sample was drawn from the employment register based on a two-stage cluster sampling procedure. Firstly, 206 municipalities were randomly selected from a pool of 12,227 municipalities in Germany. Secondly, 13,590 addresses were drawn from the selected municipalities for the purpose of conducting 4500 face-to-face interviews. The questionnaire covers psychosocial working and employment conditions, measures of mental health, work ability and functioning. Data from personal interviews were combined with employment histories from register data. Descriptive statistics of socio-demographic characteristics and logistic regressions analyses were used for comparing population, gross sample and respondents. Results: In total, 4511 face-to-face interviews were conducted. A test for sampling bias revealed that individuals in older cohorts participated more often, while individuals with an unknown educational level, residing in major cities or with a non-German ethnic background were slightly underrepresented. Conclusions: There is no indication of major deviations in characteristics between the basic population and the sample of respondents. Hence, S-MGA provides representative data for research on work and health, designed as a cohort study with plans to rerun the survey 5 years after the first assessment. PMID:28673202

  18. Sampling Strategies and Processing of Biobank Tissue Samples from Porcine Biomedical Models.

    Science.gov (United States)

    Blutke, Andreas; Wanke, Rüdiger

    2018-03-06

    In translational medical research, porcine models have steadily become more popular. Considering the high value of individual animals, particularly of genetically modified pig models, and the often-limited number of available animals of these models, establishment of (biobank) collections of adequately processed tissue samples suited for a broad spectrum of subsequent analyses methods, including analyses not specified at the time point of sampling, represent meaningful approaches to take full advantage of the translational value of the model. With respect to the peculiarities of porcine anatomy, comprehensive guidelines have recently been established for standardized generation of representative, high-quality samples from different porcine organs and tissues. These guidelines are essential prerequisites for the reproducibility of results and their comparability between different studies and investigators. The recording of basic data, such as organ weights and volumes, the determination of the sampling locations and of the numbers of tissue samples to be generated, as well as their orientation, size, processing and trimming directions, are relevant factors determining the generalizability and usability of the specimen for molecular, qualitative, and quantitative morphological analyses. Here, an illustrative, practical, step-by-step demonstration of the most important techniques for generation of representative, multi-purpose biobank specimen from porcine tissues is presented. The methods described here include determination of organ/tissue volumes and densities, the application of a volume-weighted systematic random sampling procedure for parenchymal organs by point-counting, determination of the extent of tissue shrinkage related to histological embedding of samples, and generation of randomly oriented samples for quantitative stereological analyses, such as isotropic uniform random (IUR) sections generated by the "Orientator" and "Isector" methods, and vertical

  19. Biochemical and microstructural characteristics of meat samples ...

    African Journals Online (AJOL)

    This study was conducted to compare the efficiency of different plant proteases for changing biochemical and microstructural characteristics in muscle foods. The meat samples from chicken, giant catfish, pork and beef were treated with four types of proteolytic enzymes: Calotropis procera latex proteases, papaya latex ...

  20. Using machine learning to accelerate sampling-based inversion

    Science.gov (United States)

    Valentine, A. P.; Sambridge, M.

    2017-12-01

    In most cases, a complete solution to a geophysical inverse problem (including robust understanding of the uncertainties associated with the result) requires a sampling-based approach. However, the computational burden is high, and proves intractable for many problems of interest. There is therefore considerable value in developing techniques that can accelerate sampling procedures.The main computational cost lies in evaluation of the forward operator (e.g. calculation of synthetic seismograms) for each candidate model. Modern machine learning techniques-such as Gaussian Processes-offer a route for constructing a computationally-cheap approximation to this calculation, which can replace the accurate solution during sampling. Importantly, the accuracy of the approximation can be refined as inversion proceeds, to ensure high-quality results.In this presentation, we describe and demonstrate this approach-which can be seen as an extension of popular current methods, such as the Neighbourhood Algorithm, and bridges the gap between prior- and posterior-sampling frameworks.

  1. Low-sampling-rate ultra-wideband digital receiver using equivalent-time sampling

    KAUST Repository

    Ballal, Tarig

    2014-09-01

    In this paper, we propose an all-digital scheme for ultra-wideband symbol detection. In the proposed scheme, the received symbols are sampled many times below the Nyquist rate. It is shown that when the number of symbol repetitions, P, is co-prime with the symbol duration given in Nyquist samples, the receiver can sample the received data P times below the Nyquist rate, without loss of fidelity. The proposed scheme is applied to perform channel estimation and binary pulse position modulation (BPPM) detection. Results are presented for two receivers operating at two different sampling rates that are 10 and 20 times below the Nyquist rate. The feasibility of the proposed scheme is demonstrated in different scenarios, with reasonable bit error rates obtained in most of the cases.

  2. Low-sampling-rate ultra-wideband digital receiver using equivalent-time sampling

    KAUST Repository

    Ballal, Tarig; Al-Naffouri, Tareq Y.

    2014-01-01

    In this paper, we propose an all-digital scheme for ultra-wideband symbol detection. In the proposed scheme, the received symbols are sampled many times below the Nyquist rate. It is shown that when the number of symbol repetitions, P, is co-prime with the symbol duration given in Nyquist samples, the receiver can sample the received data P times below the Nyquist rate, without loss of fidelity. The proposed scheme is applied to perform channel estimation and binary pulse position modulation (BPPM) detection. Results are presented for two receivers operating at two different sampling rates that are 10 and 20 times below the Nyquist rate. The feasibility of the proposed scheme is demonstrated in different scenarios, with reasonable bit error rates obtained in most of the cases.

  3. Small sample whole-genome amplification

    Science.gov (United States)

    Hara, Christine; Nguyen, Christine; Wheeler, Elizabeth; Sorensen, Karen; Arroyo, Erin; Vrankovich, Greg; Christian, Allen

    2005-11-01

    Many challenges arise when trying to amplify and analyze human samples collected in the field due to limitations in sample quantity, and contamination of the starting material. Tests such as DNA fingerprinting and mitochondrial typing require a certain sample size and are carried out in large volume reactions; in cases where insufficient sample is present whole genome amplification (WGA) can be used. WGA allows very small quantities of DNA to be amplified in a way that enables subsequent DNA-based tests to be performed. A limiting step to WGA is sample preparation. To minimize the necessary sample size, we have developed two modifications of WGA: the first allows for an increase in amplified product from small, nanoscale, purified samples with the use of carrier DNA while the second is a single-step method for cleaning and amplifying samples all in one column. Conventional DNA cleanup involves binding the DNA to silica, washing away impurities, and then releasing the DNA for subsequent testing. We have eliminated losses associated with incomplete sample release, thereby decreasing the required amount of starting template for DNA testing. Both techniques address the limitations of sample size by providing ample copies of genomic samples. Carrier DNA, included in our WGA reactions, can be used when amplifying samples with the standard purification method, or can be used in conjunction with our single-step DNA purification technique to potentially further decrease the amount of starting sample necessary for future forensic DNA-based assays.

  4. Reactor water sampling device

    International Nuclear Information System (INIS)

    Sakamaki, Kazuo.

    1992-01-01

    The present invention concerns a reactor water sampling device for sampling reactor water in an in-core monitor (neutron measuring tube) housing in a BWR type reactor. The upper end portion of a drain pipe of the reactor water sampling device is attached detachably to an in-core monitor flange. A push-up rod is inserted in the drain pipe vertically movably. A sampling vessel and a vacuum pump are connected to the lower end of the drain pipe. A vacuum pump is operated to depressurize the inside of the device and move the push-up rod upwardly. Reactor water in the in-core monitor housing flows between the drain pipe and the push-up rod and flows into the sampling vessel. With such a constitution, reactor water in the in-core monitor housing can be sampled rapidly with neither opening the lid of the reactor pressure vessel nor being in contact with air. Accordingly, operator's exposure dose can be reduced. (I.N.)

  5. Quantitative portable gamma-spectroscopy sample analysis for non-standard sample geometries

    International Nuclear Information System (INIS)

    Ebara, S.B.

    1998-01-01

    Utilizing a portable spectroscopy system, a quantitative method for analysis of samples containing a mixture of fission and activation products in nonstandard geometries was developed. This method was not developed to replace other methods such as Monte Carlo or Discrete Ordinates but rather to offer an alternative rapid solution. The method can be used with various sample and shielding configurations where analysis on a laboratory based gamma-spectroscopy system is impractical. The portable gamma-spectroscopy method involves calibration of the detector and modeling of the sample and shielding to identify and quantify the radionuclides present in the sample. The method utilizes the intrinsic efficiency of the detector and the unattenuated gamma fluence rate at the detector surface per unit activity from the sample to calculate the nuclide activity and Minimum Detectable Activity (MDA). For a complex geometry, a computer code written for shielding applications (MICROSHIELD) is utilized to determine the unattenuated gamma fluence rate per unit activity at the detector surface. Lastly, the method is only applicable to nuclides which emit gamma-rays and cannot be used for pure beta or alpha emitters. In addition, if sample self absorption and shielding is significant, the attenuation will result in high MDA's for nuclides which solely emit low energy gamma-rays. The following presents the analysis technique and presents verification results using actual experimental data, rather than comparisons to other approximations such as Monte Carlo techniques, to demonstrate the accuracy of the method given a known geometry and source term. (author)

  6. Perpendicular distance sampling: an alternative method for sampling downed coarse woody debris

    Science.gov (United States)

    Michael S. Williams; Jeffrey H. Gove

    2003-01-01

    Coarse woody debris (CWD) plays an important role in many forest ecosystem processes. In recent years, a number of new methods have been proposed to sample CWD. These methods select individual logs into the sample using some form of unequal probability sampling. One concern with most of these methods is the difficulty in estimating the volume of each log. A new method...

  7. Applications of Liquid-Phase Microextraction in the Sample Preparation of Environmental Solid Samples

    OpenAIRE

    Helena Prosen

    2014-01-01

    Solvent extraction remains one of the fundamental sample preparation techniques in the analysis of environmental solid samples, but organic solvents are toxic and environmentally harmful, therefore one of the possible greening directions is its miniaturization. The present review covers the relevant research from the field of application of microextraction to the sample preparation of environmental solid samples (soil, sediments, sewage sludge, dust etc.) published in the last decade. Several...

  8. Operability test procedure for PFP wastewater sampling facility

    International Nuclear Information System (INIS)

    Hirzel, D.R.

    1995-01-01

    Document provides instructions for performing the Operability Test of the 225-WC Wastewater Sampling Station which monitors the discharge to the Treated Effluent Disposal Facility from the Plutonium Finishing Plant. This Operability Test Procedure (OTP) has been prepared to verify correct configuration and performance of the PFP Wastewater sampling system installed in Building 225-WC located outside the perimeter fence southeast of the Plutonium Finishing Plant (PFP). The objective of this test is to ensure the equipment in the sampling facility operates in a safe and reliable manner. The sampler consists of two Manning Model S-5000 units which are rate controlled by the Milltronics Ultrasonic flowmeter at manhole No.C4 and from a pH measuring system with the sensor in the stream adjacent to the sample point. The intent of the dual sampling system is to utilize one unit to sample continuously at a rate proportional to the wastewater flow rate so that the aggregate tests are related to the overall flow and thereby eliminate isolated analyses. The second unit will only operate during a high or low pH excursion of the stream (hence the need for a pH control). The major items in this OTP include testing of the Manning Sampler System and associated equipment including the pH measuring and control system, the conductivity monitor, and the flow meter

  9. Sample preparation prior to the LC-MS-based metabolomics/metabonomics of blood-derived samples.

    Science.gov (United States)

    Gika, Helen; Theodoridis, Georgios

    2011-07-01

    Blood represents a very important biological fluid and has been the target of continuous and extensive research for diagnostic, or health and drug monitoring reasons. Recently, metabonomics/metabolomics have emerged as a new and promising 'omics' platform that shows potential in biomarker discovery, especially in areas such as disease diagnosis, assessment of drug efficacy or toxicity. Blood is collected in various establishments in conditions that are not standardized. Next, the samples are prepared and analyzed using different methodologies or tools. When targeted analysis of key molecules (e.g., a drug or its metabolite[s]) is the aim, enforcement of certain measures or additional analyses may correct and harmonize these discrepancies. In omics fields such as those performed by holistic analytical approaches, no such rules or tools are available. As a result, comparison or correlation of results or data fusion becomes impractical. However, it becomes evident that such obstacles should be overcome in the near future to allow for large-scale studies that involve the assaying of samples from hundreds of individuals. In this case the effect of sample handling and preparation becomes very serious, in order to avoid wasting months of work from experts and expensive instrument time. The present review aims to cover the different methodologies applied to the pretreatment of blood prior to LC-MS metabolomic/metabonomic studies. The article tries to critically compare the methods and highlight issues that need to be addressed.

  10. Measuring Happiness: From Fluctuating Happiness to Authentic–Durable Happiness

    Science.gov (United States)

    Dambrun, Michaël; Ricard, Matthieu; Després, Gérard; Drelon, Emilie; Gibelin, Eva; Gibelin, Marion; Loubeyre, Mélanie; Py, Delphine; Delpy, Aurore; Garibbo, Céline; Bray, Elise; Lac, Gérard; Michaux, Odile

    2012-01-01

    On the basis of the theoretical distinction between self-centeredness and selflessness (Dambrun and Ricard, 2011), the main goal of this research was to develop two new scales assessing distinct dimensions of happiness. By trying to maximize pleasures and to avoid displeasures, we propose that a self-centered functioning induces a fluctuating happiness in which phases of pleasure and displeasure alternate repeatedly (i.e., Fluctuating Happiness). In contrast, a selfless psychological functioning postulates the existence of a state of durable plenitude that is less dependent upon circumstances but rather is related to a person’s inner resources and abilities to deal with whatever comes his way in life (i.e., Authentic–Durable Happiness). Using various samples (n = 735), we developed a 10-item Scale measuring Subjective Fluctuating Happiness (SFHS) and a 13-item scale assessing Subjective Authentic–Durable Happiness (SA–DHS). Results indicated high internal consistencies, satisfactory test–retest validities, and adequate convergent and discriminant validities with various constructs including a biological marker of stress (salivary cortisol). Consistent with our theoretical framework, while self-enhancement values were related only to fluctuating happiness, self-transcendence values were related only to authentic–durable happiness. Support for the distinction between contentment and inner-peace, two related markers of authentic happiness, also was found. PMID:22347202

  11. Mars Sample Handling Functionality

    Science.gov (United States)

    Meyer, M. A.; Mattingly, R. L.

    2018-04-01

    The final leg of a Mars Sample Return campaign would be an entity that we have referred to as Mars Returned Sample Handling (MRSH.) This talk will address our current view of the functional requirements on MRSH, focused on the Sample Receiving Facility (SRF).

  12. An evaluation of soil sampling for 137Cs using various field-sampling volumes.

    Science.gov (United States)

    Nyhan, J W; White, G C; Schofield, T G; Trujillo, G

    1983-05-01

    The sediments from a liquid effluent receiving area at the Los Alamos National Laboratory and soils from an intensive study area in the fallout pathway of Trinity were sampled for 137Cs using 25-, 500-, 2500- and 12,500-cm3 field sampling volumes. A highly replicated sampling program was used to determine mean concentrations and inventories of 137Cs at each site, as well as estimates of spatial, aliquoting, and counting variance components of the radionuclide data. The sampling methods were also analyzed as a function of soil size fractions collected in each field sampling volume and of the total cost of the program for a given variation in the radionuclide survey results. Coefficients of variation (CV) of 137Cs inventory estimates ranged from 0.063 to 0.14 for Mortandad Canyon sediments, whereas CV values for Trinity soils were observed from 0.38 to 0.57. Spatial variance components of 137Cs concentration data were usually found to be larger than either the aliquoting or counting variance estimates and were inversely related to field sampling volume at the Trinity intensive site. Subsequent optimization studies of the sampling schemes demonstrated that each aliquot should be counted once, and that only 2-4 aliquots out of as many as 30 collected need be assayed for 137Cs. The optimization studies showed that as sample costs increased to 45 man-hours of labor per sample, the variance of the mean 137Cs concentration decreased dramatically, but decreased very little with additional labor.

  13. Waste classification sampling plan

    International Nuclear Information System (INIS)

    Landsman, S.D.

    1998-01-01

    The purpose of this sampling is to explain the method used to collect and analyze data necessary to verify and/or determine the radionuclide content of the B-Cell decontamination and decommissioning waste stream so that the correct waste classification for the waste stream can be made, and to collect samples for studies of decontamination methods that could be used to remove fixed contamination present on the waste. The scope of this plan is to establish the technical basis for collecting samples and compiling quantitative data on the radioactive constituents present in waste generated during deactivation activities in B-Cell. Sampling and radioisotopic analysis will be performed on the fixed layers of contamination present on structural material and internal surfaces of process piping and tanks. In addition, dose rate measurements on existing waste material will be performed to determine the fraction of dose rate attributable to both removable and fixed contamination. Samples will also be collected to support studies of decontamination methods that are effective in removing the fixed contamination present on the waste. Sampling performed under this plan will meet criteria established in BNF-2596, Data Quality Objectives for the B-Cell Waste Stream Classification Sampling, J. M. Barnett, May 1998

  14. Sequential effects in pigeon delayed matching-to-sample performance.

    Science.gov (United States)

    Roitblat, H L; Scopatz, R A

    1983-04-01

    Pigeons were tested in a three-alternative delayed matching-to-sample task in which second-choices were permitted following first-choice errors. Sequences of responses both within and between trials were examined in three experiments. The first experiment demonstrates that the sample information contained in first-choice errors is not sufficient to account for the observed pattern of second choices. This result implies that second-choices following first-choice errors are based on a second examination of the contents of working memory. Proactive interference was found in the second experiment in the form of a dependency, beyond that expected on the basis of trial independent response bias, of first-choices from one trial on the first-choice emitted on the previous trial. Samples from the previous trial were not found to exert a significant influence on later trials. The magnitude of the intertrial association (Experiment 3) did not depend on the duration of the intertrial interval. In contrast, longer intertrial intervals and longer sample durations did facilitate choice accuracy, by strengthening the association between current samples and choices. These results are incompatible with a trace-decay and competition model; they suggest strongly that multiple influences act simultaneously and independently to control delayed matching-to-sample responding. These multiple influences include memory for the choice occurring on the previous trial, memory for the sample, and general effects of trial spacing.

  15. Toward cost-efficient sampling methods

    Science.gov (United States)

    Luo, Peng; Li, Yongli; Wu, Chong; Zhang, Guijie

    2015-09-01

    The sampling method has been paid much attention in the field of complex network in general and statistical physics in particular. This paper proposes two new sampling methods based on the idea that a small part of vertices with high node degree could possess the most structure information of a complex network. The two proposed sampling methods are efficient in sampling high degree nodes so that they would be useful even if the sampling rate is low, which means cost-efficient. The first new sampling method is developed on the basis of the widely used stratified random sampling (SRS) method and the second one improves the famous snowball sampling (SBS) method. In order to demonstrate the validity and accuracy of two new sampling methods, we compare them with the existing sampling methods in three commonly used simulation networks that are scale-free network, random network, small-world network, and also in two real networks. The experimental results illustrate that the two proposed sampling methods perform much better than the existing sampling methods in terms of achieving the true network structure characteristics reflected by clustering coefficient, Bonacich centrality and average path length, especially when the sampling rate is low.

  16. Quantum fluctuations of the Coulomb potential as a source of flicker noise: the influence of external electric field

    International Nuclear Information System (INIS)

    Kazakov, Kirill A

    2006-01-01

    Fluctuations of the electromagnetic field produced by quantized matter in an external electric field are investigated. A general expression for the power spectrum of fluctuations is derived within the long-range expansion. It is found that in the whole measured frequency band, the power spectrum of fluctuations exhibits an inverse frequency dependence. A general argument is given showing that for all practically relevant values of the electric field, the power spectrum of induced fluctuations is proportional to the field strength squared. As an illustration, the power spectrum is calculated explicitly using a kinetic model with a relaxation-type collision term. Finally, it is shown that the magnitude of fluctuations produced by a sample generally has a Gaussian distribution around its mean value, and its dependence on the sample geometry is determined. In particular, it is demonstrated that for geometrically similar samples the power spectrum is inversely proportional to the sample volume. Application of the results obtained to the problem of flicker noise is discussed

  17. A method to combine non-probability sample data with probability sample data in estimating spatial means of environmental variables

    NARCIS (Netherlands)

    Brus, D.J.; Gruijter, de J.J.

    2003-01-01

    In estimating spatial means of environmental variables of a region from data collected by convenience or purposive sampling, validity of the results can be ensured by collecting additional data through probability sampling. The precision of the pi estimator that uses the probability sample can be

  18. Efficiently sampling conformations and pathways using the concurrent adaptive sampling (CAS) algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Surl-Hee; Grate, Jay W.; Darve, Eric F.

    2017-08-21

    Molecular dynamics (MD) simulations are useful in obtaining thermodynamic and kinetic properties of bio-molecules but are limited by the timescale barrier, i.e., we may be unable to efficiently obtain properties because we need to run microseconds or longer simulations using femtoseconds time steps. While there are several existing methods to overcome this timescale barrier and efficiently sample thermodynamic and/or kinetic properties, problems remain in regard to being able to sample un- known systems, deal with high-dimensional space of collective variables, and focus the computational effort on slow timescales. Hence, a new sampling method, called the “Concurrent Adaptive Sampling (CAS) algorithm,” has been developed to tackle these three issues and efficiently obtain conformations and pathways. The method is not constrained to use only one or two collective variables, unlike most reaction coordinate-dependent methods. Instead, it can use a large number of collective vari- ables and uses macrostates (a partition of the collective variable space) to enhance the sampling. The exploration is done by running a large number of short simula- tions, and a clustering technique is used to accelerate the sampling. In this paper, we introduce the new methodology and show results from two-dimensional models and bio-molecules, such as penta-alanine and triazine polymer

  19. Sample size determination for equivalence assessment with multiple endpoints.

    Science.gov (United States)

    Sun, Anna; Dong, Xiaoyu; Tsong, Yi

    2014-01-01

    Equivalence assessment between a reference and test treatment is often conducted by two one-sided tests (TOST). The corresponding power function and sample size determination can be derived from a joint distribution of the sample mean and sample variance. When an equivalence trial is designed with multiple endpoints, it often involves several sets of two one-sided tests. A naive approach for sample size determination in this case would select the largest sample size required for each endpoint. However, such a method ignores the correlation among endpoints. With the objective to reject all endpoints and when the endpoints are uncorrelated, the power function is the production of all power functions for individual endpoints. With correlated endpoints, the sample size and power should be adjusted for such a correlation. In this article, we propose the exact power function for the equivalence test with multiple endpoints adjusted for correlation under both crossover and parallel designs. We further discuss the differences in sample size for the naive method without and with correlation adjusted methods and illustrate with an in vivo bioequivalence crossover study with area under the curve (AUC) and maximum concentration (Cmax) as the two endpoints.

  20. Obtaining Samples Representative of Contaminant Distribution in an Aquifer

    International Nuclear Information System (INIS)

    Schalla, Ronald; Spane, Frank A.; Narbutovskih, Susan M.; Conley, Scott F.; Webber, William D.

    2002-01-01

    Historically, groundwater samples collected from monitoring wells have been assumed to provide average indications of contaminant concentrations within the aquifer over the well-screen interval. In-well flow circulation, heterogeneity in the surrounding aquifer, and the sampling method utilized, however, can significantly impact the representativeness of samples as contaminant indicators of actual conditions within the surrounding aquifer. This paper identifies the need and approaches essential for providing cost-effective and technically meaningful groundwater-monitoring results. Proper design of the well screen interval is critical. An accurate understanding of ambient (non-pumping) flow conditions within the monitoring well is essential for determining the contaminant distribution within the aquifer. The ambient in-well flow velocity, flow direction and volumetric flux rate are key to this understanding. Not only do the ambient flow conditions need to be identified for preferential flow zones, but also the probable changes that will be imposed under dynamic conditions that occur during groundwater sampling. Once the in-well flow conditions are understood, effective sampling can be conducted to obtain representative samples for specific depth zones or zones of interest. The question of sample representativeness has become an important issue as waste minimization techniques such as low flow purging and sampling are implemented to combat the increasing cost of well purging and sampling at many hazardous waste sites. Several technical approaches (e.g., well tracer techniques and flowmeter surveys) can be used to determine in-well flow conditions, and these are discussed with respect to both their usefulness and limitations. Proper fluid extraction methods using minimal, (low) volume and no purge sampling methods that are used to obtain representative samples of aquifer conditions are presented

  1. Tomographic imaging of 12 fracture samples selected from Olkiluoto deep drillholes

    International Nuclear Information System (INIS)

    Kuva, J.; Voutilainen, M.; Timonen, J.; Aaltonen, I.

    2010-06-01

    Rock samples from Olkiluoto were imaged with X-ray tomography to analyze distributions of mineral components and alteration of rock around different fracture types. Twelve samples were analyzed, which contained three types of fractures, and each sample was scanned with two different resolutions. Three dimensional reconstructions of the samples with four or five distinct mineral components displayed changes in the mineral distribution around previously water conducting fractures, which extended to a depth of several millimeters away from fracture surfaces. In addition, structure of fracture filling minerals is depicted. (orig.)

  2. Jenis Sample: Keuntungan dan Kerugiannya

    OpenAIRE

    Suprapto, Agus

    1994-01-01

    Sample is a part of a population that are used in a study for purposes of making estimation about the nature of the total population that is obtained with sampling technic. Sampling technic is more adventagous than cencus because it can reduce cost, time, and it can gather deeper information and more accurate data. It is useful to distinguish two major types of sampling technics. First, Prob bility sampling i.e. simple random sampling. Second, Non Probability sampling i.e. systematic sam­plin...

  3. Environmental and emergency response capabilities of Los Alamos Scientific Laboratory's radiological air sampling program

    International Nuclear Information System (INIS)

    Gunderson, T.C.

    1980-05-01

    Environmental and emergency response radiological air sampling capabilities of the Environmental Surveillance Group at Los Alamos Scientific Laboratory are described. The air sampling program provides a supplementary check on the adequacy of containment and effluent controls, determines compliance with applicable protection guides and standards, and assesses potential environmental impacts on site environs. It also allows evaluation of potential individual and total population doses from airborne radionuclides that may be inhaled or serve as a source of external radiation. The environmental program is sufficient in scope to detect fluctuations and long-term trends in atmospheric levels of radioactivity originating onsite. The emergency response capabilities are designed to respond to both onsite unplanned releases and atmospheric nuclear tests

  4. Are samples drawn from Mechanical Turk valid for research on political ideology?

    Directory of Open Access Journals (Sweden)

    Scott Clifford

    2015-12-01

    Full Text Available Amazon’s Mechanical Turk (MTurk is an increasingly popular tool for the recruitment of research subjects. While there has been much focus on the demographic differences between MTurk samples and the national public, we know little about whether liberals and conservatives recruited from MTurk share the same psychological dispositions as their counterparts in the mass public. In the absence of such evidence, some have argued that the selection process involved in joining MTurk invalidates the subject pool for studying questions central to political science. In this paper, we evaluate this claim by comparing a large MTurk sample to two benchmark national samples – one conducted online and one conducted face-to-face. We examine the personality and value-based motivations of political ideology across the three samples. All three samples produce substantively identical results with only minor variation in effect sizes. In short, liberals and conservatives in our MTurk sample closely mirror the psychological divisions of liberals and conservatives in the mass public, though MTurk liberals hold more characteristically liberal values and attitudes than liberals from representative samples. Overall, our results suggest that MTurk is a valid recruitment tool for psychological research on political ideology.

  5. Development and Demonstration of a Method to Evaluate Bio-Sampling Strategies Using Building Simulation and Sample Planning Software.

    Science.gov (United States)

    Dols, W Stuart; Persily, Andrew K; Morrow, Jayne B; Matzke, Brett D; Sego, Landon H; Nuffer, Lisa L; Pulsipher, Brent A

    2010-01-01

    In an effort to validate and demonstrate response and recovery sampling approaches and technologies, the U.S. Department of Homeland Security (DHS), along with several other agencies, have simulated a biothreat agent release within a facility at Idaho National Laboratory (INL) on two separate occasions in the fall of 2007 and the fall of 2008. Because these events constitute only two realizations of many possible scenarios, increased understanding of sampling strategies can be obtained by virtually examining a wide variety of release and dispersion scenarios using computer simulations. This research effort demonstrates the use of two software tools, CONTAM, developed by the National Institute of Standards and Technology (NIST), and Visual Sample Plan (VSP), developed by Pacific Northwest National Laboratory (PNNL). The CONTAM modeling software was used to virtually contaminate a model of the INL test building under various release and dissemination scenarios as well as a range of building design and operation parameters. The results of these CONTAM simulations were then used to investigate the relevance and performance of various sampling strategies using VSP. One of the fundamental outcomes of this project was the demonstration of how CONTAM and VSP can be used together to effectively develop sampling plans to support the various stages of response to an airborne chemical, biological, radiological, or nuclear event. Following such an event (or prior to an event), incident details and the conceptual site model could be used to create an ensemble of CONTAM simulations which model contaminant dispersion within a building. These predictions could then be used to identify priority area zones within the building and then sampling designs and strategies could be developed based on those zones.

  6. Environmental monitoring master sampling schedule, January--December 1990

    Energy Technology Data Exchange (ETDEWEB)

    Bisping, L.E.

    1990-01-01

    Environmental monitoring of the Hanford Site is conducted by the Pacific Northwest Laboratory (PNL) for the US Department of Energy (DOE). This document contains the planned schedule for routine sample collection for calendar year 1990 for the Environment Surveillance and Ground-Water Monitoring Projects. This schedule is subject to modification during the year in response to changes in Site operations, program requirements, and the nature of the observed results. Operational limitations such as weather, mechanical failures, sample availability, etc., may also require schedule modifications. Changes will be documented in the respective project files, but this plan will not be reissued. The purpose of these monitoring projects is to evaluate levels of radioactive and nonradioactive pollutants in the Hanford environs. This schedule includes ground-water sampling performed by PNL for environmental surveillance of the Hanford Site.

  7. Uniform Sampling Table Method and its Applications II--Evaluating the Uniform Sampling by Experiment.

    Science.gov (United States)

    Chen, Yibin; Chen, Jiaxi; Chen, Xuan; Wang, Min; Wang, Wei

    2015-01-01

    A new method of uniform sampling is evaluated in this paper. The items and indexes were adopted to evaluate the rationality of the uniform sampling. The evaluation items included convenience of operation, uniformity of sampling site distribution, and accuracy and precision of measured results. The evaluation indexes included operational complexity, occupation rate of sampling site in a row and column, relative accuracy of pill weight, and relative deviation of pill weight. They were obtained from three kinds of drugs with different shape and size by four kinds of sampling methods. Gray correlation analysis was adopted to make the comprehensive evaluation by comparing it with the standard method. The experimental results showed that the convenience of uniform sampling method was 1 (100%), odds ratio of occupation rate in a row and column was infinity, relative accuracy was 99.50-99.89%, reproducibility RSD was 0.45-0.89%, and weighted incidence degree exceeded the standard method. Hence, the uniform sampling method was easy to operate, and the selected samples were distributed uniformly. The experimental results demonstrated that the uniform sampling method has good accuracy and reproducibility, which can be put into use in drugs analysis.

  8. Sampling and Analysis Plan for Verification Sampling of LANL-Derived Residual Radionuclides in Soils within Tract A-18-2 for Land Conveyance

    Energy Technology Data Exchange (ETDEWEB)

    Ruedig, Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-30

    Public Law 105-119 directs the U.S. Department of Energy (DOE) to convey or transfer parcels of land to the Incorporated County of Los Alamos or their designees and to the Department of Interior, Bureau of Indian Affairs, in trust for the Pueblo de San Ildefonso. Los Alamos National Security is tasked to support DOE in conveyance and/or transfer of identified land parcels no later than September 2022. Under DOE Order 458.1, Radiation Protection of the Public and the Environment (O458.1, 2013) and Los Alamos National Laboratory (LANL or the Laboratory) implementing Policy 412 (P412, 2014), real property with the potential to contain residual radioactive material must meet the criteria for clearance and release to the public. This Sampling and Analysis Plan (SAP) is a second investigation of Tract A-18-2 for the purpose of verifying the previous sampling results (LANL 2017). This sample plan requires 18 projectspecific soil samples for use in radiological clearance decisions consistent with LANL Procedure ENV-ES-TP-238 (2015a) and guidance in the Multi-Agency Radiation Survey and Site Investigation Manual (MARSSIM, 2000). The sampling work will be conducted by LANL, and samples will be evaluated by a LANL-contracted independent lab. However, there will be federal review (verification) of all steps of the sampling process.

  9. System to determine present elements in oily samples

    International Nuclear Information System (INIS)

    Mendoza G, Y.

    2004-11-01

    In the Chemistry Department of the National Institute of Nuclear Investigations of Mexico, dedicated to analyze samples of oleaginous material and of another origin, to determine the elements of the periodic table present in the samples, through the Neutron activation analysis technique (NAA). This technique has been developed to determine majority elements in any solid, aqueous, industrial and environmental sample, which consists basically on to irradiate a sample with neutrons coming from the TRIGA Mark III reactor and to carry out the analysis to obtain those gamma spectra that it emits, for finally to process the information, the quantification of the analysis it is carried out in a manual way, which requires to carry out a great quantity of calculations. The main objective of this project is the development of a software that allows to carry out the quantitative analysis of the NAA for the multielemental determination of samples in an automatic way. To fulfill the objective of this project it has been divided in four chapters: In the first chapter it is shortly presented the history on radioactivity and basic concepts that will allow us penetrate better to this work. In the second chapter the NAA is explained which is used in the sample analysis, the description of the process to be carried out, its are mentioned the characteristics of the used devices and an example of the process is illustrated. In the third chapter it is described the development of the algorithm and the selection of the programming language. The fourth chapter it is shown the structure of the system, the general form of operation, the execution of processes and the obtention of results. Later on the launched results are presented in the development of the present project. (Author)

  10. Enhanced Sampling in Free Energy Calculations: Combining SGLD with the Bennett's Acceptance Ratio and Enveloping Distribution Sampling Methods.

    Science.gov (United States)

    König, Gerhard; Miller, Benjamin T; Boresch, Stefan; Wu, Xiongwu; Brooks, Bernard R

    2012-10-09

    One of the key requirements for the accurate calculation of free energy differences is proper sampling of conformational space. Especially in biological applications, molecular dynamics simulations are often confronted with rugged energy surfaces and high energy barriers, leading to insufficient sampling and, in turn, poor convergence of the free energy results. In this work, we address this problem by employing enhanced sampling methods. We explore the possibility of using self-guided Langevin dynamics (SGLD) to speed up the exploration process in free energy simulations. To obtain improved free energy differences from such simulations, it is necessary to account for the effects of the bias due to the guiding forces. We demonstrate how this can be accomplished for the Bennett's acceptance ratio (BAR) and the enveloping distribution sampling (EDS) methods. While BAR is considered among the most efficient methods available for free energy calculations, the EDS method developed by Christ and van Gunsteren is a promising development that reduces the computational costs of free energy calculations by simulating a single reference state. To evaluate the accuracy of both approaches in connection with enhanced sampling, EDS was implemented in CHARMM. For testing, we employ benchmark systems with analytical reference results and the mutation of alanine to serine. We find that SGLD with reweighting can provide accurate results for BAR and EDS where conventional molecular dynamics simulations fail. In addition, we compare the performance of EDS with other free energy methods. We briefly discuss the implications of our results and provide practical guidelines for conducting free energy simulations with SGLD.

  11. Sampling procedures and tables

    International Nuclear Information System (INIS)

    Franzkowski, R.

    1980-01-01

    Characteristics, defects, defectives - Sampling by attributes and by variables - Sample versus population - Frequency distributions for the number of defectives or the number of defects in the sample - Operating characteristic curve, producer's risk, consumer's risk - Acceptable quality level AQL - Average outgoing quality AOQ - Standard ISQ 2859 - Fundamentals of sampling by variables for fraction defective. (RW)

  12. Constraining Unsaturated Hydraulic Parameters Using the Latin Hypercube Sampling Method and Coupled Hydrogeophysical Approach

    Science.gov (United States)

    Farzamian, Mohammad; Monteiro Santos, Fernando A.; Khalil, Mohamed A.

    2017-12-01

    The coupled hydrogeophysical approach has proved to be a valuable tool for improving the use of geoelectrical data for hydrological model parameterization. In the coupled approach, hydrological parameters are directly inferred from geoelectrical measurements in a forward manner to eliminate the uncertainty connected to the independent inversion of electrical resistivity data. Several numerical studies have been conducted to demonstrate the advantages of a coupled approach; however, only a few attempts have been made to apply the coupled approach to actual field data. In this study, we developed a 1D coupled hydrogeophysical code to estimate the van Genuchten-Mualem model parameters, K s, n, θ r and α, from time-lapse vertical electrical sounding data collected during a constant inflow infiltration experiment. van Genuchten-Mualem parameters were sampled using the Latin hypercube sampling method to provide a full coverage of the range of each parameter from their distributions. By applying the coupled approach, vertical electrical sounding data were coupled to hydrological models inferred from van Genuchten-Mualem parameter samples to investigate the feasibility of constraining the hydrological model. The key approaches taken in the study are to (1) integrate electrical resistivity and hydrological data and avoiding data inversion, (2) estimate the total water mass recovery of electrical resistivity data and consider it in van Genuchten-Mualem parameters evaluation and (3) correct the influence of subsurface temperature fluctuations during the infiltration experiment on electrical resistivity data. The results of the study revealed that the coupled hydrogeophysical approach can improve the value of geophysical measurements in hydrological model parameterization. However, the approach cannot overcome the technical limitations of the geoelectrical method associated with resolution and of water mass recovery.

  13. Using Group Projects to Assess the Learning of Sampling Distributions

    Science.gov (United States)

    Neidigh, Robert O.; Dunkelberger, Jake

    2012-01-01

    In an introductory business statistics course, student groups used sample data to compare a set of sample means to the theoretical sampling distribution. Each group was given a production measurement with a population mean and standard deviation. The groups were also provided an excel spreadsheet with 40 sample measurements per week for 52 weeks…

  14. System for Earth Sample Registration SESAR: Services for IGSN Registration and Sample Metadata Management

    Science.gov (United States)

    Chan, S.; Lehnert, K. A.; Coleman, R. J.

    2011-12-01

    SESAR, the System for Earth Sample Registration, is an online registry for physical samples collected for Earth and environmental studies. SESAR generates and administers the International Geo Sample Number IGSN, a unique identifier for samples that is dramatically advancing interoperability amongst information systems for sample-based data. SESAR was developed to provide the complete range of registry services, including definition of IGSN syntax and metadata profiles, registration and validation of name spaces requested by users, tools for users to submit and manage sample metadata, validation of submitted metadata, generation and validation of the unique identifiers, archiving of sample metadata, and public or private access to the sample metadata catalog. With the development of SESAR v3, we placed particular emphasis on creating enhanced tools that make metadata submission easier and more efficient for users, and that provide superior functionality for users to manage metadata of their samples in their private workspace MySESAR. For example, SESAR v3 includes a module where users can generate custom spreadsheet templates to enter metadata for their samples, then upload these templates online for sample registration. Once the content of the template is uploaded, it is displayed online in an editable grid format. Validation rules are executed in real-time on the grid data to ensure data integrity. Other new features of SESAR v3 include the capability to transfer ownership of samples to other SESAR users, the ability to upload and store images and other files in a sample metadata profile, and the tracking of changes to sample metadata profiles. In the next version of SESAR (v3.5), we will further improve the discovery, sharing, registration of samples. For example, we are developing a more comprehensive suite of web services that will allow discovery and registration access to SESAR from external systems. Both batch and individual registrations will be possible

  15. Identification of chemical contamination in an aqueous sample using liquid chromatography with mass spectrometry during 2nd NATO mixed samples laboratory exercise

    International Nuclear Information System (INIS)

    Grolmusova, K.; Tkac, M.

    2010-01-01

    Biological and radiological screening was conducted to determine the type of biological and radiological contamination for a sample and the reference sample. Biological screening confirmed the presence of biological contamination. Radiological screening confirmed the presence of 235 U. Preliminary chemical screening military confirmed the presence of volatile chemicals (chemical warfare agents, CWA), but refute the presence of non-volatile CWA and their degradation products and precursors (1,2,3 directory Organization for the Prohibition of Chemical Weapons, OPCW). To carry out further analysis it was necessary to adjust the aqueous sample so that it minimizes the possibility of radiological contamination, while maintaining chemical contamination. To remove 235 U from the water sample for selective extraction of chemical contamination SCX cartridges (strong cation exchange) by solid phase extraction were used. To identify chemical contamination (from the list of substances 1, 2, 3 OPCW) GC-MS and LC-MS were used. LC-ESI-MS analysis has demonstrated the presence of unknown substance designated as Chemical A in an aqueous sample. LC-ESI-MS chromatograms of the reference sample, water sample and standard were compared. Unknown substance was identified on the basis of the correlation of retention times and MS spectra of unknown substance Chemical A and standard such as triethanolamine (TEA, a breakdown product of nitrogen mustard - HN 3 fabric from the list 3B17Y OPCW).

  16. Effect of Disorder on the Conductance of Spin Field Effect Transistors (SPINFET)

    OpenAIRE

    Cahay, M.; Bandyopadhyay, S.

    2003-01-01

    We show that the conductance of Spin Field Effect Transistors (SPINFET) [Datta and Das, Appl. Phys. Lett., Vol. 56, 665 (1990)] is affected by a single (non-magnetic) impurity in the transistor's channel. The extreme sensitivity of the amplitude and phase of the transistor's conductance oscillations to the location of a single impurity in the channel is reminiscent of the phenomenon of universal conductance fluctuations in mesoscopic samples and is extremely problematic as far as device imple...

  17. Detection of Campylobacter in human faecal samples in Fiji.

    Science.gov (United States)

    Devi, Aruna; Wilkinson, Jenny; Mahony, Timothy; Vanniasinkam, Thiru

    2014-01-01

    Data on campylobacteriosis in developed countries are well documented; in contrast, few studies on campylobacteriosis have been conducted in developing countries. This study was undertaken to test for Campylobacter in human faecal samples sent to the two major pathology laboratories in Fiji. A total of 408 diarrhoeal faecal samples were collected from the two major hospital pathology laboratories in Central Fiji (Suva) and Western Fiji (Lautoka) between December 2012 and February 2013 and from June to July 2013. Samples were analysed for the presence of Campylobacter using polymerase chain reaction (PCR) based methods. Campylobacter was detected in 241/408 (59.1%) of samples tested using PCR. Samples from children aged less than five accounted for 21.6% of positive cases. Campylobacter was detected in 59.1% of diarrhoeal samples collected from the two main laboratories in Fiji. A high proportion of children under five years with Campylobacter has been reported in other countries and could be due to parents being more likely to seek medical attention. Further studies are required to confirm the species of Campylobacter that are predominantly associated with gastroenteritis in Fiji.

  18. Some connections between importance sampling and enhanced sampling methods in molecular dynamics.

    Science.gov (United States)

    Lie, H C; Quer, J

    2017-11-21

    In molecular dynamics, enhanced sampling methods enable the collection of better statistics of rare events from a reference or target distribution. We show that a large class of these methods is based on the idea of importance sampling from mathematical statistics. We illustrate this connection by comparing the Hartmann-Schütte method for rare event simulation (J. Stat. Mech. Theor. Exp. 2012, P11004) and the Valsson-Parrinello method of variationally enhanced sampling [Phys. Rev. Lett. 113, 090601 (2014)]. We use this connection in order to discuss how recent results from the Monte Carlo methods literature can guide the development of enhanced sampling methods.

  19. A Quantitative Proteomics Approach to Clinical Research with Non-Traditional Samples

    Directory of Open Access Journals (Sweden)

    Rígel Licier

    2016-10-01

    Full Text Available The proper handling of samples to be analyzed by mass spectrometry (MS can guarantee excellent results and a greater depth of analysis when working in quantitative proteomics. This is critical when trying to assess non-traditional sources such as ear wax, saliva, vitreous humor, aqueous humor, tears, nipple aspirate fluid, breast milk/colostrum, cervical-vaginal fluid, nasal secretions, bronco-alveolar lavage fluid, and stools. We intend to provide the investigator with relevant aspects of quantitative proteomics and to recognize the most recent clinical research work conducted with atypical samples and analyzed by quantitative proteomics. Having as reference the most recent and different approaches used with non-traditional sources allows us to compare new strategies in the development of novel experimental models. On the other hand, these references help us to contribute significantly to the understanding of the proportions of proteins in different proteomes of clinical interest and may lead to potential advances in the emerging field of precision medicine.

  20. A Quantitative Proteomics Approach to Clinical Research with Non-Traditional Samples.

    Science.gov (United States)

    Licier, Rígel; Miranda, Eric; Serrano, Horacio

    2016-10-17

    The proper handling of samples to be analyzed by mass spectrometry (MS) can guarantee excellent results and a greater depth of analysis when working in quantitative proteomics. This is critical when trying to assess non-traditional sources such as ear wax, saliva, vitreous humor, aqueous humor, tears, nipple aspirate fluid, breast milk/colostrum, cervical-vaginal fluid, nasal secretions, bronco-alveolar lavage fluid, and stools. We intend to provide the investigator with relevant aspects of quantitative proteomics and to recognize the most recent clinical research work conducted with atypical samples and analyzed by quantitative proteomics. Having as reference the most recent and different approaches used with non-traditional sources allows us to compare new strategies in the development of novel experimental models. On the other hand, these references help us to contribute significantly to the understanding of the proportions of proteins in different proteomes of clinical interest and may lead to potential advances in the emerging field of precision medicine.