WorldWideScience

Sample records for random sampling procedures

  1. k-Means: Random Sampling Procedure

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. k-Means: Random Sampling Procedure. Optimal 1-Mean is. Approximation of Centroid (Inaba et al). S = random sample of size O(1/ ); Centroid of S is a (1+ )-approx centroid of P with constant probability.

  2. A random sampling procedure for anisotropic distributions

    International Nuclear Information System (INIS)

    Nagrajan, P.S.; Sethulakshmi, P.; Raghavendran, C.P.; Bhatia, D.P.

    1975-01-01

    A procedure is described for sampling the scattering angle of neutrons as per specified angular distribution data. The cosine of the scattering angle is written as a double Legendre expansion in the incident neutron energy and a random number. The coefficients of the expansion are given for C, N, O, Si, Ca, Fe and Pb and these elements are of interest in dosimetry and shielding. (author)

  3. 42 CFR 431.814 - Sampling plan and procedures.

    Science.gov (United States)

    2010-10-01

    ... reliability of the reduced sample. (4) The sample selection procedure. Systematic random sampling is... sampling, and yield estimates with the same or better precision than achieved in systematic random sampling... 42 Public Health 4 2010-10-01 2010-10-01 false Sampling plan and procedures. 431.814 Section 431...

  4. An alternative procedure for estimating the population mean in simple random sampling

    Directory of Open Access Journals (Sweden)

    Housila P. Singh

    2012-03-01

    Full Text Available This paper deals with the problem of estimating the finite population mean using auxiliary information in simple random sampling. Firstly we have suggested a correction to the mean squared error of the estimator proposed by Gupta and Shabbir [On improvement in estimating the population mean in simple random sampling. Jour. Appl. Statist. 35(5 (2008, pp. 559-566]. Later we have proposed a ratio type estimator and its properties are studied in simple random sampling. Numerically we have shown that the proposed class of estimators is more efficient than different known estimators including Gupta and Shabbir (2008 estimator.

  5. Estimates of Inequality Indices Based on Simple Random, Ranked Set, and Systematic Sampling

    OpenAIRE

    Bansal, Pooja; Arora, Sangeeta; Mahajan, Kalpana K.

    2013-01-01

    Gini index, Bonferroni index, and Absolute Lorenz index are some popular indices of inequality showing different features of inequality measurement. In general simple random sampling procedure is commonly used to estimate the inequality indices and their related inference. The key condition that the samples must be drawn via simple random sampling procedure though makes calculations much simpler but this assumption is often violated in practice as the data does not always yield simple random ...

  6. Generating Random Samples of a Given Size Using Social Security Numbers.

    Science.gov (United States)

    Erickson, Richard C.; Brauchle, Paul E.

    1984-01-01

    The purposes of this article are (1) to present a method by which social security numbers may be used to draw cluster samples of a predetermined size and (2) to describe procedures used to validate this method of drawing random samples. (JOW)

  7. BWIP-RANDOM-SAMPLING, Random Sample Generation for Nuclear Waste Disposal

    International Nuclear Information System (INIS)

    Sagar, B.

    1989-01-01

    1 - Description of program or function: Random samples for different distribution types are generated. Distribution types as required for performance assessment modeling of geologic nuclear waste disposal are provided. These are: - Uniform, - Log-uniform (base 10 or natural), - Normal, - Lognormal (base 10 or natural), - Exponential, - Bernoulli, - User defined continuous distribution. 2 - Method of solution: A linear congruential generator is used for uniform random numbers. A set of functions is used to transform the uniform distribution to the other distributions. Stratified, rather than random, sampling can be chosen. Truncated limits can be specified on many distributions, whose usual definition has an infinite support. 3 - Restrictions on the complexity of the problem: Generation of correlated random variables is not included

  8. Specified assurance level sampling procedure

    International Nuclear Information System (INIS)

    Willner, O.

    1980-11-01

    In the nuclear industry design specifications for certain quality characteristics require that the final product be inspected by a sampling plan which can demonstrate product conformance to stated assurance levels. The Specified Assurance Level (SAL) Sampling Procedure has been developed to permit the direct selection of attribute sampling plans which can meet commonly used assurance levels. The SAL procedure contains sampling plans which yield the minimum sample size at stated assurance levels. The SAL procedure also provides sampling plans with acceptance numbers ranging from 0 to 10, thus, making available to the user a wide choice of plans all designed to comply with a stated assurance level

  9. Procedures for sampling radium-contaminated soils

    International Nuclear Information System (INIS)

    Fleischhauer, H.L.

    1985-10-01

    Two procedures for sampling the surface layer (0 to 15 centimeters) of radium-contaminated soil are recommended for use in remedial action projects. Both procedures adhere to the philosophy that soil samples should have constant geometry and constant volume in order to ensure uniformity. In the first procedure, a ''cookie cutter'' fashioned from pipe or steel plate, is driven to the desired depth by means of a slide hammer, and the sample extracted as a core or plug. The second procedure requires use of a template to outline the sampling area, from which the sample is obtained using a trowel or spoon. Sampling to the desired depth must then be performed incrementally. Selection of one procedure over the other is governed primarily by soil conditions, the cookie cutter being effective in nongravelly soils, and the template procedure appropriate for use in both gravelly and nongravelly soils. In any event, a minimum sample volume of 1000 cubic centimeters is recommended. The step-by-step procedures are accompanied by a description of the minimum requirements for sample documentation. Transport of the soil samples from the field is then addressed in a discussion of the federal regulations for shipping radioactive materials. Interpretation of those regulations, particularly in light of their application to remedial action soil-sampling programs, is provided in the form of guidance and suggested procedures. Due to the complex nature of the regulations, however, there is no guarantee that our interpretations of them are complete or entirely accurate. Preparation of soil samples for radium-226 analysis by means of gamma-ray spectroscopy is described

  10. A Bayesian Justification for Random Sampling in Sample Survey

    Directory of Open Access Journals (Sweden)

    Glen Meeden

    2012-07-01

    Full Text Available In the usual Bayesian approach to survey sampling the sampling design, plays a minimal role, at best. Although a close relationship between exchangeable prior distributions and simple random sampling has been noted; how to formally integrate simple random sampling into the Bayesian paradigm is not clear. Recently it has been argued that the sampling design can be thought of as part of a Bayesian's prior distribution. We will show here that under this scenario simple random sample can be given a Bayesian justification in survey sampling.

  11. The study of combining Latin Hypercube Sampling method and LU decomposition method (LULHS method) for constructing spatial random field

    Science.gov (United States)

    WANG, P. T.

    2015-12-01

    Groundwater modeling requires to assign hydrogeological properties to every numerical grid. Due to the lack of detailed information and the inherent spatial heterogeneity, geological properties can be treated as random variables. Hydrogeological property is assumed to be a multivariate distribution with spatial correlations. By sampling random numbers from a given statistical distribution and assigning a value to each grid, a random field for modeling can be completed. Therefore, statistics sampling plays an important role in the efficiency of modeling procedure. Latin Hypercube Sampling (LHS) is a stratified random sampling procedure that provides an efficient way to sample variables from their multivariate distributions. This study combines the the stratified random procedure from LHS and the simulation by using LU decomposition to form LULHS. Both conditional and unconditional simulations of LULHS were develpoed. The simulation efficiency and spatial correlation of LULHS are compared to the other three different simulation methods. The results show that for the conditional simulation and unconditional simulation, LULHS method is more efficient in terms of computational effort. Less realizations are required to achieve the required statistical accuracy and spatial correlation.

  12. Sampling procedures for inventory of commercial volume tree species in Amazon Forest.

    Science.gov (United States)

    Netto, Sylvio P; Pelissari, Allan L; Cysneiros, Vinicius C; Bonazza, Marcelo; Sanquetta, Carlos R

    2017-01-01

    The spatial distribution of tropical tree species can affect the consistency of the estimators in commercial forest inventories, therefore, appropriate sampling procedures are required to survey species with different spatial patterns in the Amazon Forest. For this, the present study aims to evaluate the conventional sampling procedures and introduce the adaptive cluster sampling for volumetric inventories of Amazonian tree species, considering the hypotheses that the density, the spatial distribution and the zero-plots affect the consistency of the estimators, and that the adaptive cluster sampling allows to obtain more accurate volumetric estimation. We use data from a census carried out in Jamari National Forest, Brazil, where trees with diameters equal to or higher than 40 cm were measured in 1,355 plots. Species with different spatial patterns were selected and sampled with simple random sampling, systematic sampling, linear cluster sampling and adaptive cluster sampling, whereby the accuracy of the volumetric estimation and presence of zero-plots were evaluated. The sampling procedures applied to species were affected by the low density of trees and the large number of zero-plots, wherein the adaptive clusters allowed concentrating the sampling effort in plots with trees and, thus, agglutinating more representative samples to estimate the commercial volume.

  13. An unbiased estimator of the variance of simple random sampling using mixed random-systematic sampling

    OpenAIRE

    Padilla, Alberto

    2009-01-01

    Systematic sampling is a commonly used technique due to its simplicity and ease of implementation. The drawback of this simplicity is that it is not possible to estimate the design variance without bias. There are several ways to circumvent this problem. One method is to suppose that the variable of interest has a random order in the population, so the sample variance of simple random sampling without replacement is used. By means of a mixed random - systematic sample, an unbiased estimator o...

  14. STATISTICAL LANDMARKS AND PRACTICAL ISSUES REGARDING THE USE OF SIMPLE RANDOM SAMPLING IN MARKET RESEARCHES

    Directory of Open Access Journals (Sweden)

    CODRUŢA DURA

    2010-01-01

    Full Text Available The sample represents a particular segment of the statistical populationchosen to represent it as a whole. The representativeness of the sample determines the accuracyfor estimations made on the basis of calculating the research indicators and the inferentialstatistics. The method of random sampling is part of probabilistic methods which can be usedwithin marketing research and it is characterized by the fact that it imposes the requirementthat each unit belonging to the statistical population should have an equal chance of beingselected for the sampling process. When the simple random sampling is meant to be rigorouslyput into practice, it is recommended to use the technique of random number tables in order toconfigure the sample which will provide information that the marketer needs. The paper alsodetails the practical procedure implemented in order to create a sample for a marketingresearch by generating random numbers using the facilities offered by Microsoft Excel.

  15. Systematic versus random sampling in stereological studies.

    Science.gov (United States)

    West, Mark J

    2012-12-01

    The sampling that takes place at all levels of an experimental design must be random if the estimate is to be unbiased in a statistical sense. There are two fundamental ways by which one can make a random sample of the sections and positions to be probed on the sections. Using a card-sampling analogy, one can pick any card at all out of a deck of cards. This is referred to as independent random sampling because the sampling of any one card is made without reference to the position of the other cards. The other approach to obtaining a random sample would be to pick a card within a set number of cards and others at equal intervals within the deck. Systematic sampling along one axis of many biological structures is more efficient than random sampling, because most biological structures are not randomly organized. This article discusses the merits of systematic versus random sampling in stereological studies.

  16. Generation and Analysis of Constrained Random Sampling Patterns

    DEFF Research Database (Denmark)

    Pierzchlewski, Jacek; Arildsen, Thomas

    2016-01-01

    Random sampling is a technique for signal acquisition which is gaining popularity in practical signal processing systems. Nowadays, event-driven analog-to-digital converters make random sampling feasible in practical applications. A process of random sampling is defined by a sampling pattern, which...... indicates signal sampling points in time. Practical random sampling patterns are constrained by ADC characteristics and application requirements. In this paper, we introduce statistical methods which evaluate random sampling pattern generators with emphasis on practical applications. Furthermore, we propose...... algorithm generates random sampling patterns dedicated for event-driven-ADCs better than existed sampling pattern generators. Finally, implementation issues of random sampling patterns are discussed....

  17. An improved ashing procedure for biologic sample

    Energy Technology Data Exchange (ETDEWEB)

    Zongmei, Wu [Zhejiang Province Enviromental Radiation Monitoring Centre (China)

    1992-07-01

    The classical ashing procedure in muffle was modified for biologic samples. In the modified procedure the door of muffle was open in the duration of ashing process, the ashing was accelerated and the ashing product quality was comparable to that the classical procedure. The modified procedure is suitable for ashing biologic samples in large batches.

  18. An improved ashing procedure for biologic sample

    International Nuclear Information System (INIS)

    Wu Zongmei

    1992-01-01

    The classical ashing procedure in muffle was modified for biologic samples. In the modified procedure the door of muffle was open in the duration of ashing process, the ashing was accelerated and the ashing product quality was comparable to that the classical procedure. The modified procedure is suitable for ashing biologic samples in large batches

  19. Sampling problems for randomly broken sticks

    Energy Technology Data Exchange (ETDEWEB)

    Huillet, Thierry [Laboratoire de Physique Theorique et Modelisation, CNRS-UMR 8089 et Universite de Cergy-Pontoise, 5 mail Gay-Lussac, 95031, Neuville sur Oise (France)

    2003-04-11

    Consider the random partitioning model of a population (represented by a stick of length 1) into n species (fragments) with identically distributed random weights (sizes). Upon ranking the fragments' weights according to ascending sizes, let S{sub m:n} be the size of the mth smallest fragment. Assume that some observer is sampling such populations as follows: drop at random k points (the sample size) onto this stick and record the corresponding numbers of visited fragments. We shall investigate the following sampling problems: (1) what is the sample size if the sampling is carried out until the first visit of the smallest fragment (size S{sub 1:n})? (2) For a given sample size, have all the fragments of the stick been visited at least once or not? This question is related to Feller's random coupon collector problem. (3) In what order are new fragments being discovered and what is the random number of samples separating the discovery of consecutive new fragments until exhaustion of the list? For this problem, the distribution of the size-biased permutation of the species' weights, as the sequence of their weights in their order of appearance is needed and studied.

  20. Soil Sampling Operating Procedure

    Science.gov (United States)

    EPA Region 4 Science and Ecosystem Support Division (SESD) document that describes general and specific procedures, methods, and considerations when collecting soil samples for field screening or laboratory analysis.

  1. Random selection of items. Selection of n1 samples among N items composing a stratum

    International Nuclear Information System (INIS)

    Jaech, J.L.; Lemaire, R.J.

    1987-02-01

    STR-224 provides generalized procedures to determine required sample sizes, for instance in the course of a Physical Inventory Verification at Bulk Handling Facilities. The present report describes procedures to generate random numbers and select groups of items to be verified in a given stratum through each of the measurement methods involved in the verification. (author). 3 refs

  2. Independent random sampling methods

    CERN Document Server

    Martino, Luca; Míguez, Joaquín

    2018-01-01

    This book systematically addresses the design and analysis of efficient techniques for independent random sampling. Both general-purpose approaches, which can be used to generate samples from arbitrary probability distributions, and tailored techniques, designed to efficiently address common real-world practical problems, are introduced and discussed in detail. In turn, the monograph presents fundamental results and methodologies in the field, elaborating and developing them into the latest techniques. The theory and methods are illustrated with a varied collection of examples, which are discussed in detail in the text and supplemented with ready-to-run computer code. The main problem addressed in the book is how to generate independent random samples from an arbitrary probability distribution with the weakest possible constraints or assumptions in a form suitable for practical implementation. The authors review the fundamental results and methods in the field, address the latest methods, and emphasize the li...

  3. Randomized trial to examine procedure-to-procedure transfer in laparoscopic simulator training

    DEFF Research Database (Denmark)

    Bjerrum, F; Sorensen, J L; Konge, L

    2016-01-01

    -centre educational superiority trial. Surgical novices practised basic skills on a laparoscopic virtual reality simulator. On reaching proficiency, participants were randomized to proficiency-based training. The intervention group practised two procedures on the simulator (appendicectomy followed by salpingectomy...

  4. Soil Gas Sampling Operating Procedure

    Science.gov (United States)

    EPA Region 4 Science and Ecosystem Support Division (SESD) document that describes general and specific procedures, methods, and considerations when collecting soil gas samples for field screening or laboratory analysis.

  5. Accounting for Sampling Error in Genetic Eigenvalues Using Random Matrix Theory.

    Science.gov (United States)

    Sztepanacz, Jacqueline L; Blows, Mark W

    2017-07-01

    The distribution of genetic variance in multivariate phenotypes is characterized by the empirical spectral distribution of the eigenvalues of the genetic covariance matrix. Empirical estimates of genetic eigenvalues from random effects linear models are known to be overdispersed by sampling error, where large eigenvalues are biased upward, and small eigenvalues are biased downward. The overdispersion of the leading eigenvalues of sample covariance matrices have been demonstrated to conform to the Tracy-Widom (TW) distribution. Here we show that genetic eigenvalues estimated using restricted maximum likelihood (REML) in a multivariate random effects model with an unconstrained genetic covariance structure will also conform to the TW distribution after empirical scaling and centering. However, where estimation procedures using either REML or MCMC impose boundary constraints, the resulting genetic eigenvalues tend not be TW distributed. We show how using confidence intervals from sampling distributions of genetic eigenvalues without reference to the TW distribution is insufficient protection against mistaking sampling error as genetic variance, particularly when eigenvalues are small. By scaling such sampling distributions to the appropriate TW distribution, the critical value of the TW statistic can be used to determine if the magnitude of a genetic eigenvalue exceeds the sampling error for each eigenvalue in the spectral distribution of a given genetic covariance matrix. Copyright © 2017 by the Genetics Society of America.

  6. Comparison of transition-matrix sampling procedures

    DEFF Research Database (Denmark)

    Yevick, D.; Reimer, M.; Tromborg, Bjarne

    2009-01-01

    We compare the accuracy of the multicanonical procedure with that of transition-matrix models of static and dynamic communication system properties incorporating different acceptance rules. We find that for appropriate ranges of the underlying numerical parameters, algorithmically simple yet high...... accurate procedures can be employed in place of the standard multicanonical sampling algorithm....

  7. IXM gas sampling procedure

    International Nuclear Information System (INIS)

    Pingel, L.A.

    1995-01-01

    Ion Exchange Modules (IXMs) are used at the 105-KE and -KW Fuel Storage Basins to control radionuclide concentrations in the water. A potential safety concern relates to production of hydrogen gas by radiolysis of the water trapped in the ion exchange media of spent IXMs. This document provides a procedure for sampling the gases in the head space of the IXM

  8. Random analysis of bearing capacity of square footing using the LAS procedure

    Science.gov (United States)

    Kawa, Marek; Puła, Wojciech; Suska, Michał

    2016-09-01

    In the present paper, a three-dimensional problem of bearing capacity of square footing on random soil medium is analyzed. The random fields of strength parameters c and φ are generated using LAS procedure (Local Average Subdivision, Fenton and Vanmarcke 1990). The procedure used is re-implemented by the authors in Mathematica environment in order to combine it with commercial program. Since the procedure is still tested the random filed has been assumed as one-dimensional: the strength properties of soil are random in vertical direction only. Individual realizations of bearing capacity boundary-problem with strength parameters of medium defined the above procedure are solved using FLAC3D Software. The analysis is performed for two qualitatively different cases, namely for the purely cohesive and cohesive-frictional soils. For the latter case the friction angle and cohesion have been assumed as independent random variables. For these two cases the random square footing bearing capacity results have been obtained for the range of fluctuation scales from 0.5 m to 10 m. Each time 1000 Monte Carlo realizations have been performed. The obtained results allow not only the mean and variance but also the probability density function to be estimated. An example of application of this function for reliability calculation has been presented in the final part of the paper.

  9. Sampling procedure, receipt and conservation of water samples to determine environmental radioactivity

    International Nuclear Information System (INIS)

    Herranz, M.; Navarro, E.; Payeras, J.

    2009-01-01

    The present document informs about essential goals, processes and contents that the subgroups Sampling and Samples Preparation and Conservation believe they should be part of the procedure to obtain a correct sampling, receipt, conservation and preparation of samples of continental, marine and waste water before qualifying its radioactive content.

  10. Random sampling of quantum states: a survey of methods and some issues regarding the Overparametrized Method

    International Nuclear Information System (INIS)

    Maziero, Jonas

    2015-01-01

    The numerical generation of random quantum states (RQS) is an important procedure for investigations in quantum information science. Here, we review some methods that may be used for performing that task. We start by presenting a simple procedure for generating random state vectors, for which the main tool is the random sampling of unbiased discrete probability distributions (DPD). Afterwards, the creation of random density matrices is addressed. In this context, we first present the standard method, which consists in using the spectral decomposition of a quantum state for getting RQS from random DPDs and random unitary matrices. In the sequence, the Bloch vector parametrization method is described. This approach, despite being useful in several instances, is not in general convenient for RQS generation. In the last part of the article, we regard the overparametrized method (OPM) and the related Ginibre and Bures techniques. The OPM can be used to create random positive semidefinite matrices with unit trace from randomly produced general complex matrices in a simple way that is friendly for numerical implementations. We consider a physically relevant issue related to the possible domains that may be used for the real and imaginary parts of the elements of such general complex matrices. Subsequently, a too fast concentration of measure in the quantum state space that appears in this parametrization is noticed. (author)

  11. Sampling large random knots in a confined space

    International Nuclear Information System (INIS)

    Arsuaga, J; Blackstone, T; Diao, Y; Hinson, K; Karadayi, E; Saito, M

    2007-01-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e n 2 )). We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n 2 ). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications

  12. Sampling large random knots in a confined space

    Science.gov (United States)

    Arsuaga, J.; Blackstone, T.; Diao, Y.; Hinson, K.; Karadayi, E.; Saito, M.

    2007-09-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e^{n^2}) . We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n2). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  13. Sampling large random knots in a confined space

    Energy Technology Data Exchange (ETDEWEB)

    Arsuaga, J [Department of Mathematics, San Francisco State University, 1600 Holloway Ave, San Francisco, CA 94132 (United States); Blackstone, T [Department of Computer Science, San Francisco State University, 1600 Holloway Ave., San Francisco, CA 94132 (United States); Diao, Y [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Hinson, K [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Karadayi, E [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States); Saito, M [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States)

    2007-09-28

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e{sup n{sup 2}}). We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n{sup 2}). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  14. Randomization of grab-sampling strategies for estimating the annual exposure of U miners to Rn daughters.

    Science.gov (United States)

    Borak, T B

    1986-04-01

    Periodic grab sampling in combination with time-of-occupancy surveys has been the accepted procedure for estimating the annual exposure of underground U miners to Rn daughters. Temporal variations in the concentration of potential alpha energy in the mine generate uncertainties in this process. A system to randomize the selection of locations for measurement is described which can reduce uncertainties and eliminate systematic biases in the data. In general, a sample frequency of 50 measurements per year is sufficient to satisfy the criteria that the annual exposure be determined in working level months to within +/- 50% of the true value with a 95% level of confidence. Suggestions for implementing this randomization scheme are presented.

  15. Determination of Initial Conditions for the Safety Analysis by Random Sampling of Operating Parameters

    International Nuclear Information System (INIS)

    Jeong, Hae-Yong; Park, Moon-Ghu

    2015-01-01

    In most existing evaluation methodologies, which follow a conservative approach, the most conservative initial conditions are searched for each transient scenario through tremendous assessment for wide operating windows or limiting conditions for operation (LCO) allowed by the operating guidelines. In this procedure, a user effect could be involved and a remarkable time and human resources are consumed. In the present study, we investigated a more effective statistical method for the selection of the most conservative initial condition by the use of random sampling of operating parameters affecting the initial conditions. A method for the determination of initial conditions based on random sampling of plant design parameters is proposed. This method is expected to be applied for the selection of the most conservative initial plant conditions in the safety analysis using a conservative evaluation methodology. In the method, it is suggested that the initial conditions of reactor coolant flow rate, pressurizer level, pressurizer pressure, and SG level are adjusted by controlling the pump rated flow, setpoints of PLCS, PPCS, and FWCS, respectively. The proposed technique is expected to contribute to eliminate the human factors introduced in the conventional safety analysis procedure and also to reduce the human resources invested in the safety evaluation of nuclear power plants

  16. The influence of a eutectic mixture of lidocaine and prilocaine on minor surgical procedures: a randomized controlled double-blind trial.

    LENUS (Irish Health Repository)

    Shaikh, Faisal M

    2012-01-31

    BACKGROUND: A eutectic mixture of lidocaine and prilocaine (EMLA) has been shown to be effective in reducing pain from needle sticks, including those associated with blood sampling and intravenous insertion. OBJECTIVE: To evaluate the effectiveness of EMLA cream applied before needle puncture for local anesthetic administration before minor surgical procedures in this double-blind, randomized, controlled, parallel-group study. MATERIALS AND METHODS: Patients were randomly assigned to receive EMLA or placebo cream (Aqueous) applied under an occlusive dressing. After the procedure, patients were asked to rate the needle prick and procedure pain on a visual analog scale (0=no pain; 10=maximum pain). RESULTS: A total of 94 minor surgical procedures (49 in EMLA and 45 in control) were performed. The mean needle-stick pain score in the EMLA group was significantly lower than in the control group (2.7 vs. 5.7, p<.001, Mann-Whitney U-test). There was also significantly lower procedure pain in the EMLA group than in the control group (0.83 vs. 1.86, p=.009). There were no complications associated with the use of EMLA. CONCLUSION: EMLA effectively reduces the preprocedural needle-stick pain and procedural pain associated with minor surgical procedures.

  17. Efficient sampling of complex network with modified random walk strategies

    Science.gov (United States)

    Xie, Yunya; Chang, Shuhua; Zhang, Zhipeng; Zhang, Mi; Yang, Lei

    2018-02-01

    We present two novel random walk strategies, choosing seed node (CSN) random walk and no-retracing (NR) random walk. Different from the classical random walk sampling, the CSN and NR strategies focus on the influences of the seed node choice and path overlap, respectively. Three random walk samplings are applied in the Erdös-Rényi (ER), Barabási-Albert (BA), Watts-Strogatz (WS), and the weighted USAir networks, respectively. Then, the major properties of sampled subnets, such as sampling efficiency, degree distributions, average degree and average clustering coefficient, are studied. The similar conclusions can be reached with these three random walk strategies. Firstly, the networks with small scales and simple structures are conducive to the sampling. Secondly, the average degree and the average clustering coefficient of the sampled subnet tend to the corresponding values of original networks with limited steps. And thirdly, all the degree distributions of the subnets are slightly biased to the high degree side. However, the NR strategy performs better for the average clustering coefficient of the subnet. In the real weighted USAir networks, some obvious characters like the larger clustering coefficient and the fluctuation of degree distribution are reproduced well by these random walk strategies.

  18. RandomSpot: A web-based tool for systematic random sampling of virtual slides.

    Science.gov (United States)

    Wright, Alexander I; Grabsch, Heike I; Treanor, Darren E

    2015-01-01

    This paper describes work presented at the Nordic Symposium on Digital Pathology 2014, Linköping, Sweden. Systematic random sampling (SRS) is a stereological tool, which provides a framework to quickly build an accurate estimation of the distribution of objects or classes within an image, whilst minimizing the number of observations required. RandomSpot is a web-based tool for SRS in stereology, which systematically places equidistant points within a given region of interest on a virtual slide. Each point can then be visually inspected by a pathologist in order to generate an unbiased sample of the distribution of classes within the tissue. Further measurements can then be derived from the distribution, such as the ratio of tumor to stroma. RandomSpot replicates the fundamental principle of traditional light microscope grid-shaped graticules, with the added benefits associated with virtual slides, such as facilitated collaboration and automated navigation between points. Once the sample points have been added to the region(s) of interest, users can download the annotations and view them locally using their virtual slide viewing software. Since its introduction, RandomSpot has been used extensively for international collaborative projects, clinical trials and independent research projects. So far, the system has been used to generate over 21,000 sample sets, and has been used to generate data for use in multiple publications, identifying significant new prognostic markers in colorectal, upper gastro-intestinal and breast cancer. Data generated using RandomSpot also has significant value for training image analysis algorithms using sample point coordinates and pathologist classifications.

  19. Application of a stratified random sampling technique to the estimation and minimization of respirable quartz exposure to underground miners

    International Nuclear Information System (INIS)

    Makepeace, C.E.; Horvath, F.J.; Stocker, H.

    1981-11-01

    The aim of a stratified random sampling plan is to provide the best estimate (in the absence of full-shift personal gravimetric sampling) of personal exposure to respirable quartz among underground miners. One also gains information of the exposure distribution of all the miners at the same time. Three variables (or strata) are considered in the present scheme: locations, occupations and times of sampling. Random sampling within each stratum ensures that each location, occupation and time of sampling has equal opportunity of being selected without bias. Following implementation of the plan and analysis of collected data, one can determine the individual exposures and the mean. This information can then be used to identify those groups whose exposure contributes significantly to the collective exposure. In turn, this identification, along with other considerations, allows the mine operator to carry out a cost-benefit optimization and eventual implementation of engineering controls for these groups. This optimization and engineering control procedure, together with the random sampling plan, can then be used in an iterative manner to minimize the mean value of the distribution and collective exposures

  20. Acceptance sampling using judgmental and randomly selected samples

    Energy Technology Data Exchange (ETDEWEB)

    Sego, Landon H.; Shulman, Stanley A.; Anderson, Kevin K.; Wilson, John E.; Pulsipher, Brent A.; Sieber, W. Karl

    2010-09-01

    We present a Bayesian model for acceptance sampling where the population consists of two groups, each with different levels of risk of containing unacceptable items. Expert opinion, or judgment, may be required to distinguish between the high and low-risk groups. Hence, high-risk items are likely to be identifed (and sampled) using expert judgment, while the remaining low-risk items are sampled randomly. We focus on the situation where all observed samples must be acceptable. Consequently, the objective of the statistical inference is to quantify the probability that a large percentage of the unsampled items in the population are also acceptable. We demonstrate that traditional (frequentist) acceptance sampling and simpler Bayesian formulations of the problem are essentially special cases of the proposed model. We explore the properties of the model in detail, and discuss the conditions necessary to ensure that required samples sizes are non-decreasing function of the population size. The method is applicable to a variety of acceptance sampling problems, and, in particular, to environmental sampling where the objective is to demonstrate the safety of reoccupying a remediated facility that has been contaminated with a lethal agent.

  1. Sample processing procedures and radiocarbon dating

    International Nuclear Information System (INIS)

    Svetlik, Ivo; Tomaskova, Lenka; Dreslerova, Dagmar

    2010-01-01

    The article outlines radiocarbon dating routines and highlights the potential and limitations of this method. The author's institutions have been jointly running a conventional radiocarbon dating laboratory using the international CRL code. A procedure based on the synthesis of benzene is used. Small samples are sent abroad for dating because no AMS instrumentation is available in the Czech Republic so far. Our laboratory plans to introduce routines for the processing of milligram samples and preparation of graphitized targets for AMS

  2. A random spatial sampling method in a rural developing nation

    Science.gov (United States)

    Michelle C. Kondo; Kent D.W. Bream; Frances K. Barg; Charles C. Branas

    2014-01-01

    Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. We describe a stratified random sampling method...

  3. Procedures for cryogenic X-ray ptychographic imaging of biological samples.

    Science.gov (United States)

    Yusuf, M; Zhang, F; Chen, B; Bhartiya, A; Cunnea, K; Wagner, U; Cacho-Nerin, F; Schwenke, J; Robinson, I K

    2017-03-01

    Biological sample-preparation procedures have been developed for imaging human chromosomes under cryogenic conditions. A new experimental setup, developed for imaging frozen samples using beamline I13 at Diamond Light Source, is described. This manuscript describes the equipment and experimental procedures as well as the authors' first ptychographic reconstructions using X-rays.

  4. Procedures for cryogenic X-ray ptychographic imaging of biological samples

    Directory of Open Access Journals (Sweden)

    M. Yusuf

    2017-03-01

    Full Text Available Biological sample-preparation procedures have been developed for imaging human chromosomes under cryogenic conditions. A new experimental setup, developed for imaging frozen samples using beamline I13 at Diamond Light Source, is described. This manuscript describes the equipment and experimental procedures as well as the authors' first ptychographic reconstructions using X-rays.

  5. Role of microextraction sampling procedures in forensic toxicology.

    Science.gov (United States)

    Barroso, Mário; Moreno, Ivo; da Fonseca, Beatriz; Queiroz, João António; Gallardo, Eugenia

    2012-07-01

    The last two decades have provided analysts with more sensitive technology, enabling scientists from all analytical fields to see what they were not able to see just a few years ago. This increased sensitivity has allowed drug detection at very low concentrations and testing in unconventional samples (e.g., hair, oral fluid and sweat), where despite having low analyte concentrations has also led to a reduction in sample size. Along with this reduction, and as a result of the use of excessive amounts of potentially toxic organic solvents (with the subsequent environmental pollution and costs associated with their proper disposal), there has been a growing tendency to use miniaturized sampling techniques. Those sampling procedures allow reducing organic solvent consumption to a minimum and at the same time provide a rapid, simple and cost-effective approach. In addition, it is possible to get at least some degree of automation when using these techniques, which will enhance sample throughput. Those miniaturized sample preparation techniques may be roughly categorized in solid-phase and liquid-phase microextraction, depending on the nature of the analyte. This paper reviews recently published literature on the use of microextraction sampling procedures, with a special focus on the field of forensic toxicology.

  6. Acceptance test procedure for core sample trucks

    International Nuclear Information System (INIS)

    Smalley, J.L.

    1995-01-01

    The purpose of this Acceptance Test Procedure is to provide instruction and documentation for acceptance testing of the rotary mode core sample trucks, HO-68K-4600 and HO-68K-4647. The rotary mode core sample trucks were based upon the design of the second core sample truck (HO-68K-4345) which was constructed to implement rotary mode sampling of the waste tanks at Hanford. Acceptance testing of the rotary mode core sample trucks will verify that the design requirements have been met. All testing will be non-radioactive and stand-in materials shall be used to simulate waste tank conditions. Compressed air will be substituted for nitrogen during the majority of testing, with nitrogen being used only for flow characterization

  7. Aromatherapy for reducing colonoscopy related procedural anxiety and physiological parameters: a randomized controlled study.

    Science.gov (United States)

    Hu, Pei-Hsin; Peng, Yen-Chun; Lin, Yu-Ting; Chang, Chi-Sen; Ou, Ming-Chiu

    2010-01-01

    Colonoscopy is generally tolerated, some patients regarding the procedure as unpleasant and painful and generally performed with the patient sedated and receiving analgesics. The effect of sedation and analgesia for colonoscopy is limited. Aromatherapy is also applied to gastrointestinal endoscopy to reduce procedural anxiety. There is lack of information about aromatherapy specific for colonoscopy. In this study, we aimed to performed a randomized controlled study to investigate the effect of aromatherapy on relieve anxiety, stress and physiological parameters of colonoscopy. A randomized controlled trail was carried out and collected in 2009 and 2010. The participants were randomized in two groups. Aromatherapy was then carried out by inhalation of Sunflower oil (control group) and Neroli oil (Experimental group). The anxiety index was evaluated by State Trait Anxiety Inventory-state (STAI-S) score before aromatherapy and after colonoscopy as well as the pain index for post-procedural by visual analogue scale (VAS). Physiological indicators, such as blood pressure (systolic and diastolic blood pressure), heart rate and respiratory rate were evaluated before and after aromatherapy. Participates in this study were 27 subjects, 13 in control group and 14 in Neroli group with average age 52.26 +/- 17.79 years. There was no significance of procedural anxiety by STAI-S score and procedural pain by VAS. The physiological parameters showed a significant lower pre- and post-procedural systolic blood pressure in Neroli group than control group. Aromatic care for colonoscopy, although with no significant effect on procedural anxiety, is an inexpensive, effective and safe pre-procedural technique that could decrease systolic blood pressure.

  8. Secondary School Students' Reasoning about Conditional Probability, Samples, and Sampling Procedures

    Science.gov (United States)

    Prodromou, Theodosia

    2016-01-01

    In the Australian mathematics curriculum, Year 12 students (aged 16-17) are asked to solve conditional probability problems that involve the representation of the problem situation with two-way tables or three-dimensional diagrams and consider sampling procedures that result in different correct answers. In a small exploratory study, we…

  9. Inverse sampled Bernoulli (ISB) procedure for estimating a population proportion, with nuclear material applications

    International Nuclear Information System (INIS)

    Wright, T.

    1982-01-01

    A new sampling procedure is introduced for estimating a population proportion. The procedure combines the ideas of inverse binomial sampling and Bernoulli sampling. An unbiased estimator is given with its variance. The procedure can be viewed as a generalization of inverse binomial sampling

  10. Procedure for determination of alpha emitters in urine and dregs samples

    International Nuclear Information System (INIS)

    Serdeiro, Nelida H.

    2005-01-01

    The purpose of this work is to establish the procedure for the identification and quantification of emitting alpha radionuclides in urine and dregs samples. This procedure are applied to all laboratories of the countries of the Project ARCAL LXXVII that determinate alpha emitting radionuclides in biological samples for biological assessment [es

  11. Partial report and other sampling procedures overestimate the duration of iconic memory.

    Science.gov (United States)

    Appelman, I B

    1980-03-01

    In three experiments, subjects estimated the duration of a brief visual image (iconic memory) either directly by adjusting onset of a click to offset of the visual image, or indirectly with a Sperling partial report (sampling) procedure. The results indicated that partial report and other sampling procedures may reflect other brief phenomena along with iconic memory. First, the partial report procedure yields a greater estimate of the duration of iconic memory than the more direct click method. Second, the partial report estimate of the duration of iconic memory is affected if the subject is required to simultaneously retain a list of distractor items (memory load), while the click method estimate of the duration of iconic memory is not affected by a memory load. Finally, another sampling procedure based on visual cuing yields different estimates of the duration of iconic memory depending on how many items are cued. It was concluded that partial report and other sampling procedures overestimate the duration of iconic memory.

  12. SOME SYSTEMATIC SAMPLING STRATEGIES USING MULTIPLE RANDOM STARTS

    OpenAIRE

    Sampath Sundaram; Ammani Sivaraman

    2010-01-01

    In this paper an attempt is made to extend linear systematic sampling using multiple random starts due to Gautschi(1957)for various types of systematic sampling schemes available in literature, namely(i)  Balanced Systematic Sampling (BSS) of  Sethi (1965) and (ii) Modified Systematic Sampling (MSS) of Singh, Jindal, and Garg  (1968). Further, the proposed methods were compared with Yates corrected estimator developed with reference to Gautschi’s Linear systematic samplin...

  13. Sampling and sample handling procedures for priority pollutants in surface coal mining wastewaters. [Detailed list to be analyzed for

    Energy Technology Data Exchange (ETDEWEB)

    Hayden, R. S.; Johnson, D. O.; Henricks, J. D.

    1979-03-01

    The report describes the procedures used by Argonne National Laboratory to sample surface coal mine effluents in order to obtain field and laboratory data on 110 organic compounds or classes of compounds and 14 metals and minerals that are known as priority pollutants, plus 5-day biochemical oxygen demand (BOD/sub 5/), total organic carbon (TOC), chemical oxygen demand (COD), total dissolved solids (TDS), and total suspended solids (TSS). Included are directions for preparation of sampling containers and equipment, methods of sampling and sample preservation, and field and laboratory protocols, including chain-of-custody procedures. Actual analytical procedures are not described, but their sources are referenced.

  14. Procedures for sampling and sample reduction within quality assurance systems for solid biofuels

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-07-01

    The objective of this experimental study on sampling was to determine the size and number of samples of biofuels required (taken at two sampling points in each case) and to compare two methods of sampling. The first objective of the sample-reduction exercise was to compare the reliability of various sampling methods, and the second objective was to measure the variations introduced as a result of reducing the sample size to form suitable test portions. The materials studied were sawdust, wood chips, wood pellets and bales of straw, and these were analysed for moisture, ash, particle size and chloride. The sampling procedures are described. The study was conducted in Scandinavia. The results of the study were presented in Leipzig in October 2004. The work was carried out as part of the UK's DTI Technology Programme: New and Renewable Energy.

  15. Health plan auditing: 100-percent-of-claims vs. random-sample audits.

    Science.gov (United States)

    Sillup, George P; Klimberg, Ronald K

    2011-01-01

    The objective of this study was to examine the relative efficacy of two different methodologies for auditing self-funded medical claim expenses: 100-percent-of-claims auditing versus random-sampling auditing. Multiple data sets of claim errors or 'exceptions' from two Fortune-100 corporations were analysed and compared to 100 simulated audits of 300- and 400-claim random samples. Random-sample simulations failed to identify a significant number and amount of the errors that ranged from $200,000 to $750,000. These results suggest that health plan expenses of corporations could be significantly reduced if they audited 100% of claims and embraced a zero-defect approach.

  16. Delineating sampling procedures: Pedagogical significance of analysing sampling descriptions and their justifications in TESL experimental research reports

    Directory of Open Access Journals (Sweden)

    Jason Miin-Hwa Lim

    2011-04-01

    Full Text Available Teaching second language learners how to write research reports constitutes a crucial component in programmes on English for Specific Purposes (ESP in institutions of higher learning. One of the rhetorical segments in research reports that merit attention has to do with the descriptions and justifications of sampling procedures. This genre-based study looks into sampling delineations in the Method-related sections of research articles on the teaching of English as a second language (TESL written by expert writers and published in eight reputed international refereed journals. Using Swales’s (1990 & 2004 framework, I conducted a quantitative analysis of the rhetorical steps and a qualitative investigation into the language resources employed in delineating sampling procedures. This investigation has considerable relevance to ESP students and instructors as it has yielded pertinent findings on how samples can be appropriately described to meet the expectations of dissertation examiners, reviewers, and supervisors. The findings of this study have furnished insights into how supervisors and instructors can possibly teach novice writers ways of using specific linguistic mechanisms to lucidly describe and convincingly justify the sampling procedures in the Method sections of experimental research reports.

  17. A Combined Weighting Method Based on Hybrid of Interval Evidence Fusion and Random Sampling

    Directory of Open Access Journals (Sweden)

    Ying Yan

    2017-01-01

    Full Text Available Due to the complexity of system and lack of expertise, epistemic uncertainties may present in the experts’ judgment on the importance of certain indices during group decision-making. A novel combination weighting method is proposed to solve the index weighting problem when various uncertainties are present in expert comments. Based on the idea of evidence theory, various types of uncertain evaluation information are uniformly expressed through interval evidence structures. Similarity matrix between interval evidences is constructed, and expert’s information is fused. Comment grades are quantified using the interval number, and cumulative probability function for evaluating the importance of indices is constructed based on the fused information. Finally, index weights are obtained by Monte Carlo random sampling. The method can process expert’s information with varying degrees of uncertainties, which possesses good compatibility. Difficulty in effectively fusing high-conflict group decision-making information and large information loss after fusion is avertible. Original expert judgments are retained rather objectively throughout the processing procedure. Cumulative probability function constructing and random sampling processes do not require any human intervention or judgment. It can be implemented by computer programs easily, thus having an apparent advantage in evaluation practices of fairly huge index systems.

  18. Systematic random sampling of the comet assay.

    Science.gov (United States)

    McArt, Darragh G; Wasson, Gillian R; McKerr, George; Saetzler, Kurt; Reed, Matt; Howard, C Vyvyan

    2009-07-01

    The comet assay is a technique used to quantify DNA damage and repair at a cellular level. In the assay, cells are embedded in agarose and the cellular content is stripped away leaving only the DNA trapped in an agarose cavity which can then be electrophoresed. The damaged DNA can enter the agarose and migrate while the undamaged DNA cannot and is retained. DNA damage is measured as the proportion of the migratory 'tail' DNA compared to the total DNA in the cell. The fundamental basis of these arbitrary values is obtained in the comet acquisition phase using fluorescence microscopy with a stoichiometric stain in tandem with image analysis software. Current methods deployed in such an acquisition are expected to be both objectively and randomly obtained. In this paper we examine the 'randomness' of the acquisition phase and suggest an alternative method that offers both objective and unbiased comet selection. In order to achieve this, we have adopted a survey sampling approach widely used in stereology, which offers a method of systematic random sampling (SRS). This is desirable as it offers an impartial and reproducible method of comet analysis that can be used both manually or automated. By making use of an unbiased sampling frame and using microscope verniers, we are able to increase the precision of estimates of DNA damage. Results obtained from a multiple-user pooled variation experiment showed that the SRS technique attained a lower variability than that of the traditional approach. The analysis of a single user with repetition experiment showed greater individual variances while not being detrimental to overall averages. This would suggest that the SRS method offers a better reflection of DNA damage for a given slide and also offers better user reproducibility.

  19. Sampling, storage and sample preparation procedures for X ray fluorescence analysis of environmental materials

    International Nuclear Information System (INIS)

    1997-06-01

    X ray fluorescence (XRF) method is one of the most commonly used nuclear analytical technique because of its multielement and non-destructive character, speed, economy and ease of operation. From the point of view of quality assurance practices, sampling and sample preparation procedures are the most crucial steps in all analytical techniques, (including X ray fluorescence) applied for the analysis of heterogeneous materials. This technical document covers recent modes of the X ray fluorescence method and recent developments in sample preparation techniques for the analysis of environmental materials. Refs, figs, tabs

  20. Correlated random sampling for multivariate normal and log-normal distributions

    International Nuclear Information System (INIS)

    Žerovnik, Gašper; Trkov, Andrej; Kodeli, Ivan A.

    2012-01-01

    A method for correlated random sampling is presented. Representative samples for multivariate normal or log-normal distribution can be produced. Furthermore, any combination of normally and log-normally distributed correlated variables may be sampled to any requested accuracy. Possible applications of the method include sampling of resonance parameters which are used for reactor calculations.

  1. Primary Definitive Procedure versus Conventional Three-staged Procedure for the Management of Low-type Anorectal Malformation in Females: A Randomized Controlled Trial.

    Science.gov (United States)

    Gupta, Alisha; Agarwala, Sandeep; Sreenivas, Vishnubhatla; Srinivas, Madhur; Bhatnagar, Veereshwar

    2017-01-01

    Females with Krickenbeck low-type anorectal malformations - vestibular fistula (VF) and perineal fistula (PF) - are managed either by a primary definitive or conventional three-staged approach. Ultimate outcome in these children may be affected by wound dehiscence leading to healing by fibrosis. Most of the literature favors one approach over other based on retrospective analysis of their outcomes. Whether a statistically significant difference in wound dehiscence rates between these approaches exists needed to be seen. A randomized controlled trial for girls <14 years with VF or PF was done. Random tables were used to randomize 33 children to Group I (primary procedure) and 31 to Group II (three-staged procedure). Statistical analysis was done for significance of difference ( P < 0.05) in the primary outcome (wound dehiscence) and secondary outcomes (immediate and early postoperative complications). Of the 64 children randomized, 54 (84%) had VF. Both groups were comparable in demography, clinical profile and age at surgery. The incidence of wound dehiscence (39.4% vs. 18.2%; P = 0.04), immediate postoperative complications (51.5% vs. 12.9%; P = 0.001), and early postoperative complications (42.4% vs. 12.9%; P = 0.01) was significantly higher in Group I as compared to Group II. Six of 13 children (46.2%) with dehiscence in Group I required a diverting colostomy to be made. Females with VF or PF undergoing primary definitive procedure have a significantly higher incidence of wound dehiscence ( P = 0.04), immediate ( P = 0.001) and early postoperative complications ( P = 0.01).

  2. Wide brick tunnel randomization - an unequal allocation procedure that limits the imbalance in treatment totals.

    Science.gov (United States)

    Kuznetsova, Olga M; Tymofyeyev, Yevgen

    2014-04-30

    In open-label studies, partial predictability of permuted block randomization provides potential for selection bias. To lessen the selection bias in two-arm studies with equal allocation, a number of allocation procedures that limit the imbalance in treatment totals at a pre-specified level but do not require the exact balance at the ends of the blocks were developed. In studies with unequal allocation, however, the task of designing a randomization procedure that sets a pre-specified limit on imbalance in group totals is not resolved. Existing allocation procedures either do not preserve the allocation ratio at every allocation or do not include all allocation sequences that comply with the pre-specified imbalance threshold. Kuznetsova and Tymofyeyev described the brick tunnel randomization for studies with unequal allocation that preserves the allocation ratio at every step and, in the two-arm case, includes all sequences that satisfy the smallest possible imbalance threshold. This article introduces wide brick tunnel randomization for studies with unequal allocation that allows all allocation sequences with imbalance not exceeding any pre-specified threshold while preserving the allocation ratio at every step. In open-label studies, allowing a larger imbalance in treatment totals lowers selection bias because of the predictability of treatment assignments. The applications of the technique in two-arm and multi-arm open-label studies with unequal allocation are described. Copyright © 2013 John Wiley & Sons, Ltd.

  3. 9 CFR 147.8 - Procedures for preparing egg yolk samples for diagnostic tests.

    Science.gov (United States)

    2010-01-01

    ... samples for diagnostic tests. 147.8 Section 147.8 Animals and Animal Products ANIMAL AND PLANT HEALTH... IMPROVEMENT PLAN Blood Testing Procedures § 147.8 Procedures for preparing egg yolk samples for diagnostic... for diagnostic testing. (b) The authorized laboratory must identify each egg as to the breeding flock...

  4. Random sampling of evolution time space and Fourier transform processing

    International Nuclear Information System (INIS)

    Kazimierczuk, Krzysztof; Zawadzka, Anna; Kozminski, Wiktor; Zhukov, Igor

    2006-01-01

    Application of Fourier Transform for processing 3D NMR spectra with random sampling of evolution time space is presented. The 2D FT is calculated for pairs of frequencies, instead of conventional sequence of one-dimensional transforms. Signal to noise ratios and linewidths for different random distributions were investigated by simulations and experiments. The experimental examples include 3D HNCA, HNCACB and 15 N-edited NOESY-HSQC spectra of 13 C 15 N labeled ubiquitin sample. Obtained results revealed general applicability of proposed method and the significant improvement of resolution in comparison with conventional spectra recorded in the same time

  5. On the assessment of extremely low breakdown probabilities by an inverse sampling procedure [gaseous insulation

    DEFF Research Database (Denmark)

    Thyregod, Poul; Vibholm, Svend

    1991-01-01

    the flashover probability function and the corresponding distribution of first breakdown voltages under the inverse sampling procedure, and show how this relation may be utilized to assess the single-shot flashover probability corresponding to the observed average first breakdown voltage. Since the procedure......First breakdown voltages obtained under the inverse sampling procedure assuming a double exponential flashover probability function are discussed. An inverse sampling procedure commences the voltage application at a very low level, followed by applications at stepwise increased levels until...... is based on voltage applications in the neighbourhood of the quantile under investigation, the procedure is found to be insensitive to the underlying distributional assumptions...

  6. Relations among conceptual knowledge, procedural knowledge, and procedural flexibility in two samples differing in prior knowledge.

    Science.gov (United States)

    Schneider, Michael; Rittle-Johnson, Bethany; Star, Jon R

    2011-11-01

    Competence in many domains rests on children developing conceptual and procedural knowledge, as well as procedural flexibility. However, research on the developmental relations between these different types of knowledge has yielded unclear results, in part because little attention has been paid to the validity of the measures or to the effects of prior knowledge on the relations. To overcome these problems, we modeled the three constructs in the domain of equation solving as latent factors and tested (a) whether the predictive relations between conceptual and procedural knowledge were bidirectional, (b) whether these interrelations were moderated by prior knowledge, and (c) how both constructs contributed to procedural flexibility. We analyzed data from 2 measurement points each from two samples (Ns = 228 and 304) of middle school students who differed in prior knowledge. Conceptual and procedural knowledge had stable bidirectional relations that were not moderated by prior knowledge. Both kinds of knowledge contributed independently to procedural flexibility. The results demonstrate how changes in complex knowledge structures contribute to competence development.

  7. Sample preparation procedures utilized in microbial metabolomics: An overview.

    Science.gov (United States)

    Patejko, Małgorzata; Jacyna, Julia; Markuszewski, Michał J

    2017-02-01

    Bacteria are remarkably diverse in terms of their size, structure and biochemical properties. Due to this fact, it is hard to develop a universal method for handling bacteria cultures during metabolomic analysis. The choice of suitable processing methods constitutes a key element in any analysis, because only appropriate selection of procedures may provide accurate results, leading to reliable conclusions. Because of that, every analytical experiment concerning bacteria requires individually and very carefully planned research methodology. Although every study varies in terms of sample preparation, there are few general steps to follow while planning experiment, like sampling, separation of cells from growth medium, stopping their metabolism and extraction. As a result of extraction, all intracellular metabolites should be washed out from cell environment. What is more, extraction method utilized cannot cause any chemical decomposition or degradation of the metabolome. Furthermore, chosen extraction method should correlate with analytical technique, so it will not disturb or prolong following sample preparation steps. For those reasons, we observe a need to summarize sample preparation procedures currently utilized in microbial metabolomic studies. In the presented overview, papers concerning analysis of extra- and intracellular metabolites, published over the last decade, have been discussed. Presented work gives some basic guidelines that might be useful while planning experiments in microbial metabolomics. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Comparison of two sample preparation procedures for HPLC determination of ochratoxin A

    Directory of Open Access Journals (Sweden)

    Vuković Gorica L.

    2009-01-01

    Full Text Available In preparation of samples for chromatographic determination of ochratoxin A, two types of columns were used for sample cleanup (SPE and immunoaffinity columns. The first method consisted of liquid-liquid extraction with a mixture of chloroform and phosphoric acid, followed by ion-exchange cleanup on Waters Oasis MAX columns. The sec­ond method consisted of extraction with a mixture of water and methanol, followed by LCTech OtaCLEAN immunoaf­finity column cleanup. Recoveries of the methods were determined at three levels in three repetitions for maize flour, and they were 84% (%RSD = 19.2 for the first method of sample preparation and 101% (%RSD = 2.2 for the second method. Values of LOQ for OTA were 0.25 and 1.00 μg/kg for the IAC and SPE clean-up procedures, respectively. Both methods comply with present regulations, but the MAX sample clean-up procedure should be used as an alternative, since the immunoaffinity column clean-up procedure is characterized by better reproducibility, accuracy, and efficiency.

  9. Random sampling of elementary flux modes in large-scale metabolic networks.

    Science.gov (United States)

    Machado, Daniel; Soons, Zita; Patil, Kiran Raosaheb; Ferreira, Eugénio C; Rocha, Isabel

    2012-09-15

    The description of a metabolic network in terms of elementary (flux) modes (EMs) provides an important framework for metabolic pathway analysis. However, their application to large networks has been hampered by the combinatorial explosion in the number of modes. In this work, we develop a method for generating random samples of EMs without computing the whole set. Our algorithm is an adaptation of the canonical basis approach, where we add an additional filtering step which, at each iteration, selects a random subset of the new combinations of modes. In order to obtain an unbiased sample, all candidates are assigned the same probability of getting selected. This approach avoids the exponential growth of the number of modes during computation, thus generating a random sample of the complete set of EMs within reasonable time. We generated samples of different sizes for a metabolic network of Escherichia coli, and observed that they preserve several properties of the full EM set. It is also shown that EM sampling can be used for rational strain design. A well distributed sample, that is representative of the complete set of EMs, should be suitable to most EM-based methods for analysis and optimization of metabolic networks. Source code for a cross-platform implementation in Python is freely available at http://code.google.com/p/emsampler. dmachado@deb.uminho.pt Supplementary data are available at Bioinformatics online.

  10. Revisiting random walk based sampling in networks: evasion of burn-in period and frequent regenerations.

    Science.gov (United States)

    Avrachenkov, Konstantin; Borkar, Vivek S; Kadavankandy, Arun; Sreedharan, Jithin K

    2018-01-01

    In the framework of network sampling, random walk (RW) based estimation techniques provide many pragmatic solutions while uncovering the unknown network as little as possible. Despite several theoretical advances in this area, RW based sampling techniques usually make a strong assumption that the samples are in stationary regime, and hence are impelled to leave out the samples collected during the burn-in period. This work proposes two sampling schemes without burn-in time constraint to estimate the average of an arbitrary function defined on the network nodes, for example, the average age of users in a social network. The central idea of the algorithms lies in exploiting regeneration of RWs at revisits to an aggregated super-node or to a set of nodes, and in strategies to enhance the frequency of such regenerations either by contracting the graph or by making the hitting set larger. Our first algorithm, which is based on reinforcement learning (RL), uses stochastic approximation to derive an estimator. This method can be seen as intermediate between purely stochastic Markov chain Monte Carlo iterations and deterministic relative value iterations. The second algorithm, which we call the Ratio with Tours (RT)-estimator, is a modified form of respondent-driven sampling (RDS) that accommodates the idea of regeneration. We study the methods via simulations on real networks. We observe that the trajectories of RL-estimator are much more stable than those of standard random walk based estimation procedures, and its error performance is comparable to that of respondent-driven sampling (RDS) which has a smaller asymptotic variance than many other estimators. Simulation studies also show that the mean squared error of RT-estimator decays much faster than that of RDS with time. The newly developed RW based estimators (RL- and RT-estimators) allow to avoid burn-in period, provide better control of stability along the sample path, and overall reduce the estimation time. Our

  11. Effect of music in endoscopy procedures: systematic review and meta-analysis of randomized controlled trials.

    Science.gov (United States)

    Wang, Man Cai; Zhang, Ling Yi; Zhang, Yu Long; Zhang, Ya Wu; Xu, Xiao Dong; Zhang, You Cheng

    2014-10-01

    Endoscopies are common clinical examinations that are somewhat painful and even cause fear and anxiety for patients. We performed this systematic review and meta-analysis of randomized controlled trials to determine the effect of music on patients undergoing various endoscopic procedures. We searched the Cochrane Library, Issue 6, 2013, PubMed, and EMBASE databases up to July 2013. Randomized controlled trials comparing endoscopies, with and without the use of music, were included. Two authors independently abstracted data and assessed risk of bias. Subgroup analyses were performed to examine the impact of music on different types of endoscopic procedures. Twenty-one randomized controlled trials involving 2,134 patients were included. The overall effect of music on patients undergoing a variety of endoscopic procedures significantly improved pain score (weighted mean difference [WMD] = -1.53, 95% confidence interval [CI] [-2.53, -0.53]), anxiety (WMD = -6.04, 95% CI [-9.61, -2.48]), heart rate (P = 0.01), arterial pressure (P music group, compared with the control group. Furthermore, music had little effect for patients undergoing colposcopy and bronchoscopy in the subanalysis. Our meta-analysis suggested that music may offer benefits for patients undergoing endoscopy, except in colposcopy and bronchoscopy. Wiley Periodicals, Inc.

  12. Fast egg collection method greatly improves randomness of egg sampling in Drosophila melanogaster

    DEFF Research Database (Denmark)

    Schou, Mads Fristrup

    2013-01-01

    When obtaining samples for population genetic studies, it is essential that the sampling is random. For Drosophila, one of the crucial steps in sampling experimental flies is the collection of eggs. Here an egg collection method is presented, which randomizes the eggs in a water column...... and diminishes environmental variance. This method was compared with a traditional egg collection method where eggs are collected directly from the medium. Within each method the observed and expected standard deviations of egg-to-adult viability were compared, whereby the difference in the randomness...... and to obtain a representative collection of genotypes, the method presented here is strongly recommended when collecting eggs from Drosophila....

  13. CONSISTENCY UNDER SAMPLING OF EXPONENTIAL RANDOM GRAPH MODELS.

    Science.gov (United States)

    Shalizi, Cosma Rohilla; Rinaldo, Alessandro

    2013-04-01

    The growing availability of network data and of scientific interest in distributed systems has led to the rapid development of statistical models of network structure. Typically, however, these are models for the entire network, while the data consists only of a sampled sub-network. Parameters for the whole network, which is what is of interest, are estimated by applying the model to the sub-network. This assumes that the model is consistent under sampling , or, in terms of the theory of stochastic processes, that it defines a projective family. Focusing on the popular class of exponential random graph models (ERGMs), we show that this apparently trivial condition is in fact violated by many popular and scientifically appealing models, and that satisfying it drastically limits ERGM's expressive power. These results are actually special cases of more general results about exponential families of dependent random variables, which we also prove. Using such results, we offer easily checked conditions for the consistency of maximum likelihood estimation in ERGMs, and discuss some possible constructive responses.

  14. Reconstructing random media

    International Nuclear Information System (INIS)

    Yeong, C.L.; Torquato, S.

    1998-01-01

    We formulate a procedure to reconstruct the structure of general random heterogeneous media from limited morphological information by extending the methodology of Rintoul and Torquato [J. Colloid Interface Sci. 186, 467 (1997)] developed for dispersions. The procedure has the advantages that it is simple to implement and generally applicable to multidimensional, multiphase, and anisotropic structures. Furthermore, an extremely useful feature is that it can incorporate any type and number of correlation functions in order to provide as much morphological information as is necessary for accurate reconstruction. We consider a variety of one- and two-dimensional reconstructions, including periodic and random arrays of rods, various distribution of disks, Debye random media, and a Fontainebleau sandstone sample. We also use our algorithm to construct heterogeneous media from specified hypothetical correlation functions, including an exponentially damped, oscillating function as well as physically unrealizable ones. copyright 1998 The American Physical Society

  15. A comparative examination of sample treatment procedures for ICAP-AES analysis of biological tissue

    Science.gov (United States)

    De Boer, J. L. M.; Maessen, F. J. M. J.

    The objective of this study was to contribute to the evaluation of existing sample preparation procedures for ICAP-AES analysis of biological material. Performance characteristics were established of current digestion procedures comprising extraction, solubilization, pressure digestion, and wet and dry ashing methods. Apart from accuracy and precision, a number of criteria of special interest for the analytical practice was applied. As a test sample served SRM bovine liver. In this material six elements were simultaneously determined. Results showed that every procedure has its defects and advantages. Hence, unambiguous recommendation of standard digestion procedures can be made only when taking into account the specific analytical problem.

  16. Characterization of electron microscopes with binary pseudo-random multilayer test samples

    Science.gov (United States)

    Yashchuk, Valeriy V.; Conley, Raymond; Anderson, Erik H.; Barber, Samuel K.; Bouet, Nathalie; McKinney, Wayne R.; Takacs, Peter Z.; Voronov, Dmitriy L.

    2011-09-01

    Verification of the reliability of metrology data from high quality X-ray optics requires that adequate methods for test and calibration of the instruments be developed. For such verification for optical surface profilometers in the spatial frequency domain, a modulation transfer function (MTF) calibration method based on binary pseudo-random (BPR) gratings and arrays has been suggested [1,2] and proven to be an effective calibration method for a number of interferometric microscopes, a phase shifting Fizeau interferometer, and a scatterometer [5]. Here we describe the details of development of binary pseudo-random multilayer (BPRML) test samples suitable for characterization of scanning (SEM) and transmission (TEM) electron microscopes. We discuss the results of TEM measurements with the BPRML test samples fabricated from a WiSi 2/Si multilayer coating with pseudo-randomly distributed layers. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML test samples can be used to characterize X-ray microscopes. Corresponding work with X-ray microscopes is in progress.

  17. Characterization of electron microscopes with binary pseudo-random multilayer test samples

    International Nuclear Information System (INIS)

    Yashchuk, Valeriy V.; Conley, Raymond; Anderson, Erik H.; Barber, Samuel K.; Bouet, Nathalie; McKinney, Wayne R.; Takacs, Peter Z.; Voronov, Dmitriy L.

    2011-01-01

    Verification of the reliability of metrology data from high quality X-ray optics requires that adequate methods for test and calibration of the instruments be developed. For such verification for optical surface profilometers in the spatial frequency domain, a modulation transfer function (MTF) calibration method based on binary pseudo-random (BPR) gratings and arrays has been suggested and proven to be an effective calibration method for a number of interferometric microscopes, a phase shifting Fizeau interferometer, and a scatterometer [5]. Here we describe the details of development of binary pseudo-random multilayer (BPRML) test samples suitable for characterization of scanning (SEM) and transmission (TEM) electron microscopes. We discuss the results of TEM measurements with the BPRML test samples fabricated from a WiSi 2 /Si multilayer coating with pseudo-randomly distributed layers. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML test samples can be used to characterize X-ray microscopes. Corresponding work with X-ray microscopes is in progress.

  18. LOD score exclusion analyses for candidate genes using random population samples.

    Science.gov (United States)

    Deng, H W; Li, J; Recker, R R

    2001-05-01

    While extensive analyses have been conducted to test for, no formal analyses have been conducted to test against, the importance of candidate genes with random population samples. We develop a LOD score approach for exclusion analyses of candidate genes with random population samples. Under this approach, specific genetic effects and inheritance models at candidate genes can be analysed and if a LOD score is < or = - 2.0, the locus can be excluded from having an effect larger than that specified. Computer simulations show that, with sample sizes often employed in association studies, this approach has high power to exclude a gene from having moderate genetic effects. In contrast to regular association analyses, population admixture will not affect the robustness of our analyses; in fact, it renders our analyses more conservative and thus any significant exclusion result is robust. Our exclusion analysis complements association analysis for candidate genes in random population samples and is parallel to the exclusion mapping analyses that may be conducted in linkage analyses with pedigrees or relative pairs. The usefulness of the approach is demonstrated by an application to test the importance of vitamin D receptor and estrogen receptor genes underlying the differential risk to osteoporotic fractures.

  19. Analytical procedures for determining Pb and Sr isotopic compositions in water samples by ID-TIMS

    Directory of Open Access Journals (Sweden)

    Veridiana Martins

    2008-01-01

    Full Text Available Few articles deal with lead and strontium isotopic analysis of water samples. The aim of this study was to define the chemical procedures for Pb and Sr isotopic analyses of groundwater samples from an urban sedimentary aquifer. Thirty lead and fourteen strontium isotopic analyses were performed to test different analytical procedures. Pb and Sr isotopic ratios as well as Sr concentration did not vary using different chemical procedures. However, the Pb concentrations were very dependent on the different procedures. Therefore, the choice of the best analytical procedure was based on the Pb results, which indicated a higher reproducibility from samples that had been filtered and acidified before the evaporation, had their residues totally dissolved, and were purified by ion chromatography using the Biorad® column. Our results showed no changes in Pb ratios with the storage time.

  20. Comparison of sample preparation procedures on metal(loid) fractionation patterns in lichens.

    Science.gov (United States)

    Kroukamp, E M; Godeto, T W; Forbes, P B C

    2017-08-13

    The effects of different sample preparation strategies and storage on metal(loid) fractionation trends in plant material is largely underresearched. In this study, a bulk sample of lichen Parmotrema austrosinense (Zahlbr.) Hale was analysed for its total extractable metal(loid) content by ICP-MS, and was determined to be adequately homogenous (sample were prepared utilising a range of sample preservation techniques and subjected to a modified sequential extraction procedure or to total metal extraction. Both experiments were repeated after 1-month storage at 4 °C. Cryogenic freezing gave the best reproducibility for total extractable elemental concentrations between months, indicating this to be the most suitable method of sample preparation in such studies. The combined extraction efficiencies were >82% for As, Cu, Mn, Pb, Sr and Zn but poor for other elements, where sample preparation strategies 'no sample preparation' and 'dried in a desiccator' had the best extraction recoveries. Cryogenic freezing procedures had a significantly (p sample cleaning and preservation when species fractionation patterns are of interest. This study also shows that the assumption that species stability can be ensured through cryopreservation and freeze drying techniques needs to be revisited.

  1. Development of Radioanalytical and Microanalytical Procedures for the Determination of Actinides in Environmental Samples

    International Nuclear Information System (INIS)

    Macsik, Zsuzsanna; Vajda, Nora; Bene, Balazs; Varga, Zsolt

    2008-01-01

    A radio-analytical procedure has been developed for the simultaneous determination of actinides in swipe samples by alpha-spectrometry after the separation of the actinides by extraction chromatography. The procedure is based on the complete decomposition of the sample by destruction with microwave digestion or ashing in furnace. Actinides are separated on an extraction chromatographic column filled with TRU resin (product of Eichrom Industries Inc.). Alpha sources prepared from the separated fractions of americium, plutonium, thorium and uranium are counted by alpha spectrometry. Micro-analytical procedure is being developed for the location and identification of individual particles containing fissile material using solid state nuclear track detectors. The parameters of alpha and fission track detection have been optimized and a procedure has been elaborated to locate the particles on the sample by defining the coordinates of the tracks created by the particles on the track detector. Development of a procedure is planned to separate the located particles using micromanipulator and these particles will be examined individually by different micro- and radio-analytical techniques. (authors)

  2. Development of Radioanalytical and Microanalytical Procedures for the Determination of Actinides in Environmental Samples

    Energy Technology Data Exchange (ETDEWEB)

    Macsik, Zsuzsanna [Institute of Nuclear Techniques, Moegyetem rakpart 9, H-1111 Budapest (Hungary); Vajda, Nora [RadAnal Ltd., Bimbo ut 119/a, H-1026 Budapest (Hungary); Bene, Balazs [National Institute of Standards and Technology, Gaithersburg, MD 20899 (United States); Varga, Zsolt [Institute of Isotopes, Konkoly-Thege M. ut 29-33, H-1121 Budapest (Hungary)

    2008-07-01

    A radio-analytical procedure has been developed for the simultaneous determination of actinides in swipe samples by alpha-spectrometry after the separation of the actinides by extraction chromatography. The procedure is based on the complete decomposition of the sample by destruction with microwave digestion or ashing in furnace. Actinides are separated on an extraction chromatographic column filled with TRU resin (product of Eichrom Industries Inc.). Alpha sources prepared from the separated fractions of americium, plutonium, thorium and uranium are counted by alpha spectrometry. Micro-analytical procedure is being developed for the location and identification of individual particles containing fissile material using solid state nuclear track detectors. The parameters of alpha and fission track detection have been optimized and a procedure has been elaborated to locate the particles on the sample by defining the coordinates of the tracks created by the particles on the track detector. Development of a procedure is planned to separate the located particles using micromanipulator and these particles will be examined individually by different micro- and radio-analytical techniques. (authors)

  3. Random On-Board Pixel Sampling (ROPS) X-Ray Camera

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Zhehui [Los Alamos; Iaroshenko, O. [Los Alamos; Li, S. [Los Alamos; Liu, T. [Fermilab; Parab, N. [Argonne (main); Chen, W. W. [Purdue U.; Chu, P. [Los Alamos; Kenyon, G. [Los Alamos; Lipton, R. [Fermilab; Sun, K.-X. [Nevada U., Las Vegas

    2017-09-25

    Recent advances in compressed sensing theory and algorithms offer new possibilities for high-speed X-ray camera design. In many CMOS cameras, each pixel has an independent on-board circuit that includes an amplifier, noise rejection, signal shaper, an analog-to-digital converter (ADC), and optional in-pixel storage. When X-ray images are sparse, i.e., when one of the following cases is true: (a.) The number of pixels with true X-ray hits is much smaller than the total number of pixels; (b.) The X-ray information is redundant; or (c.) Some prior knowledge about the X-ray images exists, sparse sampling may be allowed. Here we first illustrate the feasibility of random on-board pixel sampling (ROPS) using an existing set of X-ray images, followed by a discussion about signal to noise as a function of pixel size. Next, we describe a possible circuit architecture to achieve random pixel access and in-pixel storage. The combination of a multilayer architecture, sparse on-chip sampling, and computational image techniques, is expected to facilitate the development and applications of high-speed X-ray camera technology.

  4. Sampling pig farms at the abattoir in a cross-sectional study - Evaluation of a sampling method.

    Science.gov (United States)

    Birkegård, Anna Camilla; Halasa, Tariq; Toft, Nils

    2017-09-15

    A cross-sectional study design is relatively inexpensive, fast and easy to conduct when compared to other study designs. Careful planning is essential to obtaining a representative sample of the population, and the recommended approach is to use simple random sampling from an exhaustive list of units in the target population. This approach is rarely feasible in practice, and other sampling procedures must often be adopted. For example, when slaughter pigs are the target population, sampling the pigs on the slaughter line may be an alternative to on-site sampling at a list of farms. However, it is difficult to sample a large number of farms from an exact predefined list, due to the logistics and workflow of an abattoir. Therefore, it is necessary to have a systematic sampling procedure and to evaluate the obtained sample with respect to the study objective. We propose a method for 1) planning, 2) conducting, and 3) evaluating the representativeness and reproducibility of a cross-sectional study when simple random sampling is not possible. We used an example of a cross-sectional study with the aim of quantifying the association of antimicrobial resistance and antimicrobial consumption in Danish slaughter pigs. It was not possible to visit farms within the designated timeframe. Therefore, it was decided to use convenience sampling at the abattoir. Our approach was carried out in three steps: 1) planning: using data from meat inspection to plan at which abattoirs and how many farms to sample; 2) conducting: sampling was carried out at five abattoirs; 3) evaluation: representativeness was evaluated by comparing sampled and non-sampled farms, and the reproducibility of the study was assessed through simulated sampling based on meat inspection data from the period where the actual data collection was carried out. In the cross-sectional study samples were taken from 681 Danish pig farms, during five weeks from February to March 2015. The evaluation showed that the sampling

  5. Random Sampling with Interspike-Intervals of the Exponential Integrate and Fire Neuron: A Computational Interpretation of UP-States.

    Directory of Open Access Journals (Sweden)

    Andreas Steimer

    Full Text Available Oscillations between high and low values of the membrane potential (UP and DOWN states respectively are an ubiquitous feature of cortical neurons during slow wave sleep and anesthesia. Nevertheless, a surprisingly small number of quantitative studies have been conducted only that deal with this phenomenon's implications for computation. Here we present a novel theory that explains on a detailed mathematical level the computational benefits of UP states. The theory is based on random sampling by means of interspike intervals (ISIs of the exponential integrate and fire (EIF model neuron, such that each spike is considered a sample, whose analog value corresponds to the spike's preceding ISI. As we show, the EIF's exponential sodium current, that kicks in when balancing a noisy membrane potential around values close to the firing threshold, leads to a particularly simple, approximative relationship between the neuron's ISI distribution and input current. Approximation quality depends on the frequency spectrum of the current and is improved upon increasing the voltage baseline towards threshold. Thus, the conceptually simpler leaky integrate and fire neuron that is missing such an additional current boost performs consistently worse than the EIF and does not improve when voltage baseline is increased. For the EIF in contrast, the presented mechanism is particularly effective in the high-conductance regime, which is a hallmark feature of UP-states. Our theoretical results are confirmed by accompanying simulations, which were conducted for input currents of varying spectral composition. Moreover, we provide analytical estimations of the range of ISI distributions the EIF neuron can sample from at a given approximation level. Such samples may be considered by any algorithmic procedure that is based on random sampling, such as Markov Chain Monte Carlo or message-passing methods. Finally, we explain how spike-based random sampling relates to existing

  6. Random Sampling with Interspike-Intervals of the Exponential Integrate and Fire Neuron: A Computational Interpretation of UP-States.

    Science.gov (United States)

    Steimer, Andreas; Schindler, Kaspar

    2015-01-01

    Oscillations between high and low values of the membrane potential (UP and DOWN states respectively) are an ubiquitous feature of cortical neurons during slow wave sleep and anesthesia. Nevertheless, a surprisingly small number of quantitative studies have been conducted only that deal with this phenomenon's implications for computation. Here we present a novel theory that explains on a detailed mathematical level the computational benefits of UP states. The theory is based on random sampling by means of interspike intervals (ISIs) of the exponential integrate and fire (EIF) model neuron, such that each spike is considered a sample, whose analog value corresponds to the spike's preceding ISI. As we show, the EIF's exponential sodium current, that kicks in when balancing a noisy membrane potential around values close to the firing threshold, leads to a particularly simple, approximative relationship between the neuron's ISI distribution and input current. Approximation quality depends on the frequency spectrum of the current and is improved upon increasing the voltage baseline towards threshold. Thus, the conceptually simpler leaky integrate and fire neuron that is missing such an additional current boost performs consistently worse than the EIF and does not improve when voltage baseline is increased. For the EIF in contrast, the presented mechanism is particularly effective in the high-conductance regime, which is a hallmark feature of UP-states. Our theoretical results are confirmed by accompanying simulations, which were conducted for input currents of varying spectral composition. Moreover, we provide analytical estimations of the range of ISI distributions the EIF neuron can sample from at a given approximation level. Such samples may be considered by any algorithmic procedure that is based on random sampling, such as Markov Chain Monte Carlo or message-passing methods. Finally, we explain how spike-based random sampling relates to existing computational

  7. Misrepresenting random sampling? A systematic review of research papers in the Journal of Advanced Nursing.

    Science.gov (United States)

    Williamson, Graham R

    2003-11-01

    This paper discusses the theoretical limitations of the use of random sampling and probability theory in the production of a significance level (or P-value) in nursing research. Potential alternatives, in the form of randomization tests, are proposed. Research papers in nursing, medicine and psychology frequently misrepresent their statistical findings, as the P-values reported assume random sampling. In this systematic review of studies published between January 1995 and June 2002 in the Journal of Advanced Nursing, 89 (68%) studies broke this assumption because they used convenience samples or entire populations. As a result, some of the findings may be questionable. The key ideas of random sampling and probability theory for statistical testing (for generating a P-value) are outlined. The result of a systematic review of research papers published in the Journal of Advanced Nursing is then presented, showing how frequently random sampling appears to have been misrepresented. Useful alternative techniques that might overcome these limitations are then discussed. REVIEW LIMITATIONS: This review is limited in scope because it is applied to one journal, and so the findings cannot be generalized to other nursing journals or to nursing research in general. However, it is possible that other nursing journals are also publishing research articles based on the misrepresentation of random sampling. The review is also limited because in several of the articles the sampling method was not completely clearly stated, and in this circumstance a judgment has been made as to the sampling method employed, based on the indications given by author(s). Quantitative researchers in nursing should be very careful that the statistical techniques they use are appropriate for the design and sampling methods of their studies. If the techniques they employ are not appropriate, they run the risk of misinterpreting findings by using inappropriate, unrepresentative and biased samples.

  8. Operability test procedure for PFP wastewater sampling facility

    International Nuclear Information System (INIS)

    Hirzel, D.R.

    1995-01-01

    Document provides instructions for performing the Operability Test of the 225-WC Wastewater Sampling Station which monitors the discharge to the Treated Effluent Disposal Facility from the Plutonium Finishing Plant. This Operability Test Procedure (OTP) has been prepared to verify correct configuration and performance of the PFP Wastewater sampling system installed in Building 225-WC located outside the perimeter fence southeast of the Plutonium Finishing Plant (PFP). The objective of this test is to ensure the equipment in the sampling facility operates in a safe and reliable manner. The sampler consists of two Manning Model S-5000 units which are rate controlled by the Milltronics Ultrasonic flowmeter at manhole No.C4 and from a pH measuring system with the sensor in the stream adjacent to the sample point. The intent of the dual sampling system is to utilize one unit to sample continuously at a rate proportional to the wastewater flow rate so that the aggregate tests are related to the overall flow and thereby eliminate isolated analyses. The second unit will only operate during a high or low pH excursion of the stream (hence the need for a pH control). The major items in this OTP include testing of the Manning Sampler System and associated equipment including the pH measuring and control system, the conductivity monitor, and the flow meter

  9. Generalized procedures for determining inspection sample sizes (related to quantitative measurements). Vol. 1: Detailed explanations

    International Nuclear Information System (INIS)

    Jaech, J.L.; Lemaire, R.J.

    1986-11-01

    Generalized procedures have been developed to determine sample sizes in connection with the planning of inspection activities. These procedures are based on different measurement methods. They are applied mainly to Bulk Handling Facilities and Physical Inventory Verifications. The present report attempts (i) to assign to appropriate statistical testers (viz. testers for gross, partial and small defects) the measurement methods to be used, and (ii) to associate the measurement uncertainties with the sample sizes required for verification. Working papers are also provided to assist in the application of the procedures. This volume contains the detailed explanations concerning the above mentioned procedures

  10. Considerations for sampling nuclear materials for SNM accounting measurements

    International Nuclear Information System (INIS)

    Brouns, R.J.; Roberts, F.P.; Upson, U.L.

    1978-01-01

    This report presents principles and guidelines for sampling nuclear materials to measure chemical and isotopic content of the material. Development of sampling plans and procedures that maintain the random and systematic errors of sampling within acceptable limits for SNM accounting purposes are emphasized

  11. Smoothing the redshift distributions of random samples for the baryon acoustic oscillations: applications to the SDSS-III BOSS DR12 and QPM mock samples

    Science.gov (United States)

    Wang, Shao-Jiang; Guo, Qi; Cai, Rong-Gen

    2017-12-01

    We investigate the impact of different redshift distributions of random samples on the baryon acoustic oscillations (BAO) measurements of D_V(z)r_d^fid/r_d from the two-point correlation functions of galaxies in the Data Release 12 of the Baryon Oscillation Spectroscopic Survey (BOSS). Big surveys, such as BOSS, usually assign redshifts to the random samples by randomly drawing values from the measured redshift distributions of the data, which would necessarily introduce fiducial signals of fluctuations into the random samples, weakening the signals of BAO, if the cosmic variance cannot be ignored. We propose a smooth function of redshift distribution that fits the data well to populate the random galaxy samples. The resulting cosmological parameters match the input parameters of the mock catalogue very well. The significance of BAO signals has been improved by 0.33σ for a low-redshift sample and by 0.03σ for a constant-stellar-mass sample, though the absolute values do not change significantly. Given the precision of the measurements of current cosmological parameters, it would be appreciated for the future improvements on the measurements of galaxy clustering.

  12. Toward a Principled Sampling Theory for Quasi-Orders.

    Science.gov (United States)

    Ünlü, Ali; Schrepp, Martin

    2016-01-01

    Quasi-orders, that is, reflexive and transitive binary relations, have numerous applications. In educational theories, the dependencies of mastery among the problems of a test can be modeled by quasi-orders. Methods such as item tree or Boolean analysis that mine for quasi-orders in empirical data are sensitive to the underlying quasi-order structure. These data mining techniques have to be compared based on extensive simulation studies, with unbiased samples of randomly generated quasi-orders at their basis. In this paper, we develop techniques that can provide the required quasi-order samples. We introduce a discrete doubly inductive procedure for incrementally constructing the set of all quasi-orders on a finite item set. A randomization of this deterministic procedure allows us to generate representative samples of random quasi-orders. With an outer level inductive algorithm, we consider the uniform random extensions of the trace quasi-orders to higher dimension. This is combined with an inner level inductive algorithm to correct the extensions that violate the transitivity property. The inner level correction step entails sampling biases. We propose three algorithms for bias correction and investigate them in simulation. It is evident that, on even up to 50 items, the new algorithms create close to representative quasi-order samples within acceptable computing time. Hence, the principled approach is a significant improvement to existing methods that are used to draw quasi-orders uniformly at random but cannot cope with reasonably large item sets.

  13. Toward a Principled Sampling Theory for Quasi-Orders

    Science.gov (United States)

    Ünlü, Ali; Schrepp, Martin

    2016-01-01

    Quasi-orders, that is, reflexive and transitive binary relations, have numerous applications. In educational theories, the dependencies of mastery among the problems of a test can be modeled by quasi-orders. Methods such as item tree or Boolean analysis that mine for quasi-orders in empirical data are sensitive to the underlying quasi-order structure. These data mining techniques have to be compared based on extensive simulation studies, with unbiased samples of randomly generated quasi-orders at their basis. In this paper, we develop techniques that can provide the required quasi-order samples. We introduce a discrete doubly inductive procedure for incrementally constructing the set of all quasi-orders on a finite item set. A randomization of this deterministic procedure allows us to generate representative samples of random quasi-orders. With an outer level inductive algorithm, we consider the uniform random extensions of the trace quasi-orders to higher dimension. This is combined with an inner level inductive algorithm to correct the extensions that violate the transitivity property. The inner level correction step entails sampling biases. We propose three algorithms for bias correction and investigate them in simulation. It is evident that, on even up to 50 items, the new algorithms create close to representative quasi-order samples within acceptable computing time. Hence, the principled approach is a significant improvement to existing methods that are used to draw quasi-orders uniformly at random but cannot cope with reasonably large item sets. PMID:27965601

  14. Continuous quality control of the blood sampling procedure using a structured observation scheme

    DEFF Research Database (Denmark)

    Seemann, T. L.; Nybo, M.

    2015-01-01

    Background: An important preanalytical factor is the blood sampling procedure and its adherence to the guidelines, i.e. CLSI and ISO 15189, in order to ensure a consistent quality of the blood collection. Therefore, it is critically important to introduce quality control on this part of the process....... As suggested by the EFLM working group on the preanalytical phase we introduced continuous quality control of the blood sampling procedure using a structured observation scheme to monitor the quality of blood sampling performed on an everyday basis. Materials and methods: Based on our own routines the EFLM....... Conclusion: It is possible to establish a continuous quality control on blood sampling. It has been well accepted by the staff and we have already been able to identify critical areas in the sampling process. We find that continuous auditing increase focus on the quality of blood collection which ensures...

  15. Hartmann's Procedure or Primary Anastomosis for Generalized Peritonitis due to Perforated Diverticulitis: A Prospective Multicenter Randomized Trial (DIVERTI).

    Science.gov (United States)

    Bridoux, Valerie; Regimbeau, Jean Marc; Ouaissi, Mehdi; Mathonnet, Muriel; Mauvais, Francois; Houivet, Estelle; Schwarz, Lilian; Mege, Diane; Sielezneff, Igor; Sabbagh, Charles; Tuech, Jean-Jacques

    2017-12-01

    About 25% of patients with acute diverticulitis require emergency intervention. Currently, most patients with diverticular peritonitis undergo a Hartmann's procedure. Our objective was to assess whether primary anastomosis (PA) with a diverting stoma results in lower mortality rates than Hartmann's procedure (HP) in patients with diverticular peritonitis. We conducted a multicenter randomized controlled trial conducted between June 2008 and May 2012: the DIVERTI (Primary vs Secondary Anastomosis for Hinchey Stage III-IV Diverticulitis) trial. Follow-up duration was up to 18 months. A random sample of 102 eligible participants with purulent or fecal diverticular peritonitis from tertiary care referral centers and associated centers in France were equally randomized to either a PA arm or to an HP arm. Data were analyzed on an intention-to-treat basis. The primary end point was mortality rate at 18 months. Secondary outcomes were postoperative complications, operative time, length of hospital stay, rate of definitive stoma, and morbidity. All 102 patients enrolled were comparable for age (p = 0.4453), sex (p = 0.2347), Hinchey stage III vs IV (p = 0.2347), and Mannheim Peritonitis Index (p = 0.0606). Overall mortality did not differ significantly between HP (7.7%) and PA (4%) (p = 0.4233). Morbidity for both resection and stoma reversal operations were comparable (39% in the HP arm vs 44% in the PA arm; p = 0.4233). At 18 months, 96% of PA patients and 65% of HP patients had a stoma reversal (p = 0.0001). Although mortality was similar in both arms, the rate of stoma reversal was significantly higher in the PA arm. This trial provides additional evidence in favor of PA with diverting ileostomy over HP in patients with diverticular peritonitis. ClinicalTrials.gov Identifier: NCT 00692393. Copyright © 2017. Published by Elsevier Inc.

  16. TVT-Exact and midurethral sling (SLING-IUFT) operative procedures: a randomized study.

    Science.gov (United States)

    Aniuliene, Rosita; Aniulis, Povilas; Skaudickas, Darijus

    2015-01-01

    The aim of the study is to compare results, effectiveness and complications of TVT exact and midurethral sling (SLING-IUFT) operations in the treatment of female stress urinary incontinence (SUI). A single center nonblind, randomized study of women with SUI who were randomized to TVT-Exact and SLING-IUFT was performed by one surgeon from April 2009 to April 2011. SUI was diagnosed on coughing and Valsalva test and urodynamics (cystometry and uroflowmetry) were assessed before operation and 1 year after surgery. This was a prospective randomized study. The follow up period was 12 months. 76 patients were operated using the TVT-Exact operation and 78 patients - using the SLING-IUFT operation. There was no statistically significant differences between groups for BMI, parity, menopausal status and prolapsed stage (no patients had cystocele greater than stage II). Mean operative time was significantly shorter in the SLING-IUFT group (19 ± 5.6 min.) compared with the TVT-Exact group (27 ± 7.1 min.). There were statistically significant differences in the effectiveness of both procedures: TVT-Exact - at 94.5% and SLING-IUFT - at 61.2% after one year. Hospital stay was statistically significantly shorter in the SLING-IUFT group (1. 2 ± 0.5 days) compared with the TVT-Exact group (3.5 ± 1.5 days). Statistically significantly fewer complications occurred in the SLING-IUFT group. the TVT-Exact and SLING-IUFT operations are both effective for surgical treatment of female stress urinary incontinence. The SLING-IUFT involved a shorter operation time and lower complications rate., the TVT-Exact procedure had statistically significantly more complications than the SLING-IUFT operation, but a higher effectiveness.

  17. Standard operating procedure for combustion of 14C - samples with OX-500 biological material oxidizer

    International Nuclear Information System (INIS)

    Nashriyah Mat.

    1995-01-01

    This procedure is for the purpose of safe operation of OX-500 biological material oxidizer. For ease of operation, the operation flow chart (including testing the system and sample combustion) and end of day maintenance flow chart were simplified. The front view, diagrams and switches are duly copied from operating manual. Steps on sample preparation are also included for biotic and a biotic samples. This operating procedure is subjected to future reviews

  18. Influence of rice sample preparation and milling procedures on milling quality appraisals

    Science.gov (United States)

    The objective of this research was to investigate the effect of sample preparation and milling procedure on milling quality appraisals of rough rice. Samples of freshly harvested medium-grain rice (M202) with different initial moisture contents (MCs) ranging from 20.2% to 25.1% (w.b.) were used for...

  19. Effect of family presence on pain and anxiety during invasive nursing procedures in an emergency department: A randomized controlled experimental study.

    Science.gov (United States)

    İşlekdemir, Burcu; Kaya, Nurten

    2016-01-01

    Patients generally prefer to have their family present during medical or nursing interventions. Family presence is assumed to reduce anxiety, especially during painful interventions. This study employed a randomized controlled experimental design to determine the effects of family presence on pain and anxiety during invasive nursing procedures. The study population consisted of patients hospitalized in the observation unit of the internal medicine section in the emergency department of a university hospital. The sample comprised 138 patients assigned into the experimental and control groups by drawing lots. The invasive nursing procedure was carried out in the presence of family members, for members of the experimental group, and without family members, for members of the control group. Thus, the effects of family presence on pain and anxiety during the administration of an invasive nursing procedure to patients were analyzed. The results showed that members of the experimental and control groups did not differ with respect to the pain and state anxiety scores during the intervention. Family presence does not influence the participants' pain and anxiety during an invasive nursing procedure. Thus, the decision regarding family presence during such procedures should be based on patient preference. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Prospective randomized assessment of single versus double-gloving for general surgical procedures.

    Science.gov (United States)

    Na'aya, H U; Madziga, A G; Eni, U E

    2009-01-01

    There is increased tendency towards double-gloving by general surgeons in our practice, due probably to awareness of the risk of contamination with blood or other body fluids during surgery. The aim of the study was to compare the relative frequency of glove puncture in single-glove versus double glove sets in general surgical procedures, and to determine if duration of surgery affects perforation rate. Surgeons at random do single or double gloves at their discretion, for general surgical procedures. All the gloves used by the surgeons were assessed immediately after surgery for perforation. A total of 1120 gloves were tested, of which 880 were double-glove sets and 240 single-glove sets. There was no significant difference in the overall perforation rate between single and double glove sets (18.3% versus 20%). However, only 2.3% had perforations in both the outer and inner gloves in the double glove group. Therefore, there was significantly greater risk for blood-skin exposure in the single glove sets (p < 0.01). The perforation rate was also significantly greater during procedures lasting an hour or more compared to those lasting less than an hour (p < 0.01). Double-gloving reduces the risk of blood-skin contamination in all general surgical procedures, and especially so in procedures lasting an hour or more.

  1. Influence of Freezing and Storage Procedure on Human Urine Samples in NMR-Based Metabolomics

    Directory of Open Access Journals (Sweden)

    Burkhard Luy

    2013-04-01

    Full Text Available It is consensus in the metabolomics community that standardized protocols should be followed for sample handling, storage and analysis, as it is of utmost importance to maintain constant measurement conditions to identify subtle biological differences. The aim of this work, therefore, was to systematically investigate the influence of freezing procedures and storage temperatures and their effect on NMR spectra as a potentially disturbing aspect for NMR-based metabolomics studies. Urine samples were collected from two healthy volunteers, centrifuged and divided into aliquots. Urine aliquots were frozen either at −20 °C, on dry ice, at −80 °C or in liquid nitrogen and then stored at −20 °C, −80 °C or in liquid nitrogen vapor phase for 1–5 weeks before NMR analysis. Results show spectral changes depending on the freezing procedure, with samples frozen on dry ice showing the largest deviations. The effect was found to be based on pH differences, which were caused by variations in CO2 concentrations introduced by the freezing procedure. Thus, we recommend that urine samples should be frozen at −20 °C and transferred to lower storage temperatures within one week and that freezing procedures should be part of the publication protocol.

  2. Influence of Freezing and Storage Procedure on Human Urine Samples in NMR-Based Metabolomics.

    Science.gov (United States)

    Rist, Manuela J; Muhle-Goll, Claudia; Görling, Benjamin; Bub, Achim; Heissler, Stefan; Watzl, Bernhard; Luy, Burkhard

    2013-04-09

    It is consensus in the metabolomics community that standardized protocols should be followed for sample handling, storage and analysis, as it is of utmost importance to maintain constant measurement conditions to identify subtle biological differences. The aim of this work, therefore, was to systematically investigate the influence of freezing procedures and storage temperatures and their effect on NMR spectra as a potentially disturbing aspect for NMR-based metabolomics studies. Urine samples were collected from two healthy volunteers, centrifuged and divided into aliquots. Urine aliquots were frozen either at -20 °C, on dry ice, at -80 °C or in liquid nitrogen and then stored at -20 °C, -80 °C or in liquid nitrogen vapor phase for 1-5 weeks before NMR analysis. Results show spectral changes depending on the freezing procedure, with samples frozen on dry ice showing the largest deviations. The effect was found to be based on pH differences, which were caused by variations in CO2 concentrations introduced by the freezing procedure. Thus, we recommend that urine samples should be frozen at -20 °C and transferred to lower storage temperatures within one week and that freezing procedures should be part of the publication protocol.

  3. Randomized branch sampling to estimatefruit production in Pecan trees cv. ‘Barton’

    Directory of Open Access Journals (Sweden)

    Filemom Manoel Mokochinski

    Full Text Available ABSTRACT: Sampling techniques to quantify the production of fruits are still very scarce and create a gap in crop development research. This study was conducted in a rural property in the county of Cachoeira do Sul - RS to estimate the efficiency of randomized branch sampling (RBS in quantifying the production of pecan fruit at three different ages (5,7 and 10 years. Two selection techniques were tested: the probability proportional to the diameter (PPD and the uniform probability (UP techniques, which were performed on nine trees, three from each age and randomly chosen. The RBS underestimated fruit production for all ages, and its main drawback was the high sampling error (125.17% - PPD and 111.04% - UP. The UP was regarded as more efficient than the PPD, though both techniques estimated similar production and similar experimental errors. In conclusion, we reported that branch sampling was inaccurate for this case study, requiring new studies to produce estimates with smaller sampling error.

  4. New procedure of selected biogenic amines determination in wine samples by HPLC

    Energy Technology Data Exchange (ETDEWEB)

    Piasta, Anna M.; Jastrzębska, Aneta, E-mail: aj@chem.uni.torun.pl; Krzemiński, Marek P.; Muzioł, Tadeusz M.; Szłyk, Edward

    2014-06-27

    Highlights: • We proposed new procedure for derivatization of biogenic amines. • The NMR and XRD analysis confirmed the purity and uniqueness of derivatives. • Concentration of biogenic amines in wine samples were analyzed by RP-HPLC. • Sample contamination and derivatization reactions interferences were minimized. - Abstract: A new procedure for determination of biogenic amines (BA): histamine, phenethylamine, tyramine and tryptamine, based on the derivatization reaction with 2-chloro-1,3-dinitro-5-(trifluoromethyl)-benzene (CNBF), is proposed. The amines derivatives with CNBF were isolated and characterized by X-ray crystallography and {sup 1}H, {sup 13}C, {sup 19}F NMR spectroscopy in solution. The novelty of the procedure is based on the pure and well-characterized products of the amines derivatization reaction. The method was applied for the simultaneous analysis of the above mentioned biogenic amines in wine samples by the reversed phase-high performance liquid chromatography. The procedure revealed correlation coefficients (R{sup 2}) between 0.9997 and 0.9999, and linear range: 0.10–9.00 mg L{sup −1} (histamine); 0.10–9.36 mg L{sup -1} (tyramine); 0.09–8.64 mg L{sup −1} (tryptamine) and 0.10–8.64 mg L{sup −1} (phenethylamine), whereas accuracy was 97%–102% (recovery test). Detection limit of biogenic amines in wine samples was 0.02–0.03 mg L{sup −1}, whereas quantification limit ranged 0.05–0.10 mg L{sup −1}. The variation coefficients for the analyzed amines ranged between 0.49% and 3.92%. Obtained BA derivatives enhanced separation the analytes on chromatograms due to the inhibition of hydrolysis reaction and the reduction of by-products formation.

  5. Sampling from the normal and exponential distributions

    International Nuclear Information System (INIS)

    Chaplin, K.R.; Wills, C.A.

    1982-01-01

    Methods for generating random numbers from the normal and exponential distributions are described. These involve dividing each function into subregions, and for each of these developing a method of sampling usually based on an acceptance rejection technique. When sampling from the normal or exponential distribution, each subregion provides the required random value with probability equal to the ratio of its area to the total area. Procedures written in FORTRAN for the CYBER 175/CDC 6600 system are provided to implement the two algorithms

  6. Synthetic Multiple-Imputation Procedure for Multistage Complex Samples

    Directory of Open Access Journals (Sweden)

    Zhou Hanzhi

    2016-03-01

    Full Text Available Multiple imputation (MI is commonly used when item-level missing data are present. However, MI requires that survey design information be built into the imputation models. For multistage stratified clustered designs, this requires dummy variables to represent strata as well as primary sampling units (PSUs nested within each stratum in the imputation model. Such a modeling strategy is not only operationally burdensome but also inferentially inefficient when there are many strata in the sample design. Complexity only increases when sampling weights need to be modeled. This article develops a generalpurpose analytic strategy for population inference from complex sample designs with item-level missingness. In a simulation study, the proposed procedures demonstrate efficient estimation and good coverage properties. We also consider an application to accommodate missing body mass index (BMI data in the analysis of BMI percentiles using National Health and Nutrition Examination Survey (NHANES III data. We argue that the proposed methods offer an easy-to-implement solution to problems that are not well-handled by current MI techniques. Note that, while the proposed method borrows from the MI framework to develop its inferential methods, it is not designed as an alternative strategy to release multiply imputed datasets for complex sample design data, but rather as an analytic strategy in and of itself.

  7. Use of Matrix Sampling Procedures to Assess Achievement in Solving Open Addition and Subtraction Sentences.

    Science.gov (United States)

    Montague, Margariete A.

    This study investigated the feasibility of concurrently and randomly sampling examinees and items in order to estimate group achievement. Seven 32-item tests reflecting a 640-item universe of simple open sentences were used such that item selection (random, systematic) and assignment (random, systematic) of items (four, eight, sixteen) to forms…

  8. 40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sample selection by random number... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square... area created in accordance with paragraph (a) of this section, select two random numbers: one each for...

  9. A single baseline ultrasound assessment of fibroid presence and size is strongly predictive of future uterine procedure: 8-year follow-up of randomly sampled premenopausal women aged 35-49 years.

    Science.gov (United States)

    Baird, D D; Saldana, T M; Shore, D L; Hill, M C; Schectman, J M

    2015-12-01

    How well can a single baseline ultrasound assessment of fibroid burden (presence or absence of fibroids and size of largest, if present) predict future probability of having a major uterine procedure? During an 8-year follow-up period, the risk of having a major uterine procedure was 2% for those without fibroids and increased with fibroid size for those with fibroids, reaching 47% for those with fibroids ≥ 4 cm in diameter at baseline. Uterine fibroids are a leading indication for hysterectomy. However, when fibroids are found, there are few available data to help clinicians advise patients about disease progression. Women who were 35-49 years old were randomly selected from the membership of a large urban health plan; 80% of those determined to be eligible were enrolled and screened with ultrasound for fibroids ≥ 0.5 cm in diameter. African-American and white premenopausal participants who responded to at least one follow-up interview (N = 964, 85% of those eligible) constituted the study cohort. During follow-up (5822 person-years), participants self-reported any major uterine procedure (67% hysterectomies). Life-table analyses and Cox regression (with censoring for menopause) were used to estimate the risk of having a uterine procedure for women with no fibroids, small (women, importance of a clinical diagnosis of fibroids prior to study enrollment, and the impact of submucosal fibroids on risk were investigated. There was a greater loss to follow-up for African-Americans than whites (19 versus 11%). For those with follow-up data, 64% had fibroids at baseline, 33% of whom had had a prior diagnosis. Of those with fibroids, 27% had small fibroids (women during follow-up. The estimated risk of having a procedure in any given year of follow-up for those with fibroids compared with those without fibroids increased markedly with fibroid-size category (from 4-fold, confidence interval (CI) (1.4-11.1) for the small fibroids to 10-fold, CI (4.4-24.8) for the medium

  10. Optimization by GRASP greedy randomized adaptive search procedures

    CERN Document Server

    Resende, Mauricio G C

    2016-01-01

    This is the first book to cover GRASP (Greedy Randomized Adaptive Search Procedures), a metaheuristic that has enjoyed wide success in practice with a broad range of applications to real-world combinatorial optimization problems. The state-of-the-art coverage and carefully crafted pedagogical style lends this book highly accessible as an introductory text not only to GRASP, but also to combinatorial optimization, greedy algorithms, local search, and path-relinking, as well as to heuristics and metaheuristics, in general. The focus is on algorithmic and computational aspects of applied optimization with GRASP with emphasis given to the end-user, providing sufficient information on the broad spectrum of advances in applied optimization with GRASP. For the more advanced reader, chapters on hybridization with path-relinking and parallel and continuous GRASP present these topics in a clear and concise fashion. Additionally, the book offers a very complete annotated bibliography of GRASP and combinatorial optimizat...

  11. Missing citations due to exact reference matching: Analysis of a random sample from WoS. Are publications from peripheral countries disadvantaged?

    Energy Technology Data Exchange (ETDEWEB)

    Donner, P.

    2016-07-01

    Citation counts of scientific research contributions are one fundamental data in scientometrics. Accuracy and completeness of citation links are therefore crucial data quality issues (Moed, 2005, Ch. 13). However, despite the known flaws of reference matching algorithms, usually no attempts are made to incorporate uncertainty about citation counts into indicators. This study is a step towards that goal. Particular attention is paid to the question whether publications from countries not using basic Latin script are differently affected by missed citations. The proprietary reference matching procedure of Web of Science (WoS) is based on (near) exact agreement of cited reference data (normalized during processing) to the target papers bibliographical data. Consequently, the procedure has near-optimal precision but incomplete recall - it is known to miss some slightly inaccurate reference links (Olensky, 2015). However, there has been no attempt so far to estimate the rate of missed citations by a principled method for a random sample. For this study a simple random sample of WoS source papers was drawn and it was attempted to find all reference strings of WoS indexed documents that refer to them, in particular inexact matches. The objective is to give a statistical estimate of the proportion of missed citations and to describe the relationship of the number of found citations to the number of missed citations, i.e. the conditional error distribution. The empirical error distribution is statistically analyzed and modelled. (Author)

  12. Monitoring oil persistence on beaches : SCAT versus stratified random sampling designs

    International Nuclear Information System (INIS)

    Short, J.W.; Lindeberg, M.R.; Harris, P.M.; Maselko, J.M.; Pella, J.J.; Rice, S.D.

    2003-01-01

    In the event of a coastal oil spill, shoreline clean-up assessment teams (SCAT) commonly rely on visual inspection of the entire affected area to monitor the persistence of the oil on beaches. Occasionally, pits are excavated to evaluate the persistence of subsurface oil. This approach is practical for directing clean-up efforts directly following a spill. However, sampling of the 1989 Exxon Valdez oil spill in Prince William Sound 12 years later has shown that visual inspection combined with pit excavation does not offer estimates of contaminated beach area of stranded oil volumes. This information is needed to statistically evaluate the significance of change with time. Assumptions regarding the correlation of visually-evident surface oil and cryptic subsurface oil are usually not evaluated as part of the SCAT mandate. Stratified random sampling can avoid such problems and could produce precise estimates of oiled area and volume that allow for statistical assessment of major temporal trends and the extent of the impact. The 2001 sampling of the shoreline of Prince William Sound showed that 15 per cent of surface oil occurrences were associated with subsurface oil. This study demonstrates the usefulness of the stratified random sampling method and shows how sampling design parameters impact statistical outcome. Power analysis based on the study results, indicate that optimum power is derived when unnecessary stratification is avoided. It was emphasized that sampling effort should be balanced between choosing sufficient beaches for sampling and the intensity of sampling

  13. Pretreatment procedures applied to samples to be analysed by neutron activation analysis at CDTN/CNEN

    International Nuclear Information System (INIS)

    Francisco, Dovenir; Menezes, Maria Angela de Barros Correia

    2009-01-01

    The neutron activation technique - using several methods - has been applied in 80% of the analytical demand of Division for Reactor and Analytical Techniques at CDTN/CNEN, Belo Horizonte, Minas Gerais. This scenario emphasizes the responsibility of the Laboratory to provide and assure the quality of the measurements. The first step to assure the results quality is the preparation of the samples. Therefore, this paper describes the experimental procedures adopted at CDTN/CNEN in order to uniform conditions of analysis and to avoid contaminations by elements present everywhere. Some of the procedures are based on methods described in the literature; others are based on many years of experience preparing samples from many kinds of matrices. The procedures described are related to geological material - soil, sediment, rock, gems, clay, archaeological ceramics and ore - biological materials - hair, fish, plants, food - water, etc. Analytical results in sediment samples are shown as n example pointing out the efficiency of the experimental procedure. (author)

  14. Sample preparation procedure for PIXE elemental analysis on soft tissues

    International Nuclear Information System (INIS)

    Kubica, B.; Kwiatek, W.M.; Dutkiewicz, E.M.; Lekka, M.

    1997-01-01

    Trace element analysis is one of the most important field in analytical chemistry. There are several instrumental techniques which are applied for determinations of microscopic elemental content. The PIXE (Proton Induced X-ray Emission) technique is one of the nuclear techniques that is commonly applied for such purpose due to its multielemental analysis possibilities. The aim of this study was to establish the optimal conditions for target preparation procedure. In this paper two different approaches to the topic are presented and widely discussed. The first approach was the traditional pellet technique and the second one was mineralization procedure. For the analysis soft tissue such as liver was used. Some results are also presented on water samples. (author)

  15. Are most samples of animals systematically biased? Consistent individual trait differences bias samples despite random sampling.

    Science.gov (United States)

    Biro, Peter A

    2013-02-01

    Sampling animals from the wild for study is something nearly every biologist has done, but despite our best efforts to obtain random samples of animals, 'hidden' trait biases may still exist. For example, consistent behavioral traits can affect trappability/catchability, independent of obvious factors such as size and gender, and these traits are often correlated with other repeatable physiological and/or life history traits. If so, systematic sampling bias may exist for any of these traits. The extent to which this is a problem, of course, depends on the magnitude of bias, which is presently unknown because the underlying trait distributions in populations are usually unknown, or unknowable. Indeed, our present knowledge about sampling bias comes from samples (not complete population censuses), which can possess bias to begin with. I had the unique opportunity to create naturalized populations of fish by seeding each of four small fishless lakes with equal densities of slow-, intermediate-, and fast-growing fish. Using sampling methods that are not size-selective, I observed that fast-growing fish were up to two-times more likely to be sampled than slower-growing fish. This indicates substantial and systematic bias with respect to an important life history trait (growth rate). If correlations between behavioral, physiological and life-history traits are as widespread as the literature suggests, then many animal samples may be systematically biased with respect to these traits (e.g., when collecting animals for laboratory use), and affect our inferences about population structure and abundance. I conclude with a discussion on ways to minimize sampling bias for particular physiological/behavioral/life-history types within animal populations.

  16. A simple and efficient alternative to implementing systematic random sampling in stereological designs without a motorized microscope stage.

    Science.gov (United States)

    Melvin, Neal R; Poda, Daniel; Sutherland, Robert J

    2007-10-01

    When properly applied, stereology is a very robust and efficient method to quantify a variety of parameters from biological material. A common sampling strategy in stereology is systematic random sampling, which involves choosing a random sampling [corrected] start point outside the structure of interest, and sampling relevant objects at [corrected] sites that are placed at pre-determined, equidistant intervals. This has proven to be a very efficient sampling strategy, and is used widely in stereological designs. At the microscopic level, this is most often achieved through the use of a motorized stage that facilitates the systematic random stepping across the structure of interest. Here, we report a simple, precise and cost-effective software-based alternative to accomplishing systematic random sampling under the microscope. We believe that this approach will facilitate the use of stereological designs that employ systematic random sampling in laboratories that lack the resources to acquire costly, fully automated systems.

  17. Examination of the Triarchic Assessment Procedure for Inconsistent Responding in six non-English language samples.

    Science.gov (United States)

    Kelley, Shannon E; van Dongen, Josanne D M; Donnellan, M Brent; Edens, John F; Eisenbarth, Hedwig; Fossati, Andrea; Howner, Katarina; Somma, Antonella; Sörman, Karolina

    2018-05-01

    The Triarchic Assessment Procedure for Inconsistent Responding (TAPIR; Mowle et al., 2016) was recently developed to identify inattentiveness or comprehension difficulties that may compromise the validity of responses on the Triarchic Psychopathy Measure (TriPM; Patrick, 2010). The TAPIR initially was constructed and cross-validated using exclusively English-speaking participants from the United States; however, research using the TriPM has been increasingly conducted internationally, with numerous foreign language translations of the measure emerging. The present study examined the cross-language utility of the TAPIR in German, Dutch, Swedish, and Italian translations of the TriPM using 6 archival samples of community members, university students, forensic psychiatric inpatients, forensic detainees, and adolescents residing outside the United States (combined N = 5,404). Findings suggest that the TAPIR effectively detects careless responding across these 4 translated versions of the TriPM without the need for language-specific modifications. The TAPIR total score meaningfully discriminated genuine participant responses from both fully and partially randomly generated data in every sample, and demonstrated further utility in detecting fixed "all true" or "all false" response patterns. In addition, TAPIR scores were reliably associated with inconsistent responding scores from another psychopathy inventory. Specificity for a range of tentative cut scores for assessing profile validity was modestly reduced among our samples relative to rates previously obtained with the English version of the TriPM; however, overall the TAPIR appears to demonstrate satisfactory cross-language generalizability. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  18. A random sampling approach for robust estimation of tissue-to-plasma ratio from extremely sparse data.

    Science.gov (United States)

    Chu, Hui-May; Ette, Ene I

    2005-09-02

    his study was performed to develop a new nonparametric approach for the estimation of robust tissue-to-plasma ratio from extremely sparsely sampled paired data (ie, one sample each from plasma and tissue per subject). Tissue-to-plasma ratio was estimated from paired/unpaired experimental data using independent time points approach, area under the curve (AUC) values calculated with the naïve data averaging approach, and AUC values calculated using sampling based approaches (eg, the pseudoprofile-based bootstrap [PpbB] approach and the random sampling approach [our proposed approach]). The random sampling approach involves the use of a 2-phase algorithm. The convergence of the sampling/resampling approaches was investigated, as well as the robustness of the estimates produced by different approaches. To evaluate the latter, new data sets were generated by introducing outlier(s) into the real data set. One to 2 concentration values were inflated by 10% to 40% from their original values to produce the outliers. Tissue-to-plasma ratios computed using the independent time points approach varied between 0 and 50 across time points. The ratio obtained from AUC values acquired using the naive data averaging approach was not associated with any measure of uncertainty or variability. Calculating the ratio without regard to pairing yielded poorer estimates. The random sampling and pseudoprofile-based bootstrap approaches yielded tissue-to-plasma ratios with uncertainty and variability. However, the random sampling approach, because of the 2-phase nature of its algorithm, yielded more robust estimates and required fewer replications. Therefore, a 2-phase random sampling approach is proposed for the robust estimation of tissue-to-plasma ratio from extremely sparsely sampled data.

  19. The Alaska Commercial Fisheries Water Quality Sampling Methods and Procedures Manual

    Energy Technology Data Exchange (ETDEWEB)

    Folley, G.; Pearson, L.; Crosby, C. [Alaska Dept. of Environmental Conservation, Soldotna, AK (United States); DeCola, E.; Robertson, T. [Nuka Research and Planning Group, Seldovia, AK (United States)

    2006-07-01

    A comprehensive water quality sampling program was conducted in response to the oil spill that occurred when the M/V Selendang Ayu ship ran aground near a major fishing port at Unalaska Island, Alaska in December 2004. In particular, the sampling program focused on the threat of spilled oil to the local commercial fisheries resources. Spill scientists were unable to confidently model the movement of oil away from the wreck because of limited oceanographic data. In order to determine which fish species were at risk of oil contamination, a real-time assessment of how and where the oil was moving was needed, because the wreck became a continual source of oil release for several weeks after the initial grounding. The newly developed methods and procedures used to detect whole oil during the sampling program will be presented in the Alaska Commercial Fisheries Water Quality Sampling Methods and Procedures Manual which is currently under development. The purpose of the manual is to provide instructions to spill managers while they try to determine where spilled oil has or has not been encountered. The manual will include a meaningful data set that can be analyzed in real time to assess oil movement and concentration. Sections on oil properties and processes will be included along with scientific water quality sampling methods for whole and dissolved phase oil to assess potential contamination of commercial fishery resources and gear in Alaska waters during an oil spill. The manual will present a general discussion of factors that should be considered when designing a sampling program after a spill. In order to implement Alaska's improved seafood safety measures, the spatial scope of spilled oil must be known. A water quality sampling program can provide state and federal fishery managers and food safety inspectors with important information as they identify at-risk fisheries. 11 refs., 7 figs.

  20. Analyzing Effectiveness of Routine Pleural Drainage After Nuss Procedure: A Randomized Study.

    Science.gov (United States)

    Pawlak, Krystian; Gąsiorowski, Łukasz; Gabryel, Piotr; Smoliński, Szymon; Dyszkiewicz, Wojciech

    2017-12-01

    The routine use of postoperative pleural cavity drainage after the Nuss procedure is not widely accepted, and its limited use depends on experience. This study analyzed the influence of pleural drainage in the surgical treatment of patients with pectus excavatum on the prevention of pneumothorax and the efficacy of using drainage after a corrective operation. From November 2013 to May 2015, 103 consecutive patients with pectus excavatum, aged 11 to 39 years, underwent surgical treatment by the Nuss procedure. Patients were prospectively randomized into two groups. In 58 patients, a 28F chest tube was routinely introduced into the right pleural cavity during procedure for 2 consecutive days (group I). In the remaining 45 patients, the drain was not inserted (group II). No statistically significant differences were found between the study groups, including sex, age, body mass index, or clinical subjective and objective factors in the preoperative evaluation. Group II manifested more complications in the early postoperative period; however, this was not statistically significant (group I vs group II; p = 0.0725). Pneumothorax requiring additional chest tube placement was statistically significant (group I vs group II; p = 0.0230). Other complications were also more frequent among patients from group II, although this did not reach statistical significance. Follow up was 22.9 ± 6.4 months. Routine drainage of the pleural cavity during the Nuss procedure significantly reduces the incidence of postoperative pneumothorax and should be considered as a routine procedure. Copyright © 2017 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  1. Procedures for the collection and preservation of groundwater and surface water samples and for the installation of monitoring wells

    International Nuclear Information System (INIS)

    Korte, N.; Kearl, P.

    1984-01-01

    Proper sampling procedures are essential for a successful water-quality monitoring program. It must be emphasized, however, that it is impossible to maintain absolutely in-situ conditions when collecting and preserving a water sample, whether from a flowing stream or an aquifer. Consequently, the most that can reasonably be expected is to collect a best possible sample with minimal disturbance. This document describes procedures for installing monitoring wells and for collecting samples of surface water and groundwater. The discussion of monitoring wells includes mention of multilevel sampling and a general overview of vadose-zone monitoring. Guidelines for well installation are presented in detail. The discussion of water-sample collection contains evaluations of sampling pumps, filtration equipment, and sample containers. Sample-preservation techniques, as published by several government and private sources, are reviewed. Finally, step-by-step procedures for collection of water samples are provided; these procedures address such considerations as necessary equipment, field operations, and written documentation. Separate procedures are also included for the collection of samples for determination of sulfide and for reactive aluminum. The report concludes with a brief discussion of adverse sampling, conditions that may significantly affect the quality of the data. Appendix A presents a rationale for the development and use of statistical considerations in water sampling to ensure a more complete water quality monitoring program. 51 references, 9 figures, 4 tables

  2. A recommended procedure for establishing the source level relationships between heroin case samples of unknown origins

    Directory of Open Access Journals (Sweden)

    Kar-Weng Chan

    2014-06-01

    Full Text Available A recent concern of how to reliably establish the source level relationships of heroin case samples is addressed in this paper. Twenty-two trafficking heroin case samples of unknown origins seized from two major regions (Kuala Lumpur and Penang in Malaysia were studied. A procedure containing six major steps was followed to analyze and classify these samples. Subsequently, with the aid of statistical control samples, reliability of the clustering result was assessed. The final outcome reveals that the samples seized from the two regions in 2013 had highly likely originated from two different sources. Hence, the six-step procedure is sufficient for any chemist who attempts to assess the relative source level relationships of heroin samples.

  3. Random Sampling of Correlated Parameters – a Consistent Solution for Unfavourable Conditions

    Energy Technology Data Exchange (ETDEWEB)

    Žerovnik, G., E-mail: gasper.zerovnik@ijs.si [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Trkov, A. [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); International Atomic Energy Agency, PO Box 100, A-1400 Vienna (Austria); Kodeli, I.A. [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Capote, R. [International Atomic Energy Agency, PO Box 100, A-1400 Vienna (Austria); Smith, D.L. [Argonne National Laboratory, 1710 Avenida del Mundo, Coronado, CA 92118-3073 (United States)

    2015-01-15

    Two methods for random sampling according to a multivariate lognormal distribution – the correlated sampling method and the method of transformation of correlation coefficients – are briefly presented. The methods are mathematically exact and enable consistent sampling of correlated inherently positive parameters with given information on the first two distribution moments. Furthermore, a weighted sampling method to accelerate the convergence of parameters with extremely large relative uncertainties is described. However, the method is efficient only for a limited number of correlated parameters.

  4. Sampling Strategies and Processing of Biobank Tissue Samples from Porcine Biomedical Models.

    Science.gov (United States)

    Blutke, Andreas; Wanke, Rüdiger

    2018-03-06

    In translational medical research, porcine models have steadily become more popular. Considering the high value of individual animals, particularly of genetically modified pig models, and the often-limited number of available animals of these models, establishment of (biobank) collections of adequately processed tissue samples suited for a broad spectrum of subsequent analyses methods, including analyses not specified at the time point of sampling, represent meaningful approaches to take full advantage of the translational value of the model. With respect to the peculiarities of porcine anatomy, comprehensive guidelines have recently been established for standardized generation of representative, high-quality samples from different porcine organs and tissues. These guidelines are essential prerequisites for the reproducibility of results and their comparability between different studies and investigators. The recording of basic data, such as organ weights and volumes, the determination of the sampling locations and of the numbers of tissue samples to be generated, as well as their orientation, size, processing and trimming directions, are relevant factors determining the generalizability and usability of the specimen for molecular, qualitative, and quantitative morphological analyses. Here, an illustrative, practical, step-by-step demonstration of the most important techniques for generation of representative, multi-purpose biobank specimen from porcine tissues is presented. The methods described here include determination of organ/tissue volumes and densities, the application of a volume-weighted systematic random sampling procedure for parenchymal organs by point-counting, determination of the extent of tissue shrinkage related to histological embedding of samples, and generation of randomly oriented samples for quantitative stereological analyses, such as isotropic uniform random (IUR) sections generated by the "Orientator" and "Isector" methods, and vertical

  5. The Bootstrap, the Jackknife, and the Randomization Test: A Sampling Taxonomy.

    Science.gov (United States)

    Rodgers, J L

    1999-10-01

    A simple sampling taxonomy is defined that shows the differences between and relationships among the bootstrap, the jackknife, and the randomization test. Each method has as its goal the creation of an empirical sampling distribution that can be used to test statistical hypotheses, estimate standard errors, and/or create confidence intervals. Distinctions between the methods can be made based on the sampling approach (with replacement versus without replacement) and the sample size (replacing the whole original sample versus replacing a subset of the original sample). The taxonomy is useful for teaching the goals and purposes of resampling schemes. An extension of the taxonomy implies other possible resampling approaches that have not previously been considered. Univariate and multivariate examples are presented.

  6. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality

    Science.gov (United States)

    Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Introduction Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. Materials and methods The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. Results The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. Conclusions The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in

  7. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality.

    Science.gov (United States)

    L'Engle, Kelly; Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit dialing of mobile

  8. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality.

    Directory of Open Access Journals (Sweden)

    Kelly L'Engle

    Full Text Available Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample.The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census.The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample.The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit

  9. A radiochemical procedure for the determination of Po-210 in environmental samples

    International Nuclear Information System (INIS)

    Godoy, J.M.; Schuettelkopf, H.

    1980-07-01

    A radiochemical procedure for the determination of Po-210 in environmental samples was developed. Soil, sediments, filter materials, plants, water and food samples can be analyzed for Po-210. Wet ashing is achieved with HNO 3 + H 2 O 2 or HCl + HNO 3 . To separate disturbing substances, a coprecipitation with Te is used for sample materials containing silica. Po-210 deposition from HCl solution on Ag platelets with other sample materials is possible directly. Deposited Po-210 is counted by α-spectrometry. For chemical yield determination Po-208 is added, yields range between 60% and 100%. A lower detection limit of about 0,002 pCi Po-210/sample is achievable. (orig./HP) [de

  10. SYSTEMATIC SAMPLING FOR NON - LINEAR TREND IN MILK YIELD DATA

    OpenAIRE

    Tanuj Kumar Pandey; Vinod Kumar

    2014-01-01

    The present paper utilizes systematic sampling procedures for milk yield data exhibiting some non-linear trends. The best fitted mathematical forms of non-linear trend present in the milk yield data are obtained and the expressions of average variances of the estimators of population mean under simple random, usual systematic and modified systematic sampling procedures have been derived for populations showing non-linear trend. A comparative study is made among the three sampli...

  11. Sampling procedures and tables

    International Nuclear Information System (INIS)

    Franzkowski, R.

    1980-01-01

    Characteristics, defects, defectives - Sampling by attributes and by variables - Sample versus population - Frequency distributions for the number of defectives or the number of defects in the sample - Operating characteristic curve, producer's risk, consumer's risk - Acceptable quality level AQL - Average outgoing quality AOQ - Standard ISQ 2859 - Fundamentals of sampling by variables for fraction defective. (RW)

  12. A procedure for the determination of Po-210 in water samples by alpha spectrometry

    International Nuclear Information System (INIS)

    2009-01-01

    Reliable, comparable and 'fit for purpose' results are an essential requirement for any decision based on analytical measurements. For the analyst, the availability of tested and validated analytical procedures is a extremely important tool for production of such analytical measurements. For maximum utility, such procedures should be comprehensive, clearly formulated, and readily available to both the analyst and the customer for reference. Since 2004 the Environment programme of the IAEA has included activities aimed at the development of a set of procedures for the determination of radionuclides in terrestrial environmental samples. In the case of 210 Po, this started with the collection and review of about 130 papers from the scientific literature. Based on this review, two candidate methods for the chemical separation of 210 Po from water samples were selected for testing, refinement and validation in accordance with ISO guidelines. A comprehensive methodology for calculation of results including quantification of measurement uncertainty was also developed. This report presents the final procedure which was developed based on that work

  13. Occupational position and its relation to mental distress in a random sample of Danish residents

    DEFF Research Database (Denmark)

    Rugulies, Reiner Ernst; Madsen, Ida E H; Nielsen, Maj Britt D

    2010-01-01

    PURPOSE: To analyze the distribution of depressive, anxiety, and somatization symptoms across different occupational positions in a random sample of Danish residents. METHODS: The study sample consisted of 591 Danish residents (50% women), aged 20-65, drawn from an age- and gender-stratified random...... sample of the Danish population. Participants filled out a survey that included the 92 item version of the Hopkins Symptom Checklist (SCL-92). We categorized occupational position into seven groups: high- and low-grade non-manual workers, skilled and unskilled manual workers, high- and low-grade self...

  14. Modified FlowCAM procedure for quantifying size distribution of zooplankton with sample recycling capacity.

    Directory of Open Access Journals (Sweden)

    Esther Wong

    Full Text Available We have developed a modified FlowCAM procedure for efficiently quantifying the size distribution of zooplankton. The modified method offers the following new features: 1 prevents animals from settling and clogging with constant bubbling in the sample container; 2 prevents damage to sample animals and facilitates recycling by replacing the built-in peristaltic pump with an external syringe pump, in order to generate negative pressure, creates a steady flow by drawing air from the receiving conical flask (i.e. vacuum pump, and transfers plankton from the sample container toward the main flowcell of the imaging system and finally into the receiving flask; 3 aligns samples in advance of imaging and prevents clogging with an additional flowcell placed ahead of the main flowcell. These modifications were designed to overcome the difficulties applying the standard FlowCAM procedure to studies where the number of individuals per sample is small, and since the FlowCAM can only image a subset of a sample. Our effective recycling procedure allows users to pass the same sample through the FlowCAM many times (i.e. bootstrapping the sample in order to generate a good size distribution. Although more advanced FlowCAM models are equipped with syringe pump and Field of View (FOV flowcells which can image all particles passing through the flow field; we note that these advanced setups are very expensive, offer limited syringe and flowcell sizes, and do not guarantee recycling. In contrast, our modifications are inexpensive and flexible. Finally, we compared the biovolumes estimated by automated FlowCAM image analysis versus conventional manual measurements, and found that the size of an individual zooplankter can be estimated by the FlowCAM image system after ground truthing.

  15. Diagnostic imaging procedure volume in the United States

    International Nuclear Information System (INIS)

    Johnson, J.L.; Abernathy, D.L.

    1983-01-01

    Comprehensive data on 1979 and 1980 diagnostic imaging procedure volume were collected from a stratified random sample of U.S. short-term general-care hospitals and private practices of radiologists, cardiologists, obstetricians/gynecologists, orthopedic surgeons, and neurologists/neurosurgeons. Approximately 181 million imaging procedures (within the study scope) were performed in 1980. Despite the rapidly increasing use of newer imaging methods, plain film radiography (140.3 million procedures) and contrast studies (22.9 million procedures) continue to comprise the vast majority of diagnostic imaging volume. Ultrasound, computed tomography, nuclear medicine, and special procedures make up less than 10% of total diagnostic imaging procedures. Comparison of the data from this study with data from an earlier study indicates that imaging procedure volume in hospitals expanded at an annual growth rate of almost 8% from 1973 to 1980

  16. Biclustering of gene expression data using reactive greedy randomized adaptive search procedure.

    Science.gov (United States)

    Dharan, Smitha; Nair, Achuthsankar S

    2009-01-30

    Biclustering algorithms belong to a distinct class of clustering algorithms that perform simultaneous clustering of both rows and columns of the gene expression matrix and can be a very useful analysis tool when some genes have multiple functions and experimental conditions are diverse. Cheng and Church have introduced a measure called mean squared residue score to evaluate the quality of a bicluster and has become one of the most popular measures to search for biclusters. In this paper, we review basic concepts of the metaheuristics Greedy Randomized Adaptive Search Procedure (GRASP)-construction and local search phases and propose a new method which is a variant of GRASP called Reactive Greedy Randomized Adaptive Search Procedure (Reactive GRASP) to detect significant biclusters from large microarray datasets. The method has two major steps. First, high quality bicluster seeds are generated by means of k-means clustering. In the second step, these seeds are grown using the Reactive GRASP, in which the basic parameter that defines the restrictiveness of the candidate list is self-adjusted, depending on the quality of the solutions found previously. We performed statistical and biological validations of the biclusters obtained and evaluated the method against the results of basic GRASP and as well as with the classic work of Cheng and Church. The experimental results indicate that the Reactive GRASP approach outperforms the basic GRASP algorithm and Cheng and Church approach. The Reactive GRASP approach for the detection of significant biclusters is robust and does not require calibration efforts.

  17. 21 CFR 203.34 - Policies and procedures; administrative systems.

    Science.gov (United States)

    2010-04-01

    ... distribution security and audit system, including conducting random and for-cause audits of sales... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Policies and procedures; administrative systems...; administrative systems. Each manufacturer or authorized distributor of record that distributes drug samples shall...

  18. Estimation of Sensitive Proportion by Randomized Response Data in Successive Sampling

    Directory of Open Access Journals (Sweden)

    Bo Yu

    2015-01-01

    Full Text Available This paper considers the problem of estimation for binomial proportions of sensitive or stigmatizing attributes in the population of interest. Randomized response techniques are suggested for protecting the privacy of respondents and reducing the response bias while eliciting information on sensitive attributes. In many sensitive question surveys, the same population is often sampled repeatedly on each occasion. In this paper, we apply successive sampling scheme to improve the estimation of the sensitive proportion on current occasion.

  19. Effects of blood sample handling procedures on measurable inflammatory markers in plasma, serum and dried blood spot samples

    DEFF Research Database (Denmark)

    Skogstrand, K.; Thorsen, P.; Vogel, I.

    2008-01-01

    of whole blood samples at low temperatures and rapid isolation of plasma and serum. Effects of different handling procedures for all markers studied are given. DBSS proved to be a robust and convenient way to handle samples for immunoassay analysis of inflammatory markers in whole blood Udgivelsesdato......The interests in monitoring inflammation by immunoassay determination of blood inflammatory markers call for information on the stability of these markers in relation to the handling of blood samples. The increasing use of stored biobank samples for such ventures that may have been collected...... and stored for other purposes, justifies the study hereof. Blood samples were stored for 0, 4, 24, and 48 h at 4 degrees C, room temperature (RT), and at 35 degrees C, respectively, before they were separated into serum or plasma and frozen. Dried blood spot samples (DBSS) were stored for 0, 1, 2, 3, 7...

  20. Research And Establishment Of The Analytical Procedure For/Of Sr-90 In Milk Samples

    International Nuclear Information System (INIS)

    Tran Thi Tuyet Mai; Duong Duc Thang; Nguyen Thi Linh; Bui Thi Anh Duong

    2014-01-01

    Sr-90 is an indicator for the transfer radionuclides from environment to human. This work was setup to build a procedure for Sr-90 determination in main popular foodstuff and focus to fresh milk. The deal of this work was establish procedure for Sr-90 , assessment for chemical yield and test sample of Vietnam fresh milk, also in this work, the QA, QC for the procedure was carried out using standard sample of IAEA. The work has been completed for the procedure of determination Sr-90 in milk. The chemical yield of recovery for Y-90 and Sr-90 were at 46.76 % ±1.25% and 0.78 ± 0.086, respectively. The QA & QC program was carried out using reference material IAEA-373. The result parse is appropriate equally and well agreement with the certificate value. Three reference samples were analyses with 15 measurements. The results of Sr-90 concentration after processing statistics given a value at 3.69 Bq/kg with uncertainty of 0.23 Bq/kg. The certificate of IAEA-154 for Sr-90 (half live 28.8 year) is the 6.9 Bq/kg, with the range 95% Confidence Interval as (6.0 -8.0 ) Bq/kg at 31st August 1987. After adjusting decay, the radioactivity at this time is 3.67 Bq/kg. It means that such the result of this work was perfect matching the value of stock index IAEA. Five Vietnam fresh milk samples were analyzed for Sr-90, the specific radioactivity of Sr-90 in milk were in a range from 0.032 to 0.041 Bq/l. (author)

  1. Adaptive importance sampling of random walks on continuous state spaces

    International Nuclear Information System (INIS)

    Baggerly, K.; Cox, D.; Picard, R.

    1998-01-01

    The authors consider adaptive importance sampling for a random walk with scoring in a general state space. Conditions under which exponential convergence occurs to the zero-variance solution are reviewed. These results generalize previous work for finite, discrete state spaces in Kollman (1993) and in Kollman, Baggerly, Cox, and Picard (1996). This paper is intended for nonstatisticians and includes considerable explanatory material

  2. Sampling procedure in a willow plantation for estimation of moisture content

    DEFF Research Database (Denmark)

    Nielsen, Henrik Kofoed; Lærke, Poul Erik; Liu, Na

    2015-01-01

    Heating value and fuel quality of wood is closely connected to moisture content. In this work the variation of moisture content (MC) of short rotation coppice (SRC) willow shoots is described for five clones during one harvesting season. Subsequently an appropriate sampling procedure minimising...... labour costs and sampling uncertainty is proposed, where the MC of a single stem section with the length of 10–50 cm corresponds to the mean shoot moisture content (MSMC) with a bias of maximum 11 g kg−1. This bias can be reduced by selecting the stem section according to the particular clone...

  3. Fast randomized point location without preprocessing in two- and three-dimensional Delaunay triangulations

    Energy Technology Data Exchange (ETDEWEB)

    Muecke, E.P.; Saias, I.; Zhu, B.

    1996-05-01

    This paper studies the point location problem in Delaunay triangulations without preprocessing and additional storage. The proposed procedure finds the query point simply by walking through the triangulation, after selecting a good starting point by random sampling. The analysis generalizes and extends a recent result of d = 2 dimensions by proving this procedure to take expected time close to O(n{sup 1/(d+1)}) for point location in Delaunay triangulations of n random points in d = 3 dimensions. Empirical results in both two and three dimensions show that this procedure is efficient in practice.

  4. Precision of systematic and random sampling in clustered populations: habitat patches and aggregating organisms.

    Science.gov (United States)

    McGarvey, Richard; Burch, Paul; Matthews, Janet M

    2016-01-01

    Natural populations of plants and animals spatially cluster because (1) suitable habitat is patchy, and (2) within suitable habitat, individuals aggregate further into clusters of higher density. We compare the precision of random and systematic field sampling survey designs under these two processes of species clustering. Second, we evaluate the performance of 13 estimators for the variance of the sample mean from a systematic survey. Replicated simulated surveys, as counts from 100 transects, allocated either randomly or systematically within the study region, were used to estimate population density in six spatial point populations including habitat patches and Matérn circular clustered aggregations of organisms, together and in combination. The standard one-start aligned systematic survey design, a uniform 10 x 10 grid of transects, was much more precise. Variances of the 10 000 replicated systematic survey mean densities were one-third to one-fifth of those from randomly allocated transects, implying transect sample sizes giving equivalent precision by random survey would need to be three to five times larger. Organisms being restricted to patches of habitat was alone sufficient to yield this precision advantage for the systematic design. But this improved precision for systematic sampling in clustered populations is underestimated by standard variance estimators used to compute confidence intervals. True variance for the survey sample mean was computed from the variance of 10 000 simulated survey mean estimates. Testing 10 published and three newly proposed variance estimators, the two variance estimators (v) that corrected for inter-transect correlation (ν₈ and ν(W)) were the most accurate and also the most precise in clustered populations. These greatly outperformed the two "post-stratification" variance estimators (ν₂ and ν₃) that are now more commonly applied in systematic surveys. Similar variance estimator performance rankings were found with

  5. Experimental procedure for the determination of counting efficiency and sampling flow rate of a grab-sampling working level meter

    International Nuclear Information System (INIS)

    Grenier, M.; Bigu, J.

    1982-07-01

    The calibration procedures used for a working level meter (WLM) of the grab-sampling type are presented in detail. The WLM tested is a Pylon WL-1000C working level meter and it was calibrated for radon/thoron daughter counting efficiency (E), for sampling pump flow rate (Q) and other variables of interest. For the instrument calibrated at the Elliot Lake Laboratory, E was 0.22 +- 0.01 while Q was 4.50 +- 0.01 L/min

  6. Rapid Microwave Digestion Procedures for the Elemental Analysis of Alloy and Slag Samples of Smelted Ocean Bed Polymetallic Nodules

    Directory of Open Access Journals (Sweden)

    Kumari Smita

    2013-01-01

    Full Text Available The use of microwave digester for digestion of alloy and slag samples of smelted ocean bed polymetallic nodules has permitted the complete digestion of samples, thereby replacing the tedious classical methods of digestion of samples. The digestion procedure includes two acid-closed digestions of samples in a microwave oven. Owing to the hazardous nature of perchloric acid, it was not used in developed digestion procedure. Digested sample solutions were analyzed for concentrations of various radicals and the effectiveness of the developed digestion methodology was tested using certified reference materials. It was found that the developed method is giving results comparable with that obtained from conventionally digested samples. In this digestion procedure, time required for digestion of samples was reduced to about 1 hour only from 8-9 hours of conventional digestion.

  7. Selection of representative calibration sample sets for near-infrared reflectance spectroscopy to predict nitrogen concentration in grasses

    DEFF Research Database (Denmark)

    Shetty, Nisha; Rinnan, Åsmund; Gislum, René

    2012-01-01

    ) algorithm were used and compared. Both Puchwein and CADEX methods provide a calibration set equally distributed in space, and both methods require a minimum prior of knowledge. The samples were also selected randomly using complete random, cultivar random (year fixed), year random (cultivar fixed......) and interaction (cultivar × year fixed) random procedures to see the influence of different factors on sample selection. Puchwein's method performed best with lowest RMSEP followed by CADEX, interaction random, year random, cultivar random and complete random. Out of 118 samples of the complete calibration set...... effectively enhance the cost-effectiveness of NIR spectral analysis by reducing the number of analyzed samples in the calibration set by more than 80%, which substantially reduces the effort of laboratory analyses with no significant loss in prediction accuracy....

  8. Sampling design and procedures for fixed surface-water sites in the Georgia-Florida coastal plain study unit, 1993

    Science.gov (United States)

    Hatzell, H.H.; Oaksford, E.T.; Asbury, C.E.

    1995-01-01

    The implementation of design guidelines for the National Water-Quality Assessment (NAWQA) Program has resulted in the development of new sampling procedures and the modification of existing procedures commonly used in the Water Resources Division of the U.S. Geological Survey. The Georgia-Florida Coastal Plain (GAFL) study unit began the intensive data collection phase of the program in October 1992. This report documents the implementation of the NAWQA guidelines by describing the sampling design and procedures for collecting surface-water samples in the GAFL study unit in 1993. This documentation is provided for agencies that use water-quality data and for future study units that will be entering the intensive phase of data collection. The sampling design is intended to account for large- and small-scale spatial variations, and temporal variations in water quality for the study area. Nine fixed sites were selected in drainage basins of different sizes and different land-use characteristics located in different land-resource provinces. Each of the nine fixed sites was sampled regularly for a combination of six constituent groups composed of physical and chemical constituents: field measurements, major ions and metals, nutrients, organic carbon, pesticides, and suspended sediments. Some sites were also sampled during high-flow conditions and storm events. Discussion of the sampling procedure is divided into three phases: sample collection, sample splitting, and sample processing. A cone splitter was used to split water samples for the analysis of the sampling constituent groups except organic carbon from approximately nine liters of stream water collected at four fixed sites that were sampled intensively. An example of the sample splitting schemes designed to provide the sample volumes required for each sample constituent group is described in detail. Information about onsite sample processing has been organized into a flowchart that describes a pathway for each of

  9. The mean field theory in EM procedures for blind Markov random field image restoration.

    Science.gov (United States)

    Zhang, J

    1993-01-01

    A Markov random field (MRF) model-based EM (expectation-maximization) procedure for simultaneously estimating the degradation model and restoring the image is described. The MRF is a coupled one which provides continuity (inside regions of smooth gray tones) and discontinuity (at region boundaries) constraints for the restoration problem which is, in general, ill posed. The computational difficulty associated with the EM procedure for MRFs is resolved by using the mean field theory from statistical mechanics. An orthonormal blur decomposition is used to reduce the chances of undesirable locally optimal estimates. Experimental results on synthetic and real-world images show that this approach provides good blur estimates and restored images. The restored images are comparable to those obtained by a Wiener filter in mean-square error, but are most visually pleasing.

  10. A Table-Based Random Sampling Simulation for Bioluminescence Tomography

    Directory of Open Access Journals (Sweden)

    Xiaomeng Zhang

    2006-01-01

    Full Text Available As a popular simulation of photon propagation in turbid media, the main problem of Monte Carlo (MC method is its cumbersome computation. In this work a table-based random sampling simulation (TBRS is proposed. The key idea of TBRS is to simplify multisteps of scattering to a single-step process, through randomly table querying, thus greatly reducing the computing complexity of the conventional MC algorithm and expediting the computation. The TBRS simulation is a fast algorithm of the conventional MC simulation of photon propagation. It retained the merits of flexibility and accuracy of conventional MC method and adapted well to complex geometric media and various source shapes. Both MC simulations were conducted in a homogeneous medium in our work. Also, we present a reconstructing approach to estimate the position of the fluorescent source based on the trial-and-error theory as a validation of the TBRS algorithm. Good agreement is found between the conventional MC simulation and the TBRS simulation.

  11. Oral analgesia vs intravenous conscious sedation during Essure Micro-Insert sterilization procedure: randomized, double-blind, controlled trial.

    Science.gov (United States)

    Thiel, John A; Lukwinski, Angelina; Kamencic, Huse; Lim, Hyung

    2011-01-01

    To compare the pain reported by patients during the Essure Micro-Insert sterilization procedure using either intravenous conscious sedation or oral analgesia. Randomized, double-blind, placebo-controlled trial (Canadian Task Force classification I). Tertiary care ambulatory women's clinic. Eighty women of reproductive age women requesting permanent sterilization. Hysteroscopic placement of the Essure Micro-Insert permanent birth control system. Patients undergoing placement of the Essure Micro-Insert system for permanent contraception were randomized to receive either intravenous conscious sedation, oral analgesia, or placebo. During the procedure, pain scores were recorded using a visual analog scale. Patients in the oral analgesia group reported slightly more pain during insertion of the hysteroscope and placement of the second micro-insert; the groups were otherwise equivalent. They were also equivalent when all visual analog scale scores were combined. Oral analgesia is an effective method of pain control during placement of the Essure Micro-Insert permanent birth control system. Copyright © 2011 AAGL. Published by Elsevier Inc. All rights reserved.

  12. Direct methylation procedure for converting fatty amides to fatty acid methyl esters in feed and digesta samples.

    Science.gov (United States)

    Jenkins, T C; Thies, E J; Mosley, E E

    2001-05-01

    Two direct methylation procedures often used for the analysis of total fatty acids in biological samples were evaluated for their application to samples containing fatty amides. Methylation of 5 mg of oleamide (cis-9-octadecenamide) in a one-step (methanolic HCl for 2 h at 70 degrees C) or a two-step (sodium methoxide for 10 min at 50 degrees C followed by methanolic HCl for 10 min at 80 degrees C) procedure gave 59 and 16% conversions of oleamide to oleic acid, respectively. Oleic acid recovery from oleamide was increased to 100% when the incubation in methanolic HCl was lengthened to 16 h and increased to 103% when the incubation in methoxide was modified to 24 h at 100 degrees C. However, conversion of oleamide to oleic acid in an animal feed sample was incomplete for the modified (24 h) two-step procedure but complete for the modified (16 h) one-step procedure. Unsaturated fatty amides in feed and digesta samples can be converted to fatty acid methyl esters by incubation in methanolic HCl if the time of exposure to the acid catalyst is extended from 2 to 16 h.

  13. Long Term Resource Monitoring Program procedures: fish monitoring

    Science.gov (United States)

    Ratcliff, Eric N.; Glittinger, Eric J.; O'Hara, T. Matt; Ickes, Brian S.

    2014-01-01

    This manual constitutes the second revision of the U.S. Army Corps of Engineers’ Upper Mississippi River Restoration-Environmental Management Program (UMRR-EMP) Long Term Resource Monitoring Program (LTRMP) element Fish Procedures Manual. The original (1988) manual merged and expanded on ideas and recommendations related to Upper Mississippi River fish sampling presented in several early documents. The first revision to the manual was made in 1995 reflecting important protocol changes, such as the adoption of a stratified random sampling design. The 1995 procedures manual has been an important document through the years and has been cited in many reports and scientific manuscripts. The resulting data collected by the LTRMP fish component represent the largest dataset on fish within the Upper Mississippi River System (UMRS) with more than 44,000 collections of approximately 5.7 million fish. The goal of this revision of the procedures manual is to document changes in LTRMP fish sampling procedures since 1995. Refinements to sampling methods become necessary as monitoring programs mature. Possible refinements are identified through field experiences (e.g., sampling techniques and safety protocols), data analysis (e.g., planned and studied gear efficiencies and reallocations of effort), and technological advances (e.g., electronic data entry). Other changes may be required because of financial necessity (i.e., unplanned effort reductions). This version of the LTRMP fish monitoring manual describes the most current (2014) procedures of the LTRMP fish component.

  14. 40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or as...

  15. A Nationwide Random Sampling Survey of Potential Complicated Grief in Japan

    Science.gov (United States)

    Mizuno, Yasunao; Kishimoto, Junji; Asukai, Nozomu

    2012-01-01

    To investigate the prevalence of significant loss, potential complicated grief (CG), and its contributing factors, we conducted a nationwide random sampling survey of Japanese adults aged 18 or older (N = 1,343) using a self-rating Japanese-language version of the Complicated Grief Brief Screen. Among them, 37.0% experienced their most significant…

  16. Reinforcing Sampling Distributions through a Randomization-Based Activity for Introducing ANOVA

    Science.gov (United States)

    Taylor, Laura; Doehler, Kirsten

    2015-01-01

    This paper examines the use of a randomization-based activity to introduce the ANOVA F-test to students. The two main goals of this activity are to successfully teach students to comprehend ANOVA F-tests and to increase student comprehension of sampling distributions. Four sections of students in an advanced introductory statistics course…

  17. Extending cluster lot quality assurance sampling designs for surveillance programs.

    Science.gov (United States)

    Hund, Lauren; Pagano, Marcello

    2014-07-20

    Lot quality assurance sampling (LQAS) has a long history of applications in industrial quality control. LQAS is frequently used for rapid surveillance in global health settings, with areas classified as poor or acceptable performance on the basis of the binary classification of an indicator. Historically, LQAS surveys have relied on simple random samples from the population; however, implementing two-stage cluster designs for surveillance sampling is often more cost-effective than simple random sampling. By applying survey sampling results to the binary classification procedure, we develop a simple and flexible nonparametric procedure to incorporate clustering effects into the LQAS sample design to appropriately inflate the sample size, accommodating finite numbers of clusters in the population when relevant. We use this framework to then discuss principled selection of survey design parameters in longitudinal surveillance programs. We apply this framework to design surveys to detect rises in malnutrition prevalence in nutrition surveillance programs in Kenya and South Sudan, accounting for clustering within villages. By combining historical information with data from previous surveys, we design surveys to detect spikes in the childhood malnutrition rate. Copyright © 2014 John Wiley & Sons, Ltd.

  18. Randomization tests

    CERN Document Server

    Edgington, Eugene

    2007-01-01

    Statistical Tests That Do Not Require Random Sampling Randomization Tests Numerical Examples Randomization Tests and Nonrandom Samples The Prevalence of Nonrandom Samples in Experiments The Irrelevance of Random Samples for the Typical Experiment Generalizing from Nonrandom Samples Intelligibility Respect for the Validity of Randomization Tests Versatility Practicality Precursors of Randomization Tests Other Applications of Permutation Tests Questions and Exercises Notes References Randomized Experiments Unique Benefits of Experiments Experimentation without Mani

  19. Procedure manual for the estimation of average indoor radon-daughter concentrations using the radon grab-sampling method

    International Nuclear Information System (INIS)

    George, J.L.

    1986-04-01

    The US Department of Energy (DOE) Office of Remedial Action and Waste Technology established the Technical Measurements Center to provide standardization, calibration, comparability, verification of data, quality assurance, and cost-effectiveness for the measurement requirements of DOE remedial action programs. One of the remedial-action measurement needs is the estimation of average indoor radon-daughter concentration. One method for accomplishing such estimations in support of DOE remedial action programs is the radon grab-sampling method. This manual describes procedures for radon grab sampling, with the application specifically directed to the estimation of average indoor radon-daughter concentration (RDC) in highly ventilated structures. This particular application of the measurement method is for cases where RDC estimates derived from long-term integrated measurements under occupied conditions are below the standard and where the structure being evaluated is considered to be highly ventilated. The radon grab-sampling method requires that sampling be conducted under standard maximized conditions. Briefly, the procedure for radon grab sampling involves the following steps: selection of sampling and counting equipment; sample acquisition and processing, including data reduction; calibration of equipment, including provisions to correct for pressure effects when sampling at various elevations; and incorporation of quality-control and assurance measures. This manual describes each of the above steps in detail and presents an example of a step-by-step radon grab-sampling procedure using a scintillation cell

  20. LOD score exclusion analyses for candidate QTLs using random population samples.

    Science.gov (United States)

    Deng, Hong-Wen

    2003-11-01

    While extensive analyses have been conducted to test for, no formal analyses have been conducted to test against, the importance of candidate genes as putative QTLs using random population samples. Previously, we developed an LOD score exclusion mapping approach for candidate genes for complex diseases. Here, we extend this LOD score approach for exclusion analyses of candidate genes for quantitative traits. Under this approach, specific genetic effects (as reflected by heritability) and inheritance models at candidate QTLs can be analyzed and if an LOD score is < or = -2.0, the locus can be excluded from having a heritability larger than that specified. Simulations show that this approach has high power to exclude a candidate gene from having moderate genetic effects if it is not a QTL and is robust to population admixture. Our exclusion analysis complements association analysis for candidate genes as putative QTLs in random population samples. The approach is applied to test the importance of Vitamin D receptor (VDR) gene as a potential QTL underlying the variation of bone mass, an important determinant of osteoporosis.

  1. The Dirichet-Multinomial model for multivariate randomized response data and small samples

    NARCIS (Netherlands)

    Avetisyan, Marianna; Fox, Gerardus J.A.

    2012-01-01

    In survey sampling the randomized response (RR) technique can be used to obtain truthful answers to sensitive questions. Although the individual answers are masked due to the RR technique, individual (sensitive) response rates can be estimated when observing multivariate response data. The

  2. Analysis of plant gums and saccharide materials in paint samples: comparison of GC-MS analytical procedures and databases.

    Science.gov (United States)

    Lluveras-Tenorio, Anna; Mazurek, Joy; Restivo, Annalaura; Colombini, Maria Perla; Bonaduce, Ilaria

    2012-10-10

    Saccharide materials have been used for centuries as binding media, to paint, write and illuminate manuscripts and to apply metallic leaf decorations. Although the technical literature often reports on the use of plant gums as binders, actually several other saccharide materials can be encountered in paint samples, not only as major binders, but also as additives. In the literature, there are a variety of analytical procedures that utilize GC-MS to characterize saccharide materials in paint samples, however the chromatographic profiles are often extremely different and it is impossible to compare them and reliably identify the paint binder. This paper presents a comparison between two different analytical procedures based on GC-MS for the analysis of saccharide materials in works-of-art. The research presented here evaluates the influence of the analytical procedure used, and how it impacts the sugar profiles obtained from the analysis of paint samples that contain saccharide materials. The procedures have been developed, optimised and systematically used to characterise plant gums at the Getty Conservation Institute in Los Angeles, USA (GCI) and the Department of Chemistry and Industrial Chemistry of the University of Pisa, Italy (DCCI). The main steps of the analytical procedures and their optimisation are discussed. The results presented highlight that the two methods give comparable sugar profiles, whether the samples analysed are simple raw materials, pigmented and unpigmented paint replicas, or paint samples collected from hundreds of centuries old polychrome art objects. A common database of sugar profiles of reference materials commonly found in paint samples was thus compiled. The database presents data also from those materials that only contain a minor saccharide fraction. This database highlights how many sources of saccharides can be found in a paint sample, representing an important step forward in the problem of identifying polysaccharide binders in

  3. The Influence of Sample Drying Procedures on Mercury Concentrations Analyzed in Soils

    Czech Academy of Sciences Publication Activity Database

    Hojdová, Maria; Rohovec, Jan; Chrastný, V.; Penížek, V.; Navrátil, Tomáš

    2015-01-01

    Roč. 94, č. 5 (2015), s. 570-576 ISSN 0007-4861 R&D Projects: GA ČR GP526/09/P404; GA ČR(CZ) GAP210/11/1369 Institutional support: RVO:67985831 Keywords : sample preparation * drying procedures * microbial activity * freeze-drying * contamination Subject RIV: DD - Geochemistry Impact factor: 1.191, year: 2015

  4. Development of a simple extraction procedure for chlorpyrifos determination in food samples by immunoassay.

    Science.gov (United States)

    Gabaldón, J A; Maquieira, A; Puchades, R

    2007-02-28

    The suitability of immunoassay methodology for rapid and accurate determination of chlorpyrifos in vegetables was tested. The optimised ELISA detection limit was 0.32ng/ml, with a working range from 0.69 to 6.21ng/ml and an immunoassay test-mid point (IC(50)) of 2.08ng/ml. A rapid sample preparation procedure considering different parameters such as the amount of sample, volume of extractant, extraction time and dilution factor was optimised. The developed direct extraction (DE) and multiresidue (ME) standard procedures were performed in different fortified fresh and processed vegetable samples (tomato, bonnet pepper, bean, pea, asparagus, broccoli, watermelon, melon, lettuce, cucumber, celery and red pepper). Recoveries were in all cases in the whole range 85.2-108.9% for both DE and ME extracts. Also, the comparison of the results obtained by both immunochemical and chromatographic methods for spiked fruits and vegetables were good with a correlation coefficient (r) of 0.97.

  5. Statistical evaluations of current sampling procedures and incomplete core recovery

    International Nuclear Information System (INIS)

    Heasler, P.G.; Jensen, L.

    1994-03-01

    This document develops two formulas that describe the effects of incomplete recovery on core sampling results for the Hanford waste tanks. The formulas evaluate incomplete core recovery from a worst-case (i.e.,biased) and best-case (i.e., unbiased) perspective. A core sampler is unbiased if the sample material recovered is a random sample of the material in the tank, while any sampler that preferentially recovers a particular type of waste over others is a biased sampler. There is strong evidence to indicate that the push-mode sampler presently used at the Hanford site is a biased one. The formulas presented here show the effects of incomplete core recovery on the accuracy of composition measurements, as functions of the vertical variability in the waste. These equations are evaluated using vertical variability estimates from previously sampled tanks (B110, U110, C109). Assuming that the values of vertical variability used in this study adequately describes the Hanford tank farm, one can use the formulas to compute the effect of incomplete recovery on the accuracy of an average constituent estimate. To determine acceptable recovery limits, we have assumed that the relative error of such an estimate should be no more than 20%

  6. Random Walks on Directed Networks: Inference and Respondent-Driven Sampling

    Directory of Open Access Journals (Sweden)

    Malmros Jens

    2016-06-01

    Full Text Available Respondent-driven sampling (RDS is often used to estimate population properties (e.g., sexual risk behavior in hard-to-reach populations. In RDS, already sampled individuals recruit population members to the sample from their social contacts in an efficient snowball-like sampling procedure. By assuming a Markov model for the recruitment of individuals, asymptotically unbiased estimates of population characteristics can be obtained. Current RDS estimation methodology assumes that the social network is undirected, that is, all edges are reciprocal. However, empirical social networks in general also include a substantial number of nonreciprocal edges. In this article, we develop an estimation method for RDS in populations connected by social networks that include reciprocal and nonreciprocal edges. We derive estimators of the selection probabilities of individuals as a function of the number of outgoing edges of sampled individuals. The proposed estimators are evaluated on artificial and empirical networks and are shown to generally perform better than existing estimators. This is the case in particular when the fraction of directed edges in the network is large.

  7. GENERATION OF MULTI-LOD 3D CITY MODELS IN CITYGML WITH THE PROCEDURAL MODELLING ENGINE RANDOM3DCITY

    Directory of Open Access Journals (Sweden)

    F. Biljecki

    2016-09-01

    Full Text Available The production and dissemination of semantic 3D city models is rapidly increasing benefiting a growing number of use cases. However, their availability in multiple LODs and in the CityGML format is still problematic in practice. This hinders applications and experiments where multi-LOD datasets are required as input, for instance, to determine the performance of different LODs in a spatial analysis. An alternative approach to obtain 3D city models is to generate them with procedural modelling, which is – as we discuss in this paper – well suited as a method to source multi-LOD datasets useful for a number of applications. However, procedural modelling has not yet been employed for this purpose. Therefore, we have developed RANDOM3DCITY, an experimental procedural modelling engine for generating synthetic datasets of buildings and other urban features. The engine is designed to produce models in CityGML and does so in multiple LODs. Besides the generation of multiple geometric LODs, we implement the realisation of multiple levels of spatiosemantic coherence, geometric reference variants, and indoor representations. As a result of their permutations, each building can be generated in 392 different CityGML representations, an unprecedented number of modelling variants of the same feature. The datasets produced by RANDOM3DCITY are suited for several applications, as we show in this paper with documented uses. The developed engine is available under an open-source licence at Github at http://github.com/tudelft3d/Random3Dcity.

  8. Operability test procedure for the Rotary Mode Core Sampling System Exhausters 3 and 4

    International Nuclear Information System (INIS)

    WSaldo, E.J.

    1995-01-01

    This document provides a procedure for performing operability testing of the Rotary Mode Core Sampling System Exhausters 3 ampersand 4. Upon completion of testing activities an operability testing report will be issued

  9. Computer simulation of RBS spectra from samples with surface roughness

    Energy Technology Data Exchange (ETDEWEB)

    Malinský, P., E-mail: malinsky@ujf.cas.cz [Nuclear Physics Institute of the Academy of Sciences of the Czech Republic, v. v. i., 250 68 Rez (Czech Republic); Department of Physics, Faculty of Science, J. E. Purkinje University, Ceske mladeze 8, 400 96 Usti nad Labem (Czech Republic); Hnatowicz, V., E-mail: hnatowicz@ujf.cas.cz [Nuclear Physics Institute of the Academy of Sciences of the Czech Republic, v. v. i., 250 68 Rez (Czech Republic); Macková, A., E-mail: mackova@ujf.cas.cz [Nuclear Physics Institute of the Academy of Sciences of the Czech Republic, v. v. i., 250 68 Rez (Czech Republic); Department of Physics, Faculty of Science, J. E. Purkinje University, Ceske mladeze 8, 400 96 Usti nad Labem (Czech Republic)

    2016-03-15

    A fast code for the simulation of common RBS spectra including surface roughness effects has been written and tested on virtual samples comprising either a rough layer deposited on a smooth substrate or smooth layer deposited on a rough substrate and simulated at different geometries. The sample surface or interface relief has been described by a polyline and the simulated RBS spectrum has been obtained as the sum of many particular spectra from randomly chosen particle trajectories. The code includes several procedures generating virtual samples with random and regular (periodical) roughness. The shape of the RBS spectra has been found to change strongly with increasing sample roughness and an increasing angle of the incoming ion beam.

  10. Additive non-uniform random sampling in superimposed fiber Bragg grating strain gauge

    Science.gov (United States)

    Ma, Y. C.; Liu, H. Y.; Yan, S. B.; Yang, Y. H.; Yang, M. W.; Li, J. M.; Tang, J.

    2013-05-01

    This paper demonstrates an additive non-uniform random sampling and interrogation method for dynamic and/or static strain gauge using a reflection spectrum from two superimposed fiber Bragg gratings (FBGs). The superimposed FBGs are designed to generate non-equidistant space of a sensing pulse train in the time domain during dynamic strain gauge. By combining centroid finding with smooth filtering methods, both the interrogation speed and accuracy are improved. A 1.9 kHz dynamic strain is measured by generating an additive non-uniform randomly distributed 2 kHz optical sensing pulse train from a mean 500 Hz triangular periodically changing scanning frequency.

  11. Considerations for sampling nuclear materials for SNM accounting measurements. Special nuclear material accountability report

    International Nuclear Information System (INIS)

    Brouns, R.J.; Roberts, F.P.; Upson, U.L.

    1978-05-01

    This report presents principles and guidelines for sampling nuclear materials to measure chemical and isotopic content of the material. Development of sampling plans and procedures that maintain the random and systematic errors of sampling within acceptable limits for SNM(Special Nuclear Materials) accounting purposes are emphasized

  12. Path integral methods for primordial density perturbations - sampling of constrained Gaussian random fields

    International Nuclear Information System (INIS)

    Bertschinger, E.

    1987-01-01

    Path integrals may be used to describe the statistical properties of a random field such as the primordial density perturbation field. In this framework the probability distribution is given for a Gaussian random field subjected to constraints such as the presence of a protovoid or supercluster at a specific location in the initial conditions. An algorithm has been constructed for generating samples of a constrained Gaussian random field on a lattice using Monte Carlo techniques. The method makes possible a systematic study of the density field around peaks or other constrained regions in the biased galaxy formation scenario, and it is effective for generating initial conditions for N-body simulations with rare objects in the computational volume. 21 references

  13. An efficient method of randomly sampling the coherent angular scatter distribution

    International Nuclear Information System (INIS)

    Williamson, J.F.; Morin, R.L.

    1983-01-01

    Monte Carlo simulations of photon transport phenomena require random selection of an interaction process at each collision site along the photon track. Possible choices are usually limited to photoelectric absorption and incoherent scatter as approximated by the Klein-Nishina distribution. A technique is described for sampling the coherent angular scatter distribution, for the benefit of workers in medical physics. (U.K.)

  14. Study on different pre-treatment procedures for metal determination in Orujo spirit samples by ICP-AES

    International Nuclear Information System (INIS)

    Barciela, Julia; Vilar, Manuela; Garcia-Martin, Sagrario; Pena, Rosa M.; Herrero, Carlos

    2008-01-01

    In this work several pre-treatment methods were studied for metal (Na, K, Mg, Cu and Ca) determination in Orujo spirit samples using inductively coupled plasma atomic emission spectrometry (ICP-AES). Dilution, digestion, evaporation, and cryogenic desolvatation techniques were comparatively evaluated. Because of their analytical characteristics, digestion and evaporation with nitrogen current were found to be appropriate procedures for the determination of metals in alcoholic spirit samples. Yet, if simplicity and application time are to be considered, the latter-evaporation in a water bath with a nitrogen current-stands out as the optimum procedure for any further determinations in Orujo samples by ICP-AES. Low detection levels and wide linear ranges (sufficient to determine these metals in the samples studied) were achieved for each metal. The recoveries (in the 97.5-100.5% range) and the precision (R.S.D. lower than 5.6%) obtained were also satisfactory. The selected procedure was applied to determine the content of metals in 80 representative Galician Orujo spirit samples with and without a Certified Brand of Origin (CBO) which had been produced using different distillation systems. The metal concentrations ranged between 0.37 and 79.7 mg L -1 for Na, -1 for K, 0.02-4.83 mg L -1 for Mg content, -1 for Cu and 0.03-13.10 mg L -1 for Ca

  15. Random vs. systematic sampling from administrative databases involving human subjects.

    Science.gov (United States)

    Hagino, C; Lo, R J

    1998-09-01

    Two sampling techniques, simple random sampling (SRS) and systematic sampling (SS), were compared to determine whether they yield similar and accurate distributions for the following four factors: age, gender, geographic location and years in practice. Any point estimate within 7 yr or 7 percentage points of its reference standard (SRS or the entire data set, i.e., the target population) was considered "acceptably similar" to the reference standard. The sampling frame was from the entire membership database of the Canadian Chiropractic Association. The two sampling methods were tested using eight different sample sizes of n (50, 100, 150, 200, 250, 300, 500, 800). From the profile/characteristics, summaries of four known factors [gender, average age, number (%) of chiropractors in each province and years in practice], between- and within-methods chi 2 tests and unpaired t tests were performed to determine whether any of the differences [descriptively greater than 7% or 7 yr] were also statistically significant. The strengths of the agreements between the provincial distributions were quantified by calculating the percent agreements for each (provincial pairwise-comparison methods). Any percent agreement less than 70% was judged to be unacceptable. Our assessments of the two sampling methods (SRS and SS) for the different sample sizes tested suggest that SRS and SS yielded acceptably similar results. Both methods started to yield "correct" sample profiles at approximately the same sample size (n > 200). SS is not only convenient, it can be recommended for sampling from large databases in which the data are listed without any inherent order biases other than alphabetical listing by surname.

  16. RECOMMENDED OPERATING PROCEDURE NO. 56: COLLECTION OF GASEOUS GRAB SAMPLES FROM COMBUSTION SOURCES FOR NITROUS OXIDE MEASUREMENT

    Science.gov (United States)

    The document is a recommended operating procedure, prepare or use in research activities conducted by EPA's Air and Energy Engineering Research Laboratory (AEERL). The procedure applies to the collection of gaseous grab samples from fossil fuel combustion sources for subsequent a...

  17. New experimental procedure for measuring volume magnetostriction on powder samples

    International Nuclear Information System (INIS)

    Rivero, G.; Multigner, M.; Valdes, J.; Crespo, P.; Martinez, A.; Hernando, A.

    2005-01-01

    Conventional techniques used for volume magnetostriction measurements, as strain gauge or cantilever method, are very useful for ribbons or thin films but cannot be applied when the samples are in powder form. To overcome this problem a new experimental procedure has been developed. In this work, the experimental set-up is described, together with the results obtained in amorphous FeCuZr powders, which exhibit a strong dependence of the magnetization on the strength of the applied magnetic field. The magnetostriction measurements presented in this work point out that this dependence is related to a magnetovolume effect

  18. Influence of Freezing and Storage Procedure on Human Urine Samples in NMR-Based Metabolomics

    OpenAIRE

    Rist, Manuela; Muhle-Goll, Claudia; Görling, Benjamin; Bub, Achim; Heissler, Stefan; Watzl, Bernhard; Luy, Burkhard

    2013-01-01

    It is consensus in the metabolomics community that standardized protocols should be followed for sample handling, storage and analysis, as it is of utmost importance to maintain constant measurement conditions to identify subtle biological differences. The aim of this work, therefore, was to systematically investigate the influence of freezing procedures and storage temperatures and their effect on NMR spectra as a potentially disturbing aspect for NMR-based metabolomics studies. Urine sample...

  19. Additive non-uniform random sampling in superimposed fiber Bragg grating strain gauge

    International Nuclear Information System (INIS)

    Ma, Y C; Liu, H Y; Yan, S B; Li, J M; Tang, J; Yang, Y H; Yang, M W

    2013-01-01

    This paper demonstrates an additive non-uniform random sampling and interrogation method for dynamic and/or static strain gauge using a reflection spectrum from two superimposed fiber Bragg gratings (FBGs). The superimposed FBGs are designed to generate non-equidistant space of a sensing pulse train in the time domain during dynamic strain gauge. By combining centroid finding with smooth filtering methods, both the interrogation speed and accuracy are improved. A 1.9 kHz dynamic strain is measured by generating an additive non-uniform randomly distributed 2 kHz optical sensing pulse train from a mean 500 Hz triangular periodically changing scanning frequency. (paper)

  20. The Dirichlet-Multinomial Model for Multivariate Randomized Response Data and Small Samples

    Science.gov (United States)

    Avetisyan, Marianna; Fox, Jean-Paul

    2012-01-01

    In survey sampling the randomized response (RR) technique can be used to obtain truthful answers to sensitive questions. Although the individual answers are masked due to the RR technique, individual (sensitive) response rates can be estimated when observing multivariate response data. The beta-binomial model for binary RR data will be generalized…

  1. Study on different pre-treatment procedures for metal determination in Orujo spirit samples by ICP-AES

    Energy Technology Data Exchange (ETDEWEB)

    Barciela, Julia; Vilar, Manuela; Garcia-Martin, Sagrario [Departamento de Quimica Analitica, Nutricion y Bromatologia, Facultad de Ciencias, Universidad de Santiago de Compostela, Campus de Lugo, 27002 Lugo (Spain); Pena, Rosa M. [Departamento de Quimica Analitica, Nutricion y Bromatologia, Facultad de Ciencias, Universidad de Santiago de Compostela, Campus de Lugo, 27002 Lugo (Spain)], E-mail: qarosa@lugo.usc.es; Herrero, Carlos [Departamento de Quimica Analitica, Nutricion y Bromatologia, Facultad de Ciencias, Universidad de Santiago de Compostela, Campus de Lugo, 27002 Lugo (Spain)], E-mail: cherrero@lugo.usc.es

    2008-10-17

    In this work several pre-treatment methods were studied for metal (Na, K, Mg, Cu and Ca) determination in Orujo spirit samples using inductively coupled plasma atomic emission spectrometry (ICP-AES). Dilution, digestion, evaporation, and cryogenic desolvatation techniques were comparatively evaluated. Because of their analytical characteristics, digestion and evaporation with nitrogen current were found to be appropriate procedures for the determination of metals in alcoholic spirit samples. Yet, if simplicity and application time are to be considered, the latter-evaporation in a water bath with a nitrogen current-stands out as the optimum procedure for any further determinations in Orujo samples by ICP-AES. Low detection levels and wide linear ranges (sufficient to determine these metals in the samples studied) were achieved for each metal. The recoveries (in the 97.5-100.5% range) and the precision (R.S.D. lower than 5.6%) obtained were also satisfactory. The selected procedure was applied to determine the content of metals in 80 representative Galician Orujo spirit samples with and without a Certified Brand of Origin (CBO) which had been produced using different distillation systems. The metal concentrations ranged between 0.37 and 79.7 mg L{sup -1} for Na,

  2. The MacDonald and savage titrimetric procedure for plutonium scaled-down to the milligram level: Automated procedure for routine analysis of safeguards samples containing 2 to 5 mg plutonium

    International Nuclear Information System (INIS)

    Ronesch, K.; Jammet, G.; Berger, J.; Doubek, N.; Bagliano, G.; Deron, S.

    1992-08-01

    A selective titrimetric procedure directly applicable to both input and product solutions from fast reactor fuel reprocessing was set up by MacDonald and Savage and scaled down to 3 mg of plutonium in sample aliquots at the request of the Safeguards Analytical Laboratory (SAL) of the International Atomic Energy Agency (IAEA), which needed to replace its silver (II) oxide titration procedure by a more selective electrochemical method. Although the procedure is very selective the following species still interfere: Vanadates (almost quantitatively), Neptunium (one electron exchange per mole); Nitrites, Fluorosilicates and Iodates present in mg amount yield slight biases. This paper describes the fully automatized procedure presently applied in SAL for the routine determination of 2 to 5 mg plutonium dissolved in nitric acid solution. The method allows the unattended analysis of 20 aliquots within a five hour period. The equipment including the reagent distribution system, the sample changer and the control units are introduced and the principle design of the software is shortly described. Safety requirements have been addressed and are also reviewed in the report. Results obtained on standard reference materials, international intercomparison samples and actual safeguards samples routinely analyzed with the proposed procedure are presented and compared with results achieved with the semiautomatic mode to demonstrate the performance. International requirements to reduce the amount of radioactive materials in waste will certainly lead to a further reduction of the amount of plutonium in one aliquot. Some future perspective to titrate 1 mg samples are presented in the discussion. 12 refs, 10 figs, 8 tabs

  3. CT-Guided Transgluteal Biopsy for Systematic Random Sampling of the Prostate in Patients Without Rectal Access.

    Science.gov (United States)

    Goenka, Ajit H; Remer, Erick M; Veniero, Joseph C; Thupili, Chakradhar R; Klein, Eric A

    2015-09-01

    The objective of our study was to review our experience with CT-guided transgluteal prostate biopsy in patients without rectal access. Twenty-one CT-guided transgluteal prostate biopsy procedures were performed in 16 men (mean age, 68 years; age range, 60-78 years) who were under conscious sedation. The mean prostate-specific antigen (PSA) value was 11.4 ng/mL (range, 2.3-39.4 ng/mL). Six had seven prior unsuccessful transperineal or transurethral biopsies. Biopsy results, complications, sedation time, and radiation dose were recorded. The mean PSA values and number of core specimens were compared between patients with malignant results and patients with nonmalignant results using the Student t test. The average procedural sedation time was 50.6 minutes (range, 15-90 minutes) (n = 20), and the mean effective radiation dose was 8.2 mSv (median, 6.6 mSv; range 3.6-19.3 mSv) (n = 13). Twenty of the 21 (95%) procedures were technically successful. The only complication was a single episode of gross hematuria and penile pain in one patient, which resolved spontaneously. Of 20 successful biopsies, 8 (40%) yielded adenocarcinoma (Gleason score: mean, 8; range, 7-9). Twelve biopsies yielded nonmalignant results (60%): high-grade prostatic intraepithelial neoplasia (n = 3) or benign prostatic tissue with or without inflammation (n = 9). Three patients had carcinoma diagnosed on subsequent biopsies (second biopsy, n = 2 patients; third biopsy, n = 1 patient). A malignant biopsy result was not significantly associated with the number of core specimens (p = 0.3) or the mean PSA value (p = 0.1). CT-guided transgluteal prostate biopsy is a safe and reliable technique for the systematic random sampling of the prostate in patients without a rectal access. In patients with initial negative biopsy results, repeat biopsy should be considered if there is a persistent rise in the PSA value.

  4. Flow, transport and diffusion in random geometries II: applications

    KAUST Repository

    Asinari, Pietro

    2015-01-07

    Multilevel Monte Carlo (MLMC) is an efficient and flexible solution for the propagation of uncertainties in complex models, where an explicit parametrization of the input randomness is not available or too expensive. We present several applications of our MLMC algorithm for flow, transport and diffusion in random heterogeneous materials. The absolute permeability and effective diffusivity (or formation factor) of micro-scale porous media samples are computed and the uncertainty related to the sampling procedures is studied. The algorithm is then extended to the transport problems and multiphase flows for the estimation of dispersion and relative permeability curves. The impact of water drops on random stuctured surfaces, with microfluidics applications to self-cleaning materials, is also studied and simulated. Finally the estimation of new drag correlation laws for poly-dispersed dilute and dense suspensions is presented.

  5. Flow, transport and diffusion in random geometries II: applications

    KAUST Repository

    Asinari, Pietro; Ceglia, Diego; Icardi, Matteo; Prudhomme, Serge; Tempone, Raul

    2015-01-01

    Multilevel Monte Carlo (MLMC) is an efficient and flexible solution for the propagation of uncertainties in complex models, where an explicit parametrization of the input randomness is not available or too expensive. We present several applications of our MLMC algorithm for flow, transport and diffusion in random heterogeneous materials. The absolute permeability and effective diffusivity (or formation factor) of micro-scale porous media samples are computed and the uncertainty related to the sampling procedures is studied. The algorithm is then extended to the transport problems and multiphase flows for the estimation of dispersion and relative permeability curves. The impact of water drops on random stuctured surfaces, with microfluidics applications to self-cleaning materials, is also studied and simulated. Finally the estimation of new drag correlation laws for poly-dispersed dilute and dense suspensions is presented.

  6. A procedure for the rapid determination of Pu isotopes and Am-241 in soil and sediment samples by alpha spectrometry

    International Nuclear Information System (INIS)

    2009-01-01

    Reliable, comparable and 'fit for purpose' results are an essential requirement for any decision based on analytical measurements. For the analyst, the availability of tested and validated analytical procedures is a extremely important tool for production of such analytical measurements. For maximum utility, such procedures should be comprehensive, clearly formulated, and readily available to both the analyst and the customer for reference. Since 2004 the Environment Programme of the IAEA has included activities aimed at the development of a set of procedures for the determination of radionuclides in terrestrial environmental samples. In this report, a rapid procedure for the determination of Pu and Am radionuclides in soil and sediment samples is described that can be used in emergency situations. The method provides accurate and reliable results for the activity concentrations of elevated levels of 239,240 Pu, 238 Pu and 241 Am in soil and sediment samples over the course of 24 hours. The procedure has been validated in accordance with ISO guidelines

  7. Theory of sampling and its application in tissue based diagnosis

    Directory of Open Access Journals (Sweden)

    Kayser Gian

    2009-02-01

    Full Text Available Abstract Background A general theory of sampling and its application in tissue based diagnosis is presented. Sampling is defined as extraction of information from certain limited spaces and its transformation into a statement or measure that is valid for the entire (reference space. The procedure should be reproducible in time and space, i.e. give the same results when applied under similar circumstances. Sampling includes two different aspects, the procedure of sample selection and the efficiency of its performance. The practical performance of sample selection focuses on search for localization of specific compartments within the basic space, and search for presence of specific compartments. Methods When a sampling procedure is applied in diagnostic processes two different procedures can be distinguished: I the evaluation of a diagnostic significance of a certain object, which is the probability that the object can be grouped into a certain diagnosis, and II the probability to detect these basic units. Sampling can be performed without or with external knowledge, such as size of searched objects, neighbourhood conditions, spatial distribution of objects, etc. If the sample size is much larger than the object size, the application of a translation invariant transformation results in Kriege's formula, which is widely used in search for ores. Usually, sampling is performed in a series of area (space selections of identical size. The size can be defined in relation to the reference space or according to interspatial relationship. The first method is called random sampling, the second stratified sampling. Results Random sampling does not require knowledge about the reference space, and is used to estimate the number and size of objects. Estimated features include area (volume fraction, numerical, boundary and surface densities. Stratified sampling requires the knowledge of objects (and their features and evaluates spatial features in relation to

  8. Landslide Susceptibility Assessment Using Frequency Ratio Technique with Iterative Random Sampling

    Directory of Open Access Journals (Sweden)

    Hyun-Joo Oh

    2017-01-01

    Full Text Available This paper assesses the performance of the landslide susceptibility analysis using frequency ratio (FR with an iterative random sampling. A pair of before-and-after digital aerial photographs with 50 cm spatial resolution was used to detect landslide occurrences in Yongin area, Korea. Iterative random sampling was run ten times in total and each time it was applied to the training and validation datasets. Thirteen landslide causative factors were derived from the topographic, soil, forest, and geological maps. The FR scores were calculated from the causative factors and training occurrences repeatedly ten times. The ten landslide susceptibility maps were obtained from the integration of causative factors that assigned FR scores. The landslide susceptibility maps were validated by using each validation dataset. The FR method achieved susceptibility accuracies from 89.48% to 93.21%. And the landslide susceptibility accuracy of the FR method is higher than 89%. Moreover, the ten times iterative FR modeling may contribute to a better understanding of a regularized relationship between the causative factors and landslide susceptibility. This makes it possible to incorporate knowledge-driven considerations of the causative factors into the landslide susceptibility analysis and also be extensively used to other areas.

  9. A DATABASE FOR TRACKING TOXICOGENOMIC SAMPLES AND PROCEDURES WITH GENOMIC, PROTEOMIC AND METABONOMIC COMPONENTS

    Science.gov (United States)

    A Database for Tracking Toxicogenomic Samples and Procedures with Genomic, Proteomic and Metabonomic Components Wenjun Bao1, Jennifer Fostel2, Michael D. Waters2, B. Alex Merrick2, Drew Ekman3, Mitchell Kostich4, Judith Schmid1, David Dix1Office of Research and Developmen...

  10. Evaluation of six sample preparation procedures for qualitative and quantitative proteomics analysis of milk fat globule membrane.

    Science.gov (United States)

    Yang, Yongxin; Anderson, Elizabeth; Zhang, Sheng

    2018-04-12

    Proteomic analysis of membrane proteins is challenged by the proteins solubility and detergent incompatibility with MS analysis. No single perfect protocol can be used to comprehensively characterize the proteome of membrane fraction. Here, we used cow milk fat globule membrane (MFGM) proteome analysis to assess six sample preparation procedures including one in-gel and five in-solution digestion approaches prior to LC-MS/MS analysis. The largest number of MFGM proteins were identified by suspension trapping (S-Trap) and filter-aided sample preparation (FASP) methods, followed by acetone precipitation without clean-up of tryptic peptides method. Protein identifications with highest average coverage was achieved by Chloroform/MeOH, in-gel and S-Trap methods. Most distinct proteins were identified by FASP method, followed by S-Trap. Analyses by Venn diagram, principal-component analysis, hierarchical clustering and the abundance ranking of quantitative proteins highlight differences in the MFGM fraction by the all sample preparation procedures. These results reveal the biased proteins/peptides loss occurred in each protocol. In this study, we found several novel proteins that were not observed previously by in-depth proteomics characterization of MFGM fraction in milk. Thus, a combination of multiple procedures with orthologous properties of sample preparation was demonstrated to improve the protein sequence coverage and expression level accuracy of membrane samples. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. PERSONAL NETWORK SAMPLING, OUTDEGREE ANALYSIS AND MULTILEVEL ANALYSIS - INTRODUCING THE NETWORK CONCEPT IN STUDIES OF HIDDEN POPULATIONS

    NARCIS (Netherlands)

    SPREEN, M; ZWAAGSTRA, R

    1994-01-01

    Populations, such as heroin and cocaine users, the homeless and the like (hidden populations), are among the most difficult populations to which to apply classic random sampling procedures. A frequently used data collection method for these hidden populations is the snowball procedure. The

  12. Efficient Unbiased Rendering using Enlightened Local Path Sampling

    DEFF Research Database (Denmark)

    Kristensen, Anders Wang

    measurements, which are the solution to the adjoint light transport problem. The second is a representation of the distribution of radiance and importance in the scene. We also derive a new method of particle sampling, which is advantageous compared to existing methods. Together we call the resulting algorithm....... The downside to using these algorithms is that they can be slow to converge. Due to the nature of Monte Carlo methods, the results are random variables subject to variance. This manifests itself as noise in the images, which can only be reduced by generating more samples. The reason these methods are slow...... is because of a lack of eeffective methods of importance sampling. Most global illumination algorithms are based on local path sampling, which is essentially a recipe for constructing random walks. Using this procedure paths are built based on information given explicitly as part of scene description...

  13. Establishing a routine procedure for extraction of water from vegetation samples

    International Nuclear Information System (INIS)

    Varlam, Carmen; Stefanescu, Ioan; Faurescu, Ionut; Vagner, Irina; Faurescu, Denisa

    2008-01-01

    Full text: The Cryogenic Pilot is an experimental project within the nuclear energy national research program, which has the aim of developing technologies for tritium and deuterium separation by cryogenic distillation. The process, used in this installation, is based on a combined method for liquid-phase catalytic exchange (LPCE) and cryogenic distillation. There are two ways that Cryogenic Pilot can interact with the environment: by atmospheric release and by sewage. In order to establish the base line of tritium concentration in the environment around the nuclear facilities we investigated the preparation procedure for different type of samples: soil, hay, apple, grass, milk, meat and water. For azeotropic distillation of all types of samples two solvents were used, toluene and cyclohexane, and all measurements for determination of environmental tritium concentration were carry out using liquid scintillation counting (LSC), with ultra-low level liquid scintillation spectrometer Quantulus 1220 specially designed for environmental samples and low radioactivity. Sample scintillation cocktail ratio was 8:12 ml and liquid scintillation cocktail was UltimaGold LLT. The background determined for control samples prepared ranged between 0.926 Cpm and 1.002 Cpm and counting efficiency between 25.3% and 26.1%. The counting time was 1000 minutes (50 minutes/20 cycles) for each sample, and minimum detectable activity according to ISO 9698 was 8.9 TU, and 9.05 TU, respectively, with a confidence coefficient of 3. (authors)

  14. A simple sample size formula for analysis of covariance in cluster randomized trials.

    NARCIS (Netherlands)

    Teerenstra, S.; Eldridge, S.; Graff, M.J.; Hoop, E. de; Borm, G.F.

    2012-01-01

    For cluster randomized trials with a continuous outcome, the sample size is often calculated as if an analysis of the outcomes at the end of the treatment period (follow-up scores) would be performed. However, often a baseline measurement of the outcome is available or feasible to obtain. An

  15. Recommended operating procedure number 56: Collection of gaseous grab samples from combustion sources for nitrous oxide measurement. Final report, Jan-Dec 91

    International Nuclear Information System (INIS)

    Ryan, J.V.; Karns, S.A.

    1992-07-01

    The document is a recommended operating procedure (ROP), prepared for use in research activities conducted by EPA's Air and Energy Engineering Research Laboratory (AEERL). The procedure applies to the collection of gaseous grab samples from fossil fuel combustion sources for subsequent analysis of nitrous oxide. The procedure details only the grab sampling methodology and associated equipment

  16. Adrenal venous sampling: the learning curve of a single interventionalist with 282 consecutive procedures.

    Science.gov (United States)

    Jakobsson, Hugo; Farmaki, Katerina; Sakinis, Augustinas; Ehn, Olof; Johannsson, Gudmundur; Ragnarsson, Oskar

    2018-01-01

    Primary aldosteronism (PA) is a common cause of secondary hypertension. Adrenal venous sampling (AVS) is the gold standard for assessing laterality of PA, which is of paramount importance to decide adequate treatment. AVS is a technically complicated procedure with success rates ranging between 30% and 96%. The aim of this study was to investigate the success rate of AVS over time, performed by a single interventionalist. This was a retrospective study based on consecutive AVS procedures performed by a single operator between September 2005 and June 2016. Data on serum concentrations of aldosterone and cortisol from right and left adrenal vein, inferior vena cava, and peripheral vein were collected and selectivity index (SI) calculated. Successful AVS was defined as SI > 5. In total, 282 AVS procedures were performed on 269 patients, 168 men (62%) and 101 women (38%), with a mean age of 55±11 years (range, 26-78 years). Out of 282 AVS procedures, 259 were successful, giving an overall success rate of 92%. The most common reason for failure was inability to localize the right adrenal vein (n=16; 76%). The success rates were 63%, 82%, and 94% during the first, second, and third years, respectively. During the last 8 years the success rate was 95%, and on average 27 procedures were performed annually. Satisfactory AVS success rate was achieved after approximately 36 procedures and satisfactory success rate was maintained by performing approximately 27 procedures annually. AVS should be limited to few operators that perform sufficiently large number of procedures to achieve, and maintain, satisfactory AVS success rate.

  17. Remarks on the sampling procedures for polycyclic aromatic hydrocarbons from the atmosphere

    Energy Technology Data Exchange (ETDEWEB)

    Tomingas, R [Duesseldorf Univ. (Germany, F.R.). Inst. fuer Lufthygiene und Silikoseforschung

    1979-01-01

    Despite all efforts to optimize the sampling procedure for polycyclic aromatic hydrocarbons (PAH) a part of the PAH is lost during the sampling period. The amount of loss depends on the vapour pressure, the air flow rate, the oxidants in the suspended matter and on the sampling time. The sampling time in return is determined by the PAH concentration in the atmosphere and cannot be shortened infinitely, at least not at the expense of the air flow rate. When the concentration of the PAH is extremely low, a repeated change of the filters is advisable. Losses of PAH even occur when filters are stored in the dark. An evaluation of these losses is difficult for many reasons. There is no relationship between the amount of loss and time unit; the decrease of each PAH is different and seems to depend on the composition of the particulate matter collected on the filter. The disappearance of the PAH from the filter is a continual process, therefore a rapid performance of the analysis is inevitable.

  18. Procedures for field chemical analyses of water samples

    International Nuclear Information System (INIS)

    Korte, N.; Ealey, D.

    1983-12-01

    A successful water-quality monitoring program requires a clear understanding of appropriate measurement procedures in order to obtain reliable field data. It is imperative that the responsible personnel have a thorough knowledge of the limitations of the techniques being used. Unfortunately, there is a belief that field analyses are simple and straightforward. Yet, significant controversy as well as misuse of common measurement techniques abounds. This document describes procedures for field measurements of pH, carbonate and bicarbonate, specific conductance, dissolved oxygen, nitrate, Eh, and uranium. Each procedure section includes an extensive discussion regarding the limitations of the method as well as brief discussions of calibration procedures and available equipment. A key feature of these procedures is the consideration given to the ultimate use of the data. For example, if the data are to be used for geochemical modeling, more precautions are needed. In contrast, routine monitoring conducted merely to recognize gross changes can be accomplished with less effort. Finally, quality assurance documentation for each measurement is addressed in detail. Particular attention is given to recording sufficient information such that decisions concerning the quality of the data can be easily made. Application of the procedures and recommendations presented in this document should result in a uniform and credible water-quality monitoring program. 22 references, 4 figures, 3 tables

  19. Event-triggered synchronization for reaction-diffusion complex networks via random sampling

    Science.gov (United States)

    Dong, Tao; Wang, Aijuan; Zhu, Huiyun; Liao, Xiaofeng

    2018-04-01

    In this paper, the synchronization problem of the reaction-diffusion complex networks (RDCNs) with Dirichlet boundary conditions is considered, where the data is sampled randomly. An event-triggered controller based on the sampled data is proposed, which can reduce the number of controller and the communication load. Under this strategy, the synchronization problem of the diffusion complex network is equivalently converted to the stability of a of reaction-diffusion complex dynamical systems with time delay. By using the matrix inequality technique and Lyapunov method, the synchronization conditions of the RDCNs are derived, which are dependent on the diffusion term. Moreover, it is found the proposed control strategy can get rid of the Zeno behavior naturally. Finally, a numerical example is given to verify the obtained results.

  20. Evaluation of a Class of Simple and Effective Uncertainty Methods for Sparse Samples of Random Variables and Functions

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Vicente [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bonney, Matthew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Schroeder, Benjamin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Weirs, V. Gregory [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-11-01

    When very few samples of a random quantity are available from a source distribution of unknown shape, it is usually not possible to accurately infer the exact distribution from which the data samples come. Under-estimation of important quantities such as response variance and failure probabilities can result. For many engineering purposes, including design and risk analysis, we attempt to avoid under-estimation with a strategy to conservatively estimate (bound) these types of quantities -- without being overly conservative -- when only a few samples of a random quantity are available from model predictions or replicate experiments. This report examines a class of related sparse-data uncertainty representation and inference approaches that are relatively simple, inexpensive, and effective. Tradeoffs between the methods' conservatism, reliability, and risk versus number of data samples (cost) are quantified with multi-attribute metrics use d to assess method performance for conservative estimation of two representative quantities: central 95% of response; and 10-4 probability of exceeding a response threshold in a tail of the distribution. Each method's performance is characterized with 10,000 random trials on a large number of diverse and challenging distributions. The best method and number of samples to use in a given circumstance depends on the uncertainty quantity to be estimated, the PDF character, and the desired reliability of bounding the true value. On the basis of this large data base and study, a strategy is proposed for selecting the method and number of samples for attaining reasonable credibility levels in bounding these types of quantities when sparse samples of random variables or functions are available from experiments or simulations.

  1. 47 CFR 1.1602 - Designation for random selection.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Designation for random selection. 1.1602 Section 1.1602 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1602 Designation for random selection...

  2. Infrared thermometry and the crop water stress index. II. Sampling procedures and interpretation

    International Nuclear Information System (INIS)

    Gardner, B.R.; Nielsen, D.C.; Shock, C.C.

    1992-01-01

    Infrared thermometry can be a valuable research and production tool for detecting and quantifying water stress in plants, as shown by a large volume of published research. Users of infrared thermometers (IRT) should be aware of the many equipment, environmental, and plant factors influencing canopy temperature measured by an IRT. The purpose of this paper is to describe factors influencing measured plant temperature, outline sampling procedures that will produce reliable Crop Water Stress Index (CWSI) values, and offer interpretations of CWSI and plant temperatures relative to crop production and other water stress parameters by reviewing previously conducted research. Factors that are considered are IRT condition, configuration, and position; psychrometer location; wind speed; solar radiation; time of day; leaf area and orientation; and appropriate non-water-stressed baseline equation. Standard sampling and CWSI calculation procedures are proposed. Use of CWSI with crops varying in type of response to water stress is described. Previously conducted research on plant temperatures or CWSI is tabulated by crop and water stress parameters measured. The paper provides valuable information to assist interested users of IRTs in making reliable water stress measurements. (author)

  3. Infrared thermometry and the crop water stress index. II. Sampling procedures and interpretation

    Energy Technology Data Exchange (ETDEWEB)

    Gardner, B. R. [BP Research, Cleveland, OH (United States); Nielsen, D. C.; Shock, C. C.

    1992-10-15

    Infrared thermometry can be a valuable research and production tool for detecting and quantifying water stress in plants, as shown by a large volume of published research. Users of infrared thermometers (IRT) should be aware of the many equipment, environmental, and plant factors influencing canopy temperature measured by an IRT. The purpose of this paper is to describe factors influencing measured plant temperature, outline sampling procedures that will produce reliable Crop Water Stress Index (CWSI) values, and offer interpretations of CWSI and plant temperatures relative to crop production and other water stress parameters by reviewing previously conducted research. Factors that are considered are IRT condition, configuration, and position; psychrometer location; wind speed; solar radiation; time of day; leaf area and orientation; and appropriate non-water-stressed baseline equation. Standard sampling and CWSI calculation procedures are proposed. Use of CWSI with crops varying in type of response to water stress is described. Previously conducted research on plant temperatures or CWSI is tabulated by crop and water stress parameters measured. The paper provides valuable information to assist interested users of IRTs in making reliable water stress measurements. (author)

  4. 47 CFR 1.1603 - Conduct of random selection.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Conduct of random selection. 1.1603 Section 1.1603 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1603 Conduct of random selection. The...

  5. Comparative evaluation of the US Environmental Protection Agency's and the Oak Ridge Institute for Science and Education's environmental survey and site assessment program field sampling procedures

    International Nuclear Information System (INIS)

    Vitkus, T.J.; Bright, T.L.; Roberts, S.A.

    1997-10-01

    At the request of the U.S. Nuclear Regulatory Commission's (NRC's) Headquarters Office, the Environmental Survey and Site Assessment Program (ESSAP) of the Oak Ridge Institute for Science and Education (ORISE) compared the documented procedures that the U.S. Environmental Protection Agency (EPA) and ESSAP use for collecting environmental samples. The project objectives were to review both organizations' procedures applicable to collecting various sample matrices, compare the procedures for similarities and differences, and then to evaluate the reason for any identified procedural differences and their potential impact on ESSAP's sample data quality. The procedures reviewed included those for sampling surface and subsurface soil, surface and groundwater, vegetation, air, and removable surface contamination. ESSAP obtained copies of relevant EPA documents and reviewed and prepared a tabulated summary of each applicable procedure. The methods for collecting and handling each type of sample were evaluated for differences, and where these were identified, the significance and effect of the differences on analytical quality were determined. The results of the comparison showed that, overall, the procedures and methods that EPA and ESSAP use for sample collection are very similar. The number of minor differences noted were the result of restrictions or procedures necessary to ensure sample integrity and prevent the introduction of interfering compounds when samples are to be analyzed for chemical parameters. For most radio nuclide analyses, these additional procedures are not necessary. Another item noted was EPA's inclusion of steps that reduce the potential for sample cross-contamination by preparing (dressing) a location prior to collecting a sample or removing a portion of a sample prior to containerization

  6. Music Therapy as Procedural Support for Young Children Undergoing Immunizations: A Randomized Controlled Study.

    Science.gov (United States)

    Yinger, Olivia Swedberg

    2016-01-01

    Children undergoing routine immunizations frequently experience severe distress, which may be improved through music therapy as procedural support. The purpose of this study was to examine effects of live, cognitive-behavioral music therapy during immunizations on (a) the behaviors of children, their parents, and their nurses; and (b) parental perceptions. Participants were children between the ages of 4 and 6 years (N = 58) who underwent immunizations, their parents (N = 62), and the nurses who administered the procedure (N = 19). Parent/child dyads were randomly assigned to receive music therapy (n = 29) or standard care (n = 29) during their immunization. Afterward, each parent rated their child's level of pain and the distress their child experienced compared to previous medical experiences. All procedures were videotaped and later viewed by trained observers, who classified child, parent, and nurse behaviors using the categories of the Child-Adult Medical Procedure Interaction Scale-Revised (CAMPIS-R). Significant differences between the music therapy and control groups were found in rates of child coping and distress behaviors and parent distress-promoting behaviors. Parents of children who received music therapy reported that their child's level of distress was less than during previous medical experiences, whereas parents of children in the control group reported that their child's level of distress was greater. No significant differences between groups were found in parents' ratings of children's pain or in rates of nurse behavior. Live, cognitive-behavioral music therapy has potential benefits for young children and their parents during immunizations. © the American Music Therapy Association 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  7. Lightweight link dimensioning using sFlow sampling

    DEFF Research Database (Denmark)

    de Oliviera Schmidt, Ricardo; Sadre, Ramin; Sperotto, Anna

    2013-01-01

    not be trivial in high-speed links. Aiming scalability, operators often deploy packet sampling on monitoring, but little is known how it affects link dimensioning. In this paper we assess the feasibility of lightweight link dimensioning using sFlow, which is a widely-deployed traffic monitoring tool. We...... implement sFlow sampling algorithm and use a previously proposed and validated dimensioning formula that needs traffic variance. We validate our approach using packet captures from real networks. Results show that the proposed procedure is successful for a range of sampling rates and that, due to randomness...... of sampling algorithm, the error introduced by scaling the traffic variance yields more conservative results that cope with short-term traffic fluctuations....

  8. Randomization of inspections

    International Nuclear Information System (INIS)

    Markin, J.T.

    1989-01-01

    As the numbers and complexity of nuclear facilities increase, limitations on resources for international safeguards may restrict attainment of safeguards goals. One option for improving the efficiency of limited resources is to expand the current inspection regime to include random allocation of the amount and frequency of inspection effort to material strata or to facilities. This paper identifies the changes in safeguards policy, administrative procedures, and operational procedures that would be necessary to accommodate randomized inspections and identifies those situations where randomization can improve inspection efficiency and those situations where the current nonrandom inspections should be maintained. 9 refs., 1 tab

  9. Sampling and Low-Rank Tensor Approximation of the Response Surface

    KAUST Repository

    Litvinenko, Alexander; Matthies, Hermann Georg; El-Moselhy, Tarek A.

    2013-01-01

    Most (quasi)-Monte Carlo procedures can be seen as computing some integral over an often high-dimensional domain. If the integrand is expensive to evaluate-we are thinking of a stochastic PDE (SPDE) where the coefficients are random fields and the integrand is some functional of the PDE-solution-there is the desire to keep all the samples for possible later computations of similar integrals. This obviously means a lot of data. To keep the storage demands low, and to allow evaluation of the integrand at points which were not sampled, we construct a low-rank tensor approximation of the integrand over the whole integration domain. This can also be viewed as a representation in some problem-dependent basis which allows a sparse representation. What one obtains is sometimes called a "surrogate" or "proxy" model, or a "response surface". This representation is built step by step or sample by sample, and can already be used for each new sample. In case we are sampling a solution of an SPDE, this allows us to reduce the number of necessary samples, namely in case the solution is already well-represented by the low-rank tensor approximation. This can be easily checked by evaluating the residuum of the PDE with the approximate solution. The procedure will be demonstrated in the computation of a compressible transonic Reynolds-averaged Navier-Strokes flow around an airfoil with random/uncertain data. © Springer-Verlag Berlin Heidelberg 2013.

  10. Sample size estimation and sampling techniques for selecting a representative sample

    Directory of Open Access Journals (Sweden)

    Aamir Omair

    2014-01-01

    Full Text Available Introduction: The purpose of this article is to provide a general understanding of the concepts of sampling as applied to health-related research. Sample Size Estimation: It is important to select a representative sample in quantitative research in order to be able to generalize the results to the target population. The sample should be of the required sample size and must be selected using an appropriate probability sampling technique. There are many hidden biases which can adversely affect the outcome of the study. Important factors to consider for estimating the sample size include the size of the study population, confidence level, expected proportion of the outcome variable (for categorical variables/standard deviation of the outcome variable (for numerical variables, and the required precision (margin of accuracy from the study. The more the precision required, the greater is the required sample size. Sampling Techniques: The probability sampling techniques applied for health related research include simple random sampling, systematic random sampling, stratified random sampling, cluster sampling, and multistage sampling. These are more recommended than the nonprobability sampling techniques, because the results of the study can be generalized to the target population.

  11. A systematic examination of a random sampling strategy for source apportionment calculations.

    Science.gov (United States)

    Andersson, August

    2011-12-15

    Estimating the relative contributions from multiple potential sources of a specific component in a mixed environmental matrix is a general challenge in diverse fields such as atmospheric, environmental and earth sciences. Perhaps the most common strategy for tackling such problems is by setting up a system of linear equations for the fractional influence of different sources. Even though an algebraic solution of this approach is possible for the common situation with N+1 sources and N source markers, such methodology introduces a bias, since it is implicitly assumed that the calculated fractions and the corresponding uncertainties are independent of the variability of the source distributions. Here, a random sampling (RS) strategy for accounting for such statistical bias is examined by investigating rationally designed synthetic data sets. This random sampling methodology is found to be robust and accurate with respect to reproducibility and predictability. This method is also compared to a numerical integration solution for a two-source situation where source variability also is included. A general observation from this examination is that the variability of the source profiles not only affects the calculated precision but also the mean/median source contributions. Copyright © 2011 Elsevier B.V. All rights reserved.

  12. Quality-assurance procedures: Method 5G determination of particulate emissions from wood heaters from a dilution tunnel sampling location

    Energy Technology Data Exchange (ETDEWEB)

    Ward, T.E.; Hartman, M.W.; Olin, R.C.; Rives, G.D.

    1989-06-01

    Quality-assurance procedures are contained in this comprehensive document intended to be used as an aid for wood-heater manufacturers and testing laboratories in performing particulate matter sampling of wood heaters according to EPA protocol, Method 5G. These procedures may be used in research and development, and as an aid in auditing and certification testing. A detailed, step-by-step quality assurance guide is provided to aid in the procurement and assembly of testing apparatus, to clearly describe the procedures, and to facilitate data collection and reporting. Suggested data sheets are supplied that can be used as an aid for both recordkeeping and certification applications. Throughout the document, activity matrices are provided to serve as a summary reference. Checklists are also supplied that can be used by testing personnel. Finally, for the purposes of ensuring data quality, procedures are outlined for apparatus operation, maintenance, and traceability. These procedures combined with the detailed description of the sampling and analysis protocol will help ensure the accuracy and reliability of Method 5G emission-testing results.

  13. Laboratory manual on sample preparation procedures for x-ray micro-analysis

    International Nuclear Information System (INIS)

    1997-01-01

    X-ray micro fluorescence is a non-destructive and sensitive method for studying the microscopic distribution of different elements in almost all kinds of samples. Since the beginning of this century, x-rays and electrons have been used for the analysis of many different kinds of material. Techniques which rely on electrons are mainly developed for microscopic studies, and are used in conventional Electron Microscopy (EM) or Scanning Electron Microscopy (SEM), while x-rays are widely used for chemical analysis at the microscopic level. The first chemical analysis by fluorescence spectroscopy using small x-ray beams was conducted in 1928 by Glockner and Schreiber. Since then much work has been devoted to developing different types of optical systems for focusing an x-ray beam, but the efficiency of these systems is still inferior to the conventional electron optical systems. However, even with a poor optical efficiency, the x-ray microbeam has many advantages compared with electron or proton induced x-ray emission methods. These include: The analyses are non-destructive, losses of mass are negligible, and due to the low thermal loading of x-rays, materials which may be thermally degraded can be analysed; Samples can be analysed in air, and no vacuum is required, therefore specimens with volatile components such as water in biological samples, can be imaged at normal pressure and temperature; No charging occurs during analysis and therefore coating of the sample with a conductive layer is not necessary; With these advantages, simpler sample preparation procedures including mounting and preservation can be used

  14. PGDP [Paducah Gaseous Diffusion Plant]-UF6 handling, sampling, analysis and associated QC/QA and safety related procedures

    International Nuclear Information System (INIS)

    Harris, R.L.

    1987-01-01

    This document is a compilation of Paducah Gaseous Diffusion Plant procedures on UF 6 handling, sampling, and analysis, along with associated QC/QA and safety related procedures. It was assembled for transmission by the US Department of Energy to the Korean Advanced Energy Institute as a part of the US-Korea technical exchange program

  15. A Systematic Review of Surgical Randomized Controlled Trials: Part 2. Funding Source, Conflict of Interest, and Sample Size in Plastic Surgery.

    Science.gov (United States)

    Voineskos, Sophocles H; Coroneos, Christopher J; Ziolkowski, Natalia I; Kaur, Manraj N; Banfield, Laura; Meade, Maureen O; Chung, Kevin C; Thoma, Achilleas; Bhandari, Mohit

    2016-02-01

    The authors examined industry support, conflict of interest, and sample size in plastic surgery randomized controlled trials that compared surgical interventions. They hypothesized that industry-funded trials demonstrate statistically significant outcomes more often, and randomized controlled trials with small sample sizes report statistically significant results more frequently. An electronic search identified randomized controlled trials published between 2000 and 2013. Independent reviewers assessed manuscripts and performed data extraction. Funding source, conflict of interest, primary outcome direction, and sample size were examined. Chi-squared and independent-samples t tests were used in the analysis. The search identified 173 randomized controlled trials, of which 100 (58 percent) did not acknowledge funding status. A relationship between funding source and trial outcome direction was not observed. Both funding status and conflict of interest reporting improved over time. Only 24 percent (six of 25) of industry-funded randomized controlled trials reported authors to have independent control of data and manuscript contents. The mean number of patients randomized was 73 per trial (median, 43, minimum, 3, maximum, 936). Small trials were not found to be positive more often than large trials (p = 0.87). Randomized controlled trials with small sample size were common; however, this provides great opportunity for the field to engage in further collaboration and produce larger, more definitive trials. Reporting of trial funding and conflict of interest is historically poor, but it greatly improved over the study period. Underreporting at author and journal levels remains a limitation when assessing the relationship between funding source and trial outcomes. Improved reporting and manuscript control should be goals that both authors and journals can actively achieve.

  16. Distributed fiber sparse-wideband vibration sensing by sub-Nyquist additive random sampling

    Science.gov (United States)

    Zhang, Jingdong; Zheng, Hua; Zhu, Tao; Yin, Guolu; Liu, Min; Bai, Yongzhong; Qu, Dingrong; Qiu, Feng; Huang, Xianbing

    2018-05-01

    The round trip time of the light pulse limits the maximum detectable vibration frequency response range of phase-sensitive optical time domain reflectometry ({\\phi}-OTDR). Unlike the uniform laser pulse interval in conventional {\\phi}-OTDR, we randomly modulate the pulse interval, so that an equivalent sub-Nyquist additive random sampling (sNARS) is realized for every sensing point of the long interrogation fiber. For an {\\phi}-OTDR system with 10 km sensing length, the sNARS method is optimized by theoretical analysis and Monte Carlo simulation, and the experimental results verify that a wide-band spars signal can be identified and reconstructed. Such a method can broaden the vibration frequency response range of {\\phi}-OTDR, which is of great significance in sparse-wideband-frequency vibration signal detection, such as rail track monitoring and metal defect detection.

  17. Effect of Sucrose Analgesia, for Repeated Painful Procedures, on Short-term Neurobehavioral Outcome of Preterm Neonates: A Randomized Controlled Trial.

    Science.gov (United States)

    Banga, Shreshtha; Datta, Vikram; Rehan, Harmeet Singh; Bhakhri, Bhanu Kiran

    2016-04-01

    Safety of oral sucrose, commonly used procedural analgesic in neonates, is questioned. To evaluate the effect of sucrose analgesia, for repeated painful procedures, on short-term neurobehavioral outcome of preterm neonates. Stable preterm neonates were randomized to receive either sucrose or distilled water orally, for every potentially painful procedure during the first 7 days after enrollment. Neurodevelopmental status at 40 weeks postconceptional age (PCA) measured using the domains of Neurobehavioral Assessment of Preterm Infants scale. A total of 93 newborns were analyzed. The baseline characteristics of the groups were comparable. No statistically significant difference was observed in the assessment at 40 weeks PCA, among the groups. Use of sucrose analgesia, for repeated painful procedures on newborns, does not lead to any significant difference in the short-term neurobehavioral outcome. © The Author [2015]. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. Randomized random walk on a random walk

    International Nuclear Information System (INIS)

    Lee, P.A.

    1983-06-01

    This paper discusses generalizations of the model introduced by Kehr and Kunter of the random walk of a particle on a one-dimensional chain which in turn has been constructed by a random walk procedure. The superimposed random walk is randomised in time according to the occurrences of a stochastic point process. The probability of finding the particle in a particular position at a certain instant is obtained explicitly in the transform domain. It is found that the asymptotic behaviour for large time of the mean-square displacement of the particle depends critically on the assumed structure of the basic random walk, giving a diffusion-like term for an asymmetric walk or a square root law if the walk is symmetric. Many results are obtained in closed form for the Poisson process case, and these agree with those given previously by Kehr and Kunter. (author)

  19. Sampling pig farms at the abattoir in a cross-sectional study − Evaluation of a sampling method

    DEFF Research Database (Denmark)

    Birkegård, Anna Camilla; Hisham Beshara Halasa, Tariq; Toft, Nils

    2017-01-01

    slaughter pigs. It was not possible to visit farms within the designated timeframe. Therefore, it was decided to use convenience sampling at the abattoir. Our approach was carried out in three steps: 1) planning: using data from meat inspection to plan at which abattoirs and how many farms to sample; 2......A cross-sectional study design is relatively inexpensive, fast and easy to conduct when compared to other study designs. Careful planning is essential to obtaining a representative sample of the population, and the recommended approach is to use simple random sampling from an exhaustive list...... of units in the target population. This approach is rarely feasible in practice, and other sampling procedures must often be adopted. For example, when slaughter pigs are the target population, sampling the pigs on the slaughter line may be an alternative to on-site sampling at a list of farms. However...

  20. Randomized, Double-Blind, Placebo-Controlled Study on Decolonization Procedures for Methicillin-Resistant Staphylococcus aureus (MRSA among HIV-Infected Adults.

    Directory of Open Access Journals (Sweden)

    Amy Weintrob

    Full Text Available HIV-infected persons have increased risk of MRSA colonization and skin and soft-tissue infections (SSTI. However, no large clinical trial has examined the utility of decolonization procedures in reducing MRSA colonization or infection among community-dwelling HIV-infected persons.550 HIV-infected adults at four geographically diverse US military HIV clinics were prospectively screened for MRSA colonization at five body locations every 6 months during a 2-year period. Those colonized were randomized in a double-blind fashion to nasal mupirocin (Bactroban twice daily and hexachlorophene (pHisoHex soaps daily for 7 days compared to placeboes similar in appearance but without specific antibacterial activity. The primary endpoint was MRSA colonization at 6-months post-randomization; secondary endpoints were time to MRSA clearance, subsequent MRSA infections/SSTI, and predictors for MRSA clearance at the 6-month time point.Forty-nine (9% HIV-infected persons were MRSA colonized and randomized. Among those with 6-month colonization data (80% of those randomized, 67% were negative for MRSA colonization in both groups (p = 1.0. Analyses accounting for missing 6-month data showed no significant differences could have been achieved. In the multivariate adjusted models, randomization group was not associated with 6-month MRSA clearance. The median time to MRSA clearance was similar in the treatment vs. placebo groups (1.4 vs. 1.8 months, p = 0.35. There was no difference on subsequent development of MRSA infections/SSTI (p = 0.89. In a multivariable model, treatment group, demographics, and HIV-specific factors were not predictive of MRSA clearance at the 6-month time point.A one-week decolonization procedure had no effect on MRSA colonization at the 6-month time point or subsequent infection rates among community-dwelling HIV-infected persons. More aggressive or novel interventions may be needed to reduce the burden of MRSA in this population

  1. Tobacco smoking surveillance: is quota sampling an efficient tool for monitoring national trends? A comparison with a random cross-sectional survey.

    Directory of Open Access Journals (Sweden)

    Romain Guignard

    Full Text Available OBJECTIVES: It is crucial for policy makers to monitor the evolution of tobacco smoking prevalence. In France, this monitoring is based on a series of cross-sectional general population surveys, the Health Barometers, conducted every five years and based on random samples. A methodological study has been carried out to assess the reliability of a monitoring system based on regular quota sampling surveys for smoking prevalence. DESIGN / OUTCOME MEASURES: In 2010, current and daily tobacco smoking prevalences obtained in a quota survey on 8,018 people were compared with those of the 2010 Health Barometer carried out on 27,653 people. Prevalences were assessed separately according to the telephone equipment of the interviewee (landline phone owner vs "mobile-only", and logistic regressions were conducted in the pooled database to assess the impact of the telephone equipment and of the survey mode on the prevalences found. Finally, logistic regressions adjusted for sociodemographic characteristics were conducted in the random sample in order to determine the impact of the needed number of calls to interwiew "hard-to-reach" people on the prevalence found. RESULTS: Current and daily prevalences were higher in the random sample (respectively 33.9% and 27.5% in 15-75 years-old than in the quota sample (respectively 30.2% and 25.3%. In both surveys, current and daily prevalences were lower among landline phone owners (respectively 31.8% and 25.5% in the random sample and 28.9% and 24.0% in the quota survey. The required number of calls was slightly related to the smoking status after adjustment for sociodemographic characteristics. CONCLUSION: Random sampling appears to be more effective than quota sampling, mainly by making it possible to interview hard-to-reach populations.

  2. Adverse Events With Ketamine Versus Ketofol for Procedural Sedation on Adults: A Double-blind, Randomized Controlled Trial.

    Science.gov (United States)

    Lemoel, Fabien; Contenti, Julie; Giolito, Didier; Boiffier, Mathieu; Rapp, Jocelyn; Istria, Jacques; Fournier, Marc; Ageron, François-Xavier; Levraut, Jacques

    2017-12-01

    The goal of our study was to compare the frequency and severity of recovery reactions between ketamine and ketamine-propofol 1:1 admixture ("ketofol"). We performed a multicentric, randomized, double-blind trial in which adult patients received emergency procedural sedations with ketamine or ketofol. Our primary outcome was the proportion of unpleasant recovery reactions. Other outcomes were frequency of interventions required by these recovery reactions, rates of respiratory or hemodynamic events, emesis, and satisfaction of patients as well as providers. A total of 152 patients completed the study, 76 in each arm. Compared with ketamine, ketofol determined a 22% reduction in recovery reactions incidence (p ketamine. We found a significant reduction in recovery reactions and emesis frequencies among adult patients receiving emergency procedural sedations with ketofol, compared with ketamine. © 2017 by the Society for Academic Emergency Medicine.

  3. Quantifiers for randomness of chaotic pseudo-random number generators.

    Science.gov (United States)

    De Micco, L; Larrondo, H A; Plastino, A; Rosso, O A

    2009-08-28

    We deal with randomness quantifiers and concentrate on their ability to discern the hallmark of chaos in time series used in connection with pseudo-random number generators (PRNGs). Workers in the field are motivated to use chaotic maps for generating PRNGs because of the simplicity of their implementation. Although there exist very efficient general-purpose benchmarks for testing PRNGs, we feel that the analysis provided here sheds additional didactic light on the importance of the main statistical characteristics of a chaotic map, namely (i) its invariant measure and (ii) the mixing constant. This is of help in answering two questions that arise in applications: (i) which is the best PRNG among the available ones? and (ii) if a given PRNG turns out not to be good enough and a randomization procedure must still be applied to it, which is the best applicable randomization procedure? Our answer provides a comparative analysis of several quantifiers advanced in the extant literature.

  4. A randomized clinical trial comparing cervical dysplasia treatment with cryotherapy vs loop electrosurgical excision procedure in HIV-seropositive women from Johannesburg, South Africa.

    Science.gov (United States)

    Smith, Jennifer S; Sanusi, Busola; Swarts, Avril; Faesen, Mark; Levin, Simon; Goeieman, Bridgette; Ramotshela, Sibongile; Rakhombe, Ntombiyenkosi; Williamson, Anna L; Michelow, Pam; Omar, Tanvier; Hudgens, Michael G; Firnhaber, Cynthia

    2017-08-01

    Mortality associated with cervical cancer is a public health concern for women, particularly in HIV-seropositive women in resource-limited countries. HIV-seropositive women are at a higher risk of high-grade cervical precancer, which can eventually progress to invasive carcinoma as compared to HIV-seronegative women. It is imperative to identify effective treatment methods for high-grade cervical precursors among HIV-seropositive women. Randomized controlled trial data are needed comparing cryotherapy vs loop electrosurgical excision procedure treatment efficacy in HIV-seropositive women. Our primary aim was to compare the difference in the efficacy of loop electrosurgical excision procedure vs cryotherapy for the treatment of high-grade cervical intraepithelial neoplasia (grade ≥2) among HIV-seropositive women by conducting a randomized clinical trial. HIV-seropositive women (n = 166) aged 18-65 years with histology-proven cervical intraepithelial neoplasia grade ≥2 were randomized (1:1) to cryotherapy or loop electrosurgical excision procedure treatment at a government hospital in Johannesburg. Treatment efficacy was compared using 6- and 12-month cumulative incidence posttreatment of: (1) cervical intraepithelial neoplasia grade ≥2; (2) secondary endpoints of histologic cervical intraepithelial neoplasia grade ≥3 and grade ≥1; and (3) high-grade and low-grade cervical cytology. The study was registered (ClinicalTrials.govNCT01723956). From January 2010 through August 2014, 166 participants were randomized (86 loop electrosurgical excision procedure; 80 cryotherapy). Cumulative cervical intraepithelial neoplasia grade ≥2 incidence was higher for cryotherapy (24.3%; 95% confidence interval, 16.1-35.8) than loop electrosurgical excision procedure at 6 months (10.8%; 95% confidence interval, 5.7-19.8) (P = .02), although by 12 months, the difference was not significant (27.2%; 95% confidence interval, 18.5-38.9 vs 18.5%; 95% confidence interval, 11

  5. What Is the Outcome of an Incision and Drainage Procedure in Endodontic Patients? A Prospective, Randomized, Single-blind Study.

    Science.gov (United States)

    Beus, Hannah; Fowler, Sara; Drum, Melissa; Reader, Al; Nusstein, John; Beck, Mike; Jatana, Courtney

    2018-02-01

    There are no prospective endodontic studies to determine the outcome of an incision and drainage (I&D) procedure for swelling in healthy, endodontic patients. The purpose of this prospective, randomized, single-blind study was to compare the postoperative course of I&D with drain placement versus a mock I&D procedure with mock drain placement after endodontic debridement in swollen emergency patients with symptomatic teeth and a pulpal diagnosis of necrosis. Eighty-one adult emergency patients presenting with clinical swelling received either penicillin or, if allergic, clindamycin and complete endodontic debridement, and then were randomly divided into 2 treatment groups: I&D with drain placement or a mock I&D procedure with mock drain placement. At the end of the appointment, all patients received a combination of ibuprofen/acetaminophen and, if needed, an opioid-containing escape medication. Patients recorded their pain and medication use for 4 days postoperatively. Success was defined as no or mild postoperative pain and no use of an opioid-containing escape medication. Success was evaluated using repeated measure mixed model logistic regression. Both groups had a decrease in postoperative pain and medication use over the 4 days. The mock I&D group had significantly higher success than the I&D group (odds ratio = 2.00; 95% confidence interval, 1.16-3.41). The success rate was 45% with the mock I&D and 33% with the I&D. After endodontic debridement, patients who received a mock I&D procedure with mock drain placement had more success than patients who received I&D with drain placement. Both groups clinically improved over 4 days. Copyright © 2017 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  6. Generalized Dynamic Panel Data Models with Random Effects for Cross-Section and Time

    NARCIS (Netherlands)

    Mesters, G.; Koopman, S.J.

    2014-01-01

    An exact maximum likelihood method is developed for the estimation of parameters in a nonlinear non-Gaussian dynamic panel data model with unobserved random individual-specific and time-varying effects. We propose an estimation procedure based on the importance sampling technique. In particular, a

  7. Iterative algorithm of discrete Fourier transform for processing randomly sampled NMR data sets

    International Nuclear Information System (INIS)

    Stanek, Jan; Kozminski, Wiktor

    2010-01-01

    Spectra obtained by application of multidimensional Fourier Transformation (MFT) to sparsely sampled nD NMR signals are usually corrupted due to missing data. In the present paper this phenomenon is investigated on simulations and experiments. An effective iterative algorithm for artifact suppression for sparse on-grid NMR data sets is discussed in detail. It includes automated peak recognition based on statistical methods. The results enable one to study NMR spectra of high dynamic range of peak intensities preserving benefits of random sampling, namely the superior resolution in indirectly measured dimensions. Experimental examples include 3D 15 N- and 13 C-edited NOESY-HSQC spectra of human ubiquitin.

  8. Recommended procedures for performance testing of radiobioassay laboratories: Volume 2, In vitro samples

    International Nuclear Information System (INIS)

    Fenrick, H.W.; MacLellan, J.A.

    1988-11-01

    Draft American National Standards Institute (ANSI) Standard N13.30 (Performance Criteria for Radiobioassay) was developed for the US Department of Energy and the US Nuclear Regulatory Commission to help ensure that bioassay laboratories provide accurate and consistent results. The draft standard specifies the criteria for defining the procedures necessary to establish a bioassay performance-testing laboratory and program. The bioassay testing laboratory will conduct tests to evaluate the performance of service laboratories. Pacific Northwest Laboratory helped develop testing procedures as part of an effort to evaluate the performance criteria by testing the existing measurement capabilities of various bioassay laboratories. This report recommends guidelines for the preparation, handling, storage, distribution, shipping, and documentation of in vitro test samples (artificial urine and fecal matter) for indirect bioassay. The data base and recommended records system for documenting radiobioassay performance at the service laboratories are also presented. 8 refs., 3 tabs

  9. Comparison of hydrocolloid with conventional gauze dressing in prevention of wound infection after clean surgical procedures

    International Nuclear Information System (INIS)

    Khalique, M.S.; Shukr, I.; Khalique, A.B.

    2014-01-01

    To compare hydrocolloid with conventional gauze dressing in prevention of infections after clean surgical procedures. Study Design: Randomized controlled trial. Place and Duration of Study: Department of Surgery, CMH Rawalpindi from 22 Jan 2010 to 22 Aug 2010. Patients and Methods: A total of 400 patients undergoing clean surgical procedures were randomly allocated in two equal groups, A and B by lottery method. In group A. simple gauze dressing was applied after clean surgical procedures while in group B hydrocolloid dressing was used. On 7th post operative day, patients were observed for presence of infection. Results: Mean age of sample was 42.08 +-11.112 years. In group A out of 200 Patients, 14 (7.0%) while in group B 10 (5%) developed infection postoperatively (p=0.709). Conclusion: There is no difference in the rate of infection when using a gauze dressing or a hydrocolloid dressing after clean surgical procedure. (author)

  10. Selection of blood sampling times for determination of 51Cr-EDTA clearance in a screeening procedure

    International Nuclear Information System (INIS)

    Gullquist, R.; Askergren, A.; Brandt, R.; Silk, B.; Strandell, T.; Huddinge University Hospital

    1983-01-01

    In a group of 44 construction workers various blood sampling protocols were compared with regard to variability of the 51 Cr-EDTA clearance on repeated determinations. A comparison was also made among the different blood sampling protocols with a reference method using Simpson's formula in the area calculation. A double slope method lasting for two and a half hours was finally choosen and suggested as a screening procedure in industrial environment with blood sampling at 5, 15, 90, 120, 135 and 150 minutes after injection and with the patient resting in a semirecumbent position. (orig.) [de

  11. Estimation of Finite Population Mean in Multivariate Stratified Sampling under Cost Function Using Goal Programming

    Directory of Open Access Journals (Sweden)

    Atta Ullah

    2014-01-01

    Full Text Available In practical utilization of stratified random sampling scheme, the investigator meets a problem to select a sample that maximizes the precision of a finite population mean under cost constraint. An allocation of sample size becomes complicated when more than one characteristic is observed from each selected unit in a sample. In many real life situations, a linear cost function of a sample size nh is not a good approximation to actual cost of sample survey when traveling cost between selected units in a stratum is significant. In this paper, sample allocation problem in multivariate stratified random sampling with proposed cost function is formulated in integer nonlinear multiobjective mathematical programming. A solution procedure is proposed using extended lexicographic goal programming approach. A numerical example is presented to illustrate the computational details and to compare the efficiency of proposed compromise allocation.

  12. A Randomization Procedure for "Trickle-Process" Evaluations

    Science.gov (United States)

    Goldman, Jerry

    1977-01-01

    This note suggests a solution to the problem of achieving randomization in experimental settings where units deemed eligible for treatment "trickle in," that is, appear at any time. The solution permits replication of the experiment in order to test for time-dependent effects. (Author/CTM)

  13. Computer code ENDSAM for random sampling and validation of the resonance parameters covariance matrices of some major nuclear data libraries

    International Nuclear Information System (INIS)

    Plevnik, Lucijan; Žerovnik, Gašper

    2016-01-01

    Highlights: • Methods for random sampling of correlated parameters. • Link to open-source code for sampling of resonance parameters in ENDF-6 format. • Validation of the code on realistic and artificial data. • Validation of covariances in three major contemporary nuclear data libraries. - Abstract: Methods for random sampling of correlated parameters are presented. The methods are implemented for sampling of resonance parameters in ENDF-6 format and a link to the open-source code ENDSAM is given. The code has been validated on realistic data. Additionally, consistency of covariances of resonance parameters of three major contemporary nuclear data libraries (JEFF-3.2, ENDF/B-VII.1 and JENDL-4.0u2) has been checked.

  14. Intravenous ketamine is as effective as midazolam/fentanyl for procedural sedation and analgesia in the emergency department.

    Science.gov (United States)

    Jamal, S M; Fathil, S M; Nidzwani, M M; Ismail, A K; Yatim, F M

    2011-08-01

    The study compared the effectiveness of ketamine and midazolam/fentanyl as procedural sedation and analgesia agents for reduction of fractures and dislocated joints. Forty-one adult patients were enrolled by convenience sampling. They were randomized to receive ketamine or midazolam/fentanyl. Depth of sedation, pain score, procedural outcome and memory of the procedure were documented. The ketamine group had deeper sedation, but there was no statistical difference in other variables between the two groups. Three patients in the midazolam/fentanyl group had oxygen desaturation. More adverse effects were associated with ketamine. Intravenous ketamine is as effective as midazolam/fentanyl for procedural sedation.

  15. A variation of the housing unit method for estimating the age and gender distribution of small, rural areas: A case study of the local expert procedure

    International Nuclear Information System (INIS)

    Carlson, J.F.; Roe, L.K.; Williams, C.A.; Swanson, D.A.

    1993-01-01

    This paper describes the methodologies used in the development of a demographic data base established in support of the Yucca Mountain Site Characterization Project Radiological Monitoring Plan (RadMP). It also examines the suitability of a survey-based procedure for estimating population in small, rural areas. The procedure is a variation of the Housing Unit Method. It employs the use of local experts enlisted to provide information about the demographic characteristics of households randomly selected from residential units sample frames developed from utility records. The procedure is nonintrusive and less costly than traditional survey data collection efforts. Because the procedure is based on random sampling, confidence intervals can be constructed around the population estimated by the technique. The results of a case study are provided in which the total population, and age and gender of the population, is estimated for three unincorporated communities in rural, southern Nevada

  16. A pre-concentration procedure using coprecipitation for determination of lead and iron in several samples using flame atomic absorption spectrometry

    International Nuclear Information System (INIS)

    Saracoglu, S.; Soylak, M.; Peker, D.S. Kacar; Elci, L.; Santos, W.N.L. dos; Lemos, V.A.; Ferreira, S.L.C.

    2006-01-01

    The present paper proposes a pre-concentration procedure for determination of lead and iron in several samples by flame atomic absorption spectrometry. In it, lead(II) and iron(III) ions are coprecipitated using the violuric acid-copper(II) system as collector. Afterwards, the precipitate is dissolved with 1 M HNO 3 solution and the metal ions are determined. The optimization step was performed using factorial design involving the variables: pH, violuric acid mass (VA) and copper concentration (Cu). Using the optimized experimental conditions, the proposed procedure allows the determination these metals with detection limits of 0.18 μg L -1 for iron and 0.16 μg L -1 for lead. The effects of foreign ions on the pre-concentration procedure were also evaluated and the results demonstrated that this method could be applied for determination of iron and lead in several real samples. The proposed method was successfully applied to the analysis of seawater, urine, mineral water, soil and physiological solution samples. The concentrations of lead and iron achieved in these samples agree well with others data reported in the literature

  17. Sample handling and chemical procedures for efficacious trace analysis of urine by neutron activation analysis

    International Nuclear Information System (INIS)

    Blotcky, A.J.; Rack, E.P.; Roman, F.R.

    1988-01-01

    Important for the determination of trace elements, ions, or compounds in urine by chemical neutron activation analysis is the optimization of sample handling, preirradiation chemistry, and radioassay procedures necessary for viable analysis. Each element, because of its natural abundance in the earth's crust and, hence, its potential for reagent and environmental contamination, requires specific procedures for storage, handling, and preirradiation chemistry. Radioassay techniques for radionuclides vary depending on their half-lives and decay characteristics. Described in this paper are optimized procedures for aluminum and selenium. While 28 Al (T 1/2 = 2.24 min) and 77m Se(T 1/2 = 17.4s) have short half-lives, their gamma-ray spectra are quite different. Aluminum-28 decays by a 1779-keV gamma and 77m Se by a 162-keV gamma. Unlike selenium, aluminum is a ubiquitous element in the environment requiring special handling to minimize contamination in all phases of its analytical determination

  18. TVT-Secur (Hammock) versus TVT-Obturator: a randomized trial of suburethral sling operative procedures.

    Science.gov (United States)

    Hota, Lekha S; Hanaway, Katherine; Hacker, Michele R; Disciullo, Anthony; Elkadry, Eman; Dramitinos, Patricia; Shapiro, Alexander; Ferzandi, Tanaz; Rosenblatt, Peter L

    2012-01-01

    This study aimed to compare TVT-Secur (TVT-S) and TVT-Obturator (TVT-O) suburethral slings for treatment of stress urinary incontinence (SUI). This was a single-center, nonblinded, randomized trial of women with SUI who were randomized to TVT-S or TVT-O from May 2007 to April 2009. The primary outcome, SUI on cough stress test (CST), and quality-of-life and symptom questionnaires (Pelvic Floor Distress Inventory [PFDI-20] and Pelvic Floor Impact Questionnaire [PFIQ-7]) were assessed at 12 weeks and 1 year. Forty-three women were randomized to TVT-S and 44 to TVT-O. There were no differences in median baseline PFDI-20 and PFIQ-7. Twenty-two (52.4%) of 42 participants randomized to TVT-S had a positive CST result at evaluation after 12 weeks or 1 year, whereas 4 (9.1%) of the 44 in the TVT-O group had a positive CST result. The intent-to-treat analysis showed that the risk of a positive CST result was 6 times higher after TVT-S than TVT-O (risk ratio, 6.0; 95% confidence interval [CI], 2.3-16.0). Among women not lost to follow-up, the risk ratio for a positive CST result after TVT-S compared with TVT-O was 17.9 (95% CI, 2.5-128.0) at 12 weeks and 3.5 (95% CI, 1.1-11.0) at 1 year. Both TVT-S and TVT-O resulted in improved quality of life and symptoms at 12 weeks. There was no difference between the groups for PFDI-20 (P = 0.40) or PFIQ-7 (P = 0.43). A similar pattern was seen at 1 year (P = 0.85 and P = 0.36). The TVT-S seems to have a higher risk of positive postoperative CST result; however, the procedures result in similar improvements in quality of life and symptoms.

  19. Autoshaping, random control, and omission training in the rat.

    Science.gov (United States)

    Locurto, C; Terrace, H S; Gibbon, J

    1976-11-01

    The role of the stimulus-reinforcer contingency in the development and maintenance of lever contact responding was studied in hooded rats. In Experiment I, three groups of experimentally naive rats were trained either on autoshaping, omission training, or a random-control procedure. Subjects trained by the autoshaping procedure responded more consistently than did either random-control or omission-trained subjects. The probability of at least one lever contact per trial was slightly higher in subjects trained by the omission procedure than by the random-control procedure. However, these differences were not maintained during extended training, nor were they evident in total lever-contact frequencies. When omission and random-control subjects were switched to the autoshaping condition, lever contacts increased in all animals, but a pronounced retardation was observed in omission subjects relative to the random-control subjects. In addition, subjects originally exposed to the random-control procedure, and later switched to autoshaping, acquired more rapidly than naive subjects that were exposed only on the autoshaping procedure. In Experiment II, subjects originally trained by an autoshaping procedure were exposed either to an omission, a random-control, or an extinction procedure. No differences were observed among the groups either in the rate at which lever contacts decreased or in the frequency of lever contacts at the end of training. These data implicate prior experience in the interpretation of omission-training effects and suggest limitations in the influence of stimulus-reinforcer relations in autoshaping.

  20. Improving ambulatory saliva-sampling compliance in pregnant women: a randomized controlled study.

    Directory of Open Access Journals (Sweden)

    Julian Moeller

    Full Text Available OBJECTIVE: Noncompliance with scheduled ambulatory saliva sampling is common and has been associated with biased cortisol estimates in nonpregnant subjects. This study is the first to investigate in pregnant women strategies to improve ambulatory saliva-sampling compliance, and the association between sampling noncompliance and saliva cortisol estimates. METHODS: We instructed 64 pregnant women to collect eight scheduled saliva samples on two consecutive days each. Objective compliance with scheduled sampling times was assessed with a Medication Event Monitoring System and self-reported compliance with a paper-and-pencil diary. In a randomized controlled study, we estimated whether a disclosure intervention (informing women about objective compliance monitoring and a reminder intervention (use of acoustical reminders improved compliance. A mixed model analysis was used to estimate associations between women's objective compliance and their diurnal cortisol profiles, and between deviation from scheduled sampling and the cortisol concentration measured in the related sample. RESULTS: Self-reported compliance with a saliva-sampling protocol was 91%, and objective compliance was 70%. The disclosure intervention was associated with improved objective compliance (informed: 81%, noninformed: 60%, F(1,60  = 17.64, p<0.001, but not the reminder intervention (reminders: 68%, without reminders: 72%, F(1,60 = 0.78, p = 0.379. Furthermore, a woman's increased objective compliance was associated with a higher diurnal cortisol profile, F(2,64  = 8.22, p<0.001. Altered cortisol levels were observed in less objective compliant samples, F(1,705  = 7.38, p = 0.007, with delayed sampling associated with lower cortisol levels. CONCLUSIONS: The results suggest that in pregnant women, objective noncompliance with scheduled ambulatory saliva sampling is common and is associated with biased cortisol estimates. To improve sampling compliance, results suggest

  1. Field-based random sampling without a sampling frame: control selection for a case-control study in rural Africa.

    Science.gov (United States)

    Crampin, A C; Mwinuka, V; Malema, S S; Glynn, J R; Fine, P E

    2001-01-01

    Selection bias, particularly of controls, is common in case-control studies and may materially affect the results. Methods of control selection should be tailored both for the risk factors and disease under investigation and for the population being studied. We present here a control selection method devised for a case-control study of tuberculosis in rural Africa (Karonga, northern Malawi) that selects an age/sex frequency-matched random sample of the population, with a geographical distribution in proportion to the population density. We also present an audit of the selection process, and discuss the potential of this method in other settings.

  2. Pavlovian autoshaping procedures increase plasma corticosterone and levels of norepinephrine and serotonin in prefrontal cortex in rats.

    Science.gov (United States)

    Tomie, Arthur; Tirado, Aidaluz D; Yu, Lung; Pohorecky, Larissa A

    2004-08-12

    Pavlovian autoshaping procedures provide for pairings of a small object conditioned stimulus (CS) with a rewarding substance unconditioned stimulus (US), resulting in the acquisition of complex sequences of CS-directed skeletal-motor responses or autoshaping conditioned responses (CRs). Autoshaping procedures induce higher post-session levels of corticosterone than in controls receiving CS and US randomly, and the enhanced post-session corticosterone levels have been attributed to the appetitive or arousal-inducing effects of autoshaping procedures. Enhanced corticosterone release can be induced by aversive stimulation or stressful situations, where it is often accompanied by higher levels of norepinephrine (NE) and serotonin (5-HT) in prefrontal cortex (PFC) but not in striatum (ST). Effects of autoshaping procedures on post-session corticosterone levels, NE contents in PFC, and 5-HT contents in PFC and ST were investigated in male Long-Evans rats. Post-session blood samples revealed higher corticosterone levels in the CS-US Paired group (n = 46) than in the CS-US Random control group (n = 21), and brain samples revealed higher levels of PFC NE and 5-HT in CS-US Paired group. Striatal 5-HT levels were unaltered by the autoshaping procedures. Autoshaping procedures provide for appetitive stimulation and induce an arousal-like state, as well as simultaneous stress-like changes in plasma corticosterone and monoamine levels in PFC. Autoshaping, therefore, may be useful for the study of endocrine and central processes associated with appetitive conditions.

  3. Random access procedures and radio access network (RAN) overload control in standard and advanced long-term evolution (LTE and LTE-A) networks

    DEFF Research Database (Denmark)

    Kiilerich Pratas, Nuno; Thomsen, Henning; Popovski, Petar

    2015-01-01

    In this chapter, we describe and discuss the current LTE random access procedure and the Radio Access Network Load Control solution within LTE/LTE-A. We provide an overview of the several considered load control solutions and give a detailed description of the standardized Extended Access Class B...

  4. Ketamine versus Ketamine / magnesium Sulfate for Procedural Sedation and Analgesia in the Emergency Department: A Randomized Clinical Trial.

    Science.gov (United States)

    Azizkhani, Reza; Bahadori, Azadeh; Shariati, Mohammadreza; Golshani, Keyhan; Ahmadi, Omid; Masoumi, Babak

    2018-01-01

    The present study was designed to evaluate the effectiveness of magnesium sulfate (MgSO 4 ) in procedural sedation and analgesia (PSA) when combined with ketamine in patients with fractures in emergency departments and required short and painful emergency procedures. In this study, 100 patients with fractures and dislocations who were presented to the emergency departments and required PSA for short and painful emergency procedures were randomly allocated to groups of ketamine plus MgSO 4 or ketamine alone. Train of four (TOF) stimulation pattern was assessed using nerve stimulator machine and compared between groups. The mean age of studied patients was 46.9 ± 9.3 years old. 48% were male and 52% were female. No significant differences were noted between groups in demographic variables. The status of TOF, 2 min after the injection of ketamine (1.5 mg/kg), in both groups was similar. After the injection of the second dose of ketamine (1 mg/kg) the status of TOF in four patients in ketamine plus MgSO 4 (0.45 mg/kg) group changed, it was three quarters but in ketamine group, the status of TOF in all patients was four quarters. The difference between groups was not statistically significant ( P = 0.12). The findings revealed that for muscle relaxation during medical procedures in the emergency department, ketamine in combination with MgSO 4 with this dose was not effective for muscle relaxation during procedures.

  5. Pavlovian autoshaping procedures increase plasma corticosterone levels in rats.

    Science.gov (United States)

    Tomie, Arthur; Silberman, Yuval; Williams, Kayon; Pohorecky, Larissa A

    2002-06-01

    Pavlovian autoshaping conditioned responses (CRs) are complex sequences of conditioned stimulus (CS)-directed skeletal-motor responses that are elicited by CS objects predictive of food unconditioned stimulus (US). Autoshaping CRs are observed under conditions known to be conducive to elevations in plasma corticosterone levels, as, for example, in response to the eating of food as well as in response to signals predictive of food. Two experiments investigated the relationships between Pavlovian autoshaping procedures, the performance of Pavlovian autoshaping CRs, and plasma corticosterone levels in male Long-Evans rats. In Experiment 1, rats in the CS-US paired group (n=30) were given 20 daily sessions of Pavlovian autoshaping training wherein the insertion of a retractable lever CS was followed by the response-independent presentation of the food US. Tail blood samples obtained after the 20th autoshaping session revealed higher plasma corticosterone levels in the CS-US paired group than in the CS-US random control group (n=10). In Experiment 2, rats (n=35) were assessed for basal plasma corticosterone levels 2 weeks prior to autoshaping training. Plasma samples obtained immediately following the first autoshaping session, and prior to the acquisition of lever-press autoshaping CR performance, revealed higher plasma corticosterone levels in the CS-US paired group (n=24) relative to basal levels. This effect was not observed in the CS-US random control group (n=11). Data suggest that corticosterone release is a physiological endocrine Pavlovian CR induced by lever CS-food US pairings during Pavlovian autoshaping procedures, rather than a by-product of autoshaping CR performance. Implications of the link between autoshaping procedures and corticosterone release are discussed.

  6. Efficiency and Safety of One-Step Procedure Combined Laparoscopic Cholecystectomy and Eretrograde Cholangiopancreatography for Treatment of Cholecysto-Choledocholithiasis: A Randomized Controlled Trial.

    Science.gov (United States)

    Liu, Zhiyi; Zhang, Luyao; Liu, Yanling; Gu, Yang; Sun, Tieliang

    2017-11-01

    We aimed to evaluate the efficiency and safety of one-step procedure combined endoscopic retrograde cholangiopancreatography (ERCP) and laparoscopic cholecystectomy (LC) for treatment of patients with cholecysto-choledocholithiasis. A prospective randomized study was performed on 63 consecutive cholecysto-choledocholithiasis patients during 2008 and 2011. The efficiency and safety of one-step procedure was assessed by comparing the two-step LC with ERCP + endoscopic sphincterotomy (EST). Outcomes including intraoperative features, postoperative features (length of stay and postoperative complications) were evaluated. One- or two-step procedure of LC with ERCP + EST was successfully performed in all patients, and common bile duct stones were completely removed. Statistical analyses showed that length of stay and pulmonary infection rate were significantly lower in the test group compared with that in the control group (P 0.05). The one-step procedure of LC with ERCP + EST is superior to the two-step procedure for treatment of patients with cholecysto-choledocholithiasis regarding to the reduced hospital stay and inhibited occurrence of pulmonary infections. Compared with two-step procedure, one-step procedure of LC with ERCP + EST may be a superior option for cholecysto-choledocholithiasis patients treatment regarding to hospital stay and pulmonary infections.

  7. Direct generation of all-optical random numbers from optical pulse amplitude chaos.

    Science.gov (United States)

    Li, Pu; Wang, Yun-Cai; Wang, An-Bang; Yang, Ling-Zhen; Zhang, Ming-Jiang; Zhang, Jian-Zhong

    2012-02-13

    We propose and theoretically demonstrate an all-optical method for directly generating all-optical random numbers from pulse amplitude chaos produced by a mode-locked fiber ring laser. Under an appropriate pump intensity, the mode-locked laser can experience a quasi-periodic route to chaos. Such a chaos consists of a stream of pulses with a fixed repetition frequency but random intensities. In this method, we do not require sampling procedure and external triggered clocks but directly quantize the chaotic pulses stream into random number sequence via an all-optical flip-flop. Moreover, our simulation results show that the pulse amplitude chaos has no periodicity and possesses a highly symmetric distribution of amplitude. Thus, in theory, the obtained random number sequence without post-processing has a high-quality randomness verified by industry-standard statistical tests.

  8. Modern survey sampling

    CERN Document Server

    Chaudhuri, Arijit

    2014-01-01

    Exposure to SamplingAbstract Introduction Concepts of Population, Sample, and SamplingInitial RamificationsAbstract Introduction Sampling Design, Sampling SchemeRandom Numbers and Their Uses in Simple RandomSampling (SRS)Drawing Simple Random Samples with and withoutReplacementEstimation of Mean, Total, Ratio of Totals/Means:Variance and Variance EstimationDetermination of Sample SizesA.2 Appendix to Chapter 2 A.More on Equal Probability Sampling A.Horvitz-Thompson EstimatorA.SufficiencyA.LikelihoodA.Non-Existence Theorem More Intricacies Abstract Introduction Unequal Probability Sampling StrategiesPPS Sampling Exploring Improved WaysAbstract Introduction Stratified Sampling Cluster SamplingMulti-Stage SamplingMulti-Phase Sampling: Ratio and RegressionEstimationviiviii ContentsControlled SamplingModeling Introduction Super-Population ModelingPrediction Approach Model-Assisted Approach Bayesian Methods Spatial SmoothingSampling on Successive Occasions: Panel Rotation Non-Response and Not-at-Homes Weighting Adj...

  9. Limited Impact of Music Therapy on Patient Anxiety with the Large Loop Excision of Transformation Zone Procedure - a Randomized Controlled Trial.

    Science.gov (United States)

    Kongsawatvorakul, Chompunoot; Charakorn, Chuenkamon; Paiwattananupant, Krissada; Lekskul, Navamol; Rattanasiri, Sasivimol; Lertkhachonsuk, Arb-Aroon

    2016-01-01

    Many studies have pointed to strategies to cope with patient anxiety in colposcopy. Evidence shows that patients experienced considerable distress with the large loop excision of transformation zone (LLETZ) procedure and suitable interventions should be introduced to reduce anxiety. This study aimed to investigate the effects of music therapy in patients undergoing LLETZ. A randomized controlled trial was conducted with patients undergoing LLETZ performed under local anesthesia in an out patient setting at Ramathibodi Hospital, Bangkok, Thailand, from February 2015 to January 2016. After informed consent and demographic data were obtained, we assessed the anxiety level using State Anxiety Inventory pre and post procedures. Music group patients listened to classical songs through headphones, while the control group received the standard care. Pain score was evaluated with a visual analog scale (VAS). Statistical analysis was conducted using Pearson Chi-square, Fisher's Exact test and T-Test and p-values less than 0.05 were considered statistically significant. A total of 73 patients were enrolled and randomized, resulting in 36 women in the music group and 37 women in the non-music control group. The preoperative mean anxiety score was higher in the music group (46.8 VS 45.8 points). The postoperative mean anxiety scores in the music and the non-music groups were 38.7 and 41.3 points, respectively. VAS was lower in music group (2.55 VS 3.33). The percent change of anxiety was greater in the music group, although there was no significant difference between two groups. Music therapy did not significantly reduce anxiety in patients undergoing the LLETZ procedure. However, different interventions should be developed to ease the patients' apprehension during this procedure.

  10. Efficacy of a children’s procedural preparation and distraction device on healing in acute burn wound care procedures: study protocol for a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Brown Nadia J

    2012-12-01

    Full Text Available Abstract Background The intense pain and anxiety triggered by burns and their associated wound care procedures are well established in the literature. Non-pharmacological intervention is a critical component of total pain management protocols and is used as an adjunct to pharmacological analgesia. An example is virtual reality, which has been used effectively to dampen pain intensity and unpleasantness. Possible links or causal relationships between pain/anxiety/stress and burn wound healing have previously not been investigated. The purpose of this study is to investigate these relationships, specifically by determining if a newly developed multi-modal procedural preparation and distraction device (Ditto™ used during acute burn wound care procedures will reduce the pain and anxiety of a child and increase the rate of re-epithelialization. Methods/design Children (4 to 12 years with acute burn injuries presenting for their first dressing change will be randomly assigned to either the (1 Control group (standard distraction or (2 Ditto™ intervention group (receiving Ditto™, procedural preparation and Ditto™ distraction. It is intended that a minimum of 29 participants will be recruited for each treatment group. Repeated measures of pain intensity, anxiety, stress and healing will be taken at every dressing change until complete wound re-epithelialization. Further data collection will aid in determining patient satisfaction and cost effectiveness of the Ditto™ intervention, as well as its effect on speed of wound re-epithelialization. Discussion Results of this study will provide data on whether the disease process can be altered by reducing stress, pain and anxiety in the context of acute burn wounds. Trial registration ACTRN12611000913976

  11. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR COLLECTION, STORAGE AND SHIPMENT OF URINE SAMPLES FOR SELECTED METALS AND PESTICIDES (UA-F-20.1)

    Science.gov (United States)

    The purpose of this SOP is to guide the collection, storage, and shipment of urine samples collected for the NHEXAS Arizona project. This SOP provides a brief description of sample, collection, preservation, storage, shipping, and custody procedures. This procedure was followed ...

  12. Effect of Spike Lavender Lakhlakhe on Pain Intensity Due to Phlebotomy Procedure in Premature Infants Hospitalized in Neonatal Intensive Care Unit: A Randomized Clinical Trial

    Directory of Open Access Journals (Sweden)

    Noushin Beheshtipoor

    2017-06-01

    Full Text Available Background: A Premature infants undergo multiple painful procedures during treatment; thus, it must be tried to limit complications caused by diagnostic and treatment procedures using simple and practical methods. This study was performed to evaluate the effect of spike lavender lakhlakhe on pain intensity due to phlebotomy in hospitalized premature infants.Methods: This single-arm, randomized clinical trial was performed on 30 infants chosen through convenience sampling method. Each newborn was considered as its own control. For the test group, one drop of pure (100% spike lavender lakhlakhe was taken by a standard dropper and diluted with 4 ml of warm distilled water by the research assistant. This mixture was stirred at 2-3 cm distance of the newborns’ nose from 60 minutes before until 2 minutes after phlebotomy, such that it could be smelled by the newborns. In both groups, heart rate and blood oxygen saturation were measured by a standard portable device, and the corresponding data was recorded in data collection sheets. Moreover, the infants’ facial expression changes were recorded by a camera and the intensity of pain was measured by Premature Infant Pain Profile before and after the procedure. Finally, the data was analyzed by paired comparison analysis test in SPSS, version 17.Results: Comparison of mean pain intensity caused by phlebotomy in the control and test groups showed a significant difference (7.667±0.311 vs. 4.882±0.311; P

  13. Autoshaping, random control, and omission training in the rat1

    Science.gov (United States)

    Locurto, Charles; Terrace, H. S.; Gibbon, John

    1976-01-01

    The role of the stimulus-reinforcer contingency in the development and maintenance of lever contact responding was studied in hooded rats. In Experiment I, three groups of experimentally naive rats were trained either on autoshaping, omission training, or a random-control procedure. Subjects trained by the autoshaping procedure responded more consistently than did either random-control or omission-trained subjects. The probability of at least one lever contact per trial was slightly higher in subjects trained by the omission procedure than by the random-control procedure. However, these differences were not maintained during extended training, nor were they evident in total lever-contact frequencies. When omission and random-control subjects were switched to the autoshaping condition, lever contacts increased in all animals, but a pronounced retardation was observed in omission subjects relative to the random-control subjects. In addition, subjects originally exposed to the random-control procedure, and later switched to autoshaping, acquired more rapidly than naive subjects that were exposed only on the autoshaping procedure. In Experiment II, subjects originally trained by an autoshaping procedure were exposed either to an omission, a random-control, or an extinction procedure. No differences were observed among the groups either in the rate at which lever contacts decreased or in the frequency of lever contacts at the end of training. These data implicate prior experience in the interpretation of omission-training effects and suggest limitations in the influence of stimulus-reinforcer relations in autoshaping. PMID:16811960

  14. Fixed-location hydroacoustic monitoring designs for estimating fish passage using stratified random and systematic sampling

    International Nuclear Information System (INIS)

    Skalski, J.R.; Hoffman, A.; Ransom, B.H.; Steig, T.W.

    1993-01-01

    Five alternate sampling designs are compared using 15 d of 24-h continuous hydroacoustic data to identify the most favorable approach to fixed-location hydroacoustic monitoring of salmonid outmigrants. Four alternative aproaches to systematic sampling are compared among themselves and with stratified random sampling (STRS). Stratifying systematic sampling (STSYS) on a daily basis is found to reduce sampling error in multiday monitoring studies. Although sampling precision was predictable with varying levels of effort in STRS, neither magnitude nor direction of change in precision was predictable when effort was varied in systematic sampling (SYS). Furthermore, modifying systematic sampling to include replicated (e.g., nested) sampling (RSYS) is further shown to provide unbiased point and variance estimates as does STRS. Numerous short sampling intervals (e.g., 12 samples of 1-min duration per hour) must be monitored hourly using RSYS to provide efficient, unbiased point and interval estimates. For equal levels of effort, STRS outperformed all variations of SYS examined. Parametric approaches to confidence interval estimates are found to be superior to nonparametric interval estimates (i.e., bootstrap and jackknife) in estimating total fish passage. 10 refs., 1 fig., 8 tabs

  15. Ankle Block vs Single-Shot Popliteal Fossa Block as Primary Anesthesia for Forefoot Operative Procedures: Prospective, Randomized Comparison.

    Science.gov (United States)

    Schipper, Oliver N; Hunt, Kenneth J; Anderson, Robert B; Davis, W Hodges; Jones, Carroll P; Cohen, Bruce E

    2017-11-01

    Postoperative pain is often difficult to control with oral medications, requiring large doses of opioid analgesia. Regional anesthesia may be used for primary anesthesia, reducing the need for general anesthetic and postoperative pain medication requirements in the immediate postoperative period. The purpose of this study was to compare the analgesic effects of an ankle block (AB) to a single-shot popliteal fossa block (PFB) for patients undergoing orthopedic forefoot procedures. All patients having elective outpatient orthopedic forefoot procedures were invited to participate in the study. Patients were prospectively randomized to receive either an ultrasound-guided AB or PFB by a board-certified anesthesiologist prior to their procedure. Intraoperative conversion to general anesthesia and postanesthesia care unit (PACU) opioid requirements were recorded. Postoperative pain was assessed using the visual analog scale (VAS) at regular time intervals until 8 am on postoperative day (POD) 2. Patients rated the effectiveness of the block on a 1 to 5 scale, with 5 being very effective. A total of 167 patients participated in the study with 88 patients (53%) receiving an AB and 79 (47%) receiving a single-shot PFB. There was no significant difference in the rate of conversion to general anesthesia between the 2 groups (13.6% [12/88] AB vs 12.7% [10/79] PFB). PACU morphine requirements and doses were significantly reduced in the PFB group ( P = .004) when compared to the AB group. The VAS was also significantly lower for the PFB patients at 10 pm on POD 0 (4.6 vs 1.6, P block site pain and/or erythema (AB 6.9% [6/88] vs PFB 5.1% [4/79], P = .44). The analgesic effect of the PFB lasted significantly longer when compared to the ankle block (AB 14.5 hours vs PFB 20.9 hours, P block between the 2 groups, with both blocks being highly effective (AB 4.79/5 vs PFB 4.82/5, P = .68). Regional anesthesia was a safe and reliable adjunct to perioperative pain management and highly

  16. When simulated environments make the difference : the effectiveness of different types of training of car service procedures

    NARCIS (Netherlands)

    Borsci, Simone; Lawson, Glyn; Salanitri, Davide; Jha, Bhavna

    2016-01-01

    An empirical analysis was performed to compare the effectiveness of different approaches to training a set of procedural skills to a sample of novice trainees. Sixty-five participants were randomly assigned to one of the following three training groups: (1) learning-by-doing in a 3D desktop virtual

  17. At convenience and systematic random sampling: effects on the prognostic value of nuclear area assessments in breast cancer patients.

    Science.gov (United States)

    Jannink, I; Bennen, J N; Blaauw, J; van Diest, P J; Baak, J P

    1995-01-01

    This study compares the influence of two different nuclear sampling methods on the prognostic value of assessments of mean and standard deviation of nuclear area (MNA, SDNA) in 191 consecutive invasive breast cancer patients with long term follow up. The first sampling method used was 'at convenience' sampling (ACS); the second, systematic random sampling (SRS). Both sampling methods were tested with a sample size of 50 nuclei (ACS-50 and SRS-50). To determine whether, besides the sampling methods, sample size had impact on prognostic value as well, the SRS method was also tested using a sample size of 100 nuclei (SRS-100). SDNA values were systematically lower for ACS, obviously due to (unconsciously) not including small and large nuclei. Testing prognostic value of a series of cut off points, MNA and SDNA values assessed by the SRS method were prognostically significantly stronger than the values obtained by the ACS method. This was confirmed in Cox regression analysis. For the MNA, the Mantel-Cox p-values from SRS-50 and SRS-100 measurements were not significantly different. However, for the SDNA, SRS-100 yielded significantly lower p-values than SRS-50. In conclusion, compared with the 'at convenience' nuclear sampling method, systematic random sampling of nuclei is not only superior with respect to reproducibility of results, but also provides a better prognostic value in patients with invasive breast cancer.

  18. A flexible and coherent test/estimation procedure based on restricted mean survival times for censored time-to-event data in randomized clinical trials.

    Science.gov (United States)

    Horiguchi, Miki; Cronin, Angel M; Takeuchi, Masahiro; Uno, Hajime

    2018-04-22

    In randomized clinical trials where time-to-event is the primary outcome, almost routinely, the logrank test is prespecified as the primary test and the hazard ratio is used to quantify treatment effect. If the ratio of 2 hazard functions is not constant, the logrank test is not optimal and the interpretation of hazard ratio is not obvious. When such a nonproportional hazards case is expected at the design stage, the conventional practice is to prespecify another member of weighted logrank tests, eg, Peto-Prentice-Wilcoxon test. Alternatively, one may specify a robust test as the primary test, which can capture various patterns of difference between 2 event time distributions. However, most of those tests do not have companion procedures to quantify the treatment difference, and investigators have fallen back on reporting treatment effect estimates not associated with the primary test. Such incoherence in the "test/estimation" procedure may potentially mislead clinicians/patients who have to balance risk-benefit for treatment decision. To address this, we propose a flexible and coherent test/estimation procedure based on restricted mean survival time, where the truncation time τ is selected data dependently. The proposed procedure is composed of a prespecified test and an estimation of corresponding robust and interpretable quantitative treatment effect. The utility of the new procedure is demonstrated by numerical studies based on 2 randomized cancer clinical trials; the test is dramatically more powerful than the logrank, Wilcoxon tests, and the restricted mean survival time-based test with a fixed τ, for the patterns of difference seen in these cancer clinical trials. Copyright © 2018 John Wiley & Sons, Ltd.

  19. Long-term follow-up of a randomized clinical trial comparing Beger with pylorus-preserving Whipple procedure for chronic pancreatitis.

    Science.gov (United States)

    Müller, M W; Friess, H; Martin, D J; Hinz, U; Dahmen, R; Büchler, M W

    2008-03-01

    Duodenum-preserving pancreatic head resection according to Beger and the pylorus-preserving Whipple (ppWhipple) procedure were compared in patients with chronic pancreatitis (CP) in a randomized clinical trial. Perioperative data and short-term outcome have been reported previously. The present study evaluated long-term follow-up. Forty patients were enrolled originally, 20 in each group. Long-term follow-up included mortality, morbidity, pain status, occupational rehabilitation, quality of life (QoL), and endocrine and exocrine function at median follow-up of 7 and 14 years. One patient who had a ppWhipple procedure was lost to follow-up. There were five late deaths in each group. No differences were noted in pain status and exocrine pancreatic function. Loss of appetite was significantly worse in the ppWhipple group at 14 years' follow-up, but there were no other differences in QoL parameters examined. After 14 years, diabetes mellitus was present in seven of 15 patients who had the Beger procedure and 11 of 14 patients after ppWhipple resection (P = 0.128). After long-term follow-up of up to 14 years early advantages of the Beger procedure were no longer present. 2007 British Journal of Surgery Society Ltd. Published by John Wiley & Sons, Ltd.

  20. Randomized comparison of vaginal self-sampling by standard vs. dry swabs for Human papillomavirus testing

    International Nuclear Information System (INIS)

    Eperon, Isabelle; Vassilakos, Pierre; Navarria, Isabelle; Menoud, Pierre-Alain; Gauthier, Aude; Pache, Jean-Claude; Boulvain, Michel; Untiet, Sarah; Petignat, Patrick

    2013-01-01

    To evaluate if human papillomavirus (HPV) self-sampling (Self-HPV) using a dry vaginal swab is a valid alternative for HPV testing. Women attending colposcopy clinic were recruited to collect two consecutive Self-HPV samples: a Self-HPV using a dry swab (S-DRY) and a Self-HPV using a standard wet transport medium (S-WET). These samples were analyzed for HPV using real time PCR (Roche Cobas). Participants were randomized to determine the order of the tests. Questionnaires assessing preferences and acceptability for both tests were conducted. Subsequently, women were invited for colposcopic examination; a physician collected a cervical sample (physician-sampling) with a broom-type device and placed it into a liquid-based cytology medium. Specimens were then processed for the production of cytology slides and a Hybrid Capture HPV DNA test (Qiagen) was performed from the residual liquid. Biopsies were performed if indicated. Unweighted kappa statistics (κ) and McNemar tests were used to measure the agreement among the sampling methods. A total of 120 women were randomized. Overall HPV prevalence was 68.7% (95% Confidence Interval (CI) 59.3–77.2) by S-WET, 54.4% (95% CI 44.8–63.9) by S-DRY and 53.8% (95% CI 43.8–63.7) by HC. Among paired samples (S-WET and S-DRY), the overall agreement was good (85.7%; 95% CI 77.8–91.6) and the κ was substantial (0.70; 95% CI 0.57-0.70). The proportion of positive type-specific HPV agreement was also good (77.3%; 95% CI 68.2-84.9). No differences in sensitivity for cervical intraepithelial neoplasia grade one (CIN1) or worse between the two Self-HPV tests were observed. Women reported the two Self-HPV tests as highly acceptable. Self-HPV using dry swab transfer does not appear to compromise specimen integrity. Further study in a large screening population is needed. ClinicalTrials.gov: http://clinicaltrials.gov/show/NCT01316120

  1. Cytological preparations for molecular analysis: A review of technical procedures, advantages and limitations for referring samples for testing.

    Science.gov (United States)

    da Cunha Santos, G; Saieg, M A; Troncone, G; Zeppa, P

    2018-04-01

    Minimally invasive procedures such as endobronchial ultrasound-guided transbronchial needle aspiration (EBUS-TBNA) must yield not only good quality and quantity of material for morphological assessment, but also an adequate sample for analysis of molecular markers to guide patients to appropriate targeted therapies. In this context, cytopathologists worldwide should be familiar with minimum requirements for refereeing cytological samples for testing. The present manuscript is a review with comprehensive description of the content of the workshop entitled Cytological preparations for molecular analysis: pre-analytical issues for EBUS TBNA, presented at the 40th European Congress of Cytopathology in Liverpool, UK. The present review emphasises the advantages and limitations of different types of cytology substrates used for molecular analysis such as archival smears, liquid-based preparations, archival cytospin preparations and FTA (Flinders Technology Associates) cards, as well as their technical requirements/features. These various types of cytological specimens can be successfully used for an extensive array of molecular studies, but the quality and quantity of extracted nucleic acids rely directly on adequate pre-analytical assessment of those samples. In this setting, cytopathologists must not only be familiar with the different types of specimens and associated technical procedures, but also correctly handle the material provided by minimally invasive procedures, ensuring that there is sufficient amount of material for a precise diagnosis and correct management of the patient through personalised care. © 2018 John Wiley & Sons Ltd.

  2. A smart Monte Carlo procedure for production costing and uncertainty analysis

    International Nuclear Information System (INIS)

    Parker, C.; Stremel, J.

    1996-01-01

    Electric utilities using chronological production costing models to decide whether to buy or sell power over the next week or next few weeks need to determine potential profits or losses under a number of uncertainties. A large amount of money can be at stake--often $100,000 a day or more--and one party of the sale must always take on the risk. In the case of fixed price ($/MWh) contracts, the seller accepts the risk. In the case of cost plus contracts, the buyer must accept the risk. So, modeling uncertainty and understanding the risk accurately can improve the competitive edge of the user. This paper investigates an efficient procedure for representing risks and costs from capacity outages. Typically, production costing models use an algorithm based on some form of random number generator to select resources as available or on outage. These algorithms allow experiments to be repeated and gains and losses to be observed in a short time. The authors perform several experiments to examine the capability of three unit outage selection methods and measures their results. Specifically, a brute force Monte Carlo procedure, a Monte Carlo procedure with Latin Hypercube sampling, and a Smart Monte Carlo procedure with cost stratification and directed sampling are examined

  3. Chemical vs. herbal formulations as pre-procedural mouth rinses to combat aerosol production: A randomized controlled study

    Directory of Open Access Journals (Sweden)

    Koduganti Rekha Rani

    2014-01-01

    Full Text Available Background: Disease transmission and barrier techniques are the key concerns during ultrasonic instrumentation as this procedure has the hazard of aerosol production which has a multitude of deleterious effects on the body. The aerosol produced can affect both the patient and the clinician. The aim of this study was to assess the importance of pre-procedural rinsing before scaling by ultrasonic instrumentation and to compare the efficacy of commercially available herbal mouth rinse and a Chlorhexidine gluconate mouth rinse with a control group. The study was conducted from 1 st February to 15 th April 2012 in a tertiary referral care hospital. The study was approved by the institutional ethical committee. This was a randomized single blinded interventional study, where in 36 patients equally divided into three groups participated. Material and Methods: Thirty six patients were recruited in this study aged between 18-35 years. All patients had plaque index scores between1.5-3.0, and were categorized into three groups. Patients with systemic diseases and on antibiotic therapy were excluded. Group A or control group underwent scaling with water as pre-procedural rinse, Group B used 20 ml of 0.2% Chlorhexidine and group C were administered 18 ml of a herbal pre-procedural rinse. Aerosol splatter produced during the procedure were collected on blood agar plates and sent for microbiologic analysis for the assessment of bacterial Colony Forming Units (CFUs. The mean CFUs and standard deviation (SD for each group were measured. Post hoc test was used to compare the differences between three groups, Control (A Chlorhexidine (B and Herbal (C. Results: The mean Colony Forming Units (CFUs for control group was 114.50, Chlorhexidine group was 56.75 and herbal rinse group was 47.38. Conclusion: Pre-procedural rinsing was found to be effective in reducing aerosol contamination during ultrasonic scaling though no statistically significant difference was found

  4. Reduction of the Random Variables of the Turbulent Wind Field

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.

    2012-01-01

    .e. Importance Sampling (IS) or Subset Simulation (SS), will be deteriorated on problems with many random variables. The problem with PDEM is that a multidimensional integral has to be carried out over the space defined by the random variables of the system. The numerical procedure requires discretization......Applicability of the Probability Density Evolution Method (PDEM) for realizing evolution of the probability density for the wind turbines has rather strict bounds on the basic number of the random variables involved in the model. The efficiency of most of the Advanced Monte Carlo (AMC) methods, i...... of the integral domain; this becomes increasingly difficult as the dimensions of the integral domain increase. On the other hand efficiency of the AMC methods is closely dependent on the design points of the problem. Presence of many random variables may increase the number of the design points, hence affects...

  5. Establishing Reliable Cognitive Change in Children with Epilepsy: The Procedures and Results for a Sample with Epilepsy

    Science.gov (United States)

    van Iterson, Loretta; Augustijn, Paul B.; de Jong, Peter F.; van der Leij, Aryan

    2013-01-01

    The goal of this study was to investigate reliable cognitive change in epilepsy by developing computational procedures to determine reliable change index scores (RCIs) for the Dutch Wechsler Intelligence Scales for Children. First, RCIs were calculated based on stability coefficients from a reference sample. Then, these RCIs were applied to a…

  6. Membrane biofouling characterization: effects of sample preparation procedures on biofilm structure and the microbial community

    KAUST Repository

    Xue, Zheng

    2014-07-15

    Ensuring the quality and reproducibility of results from biofilm structure and microbial community analysis is essential to membrane biofouling studies. This study evaluated the impacts of three sample preparation factors (ie number of buffer rinses, storage time at 4°C, and DNA extraction method) on the downstream analysis of nitrifying biofilms grown on ultrafiltration membranes. Both rinse and storage affected biofilm structure, as suggested by their strong correlation with total biovolume, biofilm thickness, roughness and the spatial distribution of EPS. Significant variations in DNA yields and microbial community diversity were also observed among samples treated by different rinses, storage and DNA extraction methods. For the tested biofilms, two rinses, no storage and DNA extraction with both mechanical and chemical cell lysis from attached biofilm were the optimal sample preparation procedures for obtaining accurate information about biofilm structure, EPS distribution and the microbial community. © 2014 © 2014 Taylor & Francis.

  7. Self-reference and random sampling approach for label-free identification of DNA composition using plasmonic nanomaterials.

    Science.gov (United States)

    Freeman, Lindsay M; Pang, Lin; Fainman, Yeshaiahu

    2018-05-09

    The analysis of DNA has led to revolutionary advancements in the fields of medical diagnostics, genomics, prenatal screening, and forensic science, with the global DNA testing market expected to reach revenues of USD 10.04 billion per year by 2020. However, the current methods for DNA analysis remain dependent on the necessity for fluorophores or conjugated proteins, leading to high costs associated with consumable materials and manual labor. Here, we demonstrate a potential label-free DNA composition detection method using surface-enhanced Raman spectroscopy (SERS) in which we identify the composition of cytosine and adenine within single strands of DNA. This approach depends on the fact that there is one phosphate backbone per nucleotide, which we use as a reference to compensate for systematic measurement variations. We utilize plasmonic nanomaterials with random Raman sampling to perform label-free detection of the nucleotide composition within DNA strands, generating a calibration curve from standard samples of DNA and demonstrating the capability of resolving the nucleotide composition. The work represents an innovative way for detection of the DNA composition within DNA strands without the necessity of attached labels, offering a highly sensitive and reproducible method that factors in random sampling to minimize error.

  8. New Approach Based on Compressive Sampling for Sample Rate Enhancement in DASs for Low-Cost Sensing Nodes

    Directory of Open Access Journals (Sweden)

    Francesco Bonavolontà

    2014-10-01

    Full Text Available The paper deals with the problem of improving the maximum sample rate of analog-to-digital converters (ADCs included in low cost wireless sensing nodes. To this aim, the authors propose an efficient acquisition strategy based on the combined use of high-resolution time-basis and compressive sampling. In particular, the high-resolution time-basis is adopted to provide a proper sequence of random sampling instants, and a suitable software procedure, based on compressive sampling approach, is exploited to reconstruct the signal of interest from the acquired samples. Thanks to the proposed strategy, the effective sample rate of the reconstructed signal can be as high as the frequency of the considered time-basis, thus significantly improving the inherent ADC sample rate. Several tests are carried out in simulated and real conditions to assess the performance of the proposed acquisition strategy in terms of reconstruction error. In particular, the results obtained in experimental tests with ADC included in actual 8- and 32-bits microcontrollers highlight the possibility of achieving effective sample rate up to 50 times higher than that of the original ADC sample rate.

  9. Comparing cluster-level dynamic treatment regimens using sequential, multiple assignment, randomized trials: Regression estimation and sample size considerations.

    Science.gov (United States)

    NeCamp, Timothy; Kilbourne, Amy; Almirall, Daniel

    2017-08-01

    Cluster-level dynamic treatment regimens can be used to guide sequential treatment decision-making at the cluster level in order to improve outcomes at the individual or patient-level. In a cluster-level dynamic treatment regimen, the treatment is potentially adapted and re-adapted over time based on changes in the cluster that could be impacted by prior intervention, including aggregate measures of the individuals or patients that compose it. Cluster-randomized sequential multiple assignment randomized trials can be used to answer multiple open questions preventing scientists from developing high-quality cluster-level dynamic treatment regimens. In a cluster-randomized sequential multiple assignment randomized trial, sequential randomizations occur at the cluster level and outcomes are observed at the individual level. This manuscript makes two contributions to the design and analysis of cluster-randomized sequential multiple assignment randomized trials. First, a weighted least squares regression approach is proposed for comparing the mean of a patient-level outcome between the cluster-level dynamic treatment regimens embedded in a sequential multiple assignment randomized trial. The regression approach facilitates the use of baseline covariates which is often critical in the analysis of cluster-level trials. Second, sample size calculators are derived for two common cluster-randomized sequential multiple assignment randomized trial designs for use when the primary aim is a between-dynamic treatment regimen comparison of the mean of a continuous patient-level outcome. The methods are motivated by the Adaptive Implementation of Effective Programs Trial which is, to our knowledge, the first-ever cluster-randomized sequential multiple assignment randomized trial in psychiatry.

  10. Is Virtual Reality Ready for Prime Time in the Medical Space? A Randomized Control Trial of Pediatric Virtual Reality for Acute Procedural Pain Management.

    Science.gov (United States)

    Gold, Jeffrey I; Mahrer, Nicole E

    2018-04-01

    To conduct a randomized control trial to evaluate the feasibility and efficacy of virtual reality (VR) compared with standard of care (SOC) for reducing pain, anxiety, and improving satisfaction associated with blood draw in children ages 10-21 years. In total, 143 triads (patients, their caregiver, and the phlebotomist) were recruited in outpatient phlebotomy at a pediatric hospital and randomized to receive either VR or SOC when undergoing routine blood draw. Patients and caregivers completed preprocedural and postprocedural standardized measures of pain, anxiety, and satisfaction, and phlebotomists reported about the patient's experience during the procedure. Findings showed that VR significantly reduced acute procedural pain and anxiety compared with SOC. A significant interaction between patient-reported anxiety sensitivity and treatment condition indicated that patients undergoing routine blood draw benefit more from VR intervention when they are more fearful of physiological sensations related to anxiety. Patients and caregivers in the VR condition reported high levels of satisfaction with the procedure. VR is feasible, tolerated, and well-liked by patients, caregivers, and phlebotomists alike for routine blood draw. Given the immersive and engaging nature of the VR experience, VR has the capacity to act as a preventive intervention transforming the blood draw experience into a less distressing, potentially pain-free routine medical procedure, particularly for pediatric patients with high anxiety sensitivity. VR holds promise to reduce negative health outcomes for children and reduce distress in caregivers, while facilitating increased satisfaction and throughput in hectic outpatient phlebotomy clinics.

  11. Gray bootstrap method for estimating frequency-varying random vibration signals with small samples

    Directory of Open Access Journals (Sweden)

    Wang Yanqing

    2014-04-01

    Full Text Available During environment testing, the estimation of random vibration signals (RVS is an important technique for the airborne platform safety and reliability. However, the available methods including extreme value envelope method (EVEM, statistical tolerances method (STM and improved statistical tolerance method (ISTM require large samples and typical probability distribution. Moreover, the frequency-varying characteristic of RVS is usually not taken into account. Gray bootstrap method (GBM is proposed to solve the problem of estimating frequency-varying RVS with small samples. Firstly, the estimated indexes are obtained including the estimated interval, the estimated uncertainty, the estimated value, the estimated error and estimated reliability. In addition, GBM is applied to estimating the single flight testing of certain aircraft. At last, in order to evaluate the estimated performance, GBM is compared with bootstrap method (BM and gray method (GM in testing analysis. The result shows that GBM has superiority for estimating dynamic signals with small samples and estimated reliability is proved to be 100% at the given confidence level.

  12. Evaluation of Bias-Variance Trade-Off for Commonly Used Post-Summarizing Normalization Procedures in Large-Scale Gene Expression Studies

    Science.gov (United States)

    Qiu, Xing; Hu, Rui; Wu, Zhixin

    2014-01-01

    Normalization procedures are widely used in high-throughput genomic data analyses to remove various technological noise and variations. They are known to have profound impact to the subsequent gene differential expression analysis. Although there has been some research in evaluating different normalization procedures, few attempts have been made to systematically evaluate the gene detection performances of normalization procedures from the bias-variance trade-off point of view, especially with strong gene differentiation effects and large sample size. In this paper, we conduct a thorough study to evaluate the effects of normalization procedures combined with several commonly used statistical tests and MTPs under different configurations of effect size and sample size. We conduct theoretical evaluation based on a random effect model, as well as simulation and biological data analyses to verify the results. Based on our findings, we provide some practical guidance for selecting a suitable normalization procedure under different scenarios. PMID:24941114

  13. Oral Chloral Hydrate Compare with Rectal Thiopental in Pediatric Procedural Sedation and Analgesia; a Randomized Clinical Trial

    Directory of Open Access Journals (Sweden)

    Reza Azizkhani

    2014-03-01

    Full Text Available Introduction: The increasing use of diagnostic imaging in pediatric medicine has resulted in growing need for procedural sedation and analgesia (PSA to minimize motion artifacts during procedures. The drug of choice in pediatric PSA was not introduced till now. The aim of the present study was comparison of oral chloral hydrate (OCH and rectal sodium thiopental (RST in pediatric PSA.Methods: In the present randomized clinical trial, 2-6 years old pediatrics who referred for performing brain computed tomography scan was enrolled and were randomly divided in to two groups. OCH (50mg/kg and RST (25mg/kg were prescribed and a trained nurse recorded the time from drug prescription to receiving the conscious sedation (onset of action, the total time period which the patient has the Ramsay score≥4 (duration of action, and adverse effect of agents. Mann-Whitney U test and chi-squared test, and Non-parametric analysis of covariance (ANCOVA were used for comparisons. Results: One hundred and forty children were entered to two groups of OCH and RST, randomly. The patients of two groups had similar age, sex, weight, and baseline vital signs except for diastolic blood pressure (p<0.001. The onset of action in OCH and RST groups were 24.5±6.1and 28.7±5.2 minutes, respectively (p<0.001. Duration of action in OCH and RST groups were 12.9±2.8 minutes and 13.7±2.6 minutes, respectively (p=0.085. Non parametric ANCOVA revealed that only diastolic blood pressure was affected by drug prescription (p=0.001. In 11(15.7% patients in RST group, diarrhea was observed during 24 hours (p=0.001. Oxygen desaturation was observed only in two patients, both in OCH group. Conclusion: Each of the sedative has advantages and disadvantages that should be considered when selecting one for inducing short-term sedation. It seems that rectal sodium thiopental and oral chloral hydrate are equally effective in pediatric PSA and based on patient’s condition we can administrate

  14. Random Numbers and Quantum Computers

    Science.gov (United States)

    McCartney, Mark; Glass, David

    2002-01-01

    The topic of random numbers is investigated in such a way as to illustrate links between mathematics, physics and computer science. First, the generation of random numbers by a classical computer using the linear congruential generator and logistic map is considered. It is noted that these procedures yield only pseudo-random numbers since…

  15. Continuous quality control of the blood sampling procedure using a structured observation scheme.

    Science.gov (United States)

    Seemann, Tine Lindberg; Nybo, Mads

    2016-10-15

    An observational study was conducted using a structured observation scheme to assess compliance with the local phlebotomy guideline, to identify necessary focus items, and to investigate whether adherence to the phlebotomy guideline improved. The questionnaire from the EFLM Working Group for the Preanalytical Phase was adapted to local procedures. A pilot study of three months duration was conducted. Based on this, corrective actions were implemented and a follow-up study was conducted. All phlebotomists at the Department of Clinical Biochemistry and Pharmacology were observed. Three blood collections by each phlebotomist were observed at each session conducted at the phlebotomy ward and the hospital wards, respectively. Error frequencies were calculated for the phlebotomy ward and the hospital wards and for the two study phases. A total of 126 blood drawings by 39 phlebotomists were observed in the pilot study, while 84 blood drawings by 34 phlebotomists were observed in the follow-up study. In the pilot study, the three major error items were hand hygiene (42% error), mixing of samples (22%), and order of draw (21%). Minor significant differences were found between the two settings. After focus on the major aspects, the follow-up study showed significant improvement for all three items at both settings (P < 0.01, P < 0.01, and P = 0.01, respectively). Continuous quality control of the phlebotomy procedure revealed a number of items not conducted in compliance with the local phlebotomy guideline. It supported significant improvements in the adherence to the recommended phlebotomy procedures and facilitated documentation of the phlebotomy quality.

  16. Selective extraction of chromium(VI) using a leaching procedure with sodium carbonate from some plant leaves, soil and sediment samples

    Energy Technology Data Exchange (ETDEWEB)

    Elci, Latif, E-mail: elci@pamukkale.edu.tr [Department of Chemistry, Pamukkale University, 20017 Denizli (Turkey); Divrikli, Umit; Akdogan, Abdullah; Hol, Aysen; Cetin, Ayse [Department of Chemistry, Pamukkale University, 20017 Denizli (Turkey); Soylak, Mustafa [Department of Chemistry, Erciyes University, 38039 Kayseri (Turkey)

    2010-01-15

    Speciation of chromium in some plant leaves, soil and sediment samples was carried out by selective leaching of Cr(VI) using a sodium carbonate leaching procedure. Total chromium from the samples was extracted using aqua regia and oxidative acid digestion, respectively. The concentrations of chromium species in the extracts were determined using by graphite furnace atomic absorption spectrometry (GFAAS). Uncoated graphite furnace tubes were used as an atomizer. Due to the presence of relatively high amounts of Na{sub 2}CO{sub 3} in the resulting samples, the possible influences of Na{sub 2}CO{sub 3} on the absorbance signals were checked. There is no interference of Na{sub 2}CO{sub 3} on the chromium absorbance up to 0.1 mol L{sup -1} Na{sub 2}CO{sub 3}. A limit of detection (LOD) for determination of Cr(VI) in 0.1 Na{sub 2}CO{sub 3} solution by GFAAS was found to be 0.93 {mu}g L{sup -1}. The procedure was applied to environmental samples. The relative standard deviation, R.S.D. as precision for 10 replicate measurements of 20 {mu} L{sup -1} Cr in processed soil sample was 4.2%.

  17. Selective extraction of chromium(VI) using a leaching procedure with sodium carbonate from some plant leaves, soil and sediment samples.

    Science.gov (United States)

    Elci, Latif; Divrikli, Umit; Akdogan, Abdullah; Hol, Aysen; Cetin, Ayse; Soylak, Mustafa

    2010-01-15

    Speciation of chromium in some plant leaves, soil and sediment samples was carried out by selective leaching of Cr(VI) using a sodium carbonate leaching procedure. Total chromium from the samples was extracted using aqua regia and oxidative acid digestion, respectively. The concentrations of chromium species in the extracts were determined using by graphite furnace atomic absorption spectrometry (GFAAS). Uncoated graphite furnace tubes were used as an atomizer. Due to the presence of relatively high amounts of Na(2)CO(3) in the resulting samples, the possible influences of Na(2)CO(3) on the absorbance signals were checked. There is no interference of Na(2)CO(3) on the chromium absorbance up to 0.1 mol L(-1) Na(2)CO(3). A limit of detection (LOD) for determination of Cr(VI) in 0.1 Na(2)CO(3) solution by GFAAS was found to be 0.93 microg L(-1). The procedure was applied to environmental samples. The relative standard deviation, R.S.D. as precision for 10 replicate measurements of 20 microL(-1) Cr in processed soil sample was 4.2%.

  18. Acute stress symptoms during the second Lebanon war in a random sample of Israeli citizens.

    Science.gov (United States)

    Cohen, Miri; Yahav, Rivka

    2008-02-01

    The aims of this study were to assess prevalence of acute stress disorder (ASD) and acute stress symptoms (ASS) in Israel during the second Lebanon war. A telephone survey was conducted in July 2006 of a random sample of 235 residents of northern Israel, who were subjected to missile attacks, and of central Israel, who were not subjected to missile attacks. Results indicate that ASS scores were higher in the northern respondents; 6.8% of the northern sample and 3.9% of the central sample met ASD criteria. Appearance of each symptom ranged from 15.4% for dissociative to 88.4% for reexperiencing, with significant differences between northern and central respondents only for reexperiencing and arousal. A low ASD rate and a moderate difference between areas subjected and not subjected to attack were found.

  19. Prevalence and correlates of problematic smartphone use in a large random sample of Chinese undergraduates.

    Science.gov (United States)

    Long, Jiang; Liu, Tie-Qiao; Liao, Yan-Hui; Qi, Chang; He, Hao-Yu; Chen, Shu-Bao; Billieux, Joël

    2016-11-17

    Smartphones are becoming a daily necessity for most undergraduates in Mainland China. Because the present scenario of problematic smartphone use (PSU) is largely unexplored, in the current study we aimed to estimate the prevalence of PSU and to screen suitable predictors for PSU among Chinese undergraduates in the framework of the stress-coping theory. A sample of 1062 undergraduate smartphone users was recruited by means of the stratified cluster random sampling strategy between April and May 2015. The Problematic Cellular Phone Use Questionnaire was used to identify PSU. We evaluated five candidate risk factors for PSU by using logistic regression analysis while controlling for demographic characteristics and specific features of smartphone use. The prevalence of PSU among Chinese undergraduates was estimated to be 21.3%. The risk factors for PSU were majoring in the humanities, high monthly income from the family (≥1500 RMB), serious emotional symptoms, high perceived stress, and perfectionism-related factors (high doubts about actions, high parental expectations). PSU among undergraduates appears to be ubiquitous and thus constitutes a public health issue in Mainland China. Although further longitudinal studies are required to test whether PSU is a transient phenomenon or a chronic and progressive condition, our study successfully identified socio-demographic and psychological risk factors for PSU. These results, obtained from a random and thus representative sample of undergraduates, opens up new avenues in terms of prevention and regulation policies.

  20. Randomized Comparison of Two Vaginal Self-Sampling Methods for Human Papillomavirus Detection: Dry Swab versus FTA Cartridge

    OpenAIRE

    Catarino, Rosa; Vassilakos, Pierre; Bilancioni, Aline; Vanden Eynde, Mathieu; Meyer-Hamme, Ulrike; Menoud, Pierre-Alain; Guerry, Fr?d?ric; Petignat, Patrick

    2015-01-01

    Background Human papillomavirus (HPV) self-sampling (self-HPV) is valuable in cervical cancer screening. HPV testing is usually performed on physician-collected cervical smears stored in liquid-based medium. Dry filters and swabs are an alternative. We evaluated the adequacy of self-HPV using two dry storage and transport devices, the FTA cartridge and swab. Methods A total of 130 women performed two consecutive self-HPV samples. Randomization determined which of the two tests was performed f...

  1. A radiochemical procedure for a low-level measurement of ''241Am in environmental samples using a supported functional organo phosphorus extractant

    International Nuclear Information System (INIS)

    Gasco, C.; Anton, M. P.; Alvarez, A.; Navarro, N.; Salvador, S.

    1994-01-01

    The transuranides analysis in environmental samples is carried out by CIEMAT using standardized methods based on sequential separation with ionic-exchange resins. The americium fraction is purified through a two-layer ion exchange column and lately in an anion-exchange column in nitric acid methanol medium. The technique is time consuming and the results are not completely satisfactory (low recovery and loss of a-resolution) for some samples. The chemical compound CMPO (octyl(phenyl)-N,N-diisobutyl carbomoylmethyiphosphine oxide) dissolved in TPB (tributyl phosphate) and supported on an inert substrate has been tested directly for ''241Am analysis by a large number of laboratories. A new method that combines both procedures has been developed. The details of the improved procedure are described in this paper. The advantages of its application to environmental samples (urine, faeces and sediments) are discussed. The utilization of standard samples, with americium certified concentrations confirms the reliability of our measurements. (Author) 8 refs

  2. Discriminative motif discovery via simulated evolution and random under-sampling.

    Directory of Open Access Journals (Sweden)

    Tao Song

    Full Text Available Conserved motifs in biological sequences are closely related to their structure and functions. Recently, discriminative motif discovery methods have attracted more and more attention. However, little attention has been devoted to the data imbalance problem, which is one of the main reasons affecting the performance of the discriminative models. In this article, a simulated evolution method is applied to solve the multi-class imbalance problem at the stage of data preprocessing, and at the stage of Hidden Markov Models (HMMs training, a random under-sampling method is introduced for the imbalance between the positive and negative datasets. It is shown that, in the task of discovering targeting motifs of nine subcellular compartments, the motifs found by our method are more conserved than the methods without considering data imbalance problem and recover the most known targeting motifs from Minimotif Miner and InterPro. Meanwhile, we use the found motifs to predict protein subcellular localization and achieve higher prediction precision and recall for the minority classes.

  3. Discriminative motif discovery via simulated evolution and random under-sampling.

    Science.gov (United States)

    Song, Tao; Gu, Hong

    2014-01-01

    Conserved motifs in biological sequences are closely related to their structure and functions. Recently, discriminative motif discovery methods have attracted more and more attention. However, little attention has been devoted to the data imbalance problem, which is one of the main reasons affecting the performance of the discriminative models. In this article, a simulated evolution method is applied to solve the multi-class imbalance problem at the stage of data preprocessing, and at the stage of Hidden Markov Models (HMMs) training, a random under-sampling method is introduced for the imbalance between the positive and negative datasets. It is shown that, in the task of discovering targeting motifs of nine subcellular compartments, the motifs found by our method are more conserved than the methods without considering data imbalance problem and recover the most known targeting motifs from Minimotif Miner and InterPro. Meanwhile, we use the found motifs to predict protein subcellular localization and achieve higher prediction precision and recall for the minority classes.

  4. HASL procedures manual

    International Nuclear Information System (INIS)

    Harley, J.H.

    1977-08-01

    Additions and corrections to the following sections of the HASL Procedures Manual are provided: General, Sampling, Field Measurements; General Analytical Chemistry, Chemical Procedures, Data Section, and Specifications

  5. Procedure-related risk of miscarriage following amniocentesis and chorionic villus sampling: a systematic review and meta-analysis.

    Science.gov (United States)

    Akolekar, R; Beta, J; Picciarelli, G; Ogilvie, C; D'Antonio, F

    2015-01-01

    To estimate procedure-related risks of miscarriage following amniocentesis and chorionic villus sampling (CVS) based on a systematic review of the literature and a meta-analysis. A search of MEDLINE, EMBASE, CINHAL and The Cochrane Library (2000-2014) was performed to review relevant citations reporting procedure-related complications of amniocentesis and CVS. Only studies reporting data on more than 1000 procedures were included in this review to minimize the effect of bias from smaller studies. Heterogeneity between studies was estimated using Cochran's Q, the I(2) statistic and Egger bias. Meta-analysis of proportions was used to derive weighted pooled estimates for the risk of miscarriage before 24 weeks' gestation. Incidence-rate difference meta-analysis was used to estimate pooled procedure-related risks. The weighted pooled risks of miscarriage following invasive procedures were estimated from analysis of controlled studies including 324 losses in 42 716 women who underwent amniocentesis and 207 losses in 8899 women who underwent CVS. The risk of miscarriage prior to 24 weeks in women who underwent amniocentesis and CVS was 0.81% (95% CI, 0.58-1.08%) and 2.18% (95% CI, 1.61-2.82%), respectively. The background rates of miscarriage in women from the control group that did not undergo any procedures were 0.67% (95% CI, 0.46-0.91%) for amniocentesis and 1.79% (95% CI, 0.61-3.58%) for CVS. The weighted pooled procedure-related risks of miscarriage for amniocentesis and CVS were 0.11% (95% CI, -0.04 to 0.26%) and 0.22% (95% CI, -0.71 to 1.16%), respectively. The procedure-related risks of miscarriage following amniocentesis and CVS are much lower than are currently quoted. Copyright © 2014 ISUOG. Published by John Wiley & Sons Ltd.

  6. Integrated preservation and sample clean up procedures for studying water ingestion by recreational swimmers via urinary biomarker determination.

    Science.gov (United States)

    Cantú, Ricardo; Shoemaker, Jody A; Kelty, Catherine A; Wymer, Larry J; Behymer, Thomas D; Dufour, Alfred P; Magnuson, Matthew L

    2017-08-22

    The use of cyanuric acid as a biomarker for ingestion of swimming pool water may lead to quantitative knowledge of the volume of water ingested during swimming, contributing to a better understanding of disease resulting from ingestion of environmental contaminants. When swimming pool water containing chlorinated cyanurates is inadvertently ingested, cyanuric acid is excreted quantitatively within 24 h as a urinary biomarker of ingestion. Because the volume of water ingested can be quantitatively estimated by calculation from the concentration of cyanuric acid in 24 h urine samples, a procedure for preservation, cleanup, and analysis of cyanuric acid was developed to meet the logistical demands of large scale studies. From a practical stand point, urine collected from swimmers cannot be analyzed immediately, given requirements of sample collection, shipping, handling, etc. Thus, to maintain quality control to allow confidence in the results, it is necessary to preserve the samples in a manner that ensures as quantitative analysis as possible. The preservation and clean-up of cyanuric acid in urine is complicated because typical approaches often are incompatible with the keto-enol tautomerization of cyanuric acid, interfering with cyanuric acid sample preparation, chromatography, and detection. Therefore, this paper presents a novel integration of sample preservation, clean-up, chromatography, and detection to determine cyanuric acid in 24 h urine samples. Fortification of urine with cyanuric acid (0.3-3.0 mg/L) demonstrated accuracy (86-93% recovery) and high reproducibility (RSD urine suggested sufficient cyanuric acid stability for sample collection procedures, while longer holding times suggested instability of the unpreserved urine. Preserved urine exhibited a loss of around 0.5% after 22 days at refrigerated storage conditions of 4 °C. Published by Elsevier B.V.

  7. A very sensitive LSC procedure to determine Ni-63 in environmental samples, steel and concrete

    International Nuclear Information System (INIS)

    Scheuerer, C.; Schupfner, R.; Schuettelkopf, H.

    1995-01-01

    This procedure to determine Ni-63 contributes to a safe and economically reasonable decommissioning of nuclear power plants. Co-60, Fe-55 and Ni-63 are the most abundant long-lived radionuclides associated with contaminated piping, hardware and concrete for a period of several decades of years after shutdown. Samples are carefully ashed leached, or dissolved by suitable mixtures of acids. The analysis starts with the absorption Ni 2+ on the chelating resin CHELEX 100. The next purification steps include an anionic exchange column and a precipitation as Ni-dimethyl-glyoxime, which is extracted into chloroform. After reextraction with sulfuric acid the solution containing Ni 2+ is mixed with a scintillation cocktail and counted in an anticoincidence shielded LSC. The decontamination factors are determined for all important artificially and naturally occurring radionuclides ranging form above 10 4 to 10 9 . The chemical yield adopts a value of (95±5)%. Up to maximum sample amounts of 0.4 g steel, 5 g concrete and about 100 g of environmental samples the detection limits are about 5 mBq per sample or 12 mBq/g steel, 1 mBq/g concrete and 0.05 mBq/g environmental sample at a counting time of 1000 minutes. (author) 16 refs.; 2 figs.; 2 tabs

  8. A column exchange chromatographic procedure for the automated purification of analytical samples in nuclear spent fuel reprocessing and plutonium fuel fabrication

    International Nuclear Information System (INIS)

    Zahradnik, P.; Swietly, H.; Doubek, N.; Bagliano, G.

    1992-11-01

    A Column Exchange Chromatographic procedure using Tri-n-Octyl-Phosphine-Oxide (TOPO) as stationary phase, is in routine use at SAL since 1984 on nuclear spent fuel reprocessing and on Pu product samples, prior to alpha and mass spectrometric analysis. This standard procedure was further on modified in view of its automation in a glove box; the resulting new procedure is described in this paper. Laboratory Robot Compatible (LRC) disposable columns were selected because their dimensions are particularly favorable and reproducible. A less corrosive HNO 3 -HI mixture substituted the former HC1-HI plutonium eluant. The inorganic support of the stationary phase used to test the above mentioned changes was unexpectedly withdrawn from the market so that another support had to be selected and the procedure reoptimized accordingly. The resulting procedure was tested with the robot and validated against the manual procedure taken as reference: the comparison showed that the modified procedure meets the analytical requirements and has the same performance than the original procedure. (author). Refs, figs and tabs

  9. Optimized pre-thinning procedures of ion-beam thinning for TEM sample preparation by magnetorheological polishing.

    Science.gov (United States)

    Luo, Hu; Yin, Shaohui; Zhang, Guanhua; Liu, Chunhui; Tang, Qingchun; Guo, Meijian

    2017-10-01

    Ion-beam-thinning is a well-established sample preparation technique for transmission electron microscopy (TEM), but tedious procedures and labor consuming pre-thinning could seriously reduce its efficiency. In this work, we present a simple pre-thinning technique by using magnetorheological (MR) polishing to replace manual lapping and dimpling, and demonstrate the successful preparation of electron-transparent single crystal silicon samples after MR polishing and single-sided ion milling. Dimples pre-thinned to less than 30 microns and with little mechanical surface damage were repeatedly produced under optimized MR polishing conditions. Samples pre-thinned by both MR polishing and traditional technique were ion-beam thinned from the rear side until perforation, and then observed by optical microscopy and TEM. The results show that the specimen pre-thinned by MR technique was free from dimpling related defects, which were still residual in sample pre-thinned by conventional technique. Nice high-resolution TEM images could be acquired after MR polishing and one side ion-thinning. MR polishing promises to be an adaptable and efficient method for pre-thinning in preparation of TEM specimens, especially for brittle ceramics. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Possible overestimation of surface disinfection efficiency by assessment methods based on liquid sampling procedures as demonstrated by in situ quantification of spore viability.

    Science.gov (United States)

    Grand, I; Bellon-Fontaine, M-N; Herry, J-M; Hilaire, D; Moriconi, F-X; Naïtali, M

    2011-09-01

    The standard test methods used to assess the efficiency of a disinfectant applied to surfaces are often based on counting the microbial survivors sampled in a liquid, but total cell removal from surfaces is seldom achieved. One might therefore wonder whether evaluations of microbial survivors in liquid-sampled cells are representative of the levels of survivors in whole populations. The present study was thus designed to determine the "damaged/undamaged" status induced by a peracetic acid disinfection for Bacillus atrophaeus spores deposited on glass coupons directly on this substrate and to compare it to the status of spores collected in liquid by a sampling procedure. The method utilized to assess the viability of both surface-associated and liquid-sampled spores included fluorescence labeling with a combination of Syto 61 and Chemchrome V6 dyes and quantifications by analyzing the images acquired by confocal laser scanning microscopy. The principal result of the study was that the viability of spores sampled in the liquid was found to be poorer than that of surface-associated spores. For example, after 2 min of peracetic acid disinfection, less than 17% ± 5% of viable cells were detected among liquid-sampled cells compared to 79% ± 5% or 47% ± 4%, respectively, when the viability was evaluated on the surface after or without the sampling procedure. Moreover, assessments of the survivors collected in the liquid phase, evaluated using the microscopic method and standard plate counts, were well correlated. Evaluations based on the determination of survivors among the liquid-sampled cells can thus overestimate the efficiency of surface disinfection procedures.

  11. Accounting for randomness in measurement and sampling in studying cancer cell population dynamics.

    Science.gov (United States)

    Ghavami, Siavash; Wolkenhauer, Olaf; Lahouti, Farshad; Ullah, Mukhtar; Linnebacher, Michael

    2014-10-01

    Knowing the expected temporal evolution of the proportion of different cell types in sample tissues gives an indication about the progression of the disease and its possible response to drugs. Such systems have been modelled using Markov processes. We here consider an experimentally realistic scenario in which transition probabilities are estimated from noisy cell population size measurements. Using aggregated data of FACS measurements, we develop MMSE and ML estimators and formulate two problems to find the minimum number of required samples and measurements to guarantee the accuracy of predicted population sizes. Our numerical results show that the convergence mechanism of transition probabilities and steady states differ widely from the real values if one uses the standard deterministic approach for noisy measurements. This provides support for our argument that for the analysis of FACS data one should consider the observed state as a random variable. The second problem we address is about the consequences of estimating the probability of a cell being in a particular state from measurements of small population of cells. We show how the uncertainty arising from small sample sizes can be captured by a distribution for the state probability.

  12. RANdom SAmple Consensus (RANSAC) algorithm for material-informatics: application to photovoltaic solar cells.

    Science.gov (United States)

    Kaspi, Omer; Yosipof, Abraham; Senderowitz, Hanoch

    2017-06-06

    An important aspect of chemoinformatics and material-informatics is the usage of machine learning algorithms to build Quantitative Structure Activity Relationship (QSAR) models. The RANdom SAmple Consensus (RANSAC) algorithm is a predictive modeling tool widely used in the image processing field for cleaning datasets from noise. RANSAC could be used as a "one stop shop" algorithm for developing and validating QSAR models, performing outlier removal, descriptors selection, model development and predictions for test set samples using applicability domain. For "future" predictions (i.e., for samples not included in the original test set) RANSAC provides a statistical estimate for the probability of obtaining reliable predictions, i.e., predictions within a pre-defined number of standard deviations from the true values. In this work we describe the first application of RNASAC in material informatics, focusing on the analysis of solar cells. We demonstrate that for three datasets representing different metal oxide (MO) based solar cell libraries RANSAC-derived models select descriptors previously shown to correlate with key photovoltaic properties and lead to good predictive statistics for these properties. These models were subsequently used to predict the properties of virtual solar cells libraries highlighting interesting dependencies of PV properties on MO compositions.

  13. Relationship between accuracy and number of samples on statistical quantity and contour map of environmental gamma-ray dose rate. Example of random sampling

    International Nuclear Information System (INIS)

    Matsuda, Hideharu; Minato, Susumu

    2002-01-01

    The accuracy of statistical quantity like the mean value and contour map obtained by measurement of the environmental gamma-ray dose rate was evaluated by random sampling of 5 different model distribution maps made by the mean slope, -1.3, of power spectra calculated from the actually measured values. The values were derived from 58 natural gamma dose rate data reported worldwide ranging in the means of 10-100 Gy/h rates and 10 -3 -10 7 km 2 areas. The accuracy of the mean value was found around ±7% even for 60 or 80 samplings (the most frequent number) and the standard deviation had the accuracy less than 1/4-1/3 of the means. The correlation coefficient of the frequency distribution was found 0.860 or more for 200-400 samplings (the most frequent number) but of the contour map, 0.502-0.770. (K.H.)

  14. Randomized trial of one-hour sodium bicarbonate vs standard periprocedural saline hydration in chronic kidney disease patients undergoing cardiovascular contrast procedures.

    Directory of Open Access Journals (Sweden)

    Judith Kooiman

    Full Text Available Guidelines advise periprocedural saline hydration for prevention of contrast induced-acute kidney injury (CI-AKI. We analysed whether 1-hour sodium bicarbonate hydration administered solely prior to intra-arterial contrast exposure is non-inferior to standard periprocedural saline hydration in chronic kidney disease (CKD patients undergoing elective cardiovascular diagnostic or interventional contrast procedures.We performed an open-label multicentre non-inferiority trial between 2011-2014. Patients were randomized to 1 hour pre-procedure sodium bicarbonate hydration (250 ml 1.4%, N = 168 or 4-12 hours saline hydration (1000 ml 0.9%, N = 165 prior to and following contrast administration (2000 ml of saline total. Primary outcome was the relative serum creatinine increase (% 48-96 hours post contrast exposure. Secondary outcomes were: incidence of CI-AKI (serum creatinine increase>25% or >44μmol/L, recovery of renal function, the need for dialysis, and hospital costs within two months follow-up.Mean relative creatinine increase was 3.1% (95%CI 0.9 to 5.2% in the bicarbonate and 1.1% (95%CI -1.2 to 3.5% in the saline arm, mean difference 1.9% (95%CI -1.2 to 5.1%, p-non-inferiority <0.001. CI-AKI occurred in 11 (6.7% patients randomized to sodium bicarbonate and 12 (7.5% to saline (p = 0.79. Renal function did not fully recover in 40.0% and 44.4% of CI-AKI patients, respectively (p = 0.84. No patient required dialysis. Mean costs for preventive hydration and clinical preparation for the contrast procedure were $1158 for sodium bicarbonate vs. $1561 for saline (p < 0.001.Short hydration with sodium bicarbonate prior to elective cardiovascular diagnostic or therapeutic contrast procedures is non-inferior to standard periprocedural saline hydration in CKD patients with respect to renal safety and results in considerable healthcare savings.Netherlands Trial Register (http://www.trialregister.nl/trialreg/index.asp, Nr NTR2699.

  15. Randomized trial of one-hour sodium bicarbonate vs standard periprocedural saline hydration in chronic kidney disease patients undergoing cardiovascular contrast procedures.

    Science.gov (United States)

    Kooiman, Judith; de Vries, Jean-Paul P M; Van der Heyden, Jan; Sijpkens, Yvo W J; van Dijkman, Paul R M; Wever, Jan J; van Overhagen, Hans; Vahl, Antonie C; Aarts, Nico; Verberk-Jonkers, Iris J A M; Brulez, Harald F H; Hamming, Jaap F; van der Molen, Aart J; Cannegieter, Suzanne C; Putter, Hein; van den Hout, Wilbert B; Kilicsoy, Inci; Rabelink, Ton J; Huisman, Menno V

    2018-01-01

    Guidelines advise periprocedural saline hydration for prevention of contrast induced-acute kidney injury (CI-AKI). We analysed whether 1-hour sodium bicarbonate hydration administered solely prior to intra-arterial contrast exposure is non-inferior to standard periprocedural saline hydration in chronic kidney disease (CKD) patients undergoing elective cardiovascular diagnostic or interventional contrast procedures. We performed an open-label multicentre non-inferiority trial between 2011-2014. Patients were randomized to 1 hour pre-procedure sodium bicarbonate hydration (250 ml 1.4%, N = 168) or 4-12 hours saline hydration (1000 ml 0.9%, N = 165) prior to and following contrast administration (2000 ml of saline total). Primary outcome was the relative serum creatinine increase (%) 48-96 hours post contrast exposure. Secondary outcomes were: incidence of CI-AKI (serum creatinine increase>25% or >44μmol/L), recovery of renal function, the need for dialysis, and hospital costs within two months follow-up. Mean relative creatinine increase was 3.1% (95%CI 0.9 to 5.2%) in the bicarbonate and 1.1% (95%CI -1.2 to 3.5%) in the saline arm, mean difference 1.9% (95%CI -1.2 to 5.1%, p-non-inferiority <0.001). CI-AKI occurred in 11 (6.7%) patients randomized to sodium bicarbonate and 12 (7.5%) to saline (p = 0.79). Renal function did not fully recover in 40.0% and 44.4% of CI-AKI patients, respectively (p = 0.84). No patient required dialysis. Mean costs for preventive hydration and clinical preparation for the contrast procedure were $1158 for sodium bicarbonate vs. $1561 for saline (p < 0.001). Short hydration with sodium bicarbonate prior to elective cardiovascular diagnostic or therapeutic contrast procedures is non-inferior to standard periprocedural saline hydration in CKD patients with respect to renal safety and results in considerable healthcare savings. Netherlands Trial Register (http://www.trialregister.nl/trialreg/index.asp), Nr NTR2699.

  16. A Procedure for the Sequential Determination of Radionuclides in Environmental Samples. Liquid Scintillation Counting and Alpha Spectrometry for 90Sr, 241Am and Pu Radioisotopes

    International Nuclear Information System (INIS)

    2014-01-01

    Since 2004, IAEA activities related to the terrestrial environment have aimed at the development of a set of procedures to determine radionuclides in environmental samples. Reliable, comparable and ‘fit for purpose’ results are an essential requirement for any decision based on analytical measurements. For the analyst, tested and validated analytical procedures are extremely important tools for the production of analytical data. For maximum utility, such procedures should be comprehensive, clearly formulated and readily available for reference to both the analyst and the customer. This publication describes a combined procedure for the sequential determination of 90 Sr, 241 Am and Pu radioisotopes in environmental samples. The method is based on the chemical separation of strontium, americium and plutonium using ion exchange chromatography, extraction chromatography and precipitation followed by alpha spectrometric and liquid scintillation counting detection. The method was tested and validated in terms of repeatability and trueness in accordance with International Organization for Standardization (ISO) guidelines using reference materials and proficiency test samples. Reproducibility tests were performed later at the IAEA Terrestrial Environment Laboratory. The calculations of the massic activity, uncertainty budget, decision threshold and detection limit are also described in this publication. The procedure is introduced for the determination of 90 Sr, 241 Am and Pu radioisotopes in environmental samples such as soil, sediment, air filter and vegetation samples. It is expected to be of general use to a wide range of laboratories, including the Analytical Laboratories for the Measurement of Environmental Radioactivity (ALMERA) network for routine environmental monitoring purposes

  17. Effective Recruitment of Schools for Randomized Clinical Trials: Role of School Nurses.

    Science.gov (United States)

    Petosa, R L; Smith, L

    2017-01-01

    In school settings, nurses lead efforts to improve the student health and well-being to support academic success. Nurses are guided by evidenced-based practice and data to inform care decisions. The randomized controlled trial (RCT) is considered the gold standard of scientific rigor for clinical trials. RCTs are critical to the development of evidence-based health promotion programs in schools. The purpose of this article is to present practical solutions to implementing principles of randomization to RCT trials conducted in school settings. Randomization is a powerful sampling method used to build internal and external validity. The school's daily organization and educational mission provide several barriers to randomization. Based on the authors' experience in conducting school-based RCTs, they offer a host of practical solutions to working with schools to successfully implement randomization procedures. Nurses play a critical role in implementing RCTs in schools to promote rigorous science in support of evidence-based practice.

  18. Characterization of Friction Joints Subjected to High Levels of Random Vibration

    Science.gov (United States)

    deSantos, Omar; MacNeal, Paul

    2012-01-01

    This paper describes the test program in detail including test sample description, test procedures, and vibration test results of multiple test samples. The material pairs used in the experiment were Aluminum-Aluminum, Aluminum- Dicronite coated Aluminum, and Aluminum-Plasmadize coated Aluminum. Levels of vibration for each set of twelve samples of each material pairing were gradually increased until all samples experienced substantial displacement. Data was collected on 1) acceleration in all three axes, 2) relative static displacement between vibration runs utilizing photogrammetry techniques, and 3) surface galling and contaminant generation. This data was used to estimate the values of static friction during random vibratory motion when "stick-slip" occurs and compare these to static friction coefficients measured before and after vibration testing.

  19. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR COLLECTION, STORAGE, AND SHIPMENT OF URINE SAMPLES FOR METALS AND PESTICIDES ANALYSIS (UA-F-20.1)

    Science.gov (United States)

    The purpose of this SOP is to guide the collection, storage, and shipment of urine samples collected. This SOP provides a brief description of sample, collection, preservation, storage, shipping, and custody procedures. This procedure was followed to ensure consistent data retri...

  20. Procedures for sampling and sample-reduction within quality assurance systems for solid biofuels

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-04-15

    The bias introduced when sampling solid biofuels from stockpiles or containers instead of from moving streams is assessed as well as the number and size of samples required to represent accurately the bulk sample, variations introduced when reducing bulk samples into samples for testing, and the usefulness of sample reduction methods. Details are given of the experimental work carried out in Sweden and Denmark using sawdust, wood chips, wood pellets, forestry residues and straw. The production of a model European Standard for quality assurance of solid biofuels is examined.

  1. Minimization of the blank values in the neutron activation analysis of biological samples considering the whole procedure

    International Nuclear Information System (INIS)

    Lux, F.; Bereznai, T.; Haeberlin, T.H.

    1986-01-01

    In our determination of trace element contents of animal tissue by neutron activation analysis in the course of structure-activity relationship studies on platinum containing cancer drugs and wound healing we have tried to minimize the blank values that are caused by different sources of contamination during surgery, sampling and the activation analysis procedure. The following topics have been investigated: the abrasions from scalpels made of stainless steel, titanium or quartz; the type of surgery; the homogenisation of the samples before irradiation by use of a ball milll; the surface contaminations of the quartz ampoules that pass into the digestion solution of the irradiated samples. The appropriate measures taken in order to reduce the blank values are described. The results of analyses performed under these conditions indicate the effectiveness of the given measures, especially shown by the low values obtained for the chromium contents of the analysed muscle samples. (author)

  2. Sequential extraction procedure for determination of uranium, thorium, radium, lead and polonium radionuclides by alpha spectrometry in environmental samples

    Science.gov (United States)

    Oliveira, J. M.; Carvalho, F. P.

    2006-01-01

    A sequential extraction technique was developed and tested for common naturally-occurring radionuclides. This technique allows the extraction and purification of uranium, thorium, radium, lead, and polonium radionuclides from the same sample. Environmental materials such as water, soil, and biological samples can be analyzed for those radionuclides without matrix interferences in the quality of radioelement purification and in the radiochemical yield. The use of isotopic tracers (232U, 229Th, 224Ra, 209Po, and stable lead carrier) added to the sample in the beginning of the chemical procedure, enables an accurate control of the radiochemical yield for each radioelement. The ion extraction procedure, applied after either complete dissolution of the solid sample with mineral acids or co-precipitation of dissolved radionuclide with MnO2 for aqueous samples, includes the use of commercially available pre-packed columns from Eichrom® and ion exchange columns packed with Bio-Rad resins, in altogether three chromatography columns. All radioactive elements but one are purified and electroplated on stainless steel discs. Polonium is spontaneously plated on a silver disc. The discs are measured using high resolution silicon surface barrier detectors. 210Pb, a beta emitter, can be measured either through the beta emission of 210Bi, or stored for a few months and determined by alpha spectrometry through the in-growth of 210Po. This sequential extraction chromatography technique was tested and validated with the analysis of certified reference materials from the IAEA. Reproducibility was tested through repeated analysis of the same homogeneous material (water sample).

  3. Choosing a suitable sample size in descriptive sampling

    International Nuclear Information System (INIS)

    Lee, Yong Kyun; Choi, Dong Hoon; Cha, Kyung Joon

    2010-01-01

    Descriptive sampling (DS) is an alternative to crude Monte Carlo sampling (CMCS) in finding solutions to structural reliability problems. It is known to be an effective sampling method in approximating the distribution of a random variable because it uses the deterministic selection of sample values and their random permutation,. However, because this method is difficult to apply to complex simulations, the sample size is occasionally determined without thorough consideration. Input sample variability may cause the sample size to change between runs, leading to poor simulation results. This paper proposes a numerical method for choosing a suitable sample size for use in DS. Using this method, one can estimate a more accurate probability of failure in a reliability problem while running a minimal number of simulations. The method is then applied to several examples and compared with CMCS and conventional DS to validate its usefulness and efficiency

  4. Optimal Order of Successive Office Hysteroscopy and Endometrial Biopsy for the Evaluation of Abnormal Uterine Bleeding: A Randomized Controlled Trial.

    Science.gov (United States)

    Sarkar, Papri; Mikhail, Emad; Schickler, Robyn; Plosker, Shayne; Imudia, Anthony N

    2017-09-01

    To estimate the optimal order of office hysteroscopy and endometrial biopsy when performed successively for evaluation of abnormal uterine bleeding. Patients undergoing successive office hysteroscopy and endometrial biopsy were included in a single-blind, prospective, randomized trial. The primary outcome was to evaluate the effect of order of procedures on patients' pain score. Prespecified secondary outcomes include procedure duration, hysteroscopic visualization of the uterine cavity, endometrial sample adequacy, and number of attempts at biopsy. Pain scores were assessed using a visual analog scale from 0 to 10 and endometrial sample adequacy was determined from the histopathology report. Hysteroscopy images were recorded. Sample size of 34 per group (n=68) was determined to be adequate to detect a difference of 20% in visual analog scale score between hysteroscopy first (group A) and biopsy first (group B) at α of 0.05 and 80% power. Between October 2015 and January 2017, 78 women were randomized to group A (n=40) and group B (n=38). There was no difference in global pain perception [7 (0-10) vs 7 (0-10); P=.57, 95% CI 5.8-7.1]. Procedure duration [3 (1-9) vs 3 (2-10), P=.32, 95% CI 3.3-4.1] and endometrial sample adequacy (78.9% vs 75.7%, P=.74) were similar in both groups. Group A patients had better endometrial visualization (Pabnormal uterine bleeding, the global pain perception, and time required are independent of the order in which procedures are performed. Performing hysteroscopy first ensures better image, whereas biopsy first yields adequate tissue sample with fewer attempts. ClinicalTrials.gov, NCT02472184.

  5. Sampling for stereology in lungs

    Directory of Open Access Journals (Sweden)

    J. R. Nyengaard

    2006-12-01

    Full Text Available The present article reviews the relevant stereological estimators for obtaining reliable quantitative structural data from the lungs. Stereological sampling achieves reliable, quantitative information either about the whole lung or complete lobes, whilst minimising the workload. Studies have used systematic random sampling, which has fixed and constant sampling probabilities on all blocks, sections and fields of view. For an estimation of total lung or lobe volume, the Cavalieri principle can be used, but it is not useful in estimating individual cell volume due to various effects from over- or underprojection. If the number of certain structures is required, two methods can be used: the disector and the fractionator. The disector method is a three-dimensional stereological probe for sampling objects according to their number. However, it may be affected on tissue deformation and, therefore, the fractionator method is often the preferred sampling principle. In this method, a known and predetermined fraction of an object is sampled in one or more steps, with the final step estimating the number. Both methods can be performed in a physical and optical manner, therefore enabling cells and larger lung structure numbers (e.g. number of alveoli to be estimated. Some estimators also require randomisation of orientation, so that all directions have an equal chance of being chosen. Using such isotropic sections, surface area, length, and diameter can be estimated on a Cavalieri set of sections. Stereology can also illustrate the potential for transport between two compartments by analysing the barrier width. Estimating the individual volume of cells can be achieved by local stereology using a two-step procedure that first samples lung cells using the disector and then introduces individual volume estimation of the sampled cells. The coefficient of error of most unbiased stereological estimators is a combination of variance from blocks, sections, fields

  6. Multiple-image authentication with a cascaded multilevel architecture based on amplitude field random sampling and phase information multiplexing.

    Science.gov (United States)

    Fan, Desheng; Meng, Xiangfeng; Wang, Yurong; Yang, Xiulun; Pan, Xuemei; Peng, Xiang; He, Wenqi; Dong, Guoyan; Chen, Hongyi

    2015-04-10

    A multiple-image authentication method with a cascaded multilevel architecture in the Fresnel domain is proposed, in which a synthetic encoded complex amplitude is first fabricated, and its real amplitude component is generated by iterative amplitude encoding, random sampling, and space multiplexing for the low-level certification images, while the phase component of the synthetic encoded complex amplitude is constructed by iterative phase information encoding and multiplexing for the high-level certification images. Then the synthetic encoded complex amplitude is iteratively encoded into two phase-type ciphertexts located in two different transform planes. During high-level authentication, when the two phase-type ciphertexts and the high-level decryption key are presented to the system and then the Fresnel transform is carried out, a meaningful image with good quality and a high correlation coefficient with the original certification image can be recovered in the output plane. Similar to the procedure of high-level authentication, in the case of low-level authentication with the aid of a low-level decryption key, no significant or meaningful information is retrieved, but it can result in a remarkable peak output in the nonlinear correlation coefficient of the output image and the corresponding original certification image. Therefore, the method realizes different levels of accessibility to the original certification image for different authority levels with the same cascaded multilevel architecture.

  7. Methodology Series Module 5: Sampling Strategies.

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ' Sampling Method'. There are essentially two types of sampling methods: 1) probability sampling - based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling - based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ' random sample' when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ' generalizability' of these results. In such a scenario, the researcher may want to use ' purposive sampling' for the study.

  8. Methodology series module 5: Sampling strategies

    Directory of Open Access Journals (Sweden)

    Maninder Singh Setia

    2016-01-01

    Full Text Available Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the 'Sampling Method'. There are essentially two types of sampling methods: 1 probability sampling – based on chance events (such as random numbers, flipping a coin etc.; and 2 non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term 'random sample' when the researcher has used convenience sample. The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the 'generalizability' of these results. In such a scenario, the researcher may want to use 'purposive sampling' for the study.

  9. Pre-exposure to food temptation reduces subsequent consumption: A test of the procedure with a South-African sample.

    Science.gov (United States)

    Duh, Helen Inseng; Grubliauskiene, Aiste; Dewitte, Siegfried

    2016-01-01

    It has been suggested that the consumption of unhealthy Westernized diet in a context of poverty and resultant food insecurity may have contributed to South-Africa's status of the third fattest country in the World. Considering that a number of South-Africans are reported to have experienced, or are still experiencing food insecurity, procedures which have been shown to reduce the consumption of unhealthy food in higher income countries may be ineffective in South-Africa. We thus tested the robustness of the so called pre-exposure procedure in South-Africa. We also tested the moderating role of childhood poverty in the pre-exposure procedure. With the pre-exposure procedure, a respondent is exposed to a tempting unhealthy food (e.g. candy) in a context that is designed such that eating the food interferes with a task goal. The typical result is that this procedure spills over and reduces consumption of similar tempting food later on. An experimental study conducted in a South-African laboratory showed that the pre-exposure effect is robust even with a sample, where food insecurity prevails. Childhood poverty did not moderate the effect. This study proves that behavioral procedures aimed at reducing the consumption of unhealthy food would be valuable in less rich non-Western countries. Further testing of the robustness of the pre-exposure effect is however recommended in other poorer food insecure countries. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Effects of errorless skill learning in people with mild-to-moderate or severe dementia: a randomized controlled pilot study.

    NARCIS (Netherlands)

    Kessels, R.P.C.; Hensken, L.M.

    2009-01-01

    This pilot study examines whether learning without errors is advantageous compared to trial-and-error learning in people with dementia using a procedural task and a randomized case-control design. A sample of 60 people was recruited, consisting of 20 patients with severe dementia, 20 patients with

  11. Effects of errorless skill learning in people with mild-to-moderate or severe dementia: A randomized controlled pilot study

    NARCIS (Netherlands)

    Kessels, R.P.C.; Olde Hensken, L.M.G.

    2009-01-01

    This pilot study examines whether learning without errors is advantageous compared to trial-and-error learning in people with dementia using a procedural task and a randomized case-control design. A sample of 60 people was recruited, consisting of 20 patients with severe dementia, 20 patients with

  12. 40 CFR 89.413 - Raw sampling procedures.

    Science.gov (United States)

    2010-07-01

    ... of the exhaust pipe—whichever is the larger—upstream of the exit of the exhaust gas system. (b) In..., such as in a “Vee” engine configuration, it is permissible to: (1) Sample after all exhaust pipes have been connected together into a single exhaust pipe. (2) For each mode, sample from each exhaust pipe...

  13. Methodology Series Module 5: Sampling Strategies

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ‘ Sampling Method’. There are essentially two types of sampling methods: 1) probability sampling – based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ‘ random sample’ when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ‘ generalizability’ of these results. In such a scenario, the researcher may want to use ‘ purposive sampling’ for the study. PMID:27688438

  14. Response-only modal identification using random decrement algorithm with time-varying threshold level

    International Nuclear Information System (INIS)

    Lin, Chang Sheng; Tseng, Tse Chuan

    2014-01-01

    Modal Identification from response data only is studied for structural systems under nonstationary ambient vibration. The topic of this paper is the estimation of modal parameters from nonstationary ambient vibration data by applying the random decrement algorithm with time-varying threshold level. In the conventional random decrement algorithm, the threshold level for evaluating random dec signatures is defined as the standard deviation value of response data of the reference channel. The distortion of random dec signatures may be, however, induced by the error involved in noise from the original response data in practice. To improve the accuracy of identification, a modification of the sampling procedure in random decrement algorithm is proposed for modal-parameter identification from the nonstationary ambient response data. The time-varying threshold level is presented for the acquisition of available sample time history to perform averaging analysis, and defined as the temporal root-mean-square function of structural response, which can appropriately describe a wide variety of nonstationary behaviors in reality, such as the time-varying amplitude (variance) of a nonstationary process in a seismic record. Numerical simulations confirm the validity and robustness of the proposed modal-identification method from nonstationary ambient response data under noisy conditions.

  15. Location and multi-depot vehicle routing for emergency vehicles using tour coverage and random sampling

    Directory of Open Access Journals (Sweden)

    Alireza Goli

    2015-09-01

    Full Text Available Distribution and optimum allocation of emergency resources are the most important tasks, which need to be accomplished during crisis. When a natural disaster such as earthquake, flood, etc. takes place, it is necessary to deliver rescue efforts as quickly as possible. Therefore, it is important to find optimum location and distribution of emergency relief resources. When a natural disaster occurs, it is not possible to reach some damaged areas. In this paper, location and multi-depot vehicle routing for emergency vehicles using tour coverage and random sampling is investigated. In this study, there is no need to visit all the places and some demand points receive their needs from the nearest possible location. The proposed study is implemented for some randomly generated numbers in different sizes. The preliminary results indicate that the proposed method was capable of reaching desirable solutions in reasonable amount of time.

  16. Sampling Large Graphs for Anticipatory Analytics

    Science.gov (United States)

    2015-05-15

    low. C. Random Area Sampling Random area sampling [8] is a “ snowball ” sampling method in which a set of random seed vertices are selected and areas... Sampling Large Graphs for Anticipatory Analytics Lauren Edwards, Luke Johnson, Maja Milosavljevic, Vijay Gadepally, Benjamin A. Miller Lincoln...systems, greater human-in-the-loop involvement, or through complex algorithms. We are investigating the use of sampling to mitigate these challenges

  17. A radiochemical procedure for a low-level measurement of ''241 Am in environmental samples using a supported functional organo phosphorus extractant

    International Nuclear Information System (INIS)

    Gasco, C.; Anton, M.A.; Alvarez, A.; Navarro, N.; Salvador, S.

    1994-01-01

    The transuranides analysis in environmental samples is carried out by CIEMAT using standardized methods based on sequential separation with ionic-exchange resins. The americium fraction is purified through a two-layer ion exchange column and lately in an anion-exchange column in nitric acid methanol medium. The technique is time-consuming and the results are not completely satisfactory (low recovery and loss of alpha-resolution) for some samples. The chemical compound CMPO (octyl(phenyl).N,N-diisobutyl carbomoylmethyiphosphine oxide) dissolved in TPB (tributyl phosphate) and supported on an inert substrate has been tested directly for ''241 Am analysis by a large number of laboratories. A new method that combines both procedures has been developed. The details of the improved procedure are described in this paper. The advantages of its application to environmental samples (urine, faeces and sediments) are discussed. The utilization of standard, with americium certified concentrations confirms the reliability of our measurements

  18. Comparison of the radiochemical separation procedures od plutonium applied for its determination in the environmental samples using alpha spectrometry

    International Nuclear Information System (INIS)

    Komosa, A.; Michalik, S.

    2006-01-01

    Alpha spectrometry of the plutonium isotopes can be performed only after the perfect plutonium separation from other components of the matrix. So, till now numerous procedures have been elaborated and tested. The communication presents comparison of the plutonium content determination in soil, bones, eggshells and in the reference materials obtained by alpha spectrometry combined with two different separation procedures. The samples were mineralized in the concentrated HCl or HF prior to plutonium electrodeposition or coprecipitation with NdF 3 . Some other details were also tested in various variants. Quality of the spectra is discussed in terms of all these pre-treatment methods

  19. An Evaluation of the Use of Statistical Procedures in Soil Science

    Directory of Open Access Journals (Sweden)

    Laene de Fátima Tavares

    2016-01-01

    Full Text Available ABSTRACT Experimental statistical procedures used in almost all scientific papers are fundamental for clearer interpretation of the results of experiments conducted in agrarian sciences. However, incorrect use of these procedures can lead the researcher to incorrect or incomplete conclusions. Therefore, the aim of this study was to evaluate the characteristics of the experiments and quality of the use of statistical procedures in soil science in order to promote better use of statistical procedures. For that purpose, 200 articles, published between 2010 and 2014, involving only experimentation and studies by sampling in the soil areas of fertility, chemistry, physics, biology, use and management were randomly selected. A questionnaire containing 28 questions was used to assess the characteristics of the experiments, the statistical procedures used, and the quality of selection and use of these procedures. Most of the articles evaluated presented data from studies conducted under field conditions and 27 % of all papers involved studies by sampling. Most studies did not mention testing to verify normality and homoscedasticity, and most used the Tukey test for mean comparisons. Among studies with a factorial structure of the treatments, many had ignored this structure, and data were compared assuming the absence of factorial structure, or the decomposition of interaction was performed without showing or mentioning the significance of the interaction. Almost none of the papers that had split-block factorial designs considered the factorial structure, or they considered it as a split-plot design. Among the articles that performed regression analysis, only a few of them tested non-polynomial fit models, and none reported verification of the lack of fit in the regressions. The articles evaluated thus reflected poor generalization and, in some cases, wrong generalization in experimental design and selection of procedures for statistical analysis.

  20. Critical evaluation of distillation procedure for the determination of methylmercury in soil samples.

    Science.gov (United States)

    Perez, Pablo A; Hintelman, Holger; Quiroz, Waldo; Bravo, Manuel A

    2017-11-01

    In the present work, the efficiency of distillation process for extracting monomethylmercury (MMHg) from soil samples was studied and optimized using an experimental design methodology. The influence of soil composition on MMHg extraction was evaluated by testing of four soil samples with different geochemical characteristics. Optimization suggested that the acid concentration and the duration of the distillation process were most significant and the most favorable conditions, established as a compromise for the studied soils, were determined to be a 70 min distillation using an 0.2 M acid. Corresponding limits of detection (LOD) and quantification (LOQ) were 0.21 and 0.7 pg absolute, respectively. The optimized methodology was applied with satisfactory results to soil samples and was compared to a reference methodology based on isotopic dilution analysis followed by gas chromatography-inductively coupled plasma mass spectrometry (IDA-GC-ICP-MS). Using the optimized conditions, recoveries ranged from 82 to 98%, which is an increase of 9-34% relative to the previously used standard operating procedure. Finally, the validated methodology was applied to quantify MMHg in soils collected from different sites impacted by coal fired power plants in the north-central zone of Chile, measuring MMHg concentrations ranging from 0.091 to 2.8 ng g -1 . These data are to the best of our knowledge the first MMHg measurements reported for Chile. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. A novel sample preparation procedure for effect-directed analysis of micro-contaminants of emerging concern in surface waters.

    Science.gov (United States)

    Osorio, Victoria; Schriks, Merijn; Vughs, Dennis; de Voogt, Pim; Kolkman, Annemieke

    2018-08-15

    A novel sample preparation procedure relying on Solid Phase Extraction (SPE) combining different sorbent materials on a sequential-based cartridge was optimized and validated for the enrichment of 117 widely diverse contaminants of emerging concern (CECs) from surface waters (SW) and further combined chemical and biological analysis on subsequent extracts. A liquid chromatography coupled to high resolution tandem mass spectrometry LC-(HR)MS/MS protocol was optimized and validated for the quantitative analysis of organic CECs in SW extracts. A battery of in vitro CALUX bioassays for the assessment of endocrine, metabolic and genotoxic interference and oxidative stress were performed on the same SW extracts. Satisfactory recoveries ([70-130]%) and precision ( 0.99) over three orders of magnitude. Instrumental limits of detection and method limits of quantification were of [1-96] pg injected and [0.1-58] ng/L, respectively; while corresponding intra-day and inter-day precision did not exceed 11% and 20%. The developed procedure was successfully applied for the combined chemical and toxicological assessment of SW intended for drinking water supply. Levels of compounds varied from < 10 ng/L to < 500 ng/L. Endocrine (i.e. estrogenic and anti-androgenic) and metabolic interference responses were observed. Given the demonstrated reliability of the validated sample preparation method, the authors propose its integration in an effect-directed analysis procedure for a proper evaluation of SW quality and hazard assessment of CECs. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Field sampling, preparation procedure and plutonium analyses of large freshwater samples

    International Nuclear Information System (INIS)

    Straelberg, E.; Bjerk, T.O.; Oestmo, K.; Brittain, J.E.

    2002-01-01

    This work is part of an investigation of the mobility of plutonium in freshwater systems containing humic substances. A well-defined bog-stream system located in the catchment area of a subalpine lake, Oevre Heimdalsvatn, Norway, is being studied. During the summer of 1999, six water samples were collected from the tributary stream Lektorbekken and the lake itself. However, the analyses showed that the plutonium concentration was below the detection limit in all the samples. Therefore renewed sampling at the same sites was carried out in August 2000. The results so far are in agreement with previous analyses from the Heimdalen area. However, 100 times higher concentrations are found in the lowlands in the eastern part of Norway. The reason for this is not understood, but may be caused by differences in the concentrations of humic substances and/or the fact that the mountain areas are covered with snow for a longer period of time every year. (LN)

  3. Assessment of fracture risk: value of random population-based samples--the Geelong Osteoporosis Study.

    Science.gov (United States)

    Henry, M J; Pasco, J A; Seeman, E; Nicholson, G C; Sanders, K M; Kotowicz, M A

    2001-01-01

    Fracture risk is determined by bone mineral density (BMD). The T-score, a measure of fracture risk, is the position of an individual's BMD in relation to a reference range. The aim of this study was to determine the magnitude of change in the T-score when different sampling techniques were used to produce the reference range. Reference ranges were derived from three samples, drawn from the same region: (1) an age-stratified population-based random sample, (2) unselected volunteers, and (3) a selected healthy subset of the population-based sample with no diseases or drugs known to affect bone. T-scores were calculated using the three reference ranges for a cohort of women who had sustained a fracture and as a group had a low mean BMD (ages 35-72 yr; n = 484). For most comparisons, the T-scores for the fracture cohort were more negative using the population reference range. The difference in T-scores reached 1.0 SD. The proportion of the fracture cohort classified as having osteoporosis at the spine was 26, 14, and 23% when the population, volunteer, and healthy reference ranges were applied, respectively. The use of inappropriate reference ranges results in substantial changes to T-scores and may lead to inappropriate management.

  4. POTENTIAL USE OF MELATONIN IN PROCEDURAL ANXIETY AND PAIN IN CHILDREN UNDERGOING BLOOD WITHDRAWAL.

    Science.gov (United States)

    Marseglia, L; Manti, S; D'Angelo, G; Arrigo, T; Cuppari, C; Salpietro, C; Gitto, E

    2015-01-01

    The recognition of the value of pain, especially in the pediatric population, has increased over the last decade. It is known that pain-related anxiety can increase perceived pain intensity. There are several different approaches to the treatment of pre-procedural anxiety and procedural pain in children. Melatonin, a neurohormone with the profile of a novel hypnotic-anaesthetic agent, plays an important role in anxiolysis and analgesia. This study investigated the effects of oral melatonin premedication to reduce anxiety and pain in children having blood samples taken. The investigations were carried out on 60 children, aged 1-14 years, divided into 2 equal groups. Using a computer-generated randomization schedule, patients were given either melatonin orally (0.5 mg/kg BW, max 5 mg) or placebo 30 min before blood draw. Pre-procedural anxiety was assessed using the scale from the Children’s Anxiety and Pain Scales, while procedural pain used the Face, Legs, Activity, Cry and Consolability assessment tool for children under the age of 3 years, Faces Pain Scale-Revised for children aged 3-8 years and Numeric Rating Scale for children over the age of 8 years. Oral administration of melatonin before the blood withdrawal procedure significantly reduced both anxiety (pchildren under 3 years and pchildren over 3 years). These data support the use of melatonin for taking blood samples due to its anxiolytic and analgesic properties. Further studies are needed to support the routine use of melatonin to alleviate anxiety and pain in pediatric patients having blood samples taken.

  5. Parameters, test criteria and fault assessment in random sampling of waste barrels from non-qualified processes

    International Nuclear Information System (INIS)

    Martens, B.R.

    1989-01-01

    In the context of random sampling tests, parameters are checked on the waste barrels and criteria are given on which these tests are based. Also, it is shown how faulty data on the properties of the waste or faulty waste barrels should be treated. To decide the extent of testing, the properties of the waste relevant to final storage are determined based on the conditioning process used. (DG) [de

  6. NHEXAS PHASE I MARYLAND STUDY--STANDARD OPERATING PROCEDURE FOR COLLECTION, STORAGE, AND SHIPMENT OF URINE SAMPLES FOR METAL, PESTICIDE, AND CREATININE ANALYSIS (F10)

    Science.gov (United States)

    The purpose of this SOP is to describe the procedures for collection, storage, and shipment of urine samples for metal, pesticides, and creatinine analysis. Samples were collected on Days 2 and 8 of each Cycle. The Day 2 sample was analyzed for metals and creatinine. The Day 8...

  7. Detection of toxigenic vibrio cholera from environmental water samples by an enrichment broth cultivation-pit-stop semi-nested PCR procedure

    CSIR Research Space (South Africa)

    Theron, J

    2000-09-01

    Full Text Available V. cholerae. The PCR procedure coupled with an enrichment culture detected as few as four V. cholerae organisms in pure culture. Treated sewage, surface, ground and drinking water samples were seeded with V, cholerae and following enrichment, a...

  8. The Effect of Listening to Music During Percutaneous Nephrostomy Tube Placement on Pain, Anxiety, and Success Rate of Procedure: A Randomized Prospective Study.

    Science.gov (United States)

    Hamidi, Nurullah; Ozturk, Erdem

    2017-05-01

    To evaluate the effect of listening to music on pain, anxiety, and success of procedure during office-based percutaneous nephrostomy tube placement (PNTP). One hundred consecutive patients (age >18 years) with hydronephrosis were prospectively enrolled in this study. All patients were prospectively randomized to undergo office-based PNTP with (Group I, n = 50) or without music (Group II, n = 50). Anxiety levels were evaluated with State Trait Anxiety Inventory. A visual analog scale was used to evaluate pain levels, patient's satisfaction, and willingness to undergo the procedure. We also compared success rates of procedures. The mean age, duration of procedure, and gender distribution were statistically similar between the two groups. The mean postprocedural heart rates and systolic blood pressures in Group I patients were significantly lower than Group II patients (p = 0.01 and p = 0.028, respectively), whereas preprocedural pulse rate and systolic blood pressure were similar. The mean anxiety level and mean pain score of Group I were significantly lower than those of Group II (p = 0.008 and p music during office-based PNTP decreases anxiety or pain and increases success rate of procedure. As an alternative to sedation or general anesthesia, music is easily accessible without side effect and cost.

  9. Comprehensive simulation-enhanced training curriculum for an advanced minimally invasive procedure: a randomized controlled trial.

    Science.gov (United States)

    Zevin, Boris; Dedy, Nicolas J; Bonrath, Esther M; Grantcharov, Teodor P

    2017-05-01

    There is no comprehensive simulation-enhanced training curriculum to address cognitive, psychomotor, and nontechnical skills for an advanced minimally invasive procedure. 1) To develop and provide evidence of validity for a comprehensive simulation-enhanced training (SET) curriculum for an advanced minimally invasive procedure; (2) to demonstrate transfer of acquired psychomotor skills from a simulation laboratory to live porcine model; and (3) to compare training outcomes of SET curriculum group and chief resident group. University. This prospective single-blinded, randomized, controlled trial allocated 20 intermediate-level surgery residents to receive either conventional training (control) or SET curriculum training (intervention). The SET curriculum consisted of cognitive, psychomotor, and nontechnical training modules. Psychomotor skills in a live anesthetized porcine model in the OR was the primary outcome. Knowledge of advanced minimally invasive and bariatric surgery and nontechnical skills in a simulated OR crisis scenario were the secondary outcomes. Residents in the SET curriculum group went on to perform a laparoscopic jejunojejunostomy in the OR. Cognitive, psychomotor, and nontechnical skills of SET curriculum group were also compared to a group of 12 chief surgery residents. SET curriculum group demonstrated superior psychomotor skills in a live porcine model (56 [47-62] versus 44 [38-53], Ppsychomotor skills in the live porcine model and in the OR in a human patient (56 [47-62] versus 63 [61-68]; P = .21). SET curriculum group demonstrated inferior knowledge (13 [11-15] versus 16 [14-16]; P<.05), equivalent psychomotor skill (63 [61-68] versus 68 [62-74]; P = .50), and superior nontechnical skills (41 [38-45] versus 34 [27-35], P<.01) compared with chief resident group. Completion of the SET curriculum resulted in superior training outcomes, compared with conventional surgery training. Implementation of the SET curriculum can standardize training

  10. Evaluation and optimization of DNA extraction and purification procedures for soil and sediment samples.

    Science.gov (United States)

    Miller, D N; Bryant, J E; Madsen, E L; Ghiorse, W C

    1999-11-01

    We compared and statistically evaluated the effectiveness of nine DNA extraction procedures by using frozen and dried samples of two silt loam soils and a silt loam wetland sediment with different organic matter contents. The effects of different chemical extractants (sodium dodecyl sulfate [SDS], chloroform, phenol, Chelex 100, and guanadinium isothiocyanate), different physical disruption methods (bead mill homogenization and freeze-thaw lysis), and lysozyme digestion were evaluated based on the yield and molecular size of the recovered DNA. Pairwise comparisons of the nine extraction procedures revealed that bead mill homogenization with SDS combined with either chloroform or phenol optimized both the amount of DNA extracted and the molecular size of the DNA (maximum size, 16 to 20 kb). Neither lysozyme digestion before SDS treatment nor guanidine isothiocyanate treatment nor addition of Chelex 100 resin improved the DNA yields. Bead mill homogenization in a lysis mixture containing chloroform, SDS, NaCl, and phosphate-Tris buffer (pH 8) was found to be the best physical lysis technique when DNA yield and cell lysis efficiency were used as criteria. The bead mill homogenization conditions were also optimized for speed and duration with two different homogenizers. Recovery of high-molecular-weight DNA was greatest when we used lower speeds and shorter times (30 to 120 s). We evaluated four different DNA purification methods (silica-based DNA binding, agarose gel electrophoresis, ammonium acetate precipitation, and Sephadex G-200 gel filtration) for DNA recovery and removal of PCR inhibitors from crude extracts. Sephadex G-200 spin column purification was found to be the best method for removing PCR-inhibiting substances while minimizing DNA loss during purification. Our results indicate that for these types of samples, optimum DNA recovery requires brief, low-speed bead mill homogenization in the presence of a phosphate-buffered SDS-chloroform mixture, followed

  11. Correcting Classifiers for Sample Selection Bias in Two-Phase Case-Control Studies

    Science.gov (United States)

    Theis, Fabian J.

    2017-01-01

    Epidemiological studies often utilize stratified data in which rare outcomes or exposures are artificially enriched. This design can increase precision in association tests but distorts predictions when applying classifiers on nonstratified data. Several methods correct for this so-called sample selection bias, but their performance remains unclear especially for machine learning classifiers. With an emphasis on two-phase case-control studies, we aim to assess which corrections to perform in which setting and to obtain methods suitable for machine learning techniques, especially the random forest. We propose two new resampling-based methods to resemble the original data and covariance structure: stochastic inverse-probability oversampling and parametric inverse-probability bagging. We compare all techniques for the random forest and other classifiers, both theoretically and on simulated and real data. Empirical results show that the random forest profits from only the parametric inverse-probability bagging proposed by us. For other classifiers, correction is mostly advantageous, and methods perform uniformly. We discuss consequences of inappropriate distribution assumptions and reason for different behaviors between the random forest and other classifiers. In conclusion, we provide guidance for choosing correction methods when training classifiers on biased samples. For random forests, our method outperforms state-of-the-art procedures if distribution assumptions are roughly fulfilled. We provide our implementation in the R package sambia. PMID:29312464

  12. Coordination of Conditional Poisson Samples

    Directory of Open Access Journals (Sweden)

    Grafström Anton

    2015-12-01

    Full Text Available Sample coordination seeks to maximize or to minimize the overlap of two or more samples. The former is known as positive coordination, and the latter as negative coordination. Positive coordination is mainly used for estimation purposes and to reduce data collection costs. Negative coordination is mainly performed to diminish the response burden of the sampled units. Poisson sampling design with permanent random numbers provides an optimum coordination degree of two or more samples. The size of a Poisson sample is, however, random. Conditional Poisson (CP sampling is a modification of the classical Poisson sampling that produces a fixed-size πps sample. We introduce two methods to coordinate Conditional Poisson samples over time or simultaneously. The first one uses permanent random numbers and the list-sequential implementation of CP sampling. The second method uses a CP sample in the first selection and provides an approximate one in the second selection because the prescribed inclusion probabilities are not respected exactly. The methods are evaluated using the size of the expected sample overlap, and are compared with their competitors using Monte Carlo simulation. The new methods provide a good coordination degree of two samples, close to the performance of Poisson sampling with permanent random numbers.

  13. Technical note: Alternatives to reduce adipose tissue sampling bias.

    Science.gov (United States)

    Cruz, G D; Wang, Y; Fadel, J G

    2014-10-01

    Understanding the mechanisms by which nutritional and pharmaceutical factors can manipulate adipose tissue growth and development in production animals has direct and indirect effects in the profitability of an enterprise. Adipocyte cellularity (number and size) is a key biological response that is commonly measured in animal science research. The variability and sampling of adipocyte cellularity within a muscle has been addressed in previous studies, but no attempt to critically investigate these issues has been proposed in the literature. The present study evaluated 2 sampling techniques (random and systematic) in an attempt to minimize sampling bias and to determine the minimum number of samples from 1 to 15 needed to represent the overall adipose tissue in the muscle. Both sampling procedures were applied on adipose tissue samples dissected from 30 longissimus muscles from cattle finished either on grass or grain. Briefly, adipose tissue samples were fixed with osmium tetroxide, and size and number of adipocytes were determined by a Coulter Counter. These results were then fit in a finite mixture model to obtain distribution parameters of each sample. To evaluate the benefits of increasing number of samples and the advantage of the new sampling technique, the concept of acceptance ratio was used; simply stated, the higher the acceptance ratio, the better the representation of the overall population. As expected, a great improvement on the estimation of the overall adipocyte cellularity parameters was observed using both sampling techniques when sample size number increased from 1 to 15 samples, considering both techniques' acceptance ratio increased from approximately 3 to 25%. When comparing sampling techniques, the systematic procedure slightly improved parameters estimation. The results suggest that more detailed research using other sampling techniques may provide better estimates for minimum sampling.

  14. Pollutant Assessments Group procedures manual: Volume 2, Technical procedures

    Energy Technology Data Exchange (ETDEWEB)

    1992-03-01

    This is volume 2 of the manuals that describes the technical procedures currently in use by the Pollution Assessments Group. This manual incorporates new developments in hazardous waste assessment technology and administrative policy. Descriptions of the equipment, procedures and operations of such things as radiation detection, soil sampling, radionuclide monitoring, and equipment decontamination are included in this manual. (MB)

  15. Determination of calcium, potassium, manganese, iron, copper and zinc levels in representative samples of two onion cultivars using total reflection X-ray fluorescence and ultrasound extraction procedure

    International Nuclear Information System (INIS)

    Alvarez, J.; Marco, L.M.; Arroyo, J.; Greaves, E.D.; Rivas, R.

    2003-01-01

    The chemical characterization of onion cultivar samples is an important tool for the enhancement of their productivity due to the fact that chemical composition is closed related to the quality of the products. A new sample preparation procedure for elemental characterization is proposed, involving the acid extraction of the analytes from crude samples by means of an ultrasonic bath, avoiding the required digestion of samples in vegetable tissue analysis. The technique of total reflection X-ray fluorescence (TXRF) was successfully applied for the simultaneous determination of the elements Ca, K, Mn, Fe, Cu and Zn. The procedure was compared with the wet ashing and dry ashing procedures for all the elements using multivariate analysis and the Scheffe test. The technique of flame atomic absorption spectrometry (FAAS) was employed for comparison purposes and accuracy evaluation of the proposed analysis method. A good agreement between the two techniques was found when using the dry ashing and ultrasound leaching procedures. The levels of each element found for representative samples of two onion cultivars (Yellow Granex PRR 502 and 438 Granex) were also compared by the same method. Levels of K, Mn and Zn were significantly higher in the 438 Granex cultivar, while levels of Ca, Fe and Cu were significantly higher in the Yellow Granex PRR 502 cultivar

  16. RORASC: Software for the rotated-random-scan calibration procedure

    NARCIS (Netherlands)

    Janssen PHM; CWM

    1995-01-01

    De release-versie 1.0 van het programma RORASC wordt gepresenteerd, tezamen met instructies en richtlijnen voor installatie en gebruik van de software. RORASC speelt een essentiele rol bij de 'rotated-random-scan' methode voor modelkalibratie. De software is geschreven in standaard

  17. Randomized Comparison of Two Vaginal Self-Sampling Methods for Human Papillomavirus Detection: Dry Swab versus FTA Cartridge.

    Science.gov (United States)

    Catarino, Rosa; Vassilakos, Pierre; Bilancioni, Aline; Vanden Eynde, Mathieu; Meyer-Hamme, Ulrike; Menoud, Pierre-Alain; Guerry, Frédéric; Petignat, Patrick

    2015-01-01

    Human papillomavirus (HPV) self-sampling (self-HPV) is valuable in cervical cancer screening. HPV testing is usually performed on physician-collected cervical smears stored in liquid-based medium. Dry filters and swabs are an alternative. We evaluated the adequacy of self-HPV using two dry storage and transport devices, the FTA cartridge and swab. A total of 130 women performed two consecutive self-HPV samples. Randomization determined which of the two tests was performed first: self-HPV using dry swabs (s-DRY) or vaginal specimen collection using a cytobrush applied to an FTA cartridge (s-FTA). After self-HPV, a physician collected a cervical sample using liquid-based medium (Dr-WET). HPV types were identified by real-time PCR. Agreement between collection methods was measured using the kappa statistic. HPV prevalence for high-risk types was 62.3% (95%CI: 53.7-70.2) detected by s-DRY, 56.2% (95%CI: 47.6-64.4) by Dr-WET, and 54.6% (95%CI: 46.1-62.9) by s-FTA. There was overall agreement of 70.8% between s-FTA and s-DRY samples (kappa = 0.34), and of 82.3% between self-HPV and Dr-WET samples (kappa = 0.56). Detection sensitivities for low-grade squamous intraepithelial lesion or worse (LSIL+) were: 64.0% (95%CI: 44.5-79.8) for s-FTA, 84.6% (95%CI: 66.5-93.9) for s-DRY, and 76.9% (95%CI: 58.0-89.0) for Dr-WET. The preferred self-collection method among patients was s-DRY (40.8% vs. 15.4%). Regarding costs, FTA card was five times more expensive than the swab (~5 US dollars (USD)/per card vs. ~1 USD/per swab). Self-HPV using dry swabs is sensitive for detecting LSIL+ and less expensive than s-FTA. International Standard Randomized Controlled Trial Number (ISRCTN): 43310942.

  18. Optimisation (sampling strategies and analytical procedures) for site specific environment monitoring at the areas of uranium production legacy sites in Ukraine - 59045

    International Nuclear Information System (INIS)

    Voitsekhovych, Oleg V.; Lavrova, Tatiana V.; Kostezh, Alexander B.

    2012-01-01

    There are many sites in the world, where Environment are still under influence of the contamination related to the Uranium production carried out in past. Author's experience shows that lack of site characterization data, incomplete or unreliable environment monitoring studies can significantly limit quality of Safety Assessment procedures and Priority actions analyses needed for Remediation Planning. During recent decades the analytical laboratories of the many enterprises, currently being responsible for establishing the site specific environment monitoring program have been significantly improved their technical sampling and analytical capacities. However, lack of experience in the optimal site specific sampling strategy planning and also not enough experience in application of the required analytical techniques, such as modern alpha-beta radiometers, gamma and alpha spectrometry and liquid-scintillation analytical methods application for determination of U-Th series radionuclides in the environment, does not allow to these laboratories to develop and conduct efficiently the monitoring programs as a basis for further Safety Assessment in decision making procedures. This paper gives some conclusions, which were gained from the experience establishing monitoring programs in Ukraine and also propose some practical steps on optimization in sampling strategy planning and analytical procedures to be applied for the area required Safety assessment and justification for its potential remediation and safe management. (authors)

  19. Value measurement of nuclear medicine procedures

    International Nuclear Information System (INIS)

    Potchen, E.J.; Harris, G.I.; Schonbein, W.R.; Rashford, N.J.

    1977-01-01

    The difficulty in measuring the benefit component for cost/benefit analysis of diagnostic procedures in medicine is portrayed as a complex issue relating the objective of intent to a classification of types of decisions a physician must make in evaluating a patient's problem. Ultimately, it seems desirable to develop measuring instruments such as attitude measurement tools by which the relative value of alternative diagnostic procedures could be measured in terms of what they contribute to diminishing the patient's personal perception of disease. Even without this idealized objective, it is reasonable to assume that diagnostic tests which do not contain information, defined as a change in the randomness of a state of knowledge, could not be expected to ultimately benefit the patient. Thus diagnostic information should provide a rational direction for the physician to modify the course of the patient's illness. Since information can be measured as a change in randomness of a knowledge state, we can determine the information content of a specific nuclear medicine procedure when faced with an array of diagnostic problems. These measurements remain to be made for clinical nuclear medicine procedures and are currently under study

  20. Investigating the Randomness of Numbers

    Science.gov (United States)

    Pendleton, Kenn L.

    2009-01-01

    The use of random numbers is pervasive in today's world. Random numbers have practical applications in such far-flung arenas as computer simulations, cryptography, gambling, the legal system, statistical sampling, and even the war on terrorism. Evaluating the randomness of extremely large samples is a complex, intricate process. However, the…

  1. Application of a Monte Carlo procedure for probabilistic fatigue design of floating offshore wind turbines

    Directory of Open Access Journals (Sweden)

    K. Müller

    2018-03-01

    Full Text Available Fatigue load assessment of floating offshore wind turbines poses new challenges on the feasibility of numerical procedures. Due to the increased sensitivity of the considered system with respect to the environmental conditions from wind and ocean, the application of common procedures used for fixed-bottom structures results in either inaccurate simulation results or hard-to-quantify conservatism in the system design. Monte Carlo-based sampling procedures provide a more realistic approach to deal with the large variation in the environmental conditions, although basic randomization has shown slow convergence. Specialized sampling methods allow efficient coverage of the complete design space, resulting in faster convergence and hence a reduced number of required simulations. In this study, a quasi-random sampling approach based on Sobol sequences is applied to select representative events for the determination of the lifetime damage. This is calculated applying Monte Carlo integration, using subsets of a resulting total of 16 200 coupled time–domain simulations performed with the simulation code FAST. The considered system is the Danmarks Tekniske Universitet (DTU 10 MW reference turbine installed on the LIFES50+ OO-Star Wind Floater Semi 10 MW floating platform. Statistical properties of the considered environmental parameters (i.e., wind speed, wave height and wave period are determined based on the measurement data from the Gulf of Maine, USA. Convergence analyses show that it is sufficient to perform around 200 simulations in order to reach less than 10 % uncertainty of lifetime fatigue damage-equivalent loading. Complementary in-depth investigation is performed, focusing on the load sensitivity and the impact of outliers (i.e., values far away from the mean. Recommendations for the implementation of the proposed methodology in the design process are also provided.

  2. Possible Overestimation of Surface Disinfection Efficiency by Assessment Methods Based on Liquid Sampling Procedures as Demonstrated by In Situ Quantification of Spore Viability ▿

    Science.gov (United States)

    Grand, I.; Bellon-Fontaine, M.-N.; Herry, J.-M.; Hilaire, D.; Moriconi, F.-X.; Naïtali, M.

    2011-01-01

    The standard test methods used to assess the efficiency of a disinfectant applied to surfaces are often based on counting the microbial survivors sampled in a liquid, but total cell removal from surfaces is seldom achieved. One might therefore wonder whether evaluations of microbial survivors in liquid-sampled cells are representative of the levels of survivors in whole populations. The present study was thus designed to determine the “damaged/undamaged” status induced by a peracetic acid disinfection for Bacillus atrophaeus spores deposited on glass coupons directly on this substrate and to compare it to the status of spores collected in liquid by a sampling procedure. The method utilized to assess the viability of both surface-associated and liquid-sampled spores included fluorescence labeling with a combination of Syto 61 and Chemchrome V6 dyes and quantifications by analyzing the images acquired by confocal laser scanning microscopy. The principal result of the study was that the viability of spores sampled in the liquid was found to be poorer than that of surface-associated spores. For example, after 2 min of peracetic acid disinfection, less than 17% ± 5% of viable cells were detected among liquid-sampled cells compared to 79% ± 5% or 47% ± 4%, respectively, when the viability was evaluated on the surface after or without the sampling procedure. Moreover, assessments of the survivors collected in the liquid phase, evaluated using the microscopic method and standard plate counts, were well correlated. Evaluations based on the determination of survivors among the liquid-sampled cells can thus overestimate the efficiency of surface disinfection procedures. PMID:21742922

  3. Robust inference in sample selection models

    KAUST Repository

    Zhelonkin, Mikhail; Genton, Marc G.; Ronchetti, Elvezio

    2015-01-01

    The problem of non-random sample selectivity often occurs in practice in many fields. The classical estimators introduced by Heckman are the backbone of the standard statistical analysis of these models. However, these estimators are very sensitive to small deviations from the distributional assumptions which are often not satisfied in practice. We develop a general framework to study the robustness properties of estimators and tests in sample selection models. We derive the influence function and the change-of-variance function of Heckman's two-stage estimator, and we demonstrate the non-robustness of this estimator and its estimated variance to small deviations from the model assumed. We propose a procedure for robustifying the estimator, prove its asymptotic normality and give its asymptotic variance. Both cases with and without an exclusion restriction are covered. This allows us to construct a simple robust alternative to the sample selection bias test. We illustrate the use of our new methodology in an analysis of ambulatory expenditures and we compare the performance of the classical and robust methods in a Monte Carlo simulation study.

  4. Robust inference in sample selection models

    KAUST Repository

    Zhelonkin, Mikhail

    2015-11-20

    The problem of non-random sample selectivity often occurs in practice in many fields. The classical estimators introduced by Heckman are the backbone of the standard statistical analysis of these models. However, these estimators are very sensitive to small deviations from the distributional assumptions which are often not satisfied in practice. We develop a general framework to study the robustness properties of estimators and tests in sample selection models. We derive the influence function and the change-of-variance function of Heckman\\'s two-stage estimator, and we demonstrate the non-robustness of this estimator and its estimated variance to small deviations from the model assumed. We propose a procedure for robustifying the estimator, prove its asymptotic normality and give its asymptotic variance. Both cases with and without an exclusion restriction are covered. This allows us to construct a simple robust alternative to the sample selection bias test. We illustrate the use of our new methodology in an analysis of ambulatory expenditures and we compare the performance of the classical and robust methods in a Monte Carlo simulation study.

  5. Develpoment of a procedure for the determination of chromium in samples of urine and serum by neutron activation analysis

    International Nuclear Information System (INIS)

    Buettner, I.; Hamm, V.; Knoechel, A.; Sen Gupta, R.

    1993-01-01

    Since the mid-fifties the possibility of a causal relationship between deficient chromium and insulin metabolism and the manifestation of certain varieties of diabetes mellitus has been presumed. The determination of the chromium status under pathophysiological conditions may be helpful for the study of this problem. For these purposes an analytical procedure as reference system was developed which allows the determination of chromium in biological matrices down to the concentration of 0.33 ng/ml. It is based on NAA and is used in the framework of a commonly used procedure based on GF-AAS. For its application blood and urine samples are freeze-dried and irradiated. After wet digestion with HNO 3 in a microwave combustion system chromium is separated for measurement from the matrix nuclides with the help of the ion-exchanger Cellex-P. THe individual steps of the procedure were evaluated by means of tracer experiments. (orig.)

  6. Develpoment of a procedure for the determination of chromium in samples of urine and serum by neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Buettner, I [Hamburg Univ. (Germany). Inst. for Inorganic and Applied Chemistry; Hamm, V [Hamburg Univ. (Germany). Inst. for Inorganic and Applied Chemistry; Knoechel, A [Hamburg Univ. (Germany). Inst. for Inorganic and Applied Chemistry; Sen Gupta, R [Hamburg Univ. (Germany). Inst. for Inorganic and Applied Chemistry

    1993-06-01

    Since the mid-fifties the possibility of a causal relationship between deficient chromium and insulin metabolism and the manifestation of certain varieties of diabetes mellitus has been presumed. The determination of the chromium status under pathophysiological conditions may be helpful for the study of this problem. For these purposes an analytical procedure as reference system was developed which allows the determination of chromium in biological matrices down to the concentration of 0.33 ng/ml. It is based on NAA and is used in the framework of a commonly used procedure based on GF-AAS. For its application blood and urine samples are freeze-dried and irradiated. After wet digestion with HNO[sub 3] in a microwave combustion system chromium is separated for measurement from the matrix nuclides with the help of the ion-exchanger Cellex-P. THe individual steps of the procedure were evaluated by means of tracer experiments. (orig.)

  7. Characteristics of men with substance use disorder consequent to illicit drug use: comparison of a random sample and volunteers.

    Science.gov (United States)

    Reynolds, Maureen D; Tarter, Ralph E; Kirisci, Levent

    2004-09-06

    Men qualifying for substance use disorder (SUD) consequent to consumption of an illicit drug were compared according to recruitment method. It was hypothesized that volunteers would be more self-disclosing and exhibit more severe disturbances compared to randomly recruited subjects. Personal, demographic, family, social, substance use, psychiatric, and SUD characteristics of volunteers (N = 146) were compared to randomly recruited (N = 102) subjects. Volunteers had lower socioceconomic status, were more likely to be African American, and had lower IQ than randomly recruited subjects. Volunteers also evidenced greater social and family maladjustment and more frequently had received treatment for substance abuse. In addition, lower social desirability response bias was observed in the volunteers. SUD was not more severe in the volunteers; however, they reported a higher lifetime rate of opiate, diet, depressant, and analgesic drug use. Volunteers and randomly recruited subjects qualifying for SUD consequent to illicit drug use are similar in SUD severity but differ in terms of severity of psychosocial disturbance and history of drug involvement. The factors discriminating volunteers and randomly recruited subjects are well known to impact on outcome, hence they need to be considered in research design, especially when selecting a sampling strategy in treatment research.

  8. Optimization of microwave assisted digestion procedure for the determination of zinc, copper and nickel in tea samples employing flame atomic absorption spectrometry

    International Nuclear Information System (INIS)

    Soylak, Mustafa; Tuzen, Mustafa; Souza, Anderson Santos; Korn, Maria das Gracas Andrade; Ferreira, Sergio Luis Costa

    2007-01-01

    The present paper describes the development of a microwave assisted digestion procedure for the determination of zinc, copper and nickel in tea samples employing flame atomic absorption spectrometry (FAAS). The optimization step was performed using a full factorial design (2 3 ) involving the factors: composition of the acid mixture (CMA), microwave power (MP) and radiation time (RT). The experiments of this factorial were carried out using a certified reference material of tea GBW 07605 furnished by National Research Centre for Certified Reference Materials, China, being the metal recoveries considered as response. The relative standard deviations of the method were found below 8% for the three elements. The procedure proposed was used for the determination of copper, zinc and nickel in several samples of tea from Turkey. For 10 tea samples analyzed, the concentration achieved for copper, zinc and nickel varied at 6.4-13.1, 7.0-16.5 and 3.1-5.7 (μg g -1 ), respectively

  9. Optimization of a Pre-MEKC Separation SPE Procedure for Steroid Molecules in Human Urine Samples

    Directory of Open Access Journals (Sweden)

    Ilona Olędzka

    2013-11-01

    Full Text Available Many steroid hormones can be considered as potential biomarkers and their determination in body fluids can create opportunities for the rapid diagnosis of many diseases and disorders of the human body. Most existing methods for the determination of steroids are usually time- and labor-consuming and quite costly. Therefore, the aim of analytical laboratories is to develop a new, relatively low-cost and rapid implementation methodology for their determination in biological samples. Due to the fact that there is little literature data on concentrations of steroid hormones in urine samples, we have made attempts at the electrophoretic determination of these compounds. For this purpose, an extraction procedure for the optimized separation and simultaneous determination of seven steroid hormones in urine samples has been investigated. The isolation of analytes from biological samples was performed by liquid-liquid extraction (LLE with dichloromethane and compared to solid phase extraction (SPE with C18 and hydrophilic-lipophilic balance (HLB columns. To separate all the analytes a micellar electrokinetic capillary chromatography (MECK technique was employed. For full separation of all the analytes a running buffer (pH 9.2, composed of 10 mM sodium tetraborate decahydrate (borax, 50 mM sodium dodecyl sulfate (SDS, and 10% methanol was selected. The methodology developed in this work for the determination of steroid hormones meets all the requirements of analytical methods. The applicability of the method has been confirmed for the analysis of urine samples collected from volunteers—both men and women (students, amateur bodybuilders, using and not applying steroid doping. The data obtained during this work can be successfully used for further research on the determination of steroid hormones in urine samples.

  10. U.S. Geological Survey Noble Gas Laboratory’s standard operating procedures for the measurement of dissolved gas in water samples

    Science.gov (United States)

    Hunt, Andrew G.

    2015-08-12

    This report addresses the standard operating procedures used by the U.S. Geological Survey’s Noble Gas Laboratory in Denver, Colorado, U.S.A., for the measurement of dissolved gases (methane, nitrogen, oxygen, and carbon dioxide) and noble gas isotopes (helium-3, helium-4, neon-20, neon-21, neon-22, argon-36, argon-38, argon-40, kryton-84, krypton-86, xenon-103, and xenon-132) dissolved in water. A synopsis of the instrumentation used, procedures followed, calibration practices, standards used, and a quality assurance and quality control program is presented. The report outlines the day-to-day operation of the Residual Gas Analyzer Model 200, Mass Analyzer Products Model 215–50, and ultralow vacuum extraction line along with the sample handling procedures, noble gas extraction and purification, instrument measurement procedures, instrumental data acquisition, and calculations for the conversion of raw data from the mass spectrometer into noble gas concentrations per unit mass of water analyzed. Techniques for the preparation of artificial dissolved gas standards are detailed and coupled to a quality assurance and quality control program to present the accuracy of the procedures used in the laboratory.

  11. Sample Size Calculation: Inaccurate A Priori Assumptions for Nuisance Parameters Can Greatly Affect the Power of a Randomized Controlled Trial.

    Directory of Open Access Journals (Sweden)

    Elsa Tavernier

    Full Text Available We aimed to examine the extent to which inaccurate assumptions for nuisance parameters used to calculate sample size can affect the power of a randomized controlled trial (RCT. In a simulation study, we separately considered an RCT with continuous, dichotomous or time-to-event outcomes, with associated nuisance parameters of standard deviation, success rate in the control group and survival rate in the control group at some time point, respectively. For each type of outcome, we calculated a required sample size N for a hypothesized treatment effect, an assumed nuisance parameter and a nominal power of 80%. We then assumed a nuisance parameter associated with a relative error at the design stage. For each type of outcome, we randomly drew 10,000 relative errors of the associated nuisance parameter (from empirical distributions derived from a previously published review. Then, retro-fitting the sample size formula, we derived, for the pre-calculated sample size N, the real power of the RCT, taking into account the relative error for the nuisance parameter. In total, 23%, 0% and 18% of RCTs with continuous, binary and time-to-event outcomes, respectively, were underpowered (i.e., the real power was 90%. Even with proper calculation of sample size, a substantial number of trials are underpowered or overpowered because of imprecise knowledge of nuisance parameters. Such findings raise questions about how sample size for RCTs should be determined.

  12. Solid-phase extraction and separation procedure for trace aluminum in water samples and its determination by high-resolution continuum source flame atomic absorption spectrometry (HR-CS FAAS).

    Science.gov (United States)

    Ciftci, Harun; Er, Cigdem

    2013-03-01

    In the present study, a separation/preconcentration procedure for determination of aluminum in water samples has been developed by using a new atomic absorption spectrometer concept with a high-intensity xenon short-arc lamp as continuum radiation source, a high-resolution double-echelle monochromator, and a charge-coupled device array detector. Sample solution pH, sample volume, flow rate of sample solution, volume, and concentration of eluent for solid-phase extraction of Al chelates with 4-[(dicyanomethyl)diazenyl] benzoic acid on polymeric resin (Duolite XAD-761) have been investigated. The adsorbed aluminum on resin was eluted with 5 mL of 2 mol L(-1) HNO(3) and its concentration was determined by high-resolution continuum source flame atomic absorption spectrometry (HR-CS FAAS). Under the optimal conditions, limit of detection obtained with HR-CS FAAS and Line Source FAAS (LS-FAAS) were 0.49 μg L(-1) and 3.91 μg L(-1), respectively. The accuracy of the procedure was confirmed by analyzing certified materials (NIST SRM 1643e, Trace elements in water) and spiked real samples. The developed procedure was successfully applied to water samples.

  13. MUP, CEC-DES, STRADE. Codes for uncertainty propagation, experimental design and stratified random sampling techniques

    International Nuclear Information System (INIS)

    Amendola, A.; Astolfi, M.; Lisanti, B.

    1983-01-01

    The report describes the how-to-use of the codes: MUP (Monte Carlo Uncertainty Propagation) for uncertainty analysis by Monte Carlo simulation, including correlation analysis, extreme value identification and study of selected ranges of the variable space; CEC-DES (Central Composite Design) for building experimental matrices according to the requirements of Central Composite and Factorial Experimental Designs; and, STRADE (Stratified Random Design) for experimental designs based on the Latin Hypercube Sampling Techniques. Application fields, of the codes are probabilistic risk assessment, experimental design, sensitivity analysis and system identification problems

  14. Sampling and analysis validates acceptable knowledge on LANL transuranic, heterogeneous, debris waste, or ''Cutting the Gordian knot that binds WIPP''

    International Nuclear Information System (INIS)

    Kosiewicz, S.T.; Triay, I.R.; Souza, L.A.

    1999-01-01

    Through sampling and toxicity characteristic leaching procedure (TCLP) analyses, LANL and the DOE validated that a LANL transuranic (TRU) waste (TA-55-43, Lot No. 01) was not a Resource Recovery and Conservation Act (RCRA) hazardous waste. This paper describes the sampling and analysis project as well as the statistical assessment of the analytical results. The analyses were conducted according to the requirements and procedures in the sampling and analysis plan approved by the New Mexico Environmental Department. The plan used a statistical approach that was consistent with the stratified, random sampling requirements of SW-846. LANL adhered to the plan during sampling and chemical analysis of randomly selected items of the five major types of materials in this heterogeneous, radioactive, debris waste. To generate portions of the plan, LANL analyzed a number of non-radioactive items that were representative of the mix of items present in the waste stream. Data from these cold surrogates were used to generate means and variances needed to optimize the design. Based on statistical arguments alone, only two samples from the entire waste stream were deemed necessary, however a decision was made to analyze at least two samples of each of the five major waste types. To obtain these samples, nine TRU waste drums were opened. Sixty-six radioactively contaminated and four non-radioactive grab samples were collected. Portions of the samples were composited for chemical analyses. In addition, a radioactively contaminated sample of rust-colored powder of interest to the New Mexico Environment Department (NMED) was collected and qualitatively identified as rust

  15. Device-independent randomness amplification with a single device

    International Nuclear Information System (INIS)

    Plesch, Martin; Pivoluska, Matej

    2014-01-01

    Expansion and amplification of weak randomness with untrusted quantum devices has recently become a very fruitful topic of research. Here we contribute with a procedure for amplifying a single weak random source using tri-partite GHZ-type entangled states. If the quality of the source reaches a fixed threshold R=log 2 ⁡(10), perfect random bits can be produced. This technique can be used to extract randomness from sources that can't be extracted neither classically, nor by existing procedures developed for Santha–Vazirani sources. Our protocol works with a single fault-free device decomposable into three non-communicating parts, that is repeatedly reused throughout the amplification process. - Highlights: • We propose a protocol for device independent randomness amplification. • Our protocol repeatedly re-uses a single device decomposable into three parts. • Weak random sources with min-entropy rate greater than 1/4 log 2 ⁡(10) can be amplified. • Security against all-quantum adversaries is achieved

  16. To evaluate and compare the efficacy of combined sucrose and non-nutritive sucking for analgesia in newborns undergoing minor painful procedure: a randomized controlled trial.

    Science.gov (United States)

    Thakkar, P; Arora, K; Goyal, K; Das, R R; Javadekar, B; Aiyer, S; Panigrahi, S K

    2016-01-01

    The objective of this study was to evaluate and compare the efficacy of combined sucrose and non-nutritive sucking (NNS) for analgesia in newborn infants undergoing heel-stick procedures. This randomized control trial was conducted in the neonatal intensive care unit of a tertiary care hospital over a period of 1 year. One hundred and eighty full-term neonates with birth weight >2200 g and age >24 h were randomized to one of four interventions administered 2 min before the procedure: 2 ml of 30% sucrose (group I, n=45) or NNS (group II, n=45) or both (group III, n=45) or none (group IV, n=45). Primary outcome was composite score based on Premature Infant Pain Profile (PIPP) score. Baseline variables were comparable among the groups. Median (interquartile range) PIPP score was 3 (2 to 4) in group III as compared with 7 (6.5 to 8) in group I, 9 (7 to 11) in group II and 13 (10.5 to 15) in group IV. Group III had significant decrease in the median PIPP score compared with other groups (P=0.000). Median PIPP score also decreased significantly with any intervention as compared with no intervention (P=0.000). Sucrose and/or NNS are effective in providing analgesia in full-term neonates undergoing heel-stick procedures, with the combined intervention being more effective compared with any single intervention.

  17. A solid phase extraction-ion chromatography with conductivity detection procedure for determining cationic surfactants in surface water samples.

    Science.gov (United States)

    Olkowska, Ewa; Polkowska, Żaneta; Namieśnik, Jacek

    2013-11-15

    A new analytical procedure for the simultaneous determination of individual cationic surfactants (alkyl benzyl dimethyl ammonium chlorides) in surface water samples has been developed. We describe this methodology for the first time: it involves the application of solid phase extraction (SPE-for sample preparation) coupled with ion chromatography-conductivity detection (IC-CD-for the final determination). Mean recoveries of analytes between 79% and 93%, and overall method quantification limits in the range from 0.0018 to 0.038 μg/mL for surface water and CRM samples were achieved. The methodology was applied to the determination of individual alkyl benzyl quaternary ammonium compounds in environmental samples (reservoir water) and enables their presence in such types of waters to be confirmed. In addition, it is a simpler, less time-consuming, labour-intensive, avoiding use of toxic chloroform and significantly less expensive methodology than previously described approaches (liquid-liquid extraction coupled with liquid chromatography-mass spectrometry). Copyright © 2013 Elsevier B.V. All rights reserved.

  18. Effectiveness of a pre-procedural mouthwash in reducing bacteria in dental aerosols: randomized clinical trial

    Directory of Open Access Journals (Sweden)

    Belén RETAMAL-VALDES

    2017-03-01

    Full Text Available Abstract The aim of this randomized, single blinded clinical trial was to evaluate the effect of a pre-procedural mouthwash containing cetylpyridinium chloride (CPC, zinc lactate (Zn and sodium fluoride (F in the reduction of viable bacteria in oral aerosol after a dental prophylaxis with ultrasonic scaler. Sixty systemically healthy volunteers receiving dental prophylaxis were randomly assigned to one of the following experimental groups (15 per group: (i rinsing with 0.075% CPC, 0.28% Zn and 0.05% F (CPC+Zn+F, (ii water or (iii 0.12% chlorhexidine digluconate (CHX, and (iv no rinsing. Viable bacteria were collected from different locations in the dental office on enriched TSA plates and anaerobically incubated for 72 hours. The colonies were counted and species were then identified by Checkerboard DNA–DNA Hybridization. The total number of colony-forming units (CFUs detected in the aerosols from volunteers who rinsed with CPC+Zn+F or CHX was statistically significantly (p<0.05 lower than of those subjects who did not rinse or who rinsed with water. When all locations were considered together, the aerosols from the CPC+Zn+F and CHX groups showed, respectively, 70% and 77% fewer CFUs than those from the No Rinsing group and 61% and 70% than those from the Water group. The mean proportions of bacterial species from the orange complex were statistically significantly (p<0.05 lower in aerosols from the CPC+Zn+F and CHX groups compared with the others two groups. In conclusion, the mouthwash containing CPC+Zn+F, is effective in reducing viable bacteria in oral aerosol after a dental prophylaxis with ultrasonic scaler.

  19. Random assay in radioimmunoassay: Feasibility and application compared with batch assay

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Min; Lee, Hwan Hee; Park, Sohyun; Kim, Tae Sung; Kim, Seok Ki [Dept. of Nuclear MedicineNational Cancer Center, Goyang (Korea, Republic of)

    2016-12-15

    The batch assay has been conventionally used for radioimmunoassay (RIA) because of its technical robustness and practical convenience. However, it has limitations in terms of the relative lag of report time due to the necessity of multiple assays in a small number of samples compared with the random assay technique. In this study, we aimed to verify whether the random assay technique can be applied in RIA and is feasible in daily practice. The coefficients of variation (CVs) of eight standard curves within a single kit were calculated in a CA-125 immunoradiometric assay (IRMA) for the reference of the practically ideal CV of the CA-125 kit. Ten standard curves of 10 kits from 2 prospectively collected lots (pLot) and 85 standard curves of 85 kits from 3 retrospectively collected lots (Lot) were obtained. Additionally, the raw measurement data of both 170 control references and 1123 patients' sera were collected retrospectively between December 2015 and January 2016. A standard curve of the first kit of each lot was used as a master standard curve for a random assay. The CVs of inter-kits were analyzed in each lot, respectively. All raw measurements were normalized by decay and radioactivity. The CA-125 values from control samples and patients' sera were compared using the original batch assay and random assay. In standard curve analysis, the CVs of inter-kits in pLots and Lots were comparable to those within a single kit. The CVs from the random assay with normalization were similar to those from the batch assay in the control samples (CVs % of low/high concentration; Lot1 2.71/1.91, Lot2 2.35/1.83, Lot3 2.83/2.08 vs. Lot1 2.05/1.21, Lot2 1.66/1.48, Lot3 2.41/2.14). The ICCs between the batch assay and random assay using patients' sera were satisfactory (Lot1 1.00, Lot2 0.999, Lot3 1.00). The random assay technique could be successfully applied to the conventional CA-125 IRMA kits. The random assay showed strong agreement with the batch assay. The

  20. Random lasing in human tissues

    International Nuclear Information System (INIS)

    Polson, Randal C.; Vardeny, Z. Valy

    2004-01-01

    A random collection of scatterers in a gain medium can produce coherent laser emission lines dubbed 'random lasing'. We show that biological tissues, including human tissues, can support coherent random lasing when infiltrated with a concentrated laser dye solution. To extract a typical random resonator size within the tissue we average the power Fourier transform of random laser spectra collected from many excitation locations in the tissue; we verified this procedure by a computer simulation. Surprisingly, we found that malignant tissues show many more laser lines compared to healthy tissues taken from the same organ. Consequently, the obtained typical random resonator was found to be different for healthy and cancerous tissues, and this may lead to a technique for separating malignant from healthy tissues for diagnostic imaging

  1. Conditional Monte Carlo randomization tests for regression models.

    Science.gov (United States)

    Parhat, Parwen; Rosenberger, William F; Diao, Guoqing

    2014-08-15

    We discuss the computation of randomization tests for clinical trials of two treatments when the primary outcome is based on a regression model. We begin by revisiting the seminal paper of Gail, Tan, and Piantadosi (1988), and then describe a method based on Monte Carlo generation of randomization sequences. The tests based on this Monte Carlo procedure are design based, in that they incorporate the particular randomization procedure used. We discuss permuted block designs, complete randomization, and biased coin designs. We also use a new technique by Plamadeala and Rosenberger (2012) for simple computation of conditional randomization tests. Like Gail, Tan, and Piantadosi, we focus on residuals from generalized linear models and martingale residuals from survival models. Such techniques do not apply to longitudinal data analysis, and we introduce a method for computation of randomization tests based on the predicted rate of change from a generalized linear mixed model when outcomes are longitudinal. We show, by simulation, that these randomization tests preserve the size and power well under model misspecification. Copyright © 2014 John Wiley & Sons, Ltd.

  2. Effects of adding Braun jejunojejunostomy to standard Whipple procedure on reduction of afferent loop syndrome - a randomized clinical trial.

    Science.gov (United States)

    Kakaei, Farzad; Beheshtirouy, Samad; Nejatollahi, Seyed Moahammad Reza; Rashidi, Iqbal; Asvadi, Touraj; Habibzadeh, Afshin; Oliaei-Motlagh, Mohammad

    2015-12-01

    Whipple surgery (pancreaticodeudenectomy) has a high complication rate. We aimed to evaluate whether adding Braun jejunojejunostomy (side-to-side anastomosis of afferent and efferent loops distal to the gastrojejunostomy site) to a standard Whipple procedure would reduce postoperative complications. We conducted a randomized clinical trial comparing patients who underwent standard Whipple surgery (standard group) and patients who underwent standard Whipple surgery with Braun jejunojejunostomy (Braun group). Patients were followed for 1 month after the procedure and postoperative complications were recorded. Our study included 30 patients: 15 in the Braun and 15 in the standard group. In the Braun group, 4 (26.7%) patients experienced 6 complications, whereas in the standard group, 7 (46.7%) patients experienced 11 complications (p = 0.14). Complications in the Braun group were gastrointestinal bleeding and wound infection (n = 1 each) and delayed gastric emptying and pulmonary infection (n = 2 each). Complications in the standard group were death, pancreatic anastomosis leak and biliary anastomosis leak (n = 1 each); gastrointestinal bleeding (n = 2); and afferent loop syndrome and delayed gastric emptying (n = 3 each). There was no significant difference between groups in the subtypes of complications. Our results showed that adding Braun jejunojejunostomy to standard Whipple procedure was associated with lower rates of afferent loop syndrome and delayed gastric emptying. However, more studies are needed to define the role of Braun jejunojejunostomy in this regard. IRCT2014020316473N1 (www.irct.ir).

  3. A systematic review and meta-analysis of randomized, controlled trials of moderate sedation for routine endoscopic procedures.

    Science.gov (United States)

    McQuaid, Kenneth R; Laine, Loren

    2008-05-01

    Numerous agents are available for moderate sedation in endoscopy. Our purpose was to compare efficacy, safety, and efficiency of agents used for moderate sedation in EGD or colonoscopy. Systematic review of computerized bibliographic databases for randomized trials of moderate sedation that compared 2 active regimens or 1 active regimen with placebo or no sedation. Unselected adults undergoing EGD or colonoscopy with a goal of moderate sedation. Sedation-related complications, patient assessments (satisfaction, pain, memory, willingness to repeat examination), physician assessments (satisfaction, level of sedation, patient cooperation, examination quality), and procedure-related efficiency outcomes (sedation, procedure, or recovery time). Thirty-six studies (N = 3918 patients) were included. Sedation improved patient satisfaction (relative risk [RR] = 2.29, range 1.16-4.53) and willingness to repeat EGD (RR = 1.25, range 1.13-1.38) versus no sedation. Midazolam provided superior patient satisfaction to diazepam (RR = 1.18, range 1.07-1.29) and less frequent memory of EGD (RR = 0.57, range 0.50-0.60) versus diazepam. Adverse events and patient/physician assessments were not significantly different for midazolam (with or without narcotics) versus propofol except for slightly less patient satisfaction (RR = 0.90, range 0.83-0.97) and more frequent memory (RR = 3.00, range 1.25-7.21) with midazolam plus narcotics. Procedure times were similar, but sedation and recovery times were shorter with propofol than midazolam-based regimens. Marked variability in design, regimens tested, and outcomes assessed; relatively poor methodologic quality (Jadad score sedation provides a high level of physician and patient satisfaction and a low risk of serious adverse events with all currently available agents. Midazolam-based regimens have longer sedation and recovery times than does propofol.

  4. Clinical procedure for colon carcinoma tissue sampling directly affects the cancer marker-capacity of VEGF family members

    International Nuclear Information System (INIS)

    Pringels, Sarah; Van Damme, Nancy; De Craene, Bram; Pattyn, Piet; Ceelen, Wim; Peeters, Marc; Grooten, Johan

    2012-01-01

    mRNA levels of members of the Vascular Endothelial Growth Factor family (VEGF-A, -B, -C, -D, Placental Growth Factor/PlGF) have been investigated as tissue-based markers of colon cancer. These studies, which used specimens obtained by surgical resection or colonoscopic biopsy, yielded contradictory results. We studied the effect of the sampling method on the marker accuracy of VEGF family members. Comparative RT-qPCR analysis was performed on healthy colon and colon carcinoma samples obtained by biopsy (n = 38) or resection (n = 39) to measure mRNA expression levels of individual VEGF family members. mRNA levels of genes encoding the eicosanoid enzymes cyclooxygenase 2 (COX2) and 5-lipoxygenase (5-LOX) and of genes encoding the hypoxia markers glucose transporter 1 (GLUT-1) and carbonic anhydrase IX (CAIX) were included as markers for cellular stress and hypoxia. Expression levels of COX2, 5-LOX, GLUT-1 and CAIX revealed the occurrence in healthy colon resection samples of hypoxic cellular stress and a concurrent increment of basal expression levels of VEGF family members. This increment abolished differential expression of VEGF-B and VEGF-C in matched carcinoma resection samples and created a surgery-induced underexpression of VEGF-D. VEGF-A and PlGF showed strong overexpression in carcinoma samples regardless of the sampling method. Sampling-induced hypoxia in resection samples but not in biopsy samples affects the marker-reliability of VEGF family members. Therefore, biopsy samples provide a more accurate report on VEGF family mRNA levels. Furthermore, this limited expression analysis proposes VEGF-A and PlGF as reliable, sampling procedure insensitive mRNA-markers for molecular diagnosis of colon cancer

  5. The Effect of EMLA Cream on Patient-Controlled Analgesia with Remifentanil in ESWL Procedure: A Placebo-Controlled Randomized Study.

    Science.gov (United States)

    Acar, Arzu; Erhan, Elvan; Nuri Deniz, M; Ugur, Gulden

    2013-01-01

    To alleviate stinging pain in the skin entry area and visceral discomfort in patients who are undergoing ESWL. This study was designed to investigate the effectiveness of the EMLA cream in combination with remifentanil patient-controlled analgesia (PCA) in patients undergoing ESWL treatment. Sixty patients were divided into two double-blind randomized groups. Those in the first group were administered 3-5mm of EMLA 5% cream on a marked area; the second group received, as a placebo, a cream with no analgesic effect in the same amount. All patients were administered a remifentanil bolus with a PCA device. Arterial blood pressure, oxygen saturation, and respiratory rate were recorded throughout the procedure; postoperative side effects, agitation, and respiratory depression were measured after. Visual Analogue Scale (VAS) scores were taken preoperatively, perioperatively, directly postoperatively, and 60 minutes subsequent to finishing the procedure. There were no statistically significant differences in the frequency of PCA demands and delivered boluses or among perioperative VAS. No significant side effects were noted. Patient satisfaction was recorded high in both groups. EMLA cream offered no advantage over the placebo cream in patients undergoing ESWL with remifentanil PCA.

  6. Importance sampling of heavy-tailed iterated random functions

    NARCIS (Netherlands)

    B. Chen (Bohan); C.H. Rhee (Chang-Han); A.P. Zwart (Bert)

    2016-01-01

    textabstractWe consider a stochastic recurrence equation of the form $Z_{n+1} = A_{n+1} Z_n+B_{n+1}$, where $\\mathbb{E}[\\log A_1]<0$, $\\mathbb{E}[\\log^+ B_1]<\\infty$ and $\\{(A_n,B_n)\\}_{n\\in\\mathbb{N}}$ is an i.i.d. sequence of positive random vectors. The stationary distribution of this Markov

  7. The effect of local heat on term neonates pain intensity during heel-blood sampling

    Directory of Open Access Journals (Sweden)

    R. GHobadi Mohebi

    2017-04-01

    Full Text Available Aims: Newborns are more sensitive to pain than adults and are more susceptible to the long-term complications of pain. So, it is necessary to use procedures for reducing pain in newborns. The aim of this study was to determine the effect of local heat on the pain intensity of heel-blood sampling in the term newborns. Material & Methods: In this randomized controlled clinical trial study, in 2012, 63 healthy 3 to 5-day newborns who were referred to Shahid Delkhah Health Center in Ferdows were selected by random sampling method and randomly divided into 3 groups (21 people in each group: test (heat, placebo (sound and control. The pain intensity of newborns before, during and after heel-blood sampling was evaluated. The data collection tools were demographic questionnaire and Neonatal Infant Pain Scale (NIPS. Data were analyzed by SPSS 14.5 software and chi-square test, one-way ANOVA, Tukey's post hoc test, and ANOVA with repeated observations. Finding: The mean pain intensity in the three groups was not significantly different before intervention (p=0.86, but the mean pain intensity was lower in the test group than in the other two groups (p=0.006. After heel-blood sampling, the mean pain intensity was the least in the test group and was the most in the control group (p<0.001. Conclusion: Local heat during and after heel blood sampling decreases pain intensity in the term newborns.

  8. Randomized Comparison of Two Vaginal Self-Sampling Methods for Human Papillomavirus Detection: Dry Swab versus FTA Cartridge.

    Directory of Open Access Journals (Sweden)

    Rosa Catarino

    Full Text Available Human papillomavirus (HPV self-sampling (self-HPV is valuable in cervical cancer screening. HPV testing is usually performed on physician-collected cervical smears stored in liquid-based medium. Dry filters and swabs are an alternative. We evaluated the adequacy of self-HPV using two dry storage and transport devices, the FTA cartridge and swab.A total of 130 women performed two consecutive self-HPV samples. Randomization determined which of the two tests was performed first: self-HPV using dry swabs (s-DRY or vaginal specimen collection using a cytobrush applied to an FTA cartridge (s-FTA. After self-HPV, a physician collected a cervical sample using liquid-based medium (Dr-WET. HPV types were identified by real-time PCR. Agreement between collection methods was measured using the kappa statistic.HPV prevalence for high-risk types was 62.3% (95%CI: 53.7-70.2 detected by s-DRY, 56.2% (95%CI: 47.6-64.4 by Dr-WET, and 54.6% (95%CI: 46.1-62.9 by s-FTA. There was overall agreement of 70.8% between s-FTA and s-DRY samples (kappa = 0.34, and of 82.3% between self-HPV and Dr-WET samples (kappa = 0.56. Detection sensitivities for low-grade squamous intraepithelial lesion or worse (LSIL+ were: 64.0% (95%CI: 44.5-79.8 for s-FTA, 84.6% (95%CI: 66.5-93.9 for s-DRY, and 76.9% (95%CI: 58.0-89.0 for Dr-WET. The preferred self-collection method among patients was s-DRY (40.8% vs. 15.4%. Regarding costs, FTA card was five times more expensive than the swab (~5 US dollars (USD/per card vs. ~1 USD/per swab.Self-HPV using dry swabs is sensitive for detecting LSIL+ and less expensive than s-FTA.International Standard Randomized Controlled Trial Number (ISRCTN: 43310942.

  9. Assessment of the effects of different sample perfusion procedures on phase-contrast tomographic images of mouse spinal cord

    Science.gov (United States)

    Stefanutti, E.; Sierra, A.; Miocchi, P.; Massimi, L.; Brun, F.; Maugeri, L.; Bukreeva, I.; Nurmi, A.; Begani Provinciali, G.; Tromba, G.; Gröhn, O.; Giove, F.; Cedola, A.; Fratini, M.

    2018-03-01

    Synchrotron X-ray Phase Contrast micro-Tomography (SXrPCμT) is a powerful tool in the investigation of biological tissues, including the central nervous system (CNS), and it allows to simultaneously detect the vascular and neuronal network avoiding contrast agents or destructive sample preparations. However, specific sample preparation procedures aimed to optimize the achievable contrast- and signal-to-noise ratio (CNR and SNR, respectively) are required. Here we report and discuss the effects of perfusion with two different fixative agents (ethanol and paraformaldehyde) and with a widely used contrast medium (MICROFIL®) on mouse spinal cord. As a main result, we found that ethanol enhances contrast at the grey/white matter interface and increases the contrast in correspondence of vascular features and fibres, thus providing an adequate spatial resolution to visualise the vascular network at the microscale. On the other hand, ethanol is known to induce tissue dehydration, likely reducing cell dimensions below the spatial resolution limit imposed by the experimental technique. Nonetheless, neurons remain well visible using either perfused paraformaldehyde or MICROFIL® compound, as these latter media do not affect tissues with dehydration effects. Paraformaldehyde appears as the best compromise: it is not a contrast agent, like MICROFIL®, but it is less invasive than ethanol and permits to visualise well both cells and blood vessels. However, a quantitative estimation of the relative grey matter volume of each sample has led us to conclude that no significant alterations in the grey matter extension compared to the white matter occur as a consequence of the perfusion procedures tested in this study.

  10. Relative efficiency and sample size for cluster randomized trials with variable cluster sizes.

    Science.gov (United States)

    You, Zhiying; Williams, O Dale; Aban, Inmaculada; Kabagambe, Edmond Kato; Tiwari, Hemant K; Cutter, Gary

    2011-02-01

    The statistical power of cluster randomized trials depends on two sample size components, the number of clusters per group and the numbers of individuals within clusters (cluster size). Variable cluster sizes are common and this variation alone may have significant impact on study power. Previous approaches have taken this into account by either adjusting total sample size using a designated design effect or adjusting the number of clusters according to an assessment of the relative efficiency of unequal versus equal cluster sizes. This article defines a relative efficiency of unequal versus equal cluster sizes using noncentrality parameters, investigates properties of this measure, and proposes an approach for adjusting the required sample size accordingly. We focus on comparing two groups with normally distributed outcomes using t-test, and use the noncentrality parameter to define the relative efficiency of unequal versus equal cluster sizes and show that statistical power depends only on this parameter for a given number of clusters. We calculate the sample size required for an unequal cluster sizes trial to have the same power as one with equal cluster sizes. Relative efficiency based on the noncentrality parameter is straightforward to calculate and easy to interpret. It connects the required mean cluster size directly to the required sample size with equal cluster sizes. Consequently, our approach first determines the sample size requirements with equal cluster sizes for a pre-specified study power and then calculates the required mean cluster size while keeping the number of clusters unchanged. Our approach allows adjustment in mean cluster size alone or simultaneous adjustment in mean cluster size and number of clusters, and is a flexible alternative to and a useful complement to existing methods. Comparison indicated that we have defined a relative efficiency that is greater than the relative efficiency in the literature under some conditions. Our measure

  11. The Study on Mental Health at Work: Design and sampling.

    Science.gov (United States)

    Rose, Uwe; Schiel, Stefan; Schröder, Helmut; Kleudgen, Martin; Tophoven, Silke; Rauch, Angela; Freude, Gabriele; Müller, Grit

    2017-08-01

    The Study on Mental Health at Work (S-MGA) generates the first nationwide representative survey enabling the exploration of the relationship between working conditions, mental health and functioning. This paper describes the study design, sampling procedures and data collection, and presents a summary of the sample characteristics. S-MGA is a representative study of German employees aged 31-60 years subject to social security contributions. The sample was drawn from the employment register based on a two-stage cluster sampling procedure. Firstly, 206 municipalities were randomly selected from a pool of 12,227 municipalities in Germany. Secondly, 13,590 addresses were drawn from the selected municipalities for the purpose of conducting 4500 face-to-face interviews. The questionnaire covers psychosocial working and employment conditions, measures of mental health, work ability and functioning. Data from personal interviews were combined with employment histories from register data. Descriptive statistics of socio-demographic characteristics and logistic regressions analyses were used for comparing population, gross sample and respondents. In total, 4511 face-to-face interviews were conducted. A test for sampling bias revealed that individuals in older cohorts participated more often, while individuals with an unknown educational level, residing in major cities or with a non-German ethnic background were slightly underrepresented. There is no indication of major deviations in characteristics between the basic population and the sample of respondents. Hence, S-MGA provides representative data for research on work and health, designed as a cohort study with plans to rerun the survey 5 years after the first assessment.

  12. The Study on Mental Health at Work: Design and sampling

    Science.gov (United States)

    Rose, Uwe; Schiel, Stefan; Schröder, Helmut; Kleudgen, Martin; Tophoven, Silke; Rauch, Angela; Freude, Gabriele; Müller, Grit

    2017-01-01

    Aims: The Study on Mental Health at Work (S-MGA) generates the first nationwide representative survey enabling the exploration of the relationship between working conditions, mental health and functioning. This paper describes the study design, sampling procedures and data collection, and presents a summary of the sample characteristics. Methods: S-MGA is a representative study of German employees aged 31–60 years subject to social security contributions. The sample was drawn from the employment register based on a two-stage cluster sampling procedure. Firstly, 206 municipalities were randomly selected from a pool of 12,227 municipalities in Germany. Secondly, 13,590 addresses were drawn from the selected municipalities for the purpose of conducting 4500 face-to-face interviews. The questionnaire covers psychosocial working and employment conditions, measures of mental health, work ability and functioning. Data from personal interviews were combined with employment histories from register data. Descriptive statistics of socio-demographic characteristics and logistic regressions analyses were used for comparing population, gross sample and respondents. Results: In total, 4511 face-to-face interviews were conducted. A test for sampling bias revealed that individuals in older cohorts participated more often, while individuals with an unknown educational level, residing in major cities or with a non-German ethnic background were slightly underrepresented. Conclusions: There is no indication of major deviations in characteristics between the basic population and the sample of respondents. Hence, S-MGA provides representative data for research on work and health, designed as a cohort study with plans to rerun the survey 5 years after the first assessment. PMID:28673202

  13. Results of the analysis of the intercomparison samples of the depleted uranium dioxide SR-10

    International Nuclear Information System (INIS)

    Aigner, H.; Deron, S.; Kuhn, E.; Zoigner, A.

    1981-01-01

    Samples of a homogeneous powder of depleted uranium dioxide, SR-10, were distributed to 27 laboratories in February 1979 for intercomparison of the precisions and accuracies of wet chemical assay. 7 laboratories reported their results. 6 laboratories applied titration procedures, 4 of them applied methods derived from the Davies and Gray procedure (1), and one laboratory used controlled potential coulometry. An analysis of variance yields for each laboratory the estimates of the measurement errors, the dissolution or treatment errors and the random calibration errors. The measurement errors vary between 0.01% and 0.10% relative. The differences to the reference value vary between -0.48% and +0.87% uranium, but 5 laboratories agree within +-0.25% U with the reference value. The biases of 5 laboratories are greater than expected from their random errors. The mean bias of the 7 laboratories is equal to +0.03% U. The standard deviation of the laboratory biases is equal to 0.43% U. (author)

  14. 40 CFR 90.413 - Exhaust sample procedure-gaseous components.

    Science.gov (United States)

    2010-07-01

    ...-analysis values. (3) Measure and record HC, CO, CO2, and NOX concentrations in the exhaust sample bag(s...-analysis values. (7) Collect background HC, CO, CO2, and NOX in a sample bag (for dilute exhaust sampling...: (1) For dilute grab (“bag”) sample analysis, the analyzer response must be stable at greater than 99...

  15. Power Spectrum Estimation of Randomly Sampled Signals

    DEFF Research Database (Denmark)

    Velte, C. M.; Buchhave, P.; K. George, W.

    algorithms; sample and-hold and the direct spectral estimator without residence time weighting. The computer generated signal is a Poisson process with a sample rate proportional to velocity magnitude that consist of well-defined frequency content, which makes bias easy to spot. The idea...

  16. Pancreaticoduodenectomy for pancreatic head cancer: PPPD versus Whipple procedure.

    Science.gov (United States)

    Lin, Pin-Wen; Shan, Yan-Shen; Lin, Yih-Jyh; Hung, Chung-Jye

    2005-01-01

    Resectable carcinoma of the head of the pancreas can be treated with either standard (the Whipple) or pylorus-preserving pancreaticoduodenectomy (PPPD). Only a few reports compared the differences between these two procedures. From July 1994 to Oct 2002, a prospective randomized comparison between the Whipple procedure and PPPD done by the same surgeon for the patients with carcinoma of the head of the pancreas was conducted. Thirty-six patients diagnosed as pancreatic head adenocarcinoma were randomized to receive either the Whipple procedure or a PPPD. Three patients initially randomized to have a PPPD were converted to the Whipple procedure due to gross duodenal involvement. Finally, 19 patients received the Whipple procedure, 14 patients underwent PPPD and three patients had conversion. Two perioperative deaths in the Whipple group and one perioperative death in PPPD resulted in an 8 percent mortality rate in the 36 patients. Median duration of the Whipple operation was 265 (range 203-475) min with a median blood loss of 570 (50-8540) mL. In the patients who had PPPD, median operating time was 232 (range 165-270) min, and median blood loss was 375 (range 100-1300) mL. There was one minor leak from the pancreaticojejunostomy in each group, resulting in a 5.5 percent minor leak in 36 patients. These outcomes were not significantly different. Delayed gastric emptying was observed more frequently after PPPD (six of 14 patients) than after the Whipple procedure (none of 19 patients) (P Whipple procedure and PPPD in terms of median survival and 5-year survival rate. The median survival time was 16.0 months and 5-year survival rate was 9.4 percent in the 36 patients. Blood loss during operation influenced the prognosis. There was no significant difference between the Whipple procedure and PPPD for the treatment of pancreatic head cancer in terms of operating time, blood loss, operative mortality and long-term survival. But delayed gastric emptying was more frequently

  17. Comparative evaluation of stress levels before, during, and after periodontal surgical procedures with and without nitrous oxide-oxygen inhalation sedation

    Directory of Open Access Journals (Sweden)

    Gurkirat Sandhu

    2017-01-01

    Full Text Available Context: Periodontal surgical procedures produce varying degree of stress in all patients. Nitrous oxide-oxygen inhalation sedation is very effective for adult patients with mild-to-moderate anxiety due to dental procedures and needle phobia. Aim: The present study was designed to perform periodontal surgical procedures under nitrous oxide-oxygen inhalation sedation and assess whether this technique actually reduces stress physiologically, in comparison to local anesthesia alone (LA during lengthy periodontal surgical procedures. Settings and Design: This was a randomized, split-mouth, cross-over study. Materials and Methods: A total of 16 patients were selected for this randomized, split-mouth, cross-over study. One surgical session (SS was performed under local anesthesia aided by nitrous oxide-oxygen inhalation sedation, and the other SS was performed on the contralateral quadrant under LA. For each session, blood samples to measure and evaluate serum cortisol levels were obtained, and vital parameters including blood pressure, heart rate, respiratory rate, and arterial blood oxygen saturation were monitored before, during, and after periodontal surgical procedures. Statistical Analysis Used: Paired t-test and repeated measure ANOVA. Results: The findings of the present study revealed a statistically significant decrease in serum cortisol levels, blood pressure and pulse rate and a statistically significant increase in respiratory rate and arterial blood oxygen saturation during periodontal surgical procedures under nitrous oxide inhalation sedation. Conclusion: Nitrous oxide-oxygen inhalation sedation for periodontal surgical procedures is capable of reducing stress physiologically, in comparison to LA during lengthy periodontal surgical procedures.

  18. A rapid sample decomposition procedure for bromo-heavies containing ferruginous material: determination of REEs and thorium by ICP-AES

    International Nuclear Information System (INIS)

    Khorge, C.R.; Murugesan, P.; Chakrapani, G.

    2013-01-01

    A rapid method of sample decomposition and dissolution for bromoform heavies is described for the determination of REEs and thorium by inductively coupled plasma- optical emission spectrometry. For application to geochemical exploration to achieve the high sample throughput; a simple and rapid analytical procedure is a prerequisite. In order to speed-up the existing methodology, phosphate fusion was introduced for decomposition of samples. In the proposed method, bromoform-heavies material are fused with 1:1 mixture of sodium dihydrogen orthophosphate and tetra-sodium pyrophosphate and dissolved in distilled water. After disintegration of melt, the solution was subjected to oxalate precipitation followed by R 2 O 3 separation for separating the REEs from major matrix interfering elements. The rare earth elements and thorium in the resultant solution were determined by ICP-OES. The results are compared with the results obtained by well-established existing dissolution procedures involving HF-HCl-HClO 4 acid treatment and NaF/KHF 2 fusion followed by H 2 SO 4 acid fuming. The accuracy of the method was evaluated by doping the phosphate blank with known amount of REEs and comparing the recoveries obtained using the present method. The method is simple, rapid and is suitable for the routine determination of REEs and Th in bromoform-heavies. The RSD of the method was found to be within 1-3% for Th and REEs by ICP-AES. (author)

  19. Calibration of semi-stochastic procedure for simulating high-frequency ground motions

    Science.gov (United States)

    Seyhan, Emel; Stewart, Jonathan P.; Graves, Robert

    2013-01-01

    Broadband ground motion simulation procedures typically utilize physics-based modeling at low frequencies, coupled with semi-stochastic procedures at high frequencies. The high-frequency procedure considered here combines deterministic Fourier amplitude spectra (dependent on source, path, and site models) with random phase. Previous work showed that high-frequency intensity measures from this simulation methodology attenuate faster with distance and have lower intra-event dispersion than in empirical equations. We address these issues by increasing crustal damping (Q) to reduce distance attenuation bias and by introducing random site-to-site variations to Fourier amplitudes using a lognormal standard deviation ranging from 0.45 for Mw  100 km).

  20. Quality control of the analysis of IAEA samples in the radium institute

    International Nuclear Information System (INIS)

    Belyaev, B.N.; Lovtsus, A.V.; Makarova, T.P.; Stepanov, A.V.

    1989-01-01

    Metrological chracteristics of mass and alpha spectrometric methods in the Radium Institute for analysis of spent fuel control samples are evaluated. The techniques of analysis and the procedure of quality control for isotopic ratio measurements based on the use of uranium and plutonium standard reference materials (NBS, NBL, SAL made in USSR) are described. The results of measurements performed during cooperation with IAEA are discussed, and the sources of systematic and random errors are analyzed. The results obtained agree well with the target values. (author)

  1. A single-centre experience of the implementation of adrenal vein sampling procedure: the impact on the diagnostic work-up in primary aldosteronism

    NARCIS (Netherlands)

    Kadziela, J.; Prejbisz, A.; Michalowska, I.; Kolodziejczyk-Kruk, S.; Schultze Kool, L.J.; Kabat, M.; Janaszek-Sitkowska, H.; Toutounchi, S.; Galazka, Z.; Ambroziak, U.; Bednarczuk, T.; Ptasinska-Wnuk, D.; Hoffmann, M.; Januszewicz, M.; Januszewicz, A.; Witkowski, A.

    2017-01-01

    BACKGROUND: Primary aldosteronism is one of the most common causes of secondary hypertension. Adrenal vein sampling (AVS) remains a "gold standard" procedure in differentiation between unilateral (adenoma) and bilateral (hyperplasia) disease. AIM: The aim of this study was to present our

  2. The effect of sample grinding procedures after processing on gas production profiles and end-product formation in expander processed barley and peas

    NARCIS (Netherlands)

    Azarfar, A.; Poel, van der A.F.B.; Tamminga, S.

    2007-01-01

    Grinding is a technological process widely applied in the feed manufacturing industry and is a prerequisite for obtaining representative samples for laboratory procedures (e.g. gas production analysis). When feeds are subjected to technological processes other than grinding (e.g. expander

  3. Assessing differences in groups randomized by recruitment chain in a respondent-driven sample of Seattle-area injection drug users.

    Science.gov (United States)

    Burt, Richard D; Thiede, Hanne

    2014-11-01

    Respondent-driven sampling (RDS) is a form of peer-based study recruitment and analysis that incorporates features designed to limit and adjust for biases in traditional snowball sampling. It is being widely used in studies of hidden populations. We report an empirical evaluation of RDS's consistency and variability, comparing groups recruited contemporaneously, by identical methods and using identical survey instruments. We randomized recruitment chains from the RDS-based 2012 National HIV Behavioral Surveillance survey of injection drug users in the Seattle area into two groups and compared them in terms of sociodemographic characteristics, drug-associated risk behaviors, sexual risk behaviors, human immunodeficiency virus (HIV) status and HIV testing frequency. The two groups differed in five of the 18 variables examined (P ≤ .001): race (e.g., 60% white vs. 47%), gender (52% male vs. 67%), area of residence (32% downtown Seattle vs. 44%), an HIV test in the previous 12 months (51% vs. 38%). The difference in serologic HIV status was particularly pronounced (4% positive vs. 18%). In four further randomizations, differences in one to five variables attained this level of significance, although the specific variables involved differed. We found some material differences between the randomized groups. Although the variability of the present study was less than has been reported in serial RDS surveys, these findings indicate caution in the interpretation of RDS results. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. Determination of Pesticides Residues in Cucumbers Grown in Greenhouse and the Effect of Some Procedures on Their Residues.

    Science.gov (United States)

    Leili, Mostafa; Pirmoghani, Amin; Samadi, Mohammad Taghi; Shokoohi, Reza; Roshanaei, Ghodratollah; Poormohammadi, Ali

    2016-11-01

    The objective of this study was to determine the residual concentrations of ethion and imidacloprid in cucumbers grown in greenhouse. The effect of some simple processing procedures on both ethion and imidacloprid residues were also studied. Ten active greenhouses that produce cucumber were randomly selected. Ethion and imidacloprid as the most widely used pesticides were measured in cucumber samples of studied greenhouses. Moreover, the effect of storing, washing, and peeling as simple processing procedures on both ethion and imidacloprid residues were investigated. One hour after pesticide application; the maximum residue levels (MRLs) of ethion and imidacloprid were higher than that of Codex standard level. One day after pesticide application, the levels of pesticides were decreased about 35 and 31% for ethion and imidacloprid, respectively, which still were higher than the MRL. Washing procedure led to about 51 and 42.5% loss in ethion and imidacloprid residues, respectively. Peeling procedure also led to highest loss of 93.4 and 63.7% in ethion and imidacloprid residues, respectively. The recovery for both target analytes was in the range between 88 and 102%. The residue values in collected samples one hour after pesticides application were higher than standard value. The storing, washing, and peeling procedures lead to the decrease of pesticide residues in greenhouse cucumbers. Among them, the peeling procedure has the greatest impact on residual reduction. Therefore, these procedures can be used as simple and effective processing techniques for reducing and removing pesticides from greenhouse products before their consumption.

  5. Soil Gas Sampling

    Science.gov (United States)

    Field Branches Quality System and Technical Procedures: This document describes general and specific procedures, methods and considerations to be used and observed when collecting soil gas samples for field screening or laboratory analysis.

  6. A random cluster survey and a convenience sample give comparable estimates of immunity to vaccine preventable diseases in children of school age in Victoria, Australia.

    Science.gov (United States)

    Kelly, Heath; Riddell, Michaela A; Gidding, Heather F; Nolan, Terry; Gilbert, Gwendolyn L

    2002-08-19

    We compared estimates of the age-specific population immunity to measles, mumps, rubella, hepatitis B and varicella zoster viruses in Victorian school children obtained by a national sero-survey, using a convenience sample of residual sera from diagnostic laboratories throughout Australia, with those from a three-stage random cluster survey. When grouped according to school age (primary or secondary school) there was no significant difference in the estimates of immunity to measles, mumps, hepatitis B or varicella. Compared with the convenience sample, the random cluster survey estimated higher immunity to rubella in samples from both primary (98.7% versus 93.6%, P = 0.002) and secondary school students (98.4% versus 93.2%, P = 0.03). Despite some limitations, this study suggests that the collection of a convenience sample of sera from diagnostic laboratories is an appropriate sampling strategy to provide population immunity data that will inform Australia's current and future immunisation policies. Copyright 2002 Elsevier Science Ltd.

  7. Standard practice for sampling special nuclear materials in multi-container lots

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1987-01-01

    1.1 This practice provides an aid in designing a sampling and analysis plan for the purpose of minimizing random error in the measurement of the amount of nuclear material in a lot consisting of several containers. The problem addressed is the selection of the number of containers to be sampled, the number of samples to be taken from each sampled container, and the number of aliquot analyses to be performed on each sample. 1.2 This practice provides examples for application as well as the necessary development for understanding the statistics involved. The uniqueness of most situations does not allow presentation of step-by-step procedures for designing sampling plans. It is recommended that a statistician experienced in materials sampling be consulted when developing such plans. 1.3 The values stated in SI units are to be regarded as the standard. 1.4 This standard does not purport to address all of the safety problems, if any, associated with its use. It is the responsibility of the user of this standar...

  8. The contribution of simple random sampling to observed variations in faecal egg counts.

    Science.gov (United States)

    Torgerson, Paul R; Paul, Michaela; Lewis, Fraser I

    2012-09-10

    It has been over 100 years since the classical paper published by Gosset in 1907, under the pseudonym "Student", demonstrated that yeast cells suspended in a fluid and measured by a haemocytometer conformed to a Poisson process. Similarly parasite eggs in a faecal suspension also conform to a Poisson process. Despite this there are common misconceptions how to analyse or interpret observations from the McMaster or similar quantitative parasitic diagnostic techniques, widely used for evaluating parasite eggs in faeces. The McMaster technique can easily be shown from a theoretical perspective to give variable results that inevitably arise from the random distribution of parasite eggs in a well mixed faecal sample. The Poisson processes that lead to this variability are described and illustrative examples of the potentially large confidence intervals that can arise from observed faecal eggs counts that are calculated from the observations on a McMaster slide. Attempts to modify the McMaster technique, or indeed other quantitative techniques, to ensure uniform egg counts are doomed to failure and belie ignorance of Poisson processes. A simple method to immediately identify excess variation/poor sampling from replicate counts is provided. Copyright © 2012 Elsevier B.V. All rights reserved.

  9. Cluster analysis of commercial samples of Bauhinia spp. using HPLC-UV/PDA and MCR-ALS/PCA without peak alignment procedure.

    Science.gov (United States)

    Ardila, Jorge Armando; Funari, Cristiano Soleo; Andrade, André Marques; Cavalheiro, Alberto José; Carneiro, Renato Lajarim

    2015-01-01

    Bauhinia forficata Link. is recognised by the Brazilian Health Ministry as a treatment of hypoglycemia and diabetes. Analytical methods are useful to assess the plant identity due the similarities found in plants from Bauhinia spp. HPLC-UV/PDA in combination with chemometric tools is an alternative widely used and suitable for authentication of plant material, however, the shifts of retention times for similar compounds in different samples is a problem. To perform comparisons between the authentic medicinal plant (Bauhinia forficata Link.) and samples commercially available in drugstores claiming to be "Bauhinia spp. to treat diabetes" and to evaluate the performance of multivariate curve resolution - alternating least squares (MCR-ALS) associated to principal component analysis (PCA) when compared to pure PCA. HPLC-UV/PDA data obtained from extracts of leaves were evaluated employing a combination of MCR-ALS and PCA, which allowed the use of the full chromatographic and spectrometric information without the need of peak alignment procedures. The use of MCR-ALS/PCA showed better results than the conventional PCA using only one wavelength. Only two of nine commercial samples presented characteristics similar to the authentic Bauhinia forficata spp., considering the full HPLC-UV/PDA data. The combination of MCR-ALS and PCA is very useful when applied to a group of samples where a general alignment procedure could not be applied due to the different chromatographic profiles. This work also demonstrates the need of more strict control from the health authorities regarding herbal products available on the market. Copyright © 2015 John Wiley & Sons, Ltd.

  10. High resolution 4-D spectroscopy with sparse concentric shell sampling and FFT-CLEAN.

    Science.gov (United States)

    Coggins, Brian E; Zhou, Pei

    2008-12-01

    Recent efforts to reduce the measurement time for multidimensional NMR experiments have fostered the development of a variety of new procedures for sampling and data processing. We recently described concentric ring sampling for 3-D NMR experiments, which is superior to radial sampling as input for processing by a multidimensional discrete Fourier transform. Here, we report the extension of this approach to 4-D spectroscopy as Randomized Concentric Shell Sampling (RCSS), where sampling points for the indirect dimensions are positioned on concentric shells, and where random rotations in the angular space are used to avoid coherent artifacts. With simulations, we show that RCSS produces a very low level of artifacts, even with a very limited number of sampling points. The RCSS sampling patterns can be adapted to fine rectangular grids to permit use of the Fast Fourier Transform in data processing, without an apparent increase in the artifact level. These artifacts can be further reduced to the noise level using the iterative CLEAN algorithm developed in radioastronomy. We demonstrate these methods on the high resolution 4-D HCCH-TOCSY spectrum of protein G's B1 domain, using only 1.2% of the sampling that would be needed conventionally for this resolution. The use of a multidimensional FFT instead of the slow DFT for initial data processing and for subsequent CLEAN significantly reduces the calculation time, yielding an artifact level that is on par with the level of the true spectral noise.

  11. Use of Monte Carlo Bootstrap Method in the Analysis of Sample Sufficiency for Radioecological Data

    International Nuclear Information System (INIS)

    Silva, A. N. C. da; Amaral, R. S.; Araujo Santos Jr, J.; Wilson Vieira, J.; Lima, F. R. de A.

    2015-01-01

    There are operational difficulties in obtaining samples for radioecological studies. Population data may no longer be available during the study and obtaining new samples may not be possible. These problems do the researcher sometimes work with a small number of data. Therefore, it is difficult to know whether the number of samples will be sufficient to estimate the desired parameter. Hence, it is critical do the analysis of sample sufficiency. It is not interesting uses the classical methods of statistic to analyze sample sufficiency in Radioecology, because naturally occurring radionuclides have a random distribution in soil, usually arise outliers and gaps with missing values. The present work was developed aiming to apply the Monte Carlo Bootstrap method in the analysis of sample sufficiency with quantitative estimation of a single variable such as specific activity of a natural radioisotope present in plants. The pseudo population was a small sample with 14 values of specific activity of 226 Ra in forage palm (Opuntia spp.). Using the R software was performed a computational procedure to calculate the number of the sample values. The re sampling process with replacement took the 14 values of original sample and produced 10,000 bootstrap samples for each round. Then was calculated the estimated average θ for samples with 2, 5, 8, 11 and 14 values randomly selected. The results showed that if the researcher work with only 11 sample values, the average parameter will be within a confidence interval with 90% probability . (Author)

  12. Gram-negative and -positive bacteria differentiation in blood culture samples by headspace volatile compound analysis.

    Science.gov (United States)

    Dolch, Michael E; Janitza, Silke; Boulesteix, Anne-Laure; Graßmann-Lichtenauer, Carola; Praun, Siegfried; Denzer, Wolfgang; Schelling, Gustav; Schubert, Sören

    2016-12-01

    Identification of microorganisms in positive blood cultures still relies on standard techniques such as Gram staining followed by culturing with definite microorganism identification. Alternatively, matrix-assisted laser desorption/ionization time-of-flight mass spectrometry or the analysis of headspace volatile compound (VC) composition produced by cultures can help to differentiate between microorganisms under experimental conditions. This study assessed the efficacy of volatile compound based microorganism differentiation into Gram-negatives and -positives in unselected positive blood culture samples from patients. Headspace gas samples of positive blood culture samples were transferred to sterilized, sealed, and evacuated 20 ml glass vials and stored at -30 °C until batch analysis. Headspace gas VC content analysis was carried out via an auto sampler connected to an ion-molecule reaction mass spectrometer (IMR-MS). Measurements covered a mass range from 16 to 135 u including CO2, H2, N2, and O2. Prediction rules for microorganism identification based on VC composition were derived using a training data set and evaluated using a validation data set within a random split validation procedure. One-hundred-fifty-two aerobic samples growing 27 Gram-negatives, 106 Gram-positives, and 19 fungi and 130 anaerobic samples growing 37 Gram-negatives, 91 Gram-positives, and two fungi were analysed. In anaerobic samples, ten discriminators were identified by the random forest method allowing for bacteria differentiation into Gram-negative and -positive (error rate: 16.7 % in validation data set). For aerobic samples the error rate was not better than random. In anaerobic blood culture samples of patients IMR-MS based headspace VC composition analysis facilitates bacteria differentiation into Gram-negative and -positive.

  13. Implementing self sustained quality control procedures in a clinical laboratory.

    Science.gov (United States)

    Khatri, Roshan; K C, Sanjay; Shrestha, Prabodh; Sinha, J N

    2013-01-01

    Quality control is an essential component in every clinical laboratory which maintains the excellence of laboratory standards, supplementing to proper disease diagnosis, patient care and resulting in overall strengthening of health care system. Numerous quality control schemes are available, with combinations of procedures, most of which are tedious, time consuming and can be "too technical" whereas commercially available quality control materials can be expensive especially for laboratories in developing nations like Nepal. Here, we present a procedure performed at our centre with self prepared control serum and use of simple statistical tools for quality assurance. The pooled serum was prepared as per guidelines for preparation of stabilized liquid quality control serum from human sera. Internal Quality Assessment was performed on this sample, on a daily basis which included measurement of 12 routine biochemical parameters. The results were plotted on Levey-Jennings charts and analysed with quality control rules, for a period of one month. The mean levels of biochemical analytes in self prepared control serum were within normal physiological range. This serum was evaluated every day along with patients' samples. The results obtained were plotted on control charts and analysed using common quality control rules to identify possible systematic and random errors. Immediate mitigation measures were taken and the dispatch of erroneous reports was avoided. In this study we try to highlight on a simple internal quality control procedure which can be performed by laboratories, with minimum technology, expenditure, and expertise and improve reliability and validity of the test reports.

  14. Rapid, easy, and cheap randomization: prospective evaluation in a study cohort

    Directory of Open Access Journals (Sweden)

    Parker Melissa J

    2012-06-01

    Full Text Available Abstract Background When planning a randomized controlled trial (RCT, investigators must select randomization and allocation procedures based upon a variety of factors. While third party randomization is cited as being among the most desirable randomization processes, many third party randomization procedures are neither feasible nor cost-effective for small RCTs, including pilot RCTs. In this study we present our experience with a third party randomization and allocation procedure that utilizes current technology to achieve randomization in a rapid, reliable, and cost-effective manner. Methods This method was developed by the investigators for use in a small 48-participant parallel group RCT with four study arms. As a nested study, the reliability of this randomization procedure was prospectively evaluated in this cohort. The primary outcome of this nested study was the proportion of subjects for whom allocation information was obtained by the Research Assistant within 15 min of the initial participant randomization request. A secondary outcome was the average time for communicating participant group assignment back to the Research Assistant. Descriptive information regarding any failed attempts at participant randomization as well as costs attributable to use of this method were also recorded. Statistical analyses included the calculation of simple proportions and descriptive statistics. Results Forty-eight participants were successfully randomized and group allocation instruction was received for 46 (96% within 15 min of the Research Assistant placing the initial randomization request. Time elapsed in minutes until receipt of participant allocation instruction was Mean (SD 3.1 +/− 3.6; Median (IQR 2 (2,3; Range (1–20 for the entire cohort of 48. For the two participants for whom group allocation information was not received by the Research Assistant within the 15-min pass threshold, this information was obtained following a second

  15. Summative Mass Analysis of Algal Biomass - Integration of Analytical Procedures: Laboratory Analytical Procedure (LAP)

    Energy Technology Data Exchange (ETDEWEB)

    Laurens, Lieve M. L.

    2016-01-13

    This procedure guides the integration of laboratory analytical procedures to measure algal biomass constituents in an unambiguous manner and ultimately achieve mass balance closure for algal biomass samples. Many of these methods build on years of research in algal biomass analysis.

  16. Standard practices for sampling uranium-Ore concentrate

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 These practices are intended to provide the nuclear industry with procedures for obtaining representative bulk samples from uranium-ore concentrates (UOC) (see Specification C967). 1.2 These practices also provide for obtaining a series of representative secondary samples from the original bulk sample for the determination of moisture and other test purposes, and for the preparation of pulverized analytical samples (see Test Methods C1022). 1.3 These practices consist of a number of alternative procedures for sampling and sample preparation which have been shown to be satisfactory through long experience in the nuclear industry. These procedures are described in the following order. Stage Procedure Section Primary Sampling One-stage falling stream 4 Two-stage falling stream 5 Auger 6 Secondary Sampling Straight-path (reciprocating) 7 Rotating (Vezin) 8, 9 Sample Preparation 10 Concurrent-drying 11-13 Natural moisture 14-16 Calcination 17, 18 Sample Packaging 19 Wax s...

  17. Early detection of structual changes in random signal

    International Nuclear Information System (INIS)

    Kuroda, Yoshiteru; Yokota, Katsuhiro

    1981-01-01

    Early detection of structual changes in observed random signal is very important from the point of system diagnosis. In this paper, the following procedures are applied to this problem and the results are compared. (1) auto-regressive model to random signal to calculate the prediction error, i.e., the defference between observed and predicted values. (2) auto-regressive method to caluculate the sum of the prediction error. (3) a method is based on AIC (Akaike Information Criterion). Simulation is made of these procedures, indicating their merits and demerits as a diagostic tools. (author)

  18. A randomized controlled trial of electrocoagulation-enabled biopsy versus conventional biopsy in the diagnosis of endobronchial lesions.

    Science.gov (United States)

    Khan, Ajmal; Aggarwal, Ashutosh N; Agarwal, Ritesh; Bal, Amanjit; Gupta, Dheeraj

    2011-01-01

    Although electrocoagulation at time of endobronchial biopsy can potentially reduce procedure-related bleeding during fiberoptic bronchoscopy (FOB), it can also impair quality of tissue specimen; credible data for either are lacking. To evaluate the impact of hot biopsy on the quality of tissue samples and to quantify the amount of procedure-related bleeding during endobronchial biopsy. In this single-center, prospective, single-blind, randomized controlled study we included adult patients referred for FOB and having endobronchial lesions. Patients were randomized to bronchial biopsy using an electrocoagulation-enabled biopsy forceps, with (EC+ group) or without (EC- group) application of electrocoagulation current (40 W for 10 s in a monopolar mode). Procedure-related bleeding was semi-quantified by observer description, as well as through a visual analogue scale. Overall quality of biopsy specimen and tissue damage were assessed and graded by a pulmonary pathologist blinded to FOB details. 160 patients were randomized to endobronchial biopsy with (n = 81) or without (n = 79) the application of electrocoagulation. There were no severe bleeding episodes in either group, and severity of bleeding in the EC+ and EC- groups was similar (median visual analogue scale scores of 14 and 16, respectively). Histopathological diagnosis was similar in the EC+ and EC- groups (77.8% and 82.3%, respectively). There was no significant difference in tissue quality between the two groups. Use of electrocoagulation-enabled endobronchial biopsy does not alter specimen quality and does not result in any significant reduction in procedure-related bleeding. Copyright © 2010 S. Karger AG, Basel.

  19. Does self-selection affect samples' representativeness in online surveys? An investigation in online video game research.

    Science.gov (United States)

    Khazaal, Yasser; van Singer, Mathias; Chatton, Anne; Achab, Sophia; Zullino, Daniele; Rothen, Stephane; Khan, Riaz; Billieux, Joel; Thorens, Gabriel

    2014-07-07

    The number of medical studies performed through online surveys has increased dramatically in recent years. Despite their numerous advantages (eg, sample size, facilitated access to individuals presenting stigmatizing issues), selection bias may exist in online surveys. However, evidence on the representativeness of self-selected samples in online studies is patchy. Our objective was to explore the representativeness of a self-selected sample of online gamers using online players' virtual characters (avatars). All avatars belonged to individuals playing World of Warcraft (WoW), currently the most widely used online game. Avatars' characteristics were defined using various games' scores, reported on the WoW's official website, and two self-selected samples from previous studies were compared with a randomly selected sample of avatars. We used scores linked to 1240 avatars (762 from the self-selected samples and 478 from the random sample). The two self-selected samples of avatars had higher scores on most of the assessed variables (except for guild membership and exploration). Furthermore, some guilds were overrepresented in the self-selected samples. Our results suggest that more proficient players or players more involved in the game may be more likely to participate in online surveys. Caution is needed in the interpretation of studies based on online surveys that used a self-selection recruitment procedure. Epidemiological evidence on the reduced representativeness of sample of online surveys is warranted.

  20. Different Analytical Procedures for the Study of Organic Residues in Archeological Ceramic Samples with the Use of Gas Chromatography-mass Spectrometry.

    Science.gov (United States)

    Kałużna-Czaplińska, Joanna; Rosiak, Angelina; Kwapińska, Marzena; Kwapiński, Witold

    2016-01-01

    The analysis of the composition of organic residues present in pottery is an important source of information for historians and archeologists. Chemical characterization of the materials provides information on diets, habits, technologies, and original use of the vessels. This review presents the problem of analytical studies of archeological materials with a special emphasis on organic residues. Current methods used in the determination of different organic compounds in archeological ceramics are presented. Particular attention is paid to the procedures of analysis of archeological ceramic samples used before gas chromatography-mass spectrometry. Advantages and disadvantages of different extraction methods and application of proper quality assurance/quality control procedures are discussed.

  1. Stratified random sampling plans designed to assist in the determination of radon and radon daughter concentrations in underground uranium mine atmosphere

    International Nuclear Information System (INIS)

    Makepeace, C.E.

    1981-01-01

    Sampling strategies for the monitoring of deleterious agents present in uranium mine air in underground and surface mining areas are described. These methods are designed to prevent overexposure of the lining of the respiratory system of uranium miners to ionizing radiation from radon and radon daughters, and whole body overexposure to external gamma radiation. A detailed description is provided of stratified random sampling monitoring methodology for obtaining baseline data to be used as a reference for subsequent compliance assessment

  2. The quality of the reported sample size calculations in randomized controlled trials indexed in PubMed.

    Science.gov (United States)

    Lee, Paul H; Tse, Andy C Y

    2017-05-01

    There are limited data on the quality of reporting of information essential for replication of the calculation as well as the accuracy of the sample size calculation. We examine the current quality of reporting of the sample size calculation in randomized controlled trials (RCTs) published in PubMed and to examine the variation in reporting across study design, study characteristics, and journal impact factor. We also reviewed the targeted sample size reported in trial registries. We reviewed and analyzed all RCTs published in December 2014 with journals indexed in PubMed. The 2014 Impact Factors for the journals were used as proxies for their quality. Of the 451 analyzed papers, 58.1% reported an a priori sample size calculation. Nearly all papers provided the level of significance (97.7%) and desired power (96.6%), and most of the papers reported the minimum clinically important effect size (73.3%). The median (inter-quartile range) of the percentage difference of the reported and calculated sample size calculation was 0.0% (IQR -4.6%;3.0%). The accuracy of the reported sample size was better for studies published in journals that endorsed the CONSORT statement and journals with an impact factor. A total of 98 papers had provided targeted sample size on trial registries and about two-third of these papers (n=62) reported sample size calculation, but only 25 (40.3%) had no discrepancy with the reported number in the trial registries. The reporting of the sample size calculation in RCTs published in PubMed-indexed journals and trial registries were poor. The CONSORT statement should be more widely endorsed. Copyright © 2016 European Federation of Internal Medicine. Published by Elsevier B.V. All rights reserved.

  3. Sampling method of environmental radioactivity monitoring

    International Nuclear Information System (INIS)

    1984-01-01

    This manual provides sampling methods of environmental samples of airborne dust, precipitated dust, precipitated water (rain or snow), fresh water, soil, river sediment or lake sediment, discharged water from a nuclear facility, grains, tea, milk, pasture grass, limnetic organisms, daily diet, index organisms, sea water, marine sediment, marine organisms, and that for tritium and radioiodine determination for radiation monitoring from radioactive fallout or radioactivity release by nuclear facilities. This manual aims at the presentation of standard sampling procedures for environmental radioactivity monitoring regardless of monitoring objectives, and shows preservation method of environmental samples acquired at the samplingpoint for radiation counting for those except human body. Sampling techniques adopted in this manual is decided by the criteria that they are suitable for routine monitoring and any special skillfulness is not necessary. Based on the above-mentioned principle, this manual presents outline and aims of sampling, sampling position or object, sampling quantity, apparatus, equipment or vessel for sampling, sampling location, sampling procedures, pretreatment and preparation procedures of a sample for radiation counting, necessary recording items for sampling and sample transportation procedures. Special attention is described in the chapter of tritium and radioiodine because these radionuclides might be lost by the above-mentioned sample preservation method for radiation counting of less volatile radionuclides than tritium or radioiodine. (Takagi, S.)

  4. Randomized controlled trial of attention bias modification in a racially diverse, socially anxious, alcohol dependent sample.

    Science.gov (United States)

    Clerkin, Elise M; Magee, Joshua C; Wells, Tony T; Beard, Courtney; Barnett, Nancy P

    2016-12-01

    Attention biases may be an important treatment target for both alcohol dependence and social anxiety. This is the first ABM trial to investigate two (vs. one) targets of attention bias within a sample with co-occurring symptoms of social anxiety and alcohol dependence. Additionally, we used trial-level bias scores (TL-BS) to capture the phenomena of attention bias in a more ecologically valid, dynamic way compared to traditional attention bias scores. Adult participants (N = 86; 41% Female; 52% African American; 40% White) with elevated social anxiety symptoms and alcohol dependence were randomly assigned to an 8-session training condition in this 2 (Social Anxiety ABM vs. Social Anxiety Control) by 2 (Alcohol ABM vs. Alcohol Control) design. Symptoms of social anxiety, alcohol dependence, and attention bias were assessed across time. Multilevel models estimated the trajectories for each measure within individuals, and tested whether these trajectories differed according to the randomized training conditions. Across time, there were significant or trending decreases in all attention TL-BS parameters (but not traditional attention bias scores) and most symptom measures. However, there were not significant differences in the trajectories of change between any ABM and control conditions for any symptom measures. These findings add to previous evidence questioning the robustness of ABM and point to the need to extend the effects of ABM to samples that are racially diverse and/or have co-occurring psychopathology. The results also illustrate the potential importance of calculating trial-level attention bias scores rather than only including traditional bias scores. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Randomized Controlled Trial of Attention Bias Modification in a Racially Diverse, Socially Anxious, Alcohol Dependent Sample

    Science.gov (United States)

    Clerkin, Elise M.; Magee, Joshua C.; Wells, Tony T.; Beard, Courtney; Barnett, Nancy P.

    2016-01-01

    Objective Attention biases may be an important treatment target for both alcohol dependence and social anxiety. This is the first ABM trial to investigate two (vs. one) targets of attention bias within a sample with co-occurring symptoms of social anxiety and alcohol dependence. Additionally, we used trial-level bias scores (TL-BS) to capture the phenomena of attention bias in a more ecologically valid, dynamic way compared to traditional attention bias scores. Method Adult participants (N=86; 41% Female; 52% African American; 40% White) with elevated social anxiety symptoms and alcohol dependence were randomly assigned to an 8-session training condition in this 2 (Social Anxiety ABM vs. Social Anxiety Control) by 2 (Alcohol ABM vs. Alcohol Control) design. Symptoms of social anxiety, alcohol dependence, and attention bias were assessed across time. Results Multilevel models estimated the trajectories for each measure within individuals, and tested whether these trajectories differed according to the randomized training conditions. Across time, there were significant or trending decreases in all attention TL-BS parameters (but not traditional attention bias scores) and most symptom measures. However, there were not significant differences in the trajectories of change between any ABM and control conditions for any symptom measures. Conclusions These findings add to previous evidence questioning the robustness of ABM and point to the need to extend the effects of ABM to samples that are racially diverse and/or have co-occurring psychopathology. The results also illustrate the potential importance of calculating trial-level attention bias scores rather than only including traditional bias scores. PMID:27591918

  6. States and state-preparing procedures in quantum mechanics

    International Nuclear Information System (INIS)

    Benioff, P.A.; Ekstein, Hans

    D'Espagnat and others have shown that different preparation procedures that mix systems prepared in unequivalent states and objectively different, are nevertheless assigned the same state. This unpalatable result follows from the usual interpretative rules of quantum mechanics. It is shown here that this result is incompatible with the strengthened interpretative rules (requiring randomness of the measurement outcome sequence) recently proposed. Thus, the randomness requirement restores reasonableness

  7. Minimization of the blank values in the neutron activation analysis of biological samples considering the whole procedure

    International Nuclear Information System (INIS)

    Lux, F.; Bereznai, T.; Trebert Haeberlin, S.

    1987-01-01

    During the determination of trace element contents of animal tissue by neutron activation analysis in the course of structure-activity relationship studies on platinum containing cancer drugs and wound healing the authors tried to minimize the blank values that are caused by different sources of contamination during surgery, sampling and the activation analysis procedure. The following topics were investigated: abrasions from scalpels made of stainless steel, titanium or quartz, the type of surgery, the surface contaminations of the quartz ampoules, etc. The measures to be taken to reduce tha blank values are described. (author) 19 refs.; 4 tables

  8. A Computerized Approach to Trickle-Process, Random Assignment.

    Science.gov (United States)

    Braucht, G. Nicholas; Reichardt, Charles S.

    1993-01-01

    Procedures for implementing random assignment with trickle processing and ways they can be corrupted are described. A computerized method for implementing random assignment with trickle processing is presented as a desirable alternative in many situations and a way of protecting against threats to assignment validity. (SLD)

  9. Equating Multidimensional Tests under a Random Groups Design: A Comparison of Various Equating Procedures

    Science.gov (United States)

    Lee, Eunjung

    2013-01-01

    The purpose of this research was to compare the equating performance of various equating procedures for the multidimensional tests. To examine the various equating procedures, simulated data sets were used that were generated based on a multidimensional item response theory (MIRT) framework. Various equating procedures were examined, including…

  10. EML procedures manual

    International Nuclear Information System (INIS)

    Volchok, H.L.; de Planque, G.

    1982-01-01

    This manual contains the procedures that are used currently by the Environmental Measurements Laboratory of the US Department of Energy. In addition a number of analytical methods from other laboratories have been included. These were tested for reliability at the Battelle, Pacific Northwest Laboratory under contract with the Division of Biomedical and Environmental Research of the AEC. These methods are clearly distinguished. The manual is prepared in loose leaf form to facilitate revision of the procedures and inclusion of additional procedures or data sheets. Anyone receiving the manual through EML should receive this additional material automatically. The contents are as follows: (1) general; (2) sampling; (3) field measurements; (4) general analytical chemistry; (5) chemical procedures; (6) data section; (7) specifications

  11. A novel solid phase extraction procedure on Amberlite XAD-1180 for speciation of Cr(III), Cr(VI) and total chromium in environmental and pharmaceutical samples

    International Nuclear Information System (INIS)

    Narin, Ibrahim; Kars, Ayse; Soylak, Mustafa

    2008-01-01

    Due to the toxicity of chromium, species depend on their chemical properties and bioavailabilities, speciation of chromium is very important in environmental samples. A speciation procedure for chromium(III), chromium(VI) and total chromium in environmental samples is presented in this work, prior to flame atomic absorption spectrometric determination of chromium. The procedure is based on the adsorption of Cr(III)-diphenylcarbazone complex on Amberlite XAD-1180 resin. After oxidation of Cr(III), the developed solid phase extraction system was applied to determinate the total chromium. Cr(III) was calculated as the difference between the total Cr content and the Cr(VI) content. The analytical conditions for the quantitative recoveries of Cr(VI) on Amberlite XAD-1180 resin were investigated. The effects of some alkaline, earth alkaline, metal ions and also some anions were also examined. Preconcentration factor was found to be 75. The detection limits (LOD) based on three times sigma of the blank (N: 21) for Cr(VI) and total chromium were 7.7 and 8.6 μg/L, respectively. Satisfactory results for the analysis of total chromium in the stream sediment (GBW7310) certified reference material for the validation of the presented method was obtained. The procedure was applied to food, water and pharmaceutical samples successfully

  12. Computer generation of random deviates

    International Nuclear Information System (INIS)

    Cormack, John

    1991-01-01

    The need for random deviates arises in many scientific applications. In medical physics, Monte Carlo simulations have been used in radiology, radiation therapy and nuclear medicine. Specific instances include the modelling of x-ray scattering processes and the addition of random noise to images or curves in order to assess the effects of various processing procedures. Reliable sources of random deviates with statistical properties indistinguishable from true random deviates are a fundamental necessity for such tasks. This paper provides a review of computer algorithms which can be used to generate uniform random deviates and other distributions of interest to medical physicists, along with a few caveats relating to various problems and pitfalls which can occur. Source code listings for the generators discussed (in FORTRAN, Turbo-PASCAL and Data General ASSEMBLER) are available on request from the authors. 27 refs., 3 tabs., 5 figs

  13. Wishart and anti-Wishart random matrices

    International Nuclear Information System (INIS)

    Janik, Romuald A; Nowak, Maciej A

    2003-01-01

    We provide a compact exact representation for the distribution of the matrix elements of the Wishart-type random matrices A † A, for any finite number of rows and columns of A, without any large N approximations. In particular, we treat the case when the Wishart-type random matrix contains redundant, non-random information, which is a new result. This representation is of interest for a procedure for reconstructing the redundant information hidden in Wishart matrices, with potential applications to numerous models based on biological, social and artificial intelligence networks

  14. [Identification and sampling of people with migration background for epidemiological studies in Germany].

    Science.gov (United States)

    Reiss, K; Makarova, N; Spallek, J; Zeeb, H; Razum, O

    2013-06-01

    In 2009, 19.6% of the population of Germany either had migrated themselves or were the offspring of people with migration experience. Migrants differ from the autochthonous German population in terms of health status, health awareness and health behaviour. To further investigate the health situation of migrants in Germany, epidemiological studies are needed. Such studies can employ existing databases which provide detailed information on migration status. Otherwise, onomastic or toponomastic procedures can be applied to identify people with migration background. If migrants have to be recruited into an epidemiological study, this can be done register-based (e. g., data from registration offices or telephone lists), based on residential location (random-route or random-walk procedure), via snowball sampling (e. g., through key persons) or via settings (e. g., school entry examination). An oversampling of people with migration background is not sufficient to avoid systematic bias in the sample due to non-participation. Additional measures have to be taken to increase access and raise participation rates. Personal contacting, multilingual instruments, multilingual interviewers and extensive public relations increase access and willingness to participate. Empirical evidence on 'successful' recruitment strategies for studies with migrants is still lacking in epidemiology and health sciences in Germany. The choice of the recruitment strategy as well as the measures to raise accessibility and willingness to participate depend on the available resources, the research question and the specific migrant target group. © Georg Thieme Verlag KG Stuttgart · New York.

  15. A sensitive analytical procedure for monitoring acrylamide in environmental water samples by offline SPE-UPLC/MS/MS.

    Science.gov (United States)

    Togola, Anne; Coureau, Charlotte; Guezennec, Anne-Gwenaëlle; Touzé, Solène

    2015-05-01

    The presence of acrylamide in natural systems is of concern from both environmental and health points of view. We developed an accurate and robust analytical procedure (offline solid phase extraction combined with UPLC/MS/MS) with a limit of quantification (20 ng L(-1)) compatible with toxicity threshold values. The optimized (considering the nature of extraction phases, sampling volumes, and solvent of elution) solid phase extraction (SPE) was validated according to ISO Standard ISO/IEC 17025 on groundwater, surface water, and industrial process water samples. Acrylamide is highly polar, which induces a high variability during the SPE step, therefore requiring the use of C(13)-labeled acrylamide as an internal standard to guarantee the accuracy and robustness of the method (uncertainty about 25 % (k = 2) at limit of quantification level). The specificity of the method and the stability of acrylamide were studied for these environmental media, and it was shown that the method is suitable for measuring acrylamide in environmental studies.

  16. Analysis procedure for americium in environmental samples

    International Nuclear Information System (INIS)

    Holloway, R.W.; Hayes, D.W.

    1982-01-01

    Several methods for the analysis of 241 Am in environmental samples were evaluated and a preferred method was selected. This method was modified and used to determine the 241 Am content in sediments, biota, and water. The advantages and limitations of the method are discussed. The method is also suitable for 244 Cm analysis

  17. The principles, procedures and pitfalls in identifying archaeological and historical wood samples.

    Science.gov (United States)

    Cartwright, Caroline R

    2015-07-01

    The science of wood anatomy has evolved in recent decades to add archaeological and historical wood to its repertoire of documenting and characterizing modern and fossil woods. The increasing use of online wood anatomy databases and atlases has fostered the adoption of an international consensus regarding terminology, largely through the work of the International Association of Wood Anatomists (IAWA). This review presents an overview for the general reader of the current state of principles and procedures involved in the study of the wood anatomy of archaeological and historical specimens, some of which may be preserved through charring, waterlogging, desiccation or mineral replacement. By means of selected case studies, the review evaluates to what extent varying preservation of wood anatomical characteristics limits the level of identification to taxon. It assesses the role played by increasingly accessible scanning electron microscopes and complex optical microscopes, and whether these, on the one hand, provide exceptional opportunities for high-quality imaging and analysis of difficult samples, but, on the other hand, might be misleading the novice into thinking that advanced technology can be a substitute for specialized botanical training in wood anatomy. © The Author 2015. Published by Oxford University Press on behalf of the Annals of Botany Company. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. Fatigue Damage Spectrum calculation in a Mission Synthesis procedure for Sine-on-Random excitations

    International Nuclear Information System (INIS)

    Angeli, Andrea; Troncossi, Marco; Cornelis, Bram

    2016-01-01

    In many real-life environments, certain mechanical and electronic components may be subjected to Sine-on-Random vibrations, i.e. excitations composed of random vibrations superimposed on deterministic (sinusoidal) contributions, in particular sine tones due to some rotating parts of the system (e.g. helicopters, engine-mounted components,...). These components must be designed to withstand the fatigue damage induced by the “composed” vibration environment, and qualification tests are advisable for the most critical ones. In the case of an accelerated qualification test, a proper test tailoring which starts from the real environment (measured vibration signals) and which preserves not only the accumulated fatigue damage but also the “nature” of the excitation (i.e. sinusoidal components plus random process) is important to obtain reliable results. In this paper, the classic time domain approach is taken as a reference for the comparison of different methods for the Fatigue Damage Spectrum (FDS) calculation in case of Sine-on-Random vibration environments. Then, a methodology to compute a Sine-on-Random specification based on a mission FDS is proposed. (paper)

  19. Revisiting sample size: are big trials the answer?

    Science.gov (United States)

    Lurati Buse, Giovanna A L; Botto, Fernando; Devereaux, P J

    2012-07-18

    The superiority of the evidence generated in randomized controlled trials over observational data is not only conditional to randomization. Randomized controlled trials require proper design and implementation to provide a reliable effect estimate. Adequate random sequence generation, allocation implementation, analyses based on the intention-to-treat principle, and sufficient power are crucial to the quality of a randomized controlled trial. Power, or the probability of the trial to detect a difference when a real difference between treatments exists, strongly depends on sample size. The quality of orthopaedic randomized controlled trials is frequently threatened by a limited sample size. This paper reviews basic concepts and pitfalls in sample-size estimation and focuses on the importance of large trials in the generation of valid evidence.

  20. Procedures can be learned on the Web: a randomized study of ultrasound-guided vascular access training.

    Science.gov (United States)

    Chenkin, Jordan; Lee, Shirley; Huynh, Thien; Bandiera, Glen

    2008-10-01

    Web-based learning has several potential advantages over lectures, such as anytime-anywhere access, rich multimedia, and nonlinear navigation. While known to be an effective method for learning facts, few studies have examined the effectiveness of Web-based formats for learning procedural skills. The authors sought to determine whether a Web-based tutorial is at least as effective as a didactic lecture for learning ultrasound-guided vascular access (UGVA). Participating staff emergency physicians (EPs) and junior emergency medicine (EM) residents with no UGVA experience completed a precourse test and were randomized to either a Web-based or a didactic group. The Web-based group was instructed to use an online tutorial and the didactic group attended a lecture. Participants then practiced on simulators and live models without any further instruction. Following a rest period, participants completed a four-station objective structured clinical examination (OSCE), a written examination, and a postcourse questionnaire. Examination results were compared using a noninferiority data analysis with a 10% margin of difference. Twenty-one residents and EPs participated in the study. There were no significant differences in mean OSCE scores (absolute difference = -2.8%; 95% confidence interval [CI] = -9.3% to 3.8%) or written test scores (absolute difference = -1.4%; 95% CI = -7.8% to 5.0%) between the Web group and the didactic group. Both groups demonstrated similar improvements in written test scores (26.1% vs. 25.8%; p = 0.95). Ninety-one percent (10/11) of the Web group and 80% (8/10) of the didactic group participants found the teaching format to be effective (p = 0.59). Our Web-based tutorial was at least as effective as a traditional didactic lecture for teaching the knowledge and skills essential for UGVA. Participants expressed high satisfaction with this teaching technology. Web-based teaching may be a useful alternative to didactic teaching for learning procedural

  1. Distribution of peak expiratory flow variability by age, gender and smoking habits in a random population sample aged 20-70 yrs

    NARCIS (Netherlands)

    Boezen, H M; Schouten, J. P.; Postma, D S; Rijcken, B

    1994-01-01

    Peak expiratory flow (PEF) variability can be considered as an index of bronchial lability. Population studies on PEF variability are few. The purpose of the current paper is to describe the distribution of PEF variability in a random population sample of adults with a wide age range (20-70 yrs),

  2. Development of an analytical procedure for plutonium in the concentration range of femtogram/gram and its application to environmental samples

    International Nuclear Information System (INIS)

    Schuettelkopf, H.

    1981-09-01

    To study the behaviour of plutonium in the environment and to measure plutonium in the vicinity of nuclear facilities, a quick, sensitive analytical method is required which can be applied to all sample materials found in the environment. For a sediment contaminated with plutonium a boiling out method using first HNO 3 /HF and subsequently HNO 3 /Al(NO 3 ) 3 was found to be successful. The leaching solution was then extracted by TOPO and the plutonium backextracted by ascorbic acid/HCl. Some different purification steps and finally electroplating using ammonium oxalate led to an optimum sample for α- spectroscopic determination of plutonium. An analytical method was worked out for plutonium which can be applied to all materials found in the environment. The sample size is 100 g but it might also be much greater. The average chemical yield is 70 and 80%. The detection limit for soil samples is 0.1 fCi/g and for plant samples 0.5 fCi/g. One technician can perform eight analyses per working day. The analytical procedure was applied to a large number of environmental samples and the results of these analyses are indicated. (orig./RB) [de

  3. Microbial ecology laboratory procedures manual NASA/MSFC

    Science.gov (United States)

    Huff, Timothy L.

    1990-01-01

    An essential part of the efficient operation of any microbiology laboratory involved in sample analysis is a standard procedures manual. The purpose of this manual is to provide concise and well defined instructions on routine technical procedures involving sample analysis and methods for monitoring and maintaining quality control within the laboratory. Of equal importance is the safe operation of the laboratory. This manual outlines detailed procedures to be followed in the microbial ecology laboratory to assure safety, analytical control, and validity of results.

  4. Sampling procedure for lake or stream surface water chemistry

    Science.gov (United States)

    Robert Musselman

    2012-01-01

    Surface waters collected in the field for chemical analyses are easily contaminated. This research note presents a step-by-step detailed description of how to avoid sample contamination when field collecting, processing, and transporting surface water samples for laboratory analysis.

  5. Recovery of extracellular vesicles from human breast milk is influenced by sample collection and vesicle isolation procedures

    Directory of Open Access Journals (Sweden)

    Marijke I. Zonneveld

    2014-08-01

    Full Text Available Extracellular vesicles (EV in breast milk carry immune relevant proteins and could play an important role in the instruction of the neonatal immune system. To further analyze these EV and to elucidate their function it is important that native populations of EV can be recovered from (stored breast milk samples in a reproducible fashion. However, the impact of isolation and storage procedures on recovery of breast milk EV has remained underexposed. Here, we aimed to define parameters important for EV recovery from fresh and stored breast milk. To compare various protocols across different donors, breast milk was spiked with a well-defined murine EV population. We found that centrifugation of EV down into density gradients largely improved density-based separation and isolation of EV, compared to floatation up into gradients after high-force pelleting of EV. Using cryo-electron microscopy, we identified different subpopulations of human breast milk EV and a not previously described population of lipid tubules. Additionally, the impact of cold storage on breast milk EV was investigated. We determined that storing unprocessed breast milk at −80°C or 4°C caused death of cells present in breast milk, leading to contamination of the breast milk EV population with storage-induced EV. Here, an alternative method is proposed to store breast milk samples for EV analysis at later time points. The proposed adaptations to the breast milk storage and EV isolation procedures can be applied for EV-based biomarker profiling of breast milk and functional analysis of the role of breast milk EV in the development of the neonatal immune system.

  6. Scanning method as an unbiased simulation technique and its application to the study of self-attracting random walks

    Science.gov (United States)

    Meirovitch, Hagai

    1985-12-01

    The scanning method proposed by us [J. Phys. A 15, L735 (1982); Macromolecules 18, 563 (1985)] for simulation of polymer chains is further developed and applied, for the first time, to a model with finite interactions. In addition to ``importance sampling,'' we remove the bias introduced by the scanning method with a procedure suggested recently by Schmidt [Phys. Rev. Lett. 51, 2175 (1983)]; this procedure has the advantage of enabling one to estimate the statistical error. We find these two procedures to be equally efficient. The model studied is an N-step random walk on a lattice, in which a random walk i has a statistical weight &, where p∞ for any dimension d by Donsker and Varadhan (DV) and by others. and lnφ, where φ is the survival probability in the trapping problem, diverge like Nα with α=d/(d+2). Most numerical studies, however, have failed to reach the DV regime in which d/(d+2) becomes a good approximation for α. On the other hand, our results for α (obtained for Nzero, and that the probability of a walk returning to the origin behaves approximately as N-1/3 for both d=2 and 3.

  7. Prioritization to limit sampling and drilling in site investigations

    International Nuclear Information System (INIS)

    Burton, J.C.

    1992-01-01

    One of the major goals of the Environmental Research Division of Argonne National Laboratory is to develop and provide governmental agencies with technically sound, cost-effective frameworks for environmental site characterization and remedial programs. An example of the development of such a framework for preremedial site characterization is presented in this paper. Specifically, this paper presents portions of an expanded site investigation program developed for landfills suspected of containing hazardous waste. The work was sponsored by the New Mexico State Office of the US Department of Interior's Bureau of Land Management (BLM). The emphasis of the BLM program was on identifying initial characterization procedures that would decrease the need for sampling and drilling on a random grid

  8. Sampling strategies in antimicrobial resistance monitoring: evaluating how precision and sensitivity vary with the number of animals sampled per farm.

    Directory of Open Access Journals (Sweden)

    Takehisa Yamamoto

    Full Text Available Because antimicrobial resistance in food-producing animals is a major public health concern, many countries have implemented antimicrobial monitoring systems at a national level. When designing a sampling scheme for antimicrobial resistance monitoring, it is necessary to consider both cost effectiveness and statistical plausibility. In this study, we examined how sampling scheme precision and sensitivity can vary with the number of animals sampled from each farm, while keeping the overall sample size constant to avoid additional sampling costs. Five sampling strategies were investigated. These employed 1, 2, 3, 4 or 6 animal samples per farm, with a total of 12 animals sampled in each strategy. A total of 1,500 Escherichia coli isolates from 300 fattening pigs on 30 farms were tested for resistance against 12 antimicrobials. The performance of each sampling strategy was evaluated by bootstrap resampling from the observational data. In the bootstrapping procedure, farms, animals, and isolates were selected randomly with replacement, and a total of 10,000 replications were conducted. For each antimicrobial, we observed that the standard deviation and 2.5-97.5 percentile interval of resistance prevalence were smallest in the sampling strategy that employed 1 animal per farm. The proportion of bootstrap samples that included at least 1 isolate with resistance was also evaluated as an indicator of the sensitivity of the sampling strategy to previously unidentified antimicrobial resistance. The proportion was greatest with 1 sample per farm and decreased with larger samples per farm. We concluded that when the total number of samples is pre-specified, the most precise and sensitive sampling strategy involves collecting 1 sample per farm.

  9. Dynamical implications of sample shape for avalanches in 2-dimensional random-field Ising model with saw-tooth domain wall

    Science.gov (United States)

    Tadić, Bosiljka

    2018-03-01

    We study dynamics of a built-in domain wall (DW) in 2-dimensional disordered ferromagnets with different sample shapes using random-field Ising model on a square lattice rotated by 45 degrees. The saw-tooth DW of the length Lx is created along one side and swept through the sample by slow ramping of the external field until the complete magnetisation reversal and the wall annihilation at the open top boundary at a distance Ly. By fixing the number of spins N =Lx ×Ly = 106 and the random-field distribution at a value above the critical disorder, we vary the ratio of the DW length to the annihilation distance in the range Lx /Ly ∈ [ 1 / 16 , 16 ] . The periodic boundary conditions are applied in the y-direction so that these ratios comprise different samples, i.e., surfaces of cylinders with the changing perimeter Lx and height Ly. We analyse the avalanches of the DW slips between following field updates, and the multifractal structure of the magnetisation fluctuation time series. Our main findings are that the domain-wall lengths materialised in different sample shapes have an impact on the dynamics at all scales. Moreover, the domain-wall motion at the beginning of the hysteresis loop (HLB) probes the disorder effects resulting in the fluctuations that are significantly different from the large avalanches in the central part of the loop (HLC), where the strong fields dominate. Specifically, the fluctuations in HLB exhibit a wide multi-fractal spectrum, which shifts towards higher values of the exponents when the DW length is reduced. The distributions of the avalanches in this segments of the loops obey power-law decay and the exponential cutoffs with the exponents firmly in the mean-field universality class for long DW. In contrast, the avalanches in the HLC obey Tsallis density distribution with the power-law tails which indicate the new categories of the scale invariant behaviour for different ratios Lx /Ly. The large fluctuations in the HLC, on the other

  10. Effect of Mozart music on heel prick pain in preterm infants: a pilot randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Cristina Cavaiuolo

    2015-02-01

    Full Text Available Objective: The aim of this pilot study was to determine the effect of music by Mozart on heel prick procedural pain in premature infants.Background: Painful procedures are routinely performed in the setting of neonatal intensive care unit (NICU. Pain may exert short- and long-term deleterious effects on premature babies. Many non-pharmacological interventions have been proven efficacious for blunting neonatal pain.Study design: Randomized, controlled trial.Methods: The study was carried out in the NICU of the “G. Rummo” Hospital in Benevento, Italy. The sample consisted of 42 preterm infants, with no hearing loss or significant cerebral lesions on cranial ultrasound. They were randomized to receive heel lance during a music condition or a no-music control condition. We set strict criteria for selecting and delivering the music. Baseline and postprocedural heart rate and transcutaneous oxygen saturation were manually recorded. The Premature Infant Pain Profile (PIPP score was used to measure the behavioral response to prick. An unpaired t-test was performed for the intergroup comparisons.Results: There were significant differences between groups on heart rate increase, oxygen saturation reduction and PIPP score following the procedure.Conclusions: Listening to Mozart music during heel prick is a simple and inexpensive tool for pain alleviating in preterm stable neonates.

  11. Simultaneous escaping of explicit and hidden free energy barriers: application of the orthogonal space random walk strategy in generalized ensemble based conformational sampling.

    Science.gov (United States)

    Zheng, Lianqing; Chen, Mengen; Yang, Wei

    2009-06-21

    To overcome the pseudoergodicity problem, conformational sampling can be accelerated via generalized ensemble methods, e.g., through the realization of random walks along prechosen collective variables, such as spatial order parameters, energy scaling parameters, or even system temperatures or pressures, etc. As usually observed, in generalized ensemble simulations, hidden barriers are likely to exist in the space perpendicular to the collective variable direction and these residual free energy barriers could greatly abolish the sampling efficiency. This sampling issue is particularly severe when the collective variable is defined in a low-dimension subset of the target system; then the "Hamiltonian lagging" problem, which reveals the fact that necessary structural relaxation falls behind the move of the collective variable, may be likely to occur. To overcome this problem in equilibrium conformational sampling, we adopted the orthogonal space random walk (OSRW) strategy, which was originally developed in the context of free energy simulation [L. Zheng, M. Chen, and W. Yang, Proc. Natl. Acad. Sci. U.S.A. 105, 20227 (2008)]. Thereby, generalized ensemble simulations can simultaneously escape both the explicit barriers along the collective variable direction and the hidden barriers that are strongly coupled with the collective variable move. As demonstrated in our model studies, the present OSRW based generalized ensemble treatments show improved sampling capability over the corresponding classical generalized ensemble treatments.

  12. The Macdonald and Savage titrimetric procedure scaled down to 4 mg sized plutonium samples. P. 1

    International Nuclear Information System (INIS)

    Kuvik, V.; Lecouteux, C.; Doubek, N.; Ronesch, K.; Jammet, G.; Bagliano, G.; Deron, S.

    1992-01-01

    The original Macdonald and Savage amperometric method scaled down to milligram-sized plutonium samples was further modified. The electro-chemical process of each redox step and the end-point of the final titration were monitored potentiometrically. The method is designed to determine 4 mg of plutonium dissolved in nitric acid solution. It is suitable for the direct determination of plutonium in non-irradiated fuel with a uranium-to-plutonium ratio of up to 30. The precision and accuracy are ca. 0.05-0.1% (relative standard deviation). Although the procedure is very selective, the following species interfere: vanadyl(IV) and vanadate (almost quantitatively), neptunium (one electron exchange per mole), nitrites, fluorosilicates (milligram amounts yield a slight bias) and iodates. (author). 15 refs.; 8 figs.; 7 tabs

  13. Compressive sampling of polynomial chaos expansions: Convergence analysis and sampling strategies

    International Nuclear Information System (INIS)

    Hampton, Jerrad; Doostan, Alireza

    2015-01-01

    Sampling orthogonal polynomial bases via Monte Carlo is of interest for uncertainty quantification of models with random inputs, using Polynomial Chaos (PC) expansions. It is known that bounding a probabilistic parameter, referred to as coherence, yields a bound on the number of samples necessary to identify coefficients in a sparse PC expansion via solution to an ℓ 1 -minimization problem. Utilizing results for orthogonal polynomials, we bound the coherence parameter for polynomials of Hermite and Legendre type under their respective natural sampling distribution. In both polynomial bases we identify an importance sampling distribution which yields a bound with weaker dependence on the order of the approximation. For more general orthonormal bases, we propose the coherence-optimal sampling: a Markov Chain Monte Carlo sampling, which directly uses the basis functions under consideration to achieve a statistical optimality among all sampling schemes with identical support. We demonstrate these different sampling strategies numerically in both high-order and high-dimensional, manufactured PC expansions. In addition, the quality of each sampling method is compared in the identification of solutions to two differential equations, one with a high-dimensional random input and the other with a high-order PC expansion. In both cases, the coherence-optimal sampling scheme leads to similar or considerably improved accuracy

  14. Compressive sampling of polynomial chaos expansions: Convergence analysis and sampling strategies

    Science.gov (United States)

    Hampton, Jerrad; Doostan, Alireza

    2015-01-01

    Sampling orthogonal polynomial bases via Monte Carlo is of interest for uncertainty quantification of models with random inputs, using Polynomial Chaos (PC) expansions. It is known that bounding a probabilistic parameter, referred to as coherence, yields a bound on the number of samples necessary to identify coefficients in a sparse PC expansion via solution to an ℓ1-minimization problem. Utilizing results for orthogonal polynomials, we bound the coherence parameter for polynomials of Hermite and Legendre type under their respective natural sampling distribution. In both polynomial bases we identify an importance sampling distribution which yields a bound with weaker dependence on the order of the approximation. For more general orthonormal bases, we propose the coherence-optimal sampling: a Markov Chain Monte Carlo sampling, which directly uses the basis functions under consideration to achieve a statistical optimality among all sampling schemes with identical support. We demonstrate these different sampling strategies numerically in both high-order and high-dimensional, manufactured PC expansions. In addition, the quality of each sampling method is compared in the identification of solutions to two differential equations, one with a high-dimensional random input and the other with a high-order PC expansion. In both cases, the coherence-optimal sampling scheme leads to similar or considerably improved accuracy.

  15. Soil map disaggregation improved by soil-landscape relationships, area-proportional sampling and random forest implementation

    DEFF Research Database (Denmark)

    Møller, Anders Bjørn; Malone, Brendan P.; Odgers, Nathan

    implementation generally improved the algorithm’s ability to predict the correct soil class. The implementation of soil-landscape relationships and area-proportional sampling generally increased the calculation time, while the random forest implementation reduced the calculation time. In the most successful......Detailed soil information is often needed to support agricultural practices, environmental protection and policy decisions. Several digital approaches can be used to map soil properties based on field observations. When soil observations are sparse or missing, an alternative approach...... is to disaggregate existing conventional soil maps. At present, the DSMART algorithm represents the most sophisticated approach for disaggregating conventional soil maps (Odgers et al., 2014). The algorithm relies on classification trees trained from resampled points, which are assigned classes according...

  16. Tukey g-and-h Random Fields

    KAUST Repository

    Xu, Ganggang; Genton, Marc G.

    2016-01-01

    We propose a new class of trans-Gaussian random fields named Tukey g-and-h (TGH) random fields to model non-Gaussian spatial data. The proposed TGH random fields have extremely flexible marginal distributions, possibly skewed and/or heavy-tailed, and, therefore, have a wide range of applications. The special formulation of the TGH random field enables an automatic search for the most suitable transformation for the dataset of interest while estimating model parameters. Asymptotic properties of the maximum likelihood estimator and the probabilistic properties of the TGH random fields are investigated. An efficient estimation procedure, based on maximum approximated likelihood, is proposed and an extreme spatial outlier detection algorithm is formulated. Kriging and probabilistic prediction with TGH random fields are developed along with prediction confidence intervals. The predictive performance of TGH random fields is demonstrated through extensive simulation studies and an application to a dataset of total precipitation in the south east of the United States.

  17. Tukey g-and-h Random Fields

    KAUST Repository

    Xu, Ganggang

    2016-07-15

    We propose a new class of trans-Gaussian random fields named Tukey g-and-h (TGH) random fields to model non-Gaussian spatial data. The proposed TGH random fields have extremely flexible marginal distributions, possibly skewed and/or heavy-tailed, and, therefore, have a wide range of applications. The special formulation of the TGH random field enables an automatic search for the most suitable transformation for the dataset of interest while estimating model parameters. Asymptotic properties of the maximum likelihood estimator and the probabilistic properties of the TGH random fields are investigated. An efficient estimation procedure, based on maximum approximated likelihood, is proposed and an extreme spatial outlier detection algorithm is formulated. Kriging and probabilistic prediction with TGH random fields are developed along with prediction confidence intervals. The predictive performance of TGH random fields is demonstrated through extensive simulation studies and an application to a dataset of total precipitation in the south east of the United States.

  18. An inter-lab comparison determination of radionuclides in soil samples by γ-apectrometry

    International Nuclear Information System (INIS)

    Pan Jingquan; Zhang Shurong; Xu Cuihua

    1986-01-01

    The results of an inter-lab comparison of quantitative determination of radionuclides in two soil samples and in an imitated one used as standard reference material by direct γ-spectrometry are presented and discussed. The methods of preparation of the three samples, its homogeneity and the procedures used in this inter-lab comparison are also described. Fifteen laboratories in China participated in this program. The contents of main radionuclides in the samples were estimated by statistical treatment of the reproted data. More than 91% of these laboratories obtained mean values with relative standard deviation below 20%, and in 88% of them the average values we e within the range of the standard reference values with deviation less than 10%. Statistical analysis showed that random error might be underestimated or systematic error might exist in a few laboratories

  19. Comparison of Address-based Sampling and Random-digit Dialing Methods for Recruiting Young Men as Controls in a Case-Control Study of Testicular Cancer Susceptibility

    OpenAIRE

    Clagett, Bartholt; Nathanson, Katherine L.; Ciosek, Stephanie L.; McDermoth, Monique; Vaughn, David J.; Mitra, Nandita; Weiss, Andrew; Martonik, Rachel; Kanetsky, Peter A.

    2013-01-01

    Random-digit dialing (RDD) using landline telephone numbers is the historical gold standard for control recruitment in population-based epidemiologic research. However, increasing cell-phone usage and diminishing response rates suggest that the effectiveness of RDD in recruiting a random sample of the general population, particularly for younger target populations, is decreasing. In this study, we compared landline RDD with alternative methods of control recruitment, including RDD using cell-...

  20. A new procedure for making TEM specimens of superconductor devices

    International Nuclear Information System (INIS)

    Huang, Y.; Merkle, K.L.

    1997-04-01

    A new procedure is developed for making TEM specimens of thin film devices. In this procedure the sample is flatly polished to an overall ion-mill-ready thickness so that any point in the 2-D sample pane can be thinned to an electron-transparent thickness by subsequent ion-milling. Using this procedure, small regions of interest can be easily reached in both cross-section and plan-view samples. This is especially useful in device studies. Applications of this procedure to the study of superconductor devices yield good results. This procedure, using commercially available equipment and relatively cheap materials, is simple and easy to realize

  1. Systematic sampling with errors in sample locations

    DEFF Research Database (Denmark)

    Ziegel, Johanna; Baddeley, Adrian; Dorph-Petersen, Karl-Anton

    2010-01-01

    analysis using point process methods. We then analyze three different models for the error process, calculate exact expressions for the variances, and derive asymptotic variances. Errors in the placement of sample points can lead to substantial inflation of the variance, dampening of zitterbewegung......Systematic sampling of points in continuous space is widely used in microscopy and spatial surveys. Classical theory provides asymptotic expressions for the variance of estimators based on systematic sampling as the grid spacing decreases. However, the classical theory assumes that the sample grid...... is exactly periodic; real physical sampling procedures may introduce errors in the placement of the sample points. This paper studies the effect of errors in sample positioning on the variance of estimators in the case of one-dimensional systematic sampling. First we sketch a general approach to variance...

  2. Determination of thiobencarb in water samples by gas chromatography using a homogeneous liquid-liquid microextraction via flotation assistance procedure

    Directory of Open Access Journals (Sweden)

    H.A. Mashayekhi

    2013-09-01

    Full Text Available Homogeneous liquid-liquid microextraction via flotation assistance (HLLME-FA coupled with gas chromatography-flame ionization detection (GC-FID was applied for the extraction and determination of thiobencarb in water samples. In this study, a special extraction cell was designed to facilitate collection of the low-density solvent extraction. No centrifugation was required in this procedure. The water sample solution was added into the extraction cell which contained an appropriate mixture of toluene (as an extraction solvent and acetone (as a homogeneous solvent. By using air flotation, the organic solvent was collected at the conical part of the designed cell. The effect of the different parameters on the efficiency of extraction such as type and volume of extraction and homogeneous solvents, ionic strength and extraction time were studied and optimized. Under the optimal conditions, linearity of the method was in the range of 1.0-200 µg L-1. The relative standard deviations in the real samples varied from 7.8-11.7 % (n = 3. The proposed method was successfully applied to analysis of thiobencarb in the water samples and satisfactory results were obtained.DOI: http://dx.doi.org/10.4314/bcse.v27i3.4

  3. Manure sampling procedures and nutrient estimation by the hydrometer method for gestation pigs.

    Science.gov (United States)

    Zhu, Jun; Ndegwa, Pius M; Zhang, Zhijian

    2004-05-01

    Three manure agitation procedures were examined in this study (vertical mixing, horizontal mixing, and no mixing) to determine the efficacy of producing a representative manure sample. The total solids content for manure from gestation pigs was found to be well correlated with the total nitrogen (TN) and total phosphorus (TP) concentrations in the manure, with highly significant correlation coefficients of 0.988 and 0.994, respectively. Linear correlations were observed between the TN and TP contents and the manure specific gravity (correlation coefficients: 0.991 and 0.987, respectively). Therefore, it may be inferred that the nutrients in pig manure can be estimated with reasonable accuracy by measuring the liquid manure specific gravity. A rapid testing method for manure nutrient contents (TN and TP) using a soil hydrometer was also evaluated. The results showed that the estimating error increased from +/-10% to +/-30% with the decrease in TN (from 1000 to 100 ppm) and TP (from 700 to 50 ppm) concentrations in the manure. Data also showed that the hydrometer readings had to be taken within 10 s after mixing to avoid reading drift in specific gravity due to the settling of manure solids.

  4. Statistical inference for the additive hazards model under outcome-dependent sampling.

    Science.gov (United States)

    Yu, Jichang; Liu, Yanyan; Sandler, Dale P; Zhou, Haibo

    2015-09-01

    Cost-effective study design and proper inference procedures for data from such designs are always of particular interests to study investigators. In this article, we propose a biased sampling scheme, an outcome-dependent sampling (ODS) design for survival data with right censoring under the additive hazards model. We develop a weighted pseudo-score estimator for the regression parameters for the proposed design and derive the asymptotic properties of the proposed estimator. We also provide some suggestions for using the proposed method by evaluating the relative efficiency of the proposed method against simple random sampling design and derive the optimal allocation of the subsamples for the proposed design. Simulation studies show that the proposed ODS design is more powerful than other existing designs and the proposed estimator is more efficient than other estimators. We apply our method to analyze a cancer study conducted at NIEHS, the Cancer Incidence and Mortality of Uranium Miners Study, to study the risk of radon exposure to cancer.

  5. Uranium tailings sampling manual

    International Nuclear Information System (INIS)

    Feenstra, S.; Reades, D.W.; Cherry, J.A.; Chambers, D.B.; Case, G.G.; Ibbotson, B.G.

    1985-01-01

    The purpose of this manual is to describe the requisite sampling procedures for the application of uniform high-quality standards to detailed geotechnical, hydrogeological, geochemical and air quality measurements at Canadian uranium tailings disposal sites. The selection and implementation of applicable sampling procedures for such measurements at uranium tailings disposal sites are complicated by two primary factors. Firstly, the physical and chemical nature of uranium mine tailings and effluent is considerably different from natural soil materials and natural waters. Consequently, many conventional methods for the collection and analysis of natural soils and waters are not directly applicable to tailings. Secondly, there is a wide range in the physical and chemical nature of uranium tailings. The composition of the ore, the milling process, the nature of tailings depositon, and effluent treatment vary considerably and are highly site-specific. Therefore, the definition and implementation of sampling programs for uranium tailings disposal sites require considerable evaluation, and often innovation, to ensure that appropriate sampling and analysis methods are used which provide the flexibility to take into account site-specific considerations. The following chapters describe the objective and scope of a sampling program, preliminary data collection, and the procedures for sampling of tailings solids, surface water and seepage, tailings pore-water, and wind-blown dust and radon

  6. Determination of tritium in wine yeast samples

    International Nuclear Information System (INIS)

    Cotarlea, Monica-Ionela; Paunescu Niculina; Galeriu, D; Mocanu, N.; Margineanu, R.; Marin, G.

    1998-01-01

    Analytical procedures were developed to determine tritium in wine and wine yeast samples. The content of organic compounds affecting the LSC measurement is reduced by fractioning distillation for wine samples and azeotropic distillation/fractional distillation for wine yeast samples. Finally, the water samples were normally distilled with K MO 4 . The established procedures were successfully applied for wine and wine samples from Murfatlar harvests of the years 1995 and 1996. (authors)

  7. Outcomes of the modified Brostrom procedure using suture anchors for chronic lateral ankle instability--a prospective, randomized comparison between single and double suture anchors.

    Science.gov (United States)

    Cho, Byung-Ki; Kim, Yong-Min; Kim, Dong-Soo; Choi, Eui-Sung; Shon, Hyun-Chul; Park, Kyoung-Jin

    2013-01-01

    The present prospective, randomized study was conducted to compare the clinical outcomes of the modified Brostrom procedure using single and double suture anchors for chronic lateral ankle instability. A total of 50 patients were followed up for more than 2 years after undergoing the modified Brostrom procedure. Of the 50 procedures, 25 each were performed using single and double suture anchors by 1 surgeon. The Karlsson scale had improved significantly to 89.8 points and 90.6 points in the single and double anchor groups, respectively. Using the Sefton grading system, 23 cases (92%) in the single anchor group and 22 (88%) in the double anchor group achieved satisfactory results. The talar tilt angle and anterior talar translation on stress radiographs using the Telos device had improved significantly to an average of 5.7° and 4.6 mm in the single anchor group and 4.5° and 4.3 mm in the double anchor group, respectively. The double anchor technique was superior with respect to the postoperative talar tilt. The single and double suture anchor techniques produced similar clinical and functional outcomes, with the exception of talar tilt as a reference of mechanical stability. The modified Brostrom procedure using both single and double suture anchors appears to be an effective treatment method for chronic lateral ankle instability. Copyright © 2013 American College of Foot and Ankle Surgeons. Published by Elsevier Inc. All rights reserved.

  8. Evaluating effectiveness of down-sampling for stratified designs and unbalanced prevalence in Random Forest models of tree species distributions in Nevada

    Science.gov (United States)

    Elizabeth A. Freeman; Gretchen G. Moisen; Tracy S. Frescino

    2012-01-01

    Random Forests is frequently used to model species distributions over large geographic areas. Complications arise when data used to train the models have been collected in stratified designs that involve different sampling intensity per stratum. The modeling process is further complicated if some of the target species are relatively rare on the landscape leading to an...

  9. Surface Environmental Surveillance Procedures Manual

    Energy Technology Data Exchange (ETDEWEB)

    Hanf, RW; Dirkes, RL

    1990-02-01

    This manual establishes the procedures for the collection of environmental samples and the performance of radiation surveys and other field measurements. Responsibilities are defined for those personnel directly involved in the collection of samples and the performance of field measurements.

  10. Environmental Measurements Laboratory (EML) procedures manual

    International Nuclear Information System (INIS)

    Chieco, N.A.; Bogen, D.C.; Knutson, E.O.

    1990-11-01

    Volume 1 of this manual documents the procedures and existing technology that are currently used by the Environmental Measurements Laboratory. A section devoted to quality assurance has been included. These procedures have been updated and revised and new procedures have been added. They include: sampling; radiation measurements; analytical chemistry; radionuclide data; special facilities; and specifications. 228 refs., 62 figs., 37 tabs. (FL)

  11. Implementation guide for turbidity threshold sampling: principles, procedures, and analysis

    Science.gov (United States)

    Jack Lewis; Rand Eads

    2009-01-01

    Turbidity Threshold Sampling uses real-time turbidity and river stage information to automatically collect water quality samples for estimating suspended sediment loads. The system uses a programmable data logger in conjunction with a stage measurement device, a turbidity sensor, and a pumping sampler. Specialized software enables the user to control the sampling...

  12. Blocked Randomization with Randomly Selected Block Sizes

    Directory of Open Access Journals (Sweden)

    Jimmy Efird

    2010-12-01

    Full Text Available When planning a randomized clinical trial, careful consideration must be given to how participants are selected for various arms of a study. Selection and accidental bias may occur when participants are not assigned to study groups with equal probability. A simple random allocation scheme is a process by which each participant has equal likelihood of being assigned to treatment versus referent groups. However, by chance an unequal number of individuals may be assigned to each arm of the study and thus decrease the power to detect statistically significant differences between groups. Block randomization is a commonly used technique in clinical trial design to reduce bias and achieve balance in the allocation of participants to treatment arms, especially when the sample size is small. This method increases the probability that each arm will contain an equal number of individuals by sequencing participant assignments by block. Yet still, the allocation process may be predictable, for example, when the investigator is not blind and the block size is fixed. This paper provides an overview of blocked randomization and illustrates how to avoid selection bias by using random block sizes.

  13. Randomized controlled clinical trial of long-term chemo-mechanical caries removal using PapacarieTM gel

    Directory of Open Access Journals (Sweden)

    Lara Jansiski MOTTA

    2014-07-01

    Full Text Available Objectives: Compare the effectiveness of PapacarieTM gel for the chemo-mechanical removal of carious lesions on primary teeth to conventional caries removal with a low-speed bur with regard to execution time, clinical aspects and radiographic findings. Material and Methods: A randomized controlled clinical trial with a split-mouth design was carried out. The sample was composed of 20 children aged four to seven years, in whom 40 deciduous teeth were randomly divided into two groups: chemo-mechanical caries removal with PapacarieTM and removal of carious dentin with a low-speed bur. Each child underwent both procedures and served as his/her own control. Restorations were performed with glass ionomer cement. The time required to perform the procedure was also analyzed. The patients underwent longitudinal clinical and radiographic follow-up of the restorations. Results: No statistically significant difference between groups was found regarding the time required to perform the procedures and the radiographic follow up. Statistically significant differences between groups were found in the clinical evaluation at 6 and 18 months after treatment. Conclusion: PapacarieTM is as effective as the traditional method for the removal of carious dentin on deciduous teeth, but offers the advantages of the preservation of sound dental tissue as well as the avoidance of sharp rotary instruments and local anesthesia.

  14. Standard operating procedures for collection of soil and sediment samples for the Sediment-bound Contaminant Resiliency and Response (SCoRR) strategy pilot study

    Science.gov (United States)

    Fisher, Shawn C.; Reilly, Timothy J.; Jones, Daniel K.; Benzel, William M.; Griffin, Dale W.; Loftin, Keith A.; Iwanowicz, Luke R.; Cohl, Jonathan A.

    2015-12-17

    An understanding of the effects on human and ecological health brought by major coastal storms or flooding events is typically limited because of a lack of regionally consistent baseline and trends data in locations proximal to potential contaminant sources and mitigation activities, sensitive ecosystems, and recreational facilities where exposures are probable. In an attempt to close this gap, the U.S. Geological Survey (USGS) has implemented the Sediment-bound Contaminant Resiliency and Response (SCoRR) strategy pilot study to collect regional sediment-quality data prior to and in response to future coastal storms. The standard operating procedure (SOP) detailed in this document serves as the sample-collection protocol for the SCoRR strategy by providing step-by-step instructions for site preparation, sample collection and processing, and shipping of soil and surficial sediment (for example, bed sediment, marsh sediment, or beach material). The objectives of the SCoRR strategy pilot study are (1) to create a baseline of soil-, sand-, marsh sediment-, and bed-sediment-quality data from sites located in the coastal counties from Maine to Virginia based on their potential risk of being contaminated in the event of a major coastal storm or flooding (defined as Resiliency mode); and (2) respond to major coastal storms and flooding by reoccupying select baseline sites and sampling within days of the event (defined as Response mode). For both modes, samples are collected in a consistent manner to minimize bias and maximize quality control by ensuring that all sampling personnel across the region collect, document, and process soil and sediment samples following the procedures outlined in this SOP. Samples are analyzed using four USGS-developed screening methods—inorganic geochemistry, organic geochemistry, pathogens, and biological assays—which are also outlined in this SOP. Because the SCoRR strategy employs a multi-metric approach for sample analyses, this

  15. Robust weak measurements on finite samples

    International Nuclear Information System (INIS)

    Tollaksen, Jeff

    2007-01-01

    A new weak measurement procedure is introduced for finite samples which yields accurate weak values that are outside the range of eigenvalues and which do not require an exponentially rare ensemble. This procedure provides a unique advantage in the amplification of small nonrandom signals by minimizing uncertainties in determining the weak value and by minimizing sample size. This procedure can also extend the strength of the coupling between the system and measuring device to a new regime

  16. Variable screening and ranking using sampling-based sensitivity measures

    International Nuclear Information System (INIS)

    Wu, Y-T.; Mohanty, Sitakanta

    2006-01-01

    This paper presents a methodology for screening insignificant random variables and ranking significant important random variables using sensitivity measures including two cumulative distribution function (CDF)-based and two mean-response based measures. The methodology features (1) using random samples to compute sensitivities and (2) using acceptance limits, derived from the test-of-hypothesis, to classify significant and insignificant random variables. Because no approximation is needed in either the form of the performance functions or the type of continuous distribution functions representing input variables, the sampling-based approach can handle highly nonlinear functions with non-normal variables. The main characteristics and effectiveness of the sampling-based sensitivity measures are investigated using both simple and complex examples. Because the number of samples needed does not depend on the number of variables, the methodology appears to be particularly suitable for problems with large, complex models that have large numbers of random variables but relatively few numbers of significant random variables

  17. Procedures for mastitis diagnosis and control.

    Science.gov (United States)

    Sears, P M; González, R N; Wilson, D J; Han, H R

    1993-11-01

    Procedures for mastitis diagnosis and control include culturing individual cow and bulk tank milk samples, antibiotic susceptibility testing, and evaluation of somatic cell count reports and clinical mastitis treatment records. Integrated use of such procedures is necessary for effective mastitis diagnosis and control.

  18. Results of the analysis of the intercomparison samples of the depleted uranium dioxide SR-20

    International Nuclear Information System (INIS)

    Aigner, H.; Deron, S.; Kuhn, E.; Ronesch, K.; Zoigner, A.

    Samples of a homogeneous powder of depleted uranium dioxide, SR-20, were distributed to 32 laboratories in January 1980 for intercomparison of the precisions and accuracies of wet chemical assay. 11 laboratories reported their results (ANNEX 1). 5 laboratories applied titration procedures, 4 of them applied methods derived from the Davies and Gray procedure (1), 2 laboratories used controlled potential coulometry, 2 laboratories used precipitation procedures, 1 laboratory used fluorimetry and 1 laboratory used activation analysis. An analysis of variance yields for each laboratory the estimates of the measurement errors, the dissolution or treatment errors and the random calibration errors. The measurement errors vary between 0.01% and 1.7% relative. The differences to the reference value vary between -9.1% and +0.92% uranium, but 9 laboratories agree within +-1%U with the reference value. The mean bias of these 9 laboratories is equal to +0.04%U. The standard deviation of the biases of these 9 laboratories is equal to 0.36%.U

  19. Fractionation of metals in street sediment samples by using the BCR sequential extraction procedure and multivariate statistical elucidation of the data

    International Nuclear Information System (INIS)

    Kartal, Senol; Aydin, Zeki; Tokalioglu, Serife

    2006-01-01

    The concentrations of metals (Cd, Co, Cr, Cu, Fe, Mn, Ni, Pb, and Zn) in street sediment samples were determined by flame atomic absorption spectrometry (FAAS) using the modified BCR (the European Community Bureau of Reference) sequential extraction procedure. According to the BCR protocol for extracting the metals from the relevant target phases, 1.0 g of specimen of the sample was treated with 0.11 M acetic acid (exchangeable and bound to carbonates), 0.5 M hydroxylamine hydrochloride (bound to iron- and manganese-oxides), and 8.8 M hydrogen peroxide plus 1 M ammonium acetate (bound to sulphides and organics), sequentially. The residue was treated with aqua regia solution for recovery studies, although this step is not part of the BCR procedure. The mobility sequence based on the sum of the BCR sequential extraction stages was: Cd ∼ Zn (∼90%) > Pb (∼84%) > Cu (∼75%) > Mn (∼70%) > Co (∼57%) > Ni (∼43%) > Cr (∼40%) > Fe (∼17%). Enrichment factors as the criteria for examining the impact of the anthropogenic emission sources of heavy metals were calculated, and it was observed that the highest enriched elements were Cd, Pb, and Zn in the dust samples, average 190, 111, and 20, respectively. Correlation analysis (CA) and principal component analysis (PCA) were applied to the data matrix to evaluate the analytical results and to identify the possible pollution sources of metals. PCA revealed that the sampling area was mainly influenced from three pollution sources, namely; traffic, industrial, and natural sources. The results show that chemical sequential extraction is a precious operational tool. Validation of the analytical results was checked by both recovery studies and analysis of the standard reference material (NIST SRM 2711 Montana Soil)

  20. Visualizing the Sample Standard Deviation

    Science.gov (United States)

    Sarkar, Jyotirmoy; Rashid, Mamunur

    2017-01-01

    The standard deviation (SD) of a random sample is defined as the square-root of the sample variance, which is the "mean" squared deviation of the sample observations from the sample mean. Here, we interpret the sample SD as the square-root of twice the mean square of all pairwise half deviations between any two sample observations. This…

  1. Network Sampling with Memory: A proposal for more efficient sampling from social networks

    Science.gov (United States)

    Mouw, Ted; Verdery, Ashton M.

    2013-01-01

    Techniques for sampling from networks have grown into an important area of research across several fields. For sociologists, the possibility of sampling from a network is appealing for two reasons: (1) A network sample can yield substantively interesting data about network structures and social interactions, and (2) it is useful in situations where study populations are difficult or impossible to survey with traditional sampling approaches because of the lack of a sampling frame. Despite its appeal, methodological concerns about the precision and accuracy of network-based sampling methods remain. In particular, recent research has shown that sampling from a network using a random walk based approach such as Respondent Driven Sampling (RDS) can result in high design effects (DE)—the ratio of the sampling variance to the sampling variance of simple random sampling (SRS). A high design effect means that more cases must be collected to achieve the same level of precision as SRS. In this paper we propose an alternative strategy, Network Sampling with Memory (NSM), which collects network data from respondents in order to reduce design effects and, correspondingly, the number of interviews needed to achieve a given level of statistical power. NSM combines a “List” mode, where all individuals on the revealed network list are sampled with the same cumulative probability, with a “Search” mode, which gives priority to bridge nodes connecting the current sample to unexplored parts of the network. We test the relative efficiency of NSM compared to RDS and SRS on 162 school and university networks from Add Health and Facebook that range in size from 110 to 16,278 nodes. The results show that the average design effect for NSM on these 162 networks is 1.16, which is very close to the efficiency of a simple random sample (DE=1), and 98.5% lower than the average DE we observed for RDS. PMID:24159246

  2. Special nuclear material inventory sampling plans

    International Nuclear Information System (INIS)

    Vaccaro, H.; Goldman, A.

    1987-01-01

    Since their introduction in 1942, sampling inspection procedures have been common quality assurance practice. The U.S. Department of Energy (DOE) supports such sampling of special nuclear materials inventories. The DOE Order 5630.7 states, Operations Offices may develop and use statistically valid sampling plans appropriate for their site-specific needs. The benefits for nuclear facilities operations include reduced worker exposure and reduced work load. Improved procedures have been developed for obtaining statistically valid sampling plans that maximize these benefits. The double sampling concept is described and the resulting sample sizes for double sample plans are compared with other plans. An algorithm is given for finding optimal double sampling plans that assist in choosing the appropriate detection and false alarm probabilities for various sampling plans

  3. Fast integration using quasi-random numbers

    International Nuclear Information System (INIS)

    Bossert, J.; Feindt, M.; Kerzel, U.

    2006-01-01

    Quasi-random numbers are specially constructed series of numbers optimised to evenly sample a given s-dimensional volume. Using quasi-random numbers in numerical integration converges faster with a higher accuracy compared to the case of pseudo-random numbers. The basic properties of quasi-random numbers are introduced, various generators are discussed and the achieved gain is illustrated by examples

  4. Fast integration using quasi-random numbers

    Science.gov (United States)

    Bossert, J.; Feindt, M.; Kerzel, U.

    2006-04-01

    Quasi-random numbers are specially constructed series of numbers optimised to evenly sample a given s-dimensional volume. Using quasi-random numbers in numerical integration converges faster with a higher accuracy compared to the case of pseudo-random numbers. The basic properties of quasi-random numbers are introduced, various generators are discussed and the achieved gain is illustrated by examples.

  5. Nevada Applied Ecology Group procedures handbook for environmental transuranics

    International Nuclear Information System (INIS)

    White, M.G.; Dunaway, P.B.

    1976-10-01

    The activities of the Nevada Applied Ecology Group (NAEG) integrated research studies of environmental plutonium and other transuranics at the Nevada Test Site have required many standardized field and laboratory procedures. These include sampling techniques, collection and preparation, radiochemical and wet chemistry analysis, data bank storage and reporting, and statistical considerations for environmental samples of soil, vegetation, resuspended particles, animals, and other biological material. This document, printed in two volumes, includes most of the Nevada Applied Ecology Group standard procedures, with explanations as to the specific applications involved in the environmental studies. Where there is more than one document concerning a procedure, it has been included to indicate special studies or applications more complex than the routine standard sampling procedures utilized

  6. Nevada Applied Ecology Group procedures handbook for environmental transuranics

    International Nuclear Information System (INIS)

    White, M.G.; Dunaway, P.B.

    1976-10-01

    The activities of the Nevada Applied Ecology Group (NAEG) integrated research studies of environmental plutonium and other transuranics at the Nevada Test Site have required many standardized field and laboratory procedures. These include sampling techniques, collection and preparation, radiochemical and wet chemistry analysis, data bank storage and reporting, and statistical considerations for environmental samples of soil, vegetation, resuspended particles, animals, and others. This document, printed in two volumes, includes most of the Nevada Applied Ecology Group standard procedures, with explanations as to the specific applications involved in the environmental studies. Where there is more than one document concerning a procedure, it has been included to indicate special studies or applications perhaps more complex than the routine standard sampling procedures utilized

  7. Vapocoolant Spray Effectiveness on Arterial Puncture Pain: A Randomized Controlled Clinical Trial

    Directory of Open Access Journals (Sweden)

    Shervin Farahmand

    2017-02-01

    Full Text Available Arterial blood gas (ABG sampling is a painful procedure with no perfect technique for quelling the discomfort. An ideal local anesthesia should be rapid, easy to learn, inexpensive, and noninvasive. This study was aimed to compare pain levels from ABG sampling performed with vapocoolant spray in comparison to placebo. We hypothesized that pretreatment with the vapocoolant would reduce the pain of arterial puncture by at least 1 point on a 10 point verbal numeric scale. We have evaluated the effectiveness of a vapocoolant spray in achieving satisfactory pain control in patients undergoing ABG sampling in this randomized placebo controlled trial. Eighty patients were randomized to 2 groups: group A, who received vapocoolant spray, and group B, who received water spray as placebo (Control group. Puncture and spray application pain was assessed with numerical rating scale (0, the absence of pain; 10, greatest imaginable pain and number of attempts was recorded. The pain score during ABG sampling was not lower in group A compared with group B significantly (4.78±1.761 vs. 4.90±1.837; P:0.945. This study showed that while the spray exerts more application pain, the number of attempts required for ABG sampling was not significantly lower in group A compared with group B (1.38±0.54 vs. 1.53±0.68; P=0.372. Vapocoolant spray was not effective in ABG pain reduction, had milder application pain compared to placebo (P<0.05, but did not reduce sampling attempts. At present, this spray cannot be recommended for arterial puncture anesthesia, and further study on different timing is necessary.

  8. An improved sampling method of complex network

    Science.gov (United States)

    Gao, Qi; Ding, Xintong; Pan, Feng; Li, Weixing

    2014-12-01

    Sampling subnet is an important topic of complex network research. Sampling methods influence the structure and characteristics of subnet. Random multiple snowball with Cohen (RMSC) process sampling which combines the advantages of random sampling and snowball sampling is proposed in this paper. It has the ability to explore global information and discover the local structure at the same time. The experiments indicate that this novel sampling method could keep the similarity between sampling subnet and original network on degree distribution, connectivity rate and average shortest path. This method is applicable to the situation where the prior knowledge about degree distribution of original network is not sufficient.

  9. A Survey of Procedural Methods for Terrain Modelling

    NARCIS (Netherlands)

    Smelik, R.M.; Kraker, J.K. de; Groenewegen, S.A.; Tutenel, T.; Bidarra, R.

    2009-01-01

    Procedural methods are a promising but underused alternative to manual content creation. Commonly heard drawbacks are the randomness of and the lack of control over the output and the absence of integrated solutions, although more recent publications increasingly address these issues. This paper

  10. Grid - a fast threshold tracking procedure

    DEFF Research Database (Denmark)

    Fereczkowski, Michal; Dau, Torsten; MacDonald, Ewen

    2016-01-01

    A new procedure, called “grid”, is evaluated that allows rapid acquisition of threshold curves for psychophysics and, in particular, psychoacoustic, experiments. In this method, the parameterresponse space is sampled in two dimensions within a single run. This allows the procedure to focus more e...

  11. Sampling Polya-Gamma random variates: alternate and approximate techniques

    OpenAIRE

    Windle, Jesse; Polson, Nicholas G.; Scott, James G.

    2014-01-01

    Efficiently sampling from the P\\'olya-Gamma distribution, ${PG}(b,z)$, is an essential element of P\\'olya-Gamma data augmentation. Polson et. al (2013) show how to efficiently sample from the ${PG}(1,z)$ distribution. We build two new samplers that offer improved performance when sampling from the ${PG}(b,z)$ distribution and $b$ is not unity.

  12. Random Photon Absorption Model Elucidates How Early Gain Control in Fly Photoreceptors Arises from Quantal Sampling

    Science.gov (United States)

    Song, Zhuoyi; Zhou, Yu; Juusola, Mikko

    2016-01-01

    Many diurnal photoreceptors encode vast real-world light changes effectively, but how this performance originates from photon sampling is unclear. A 4-module biophysically-realistic fly photoreceptor model, in which information capture is limited by the number of its sampling units (microvilli) and their photon-hit recovery time (refractoriness), can accurately simulate real recordings and their information content. However, sublinear summation in quantum bump production (quantum-gain-nonlinearity) may also cause adaptation by reducing the bump/photon gain when multiple photons hit the same microvillus simultaneously. Here, we use a Random Photon Absorption Model (RandPAM), which is the 1st module of the 4-module fly photoreceptor model, to quantify the contribution of quantum-gain-nonlinearity in light adaptation. We show how quantum-gain-nonlinearity already results from photon sampling alone. In the extreme case, when two or more simultaneous photon-hits reduce to a single sublinear value, quantum-gain-nonlinearity is preset before the phototransduction reactions adapt the quantum bump waveform. However, the contribution of quantum-gain-nonlinearity in light adaptation depends upon the likelihood of multi-photon-hits, which is strictly determined by the number of microvilli and light intensity. Specifically, its contribution to light-adaptation is marginal (≤ 1%) in fly photoreceptors with many thousands of microvilli, because the probability of simultaneous multi-photon-hits on any one microvillus is low even during daylight conditions. However, in cells with fewer sampling units, the impact of quantum-gain-nonlinearity increases with brightening light. PMID:27445779

  13. Critical evaluation of sample pretreatment techniques.

    Science.gov (United States)

    Hyötyläinen, Tuulia

    2009-06-01

    Sample preparation before chromatographic separation is the most time-consuming and error-prone part of the analytical procedure. Therefore, selecting and optimizing an appropriate sample preparation scheme is a key factor in the final success of the analysis, and the judicious choice of an appropriate procedure greatly influences the reliability and accuracy of a given analysis. The main objective of this review is to critically evaluate the applicability, disadvantages, and advantages of various sample preparation techniques. Particular emphasis is placed on extraction techniques suitable for both liquid and solid samples.

  14. Health indicators: eliminating bias from convenience sampling estimators.

    Science.gov (United States)

    Hedt, Bethany L; Pagano, Marcello

    2011-02-28

    Public health practitioners are often called upon to make inference about a health indicator for a population at large when the sole available information are data gathered from a convenience sample, such as data gathered on visitors to a clinic. These data may be of the highest quality and quite extensive, but the biases inherent in a convenience sample preclude the legitimate use of powerful inferential tools that are usually associated with a random sample. In general, we know nothing about those who do not visit the clinic beyond the fact that they do not visit the clinic. An alternative is to take a random sample of the population. However, we show that this solution would be wasteful if it excluded the use of available information. Hence, we present a simple annealing methodology that combines a relatively small, and presumably far less expensive, random sample with the convenience sample. This allows us to not only take advantage of powerful inferential tools, but also provides more accurate information than that available from just using data from the random sample alone. Copyright © 2011 John Wiley & Sons, Ltd.

  15. Intravenous paracetamol for relief of pain during transrectal-ultrasound-guided biopsy of the prostate: A prospective, randomized, double-blind, placebo-controlled study

    Directory of Open Access Journals (Sweden)

    Ozcan Kilic

    2015-11-01

    Full Text Available Transrectal-ultrasound-guided prostate biopsy (TRUS-PBx is the standard procedure for diagnosing prostate cancer. The procedure does cause some pain and discomfort; therefore, an adequate analgesia is necessary to ensure patient comfort, which can also facilitate good-quality results. This prospective, randomized, double-blinded, placebo-controlled study aimed to determine if intravenous (IV paracetamol can reduce the severity of pain associated with TRUS-PBx. The study included 104 patients, scheduled to undergo TRUS-PBx with a suspicion of prostate cancer, that were prospectively randomized to receive either IV paracetamol (paracetamol group or placebo (placebo group 30 minutes prior to TRUS-PBx. All patients had 12 standardized biopsy samples taken. Pain was measured using a 10-point visual analog pain scale during probe insertion, during the biopsy procedure, and 1 hour postbiopsy. All biopsies were performed by the same urologist, whereas a different urologist administered the visual analog pain scale. There were not any significant differences in age, prostate-specific antigen level, or prostate volume between the two groups. The pain scores were significantly lower during probe insertion, biopsy procedure, and 1 hour postbiopsy in the paracetamol group than in the placebo group. In conclusion, the IV administration of paracetamol significantly reduced the severity of pain associated with TRUS-PBx.

  16. Computed tomography of the brain, hepatotoxic drugs and high alcohol consumption in male alcoholic patients and a random sample from the general male population

    Energy Technology Data Exchange (ETDEWEB)

    Muetzell, S. (Univ. Hospital of Uppsala (Sweden). Dept. of Family Medicine)

    1992-01-01

    Computed tomography (CT) of the brain was performed in a random sample of a total of 195 men and 211 male alcoholic patients admitted for the first time during a period of two years from the same geographically limited area of Greater Stockholm as the sample. Laboratory tests were performed, including liver and pancreatic tests. Toxicological screening was performed and the consumption of hepatotoxic drugs was also investigated. The groups were then subdivided with respect to alcohol consumption and use of hepatotoxic drugs: group IA, men from the random sample with low or moderate alcohol consumption and no use of hepatotoxic drugs; IB, men from the random sample with low or moderate alcohol consumption with use of hepatotoxic drugs; IIA, alcoholic inpatients with use of alcohol and no drugs; and IIB, alcoholic inpatients with use of alcohol and drugs. Group IIB was found to have a higher incidence of cortical and subcortical changes than group IA. Group IB had a higher incidence of subcortical changes than group IA, and they differed only in drug use. Groups IIN and IIA only differed in drug use, and IIB had a higher incidence of brian damage except for anterior horn index and wide cerebellar sulci indicating vermian atrophy. Significantly higher serum levels of bilirubin, GGT, ASAT, ALAT, CK LD, and amylase were found in IIB. The results indicate that drug use influences the incidence of cortical and subcortical aberrations, except anterior horn index. It is concluded that the groups with alcohol abuse who used hepatotoxic drugs showed a picture of cortical changes (wide transport sulci and clear-cut of high-grade cortical changes) and also of subcortical aberrations, expressed as an increased widening on the third ventricle.

  17. Computed tomography of the brain, hepatotoxic drugs and high alcohol consumption in male alcoholic patients and a random sample from the general male population

    International Nuclear Information System (INIS)

    Muetzell, S.

    1992-01-01

    Computed tomography (CT) of the brain was performed in a random sample of a total of 195 men and 211 male alcoholic patients admitted for the first time during a period of two years from the same geographically limited area of Greater Stockholm as the sample. Laboratory tests were performed, including liver and pancreatic tests. Toxicological screening was performed and the consumption of hepatotoxic drugs was also investigated. The groups were then subdivided with respect to alcohol consumption and use of hepatotoxic drugs: group IA, men from the random sample with low or moderate alcohol consumption and no use of hepatotoxic drugs; IB, men from the random sample with low or moderate alcohol consumption with use of hepatotoxic drugs; IIA, alcoholic inpatients with use of alcohol and no drugs; and IIB, alcoholic inpatients with use of alcohol and drugs. Group IIB was found to have a higher incidence of cortical and subcortical changes than group IA. Group IB had a higher incidence of subcortical changes than group IA, and they differed only in drug use. Groups IIN and IIA only differed in drug use, and IIB had a higher incidence of brian damage except for anterior horn index and wide cerebellar sulci indicating vermian atrophy. Significantly higher serum levels of bilirubin, GGT, ASAT, ALAT, CK LD, and amylase were found in IIB. The results indicate that drug use influences the incidence of cortical and subcortical aberrations, except anterior horn index. It is concluded that the groups with alcohol abuse who used hepatotoxic drugs showed a picture of cortical changes (wide transport sulci and clear-cut of high-grade cortical changes) and also of subcortical aberrations, expressed as an increased widening on the third ventricle

  18. A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.

    Science.gov (United States)

    Płotka-Wasylka, J

    2018-05-01

    A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Soil sampling

    International Nuclear Information System (INIS)

    Fortunati, G.U.; Banfi, C.; Pasturenzi, M.

    1994-01-01

    This study attempts to survey the problems associated with techniques and strategies of soil sampling. Keeping in mind the well defined objectives of a sampling campaign, the aim was to highlight the most important aspect of representativeness of samples as a function of the available resources. Particular emphasis was given to the techniques and particularly to a description of the many types of samplers which are in use. The procedures and techniques employed during the investigations following the Seveso accident are described. (orig.)

  20. Toward cost-efficient sampling methods

    Science.gov (United States)

    Luo, Peng; Li, Yongli; Wu, Chong; Zhang, Guijie

    2015-09-01

    The sampling method has been paid much attention in the field of complex network in general and statistical physics in particular. This paper proposes two new sampling methods based on the idea that a small part of vertices with high node degree could possess the most structure information of a complex network. The two proposed sampling methods are efficient in sampling high degree nodes so that they would be useful even if the sampling rate is low, which means cost-efficient. The first new sampling method is developed on the basis of the widely used stratified random sampling (SRS) method and the second one improves the famous snowball sampling (SBS) method. In order to demonstrate the validity and accuracy of two new sampling methods, we compare them with the existing sampling methods in three commonly used simulation networks that are scale-free network, random network, small-world network, and also in two real networks. The experimental results illustrate that the two proposed sampling methods perform much better than the existing sampling methods in terms of achieving the true network structure characteristics reflected by clustering coefficient, Bonacich centrality and average path length, especially when the sampling rate is low.

  1. Statistical sampling method for releasing decontaminated vehicles

    International Nuclear Information System (INIS)

    Lively, J.W.; Ware, J.A.

    1996-01-01

    Earth moving vehicles (e.g., dump trucks, belly dumps) commonly haul radiologically contaminated materials from a site being remediated to a disposal site. Traditionally, each vehicle must be surveyed before being released. The logistical difficulties of implementing the traditional approach on a large scale demand that an alternative be devised. A statistical method (MIL-STD-105E, open-quotes Sampling Procedures and Tables for Inspection by Attributesclose quotes) for assessing product quality from a continuous process was adapted to the vehicle decontamination process. This method produced a sampling scheme that automatically compensates and accommodates fluctuating batch sizes and changing conditions without the need to modify or rectify the sampling scheme in the field. Vehicles are randomly selected (sampled) upon completion of the decontamination process to be surveyed for residual radioactive surface contamination. The frequency of sampling is based on the expected number of vehicles passing through the decontamination process in a given period and the confidence level desired. This process has been successfully used for 1 year at the former uranium mill site in Monticello, Utah (a CERCLA regulated clean-up site). The method forces improvement in the quality of the decontamination process and results in a lower likelihood that vehicles exceeding the surface contamination standards are offered for survey. Implementation of this statistical sampling method on Monticello Projects has resulted in more efficient processing of vehicles through decontamination and radiological release, saved hundreds of hours of processing time, provided a high level of confidence that release limits are met, and improved the radiological cleanliness of vehicles leaving the controlled site

  2. Radiochemical procedures and techniques

    International Nuclear Information System (INIS)

    Flynn, K.

    1975-04-01

    A summary is presented of the radiochemical procedures and techniques currently in use by the Chemistry Division Nuclear Chemistry Group at Argonne National Laboratory for the analysis of radioactive samples. (U.S.)

  3. EPA perspective on radionuclide aerosol sampling

    Energy Technology Data Exchange (ETDEWEB)

    Karhnak, J.M. [Environmental Protection Agency, Washington, DC (United States)

    1995-02-01

    The Environmental Protection Agency (EPA) is concerned with radionuclide aerosol sampling primarily at Department of Energy (DOE) facilities in order to insure compliance with national air emission standards, known as NESHAPs. Sampling procedures are specified in {open_quotes}National Emission Standards for Emissions of Radionuclides other than Radon from Department of Energy Sites{close_quotes} (Subpart H). Subpart H also allows alternate procedures to be used if they meet certain requirements. This paper discusses some of the mission differences between EPA and Doe and how these differences are reflected in decisions that are made. It then describes how the EPA develops standards, considers alternate sampling procedures, and lists suggestions to speed up the review and acceptance process for alternate procedures. The paper concludes with a discussion of the process for delegation of Radionuclide NESHAPs responsibilities to the States, and responsibilities that could be retained by EPA.

  4. Investigating causal associations between use of nicotine, alcohol, caffeine and cannabis: a two-sample bidirectional Mendelian randomization study.

    Science.gov (United States)

    Verweij, Karin J H; Treur, Jorien L; Vink, Jacqueline M

    2018-07-01

    Epidemiological studies consistently show co-occurrence of use of different addictive substances. Whether these associations are causal or due to overlapping underlying influences remains an important question in addiction research. Methodological advances have made it possible to use published genetic associations to infer causal relationships between phenotypes. In this exploratory study, we used Mendelian randomization (MR) to examine the causality of well-established associations between nicotine, alcohol, caffeine and cannabis use. Two-sample MR was employed to estimate bidirectional causal effects between four addictive substances: nicotine (smoking initiation and cigarettes smoked per day), caffeine (cups of coffee per day), alcohol (units per week) and cannabis (initiation). Based on existing genome-wide association results we selected genetic variants associated with the exposure measure as an instrument to estimate causal effects. Where possible we applied sensitivity analyses (MR-Egger and weighted median) more robust to horizontal pleiotropy. Most MR tests did not reveal causal associations. There was some weak evidence for a causal positive effect of genetically instrumented alcohol use on smoking initiation and of cigarettes per day on caffeine use, but these were not supported by the sensitivity analyses. There was also some suggestive evidence for a positive effect of alcohol use on caffeine use (only with MR-Egger) and smoking initiation on cannabis initiation (only with weighted median). None of the suggestive causal associations survived corrections for multiple testing. Two-sample Mendelian randomization analyses found little evidence for causal relationships between nicotine, alcohol, caffeine and cannabis use. © 2018 Society for the Study of Addiction.

  5. Green approaches in sample preparation of bioanalytical samples prior to chromatographic analysis.

    Science.gov (United States)

    Filippou, Olga; Bitas, Dimitrios; Samanidou, Victoria

    2017-02-01

    Sample preparation is considered to be the most challenging step of the analytical procedure, since it has an effect on the whole analytical methodology, therefore it contributes significantly to the greenness or lack of it of the entire process. The elimination of the sample treatment steps, pursuing at the same time the reduction of the amount of the sample, strong reductions in consumption of hazardous reagents and energy also maximizing safety for operators and environment, the avoidance of the use of big amount of organic solvents, form the basis for greening sample preparation and analytical methods. In the last decade, the development and utilization of greener and sustainable microextraction techniques is an alternative to classical sample preparation procedures. In this review, the main green microextraction techniques (solid phase microextraction, stir bar sorptive extraction, hollow-fiber liquid phase microextraction, dispersive liquid - liquid microextraction, etc.) will be presented, with special attention to bioanalytical applications of these environment-friendly sample preparation techniques which comply with the green analytical chemistry principles. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Bayesian analysis for exponential random graph models using the adaptive exchange sampler

    KAUST Repository

    Jin, Ick Hoon

    2013-01-01

    Exponential random graph models have been widely used in social network analysis. However, these models are extremely difficult to handle from a statistical viewpoint, because of the existence of intractable normalizing constants. In this paper, we consider a fully Bayesian analysis for exponential random graph models using the adaptive exchange sampler, which solves the issue of intractable normalizing constants encountered in Markov chain Monte Carlo (MCMC) simulations. The adaptive exchange sampler can be viewed as a MCMC extension of the exchange algorithm, and it generates auxiliary networks via an importance sampling procedure from an auxiliary Markov chain running in parallel. The convergence of this algorithm is established under mild conditions. The adaptive exchange sampler is illustrated using a few social networks, including the Florentine business network, molecule synthetic network, and dolphins network. The results indicate that the adaptive exchange algorithm can produce more accurate estimates than approximate exchange algorithms, while maintaining the same computational efficiency.

  7. Reduced infant response to a routine care procedure after sucrose analgesia.

    Science.gov (United States)

    Taddio, Anna; Shah, Vibhuti; Katz, Joel

    2009-03-01

    Sucrose has analgesic and calming effects in newborns. To date, it is not known whether the beneficial effects extend to caregiving procedures that are performed after painful procedures. Our objective was to determine the effect of sucrose analgesia for procedural pain on infant pain responses during a subsequent caregiving procedure. We conducted a double-blind, randomized, controlled trial. Healthy neonates within 2 strata (normal infants and infants of diabetic mothers) were randomly assigned to a sucrose or placebo water group before all needle procedures after birth. Pain response during a diaper change performed after venipuncture for the newborn screening test was determined by using a validated multidimensional measure, the Premature Infant Pain Profile. The study was conducted between September 15, 2003, and July 27, 2004. Altogether, 412 parents were approached; 263 consented. Twenty-three infants were not assigned, leaving 240 for participation (n = 120 per group), with an equal number in each infant strata. Of those, 186 (78%) completed the study. There were no significant differences in birth characteristics between groups. During diaper change, sucrose-treated infants had lower pain scores than placebo-treated infants. The relative risk of having pain, defined as a Premature Infant Pain Profile score of >/=6, was 0.64 with sucrose compared with placebo. This study demonstrates that when used to manage pain, sucrose reduces the pain response to a subsequent routine caregiving procedure. Therefore, the benefits of sucrose analgesia extend beyond the painful event to other aversive and potentially painful procedures.

  8. Electrochemical detection of a powerful estrogenic endocrine disruptor: ethinylestradiol in water samples through bioseparation procedure.

    Science.gov (United States)

    Martínez, Noelia A; Pereira, Sirley V; Bertolino, Franco A; Schneider, Rudolf J; Messina, Germán A; Raba, Julio

    2012-04-20

    The synthetic estrogen ethinylestradiol (EE2) is an active component of oral contraceptives (OCs), considered as an endocrine disrupting compound (EDC). It is excreted from humans and released via sewage treatment plant effluents into aquatic environments. EDCs are any environmental pollutant chemical that, once incorporated into an organism, affects the hormonal balance of various species including humans. Its presence in the environment is becoming of great importance in water quality. This paper describes the development of an accurate, sensitive and selective method for capture, preconcentration and determination of EE2 present in water samples using: magnetic particles (MPs) as bioaffinity support for the capture and preconcentration of EE2 and a glassy carbon electrode modified with multi-walled carbon nanotubes (MWCNTs/GCE) as detection system. The capture procedure was based on the principle of immunoaffinity, the EE2 being extracted from the sample using the anti-EE2 antibodies (anti-EE2 Ab) which were previously immobilized on MPs. Subsequently the analyte desorption was done employing a sulfuric acid solution and the determination of the EE2 in the pre-concentrated solution was carried out by square wave voltammetry (SWV). This method can be used to determine EE2 in the range of 0.035-70 ng L(-1) with a detection limit (LOD) of 0.01 ng L(-1) and R.S.D.levels. Copyright © 2012 Elsevier B.V. All rights reserved.

  9. Developing Water Sampling Standards

    Science.gov (United States)

    Environmental Science and Technology, 1974

    1974-01-01

    Participants in the D-19 symposium on aquatic sampling and measurement for water pollution assessment were informed that determining the extent of waste water stream pollution is not a cut and dry procedure. Topics discussed include field sampling, representative sampling from storm sewers, suggested sampler features and application of improved…

  10. An Attempt to Analyse Baarda's Iterative Data Snooping Procedure ...

    African Journals Online (AJOL)

    ... by a combination of mathematical and empirical work with random numbers. This is now known as an early application of the Monte Carlo simulation. ... Baarda's iterative data snooping procedure as test statistic for outlier identification in the ...

  11. Sample size methodology

    CERN Document Server

    Desu, M M

    2012-01-01

    One of the most important problems in designing an experiment or a survey is sample size determination and this book presents the currently available methodology. It includes both random sampling from standard probability distributions and from finite populations. Also discussed is sample size determination for estimating parameters in a Bayesian setting by considering the posterior distribution of the parameter and specifying the necessary requirements. The determination of the sample size is considered for ranking and selection problems as well as for the design of clinical trials. Appropria

  12. Sampling Methods in Cardiovascular Nursing Research: An Overview.

    Science.gov (United States)

    Kandola, Damanpreet; Banner, Davina; O'Keefe-McCarthy, Sheila; Jassal, Debbie

    2014-01-01

    Cardiovascular nursing research covers a wide array of topics from health services to psychosocial patient experiences. The selection of specific participant samples is an important part of the research design and process. The sampling strategy employed is of utmost importance to ensure that a representative sample of participants is chosen. There are two main categories of sampling methods: probability and non-probability. Probability sampling is the random selection of elements from the population, where each element of the population has an equal and independent chance of being included in the sample. There are five main types of probability sampling including simple random sampling, systematic sampling, stratified sampling, cluster sampling, and multi-stage sampling. Non-probability sampling methods are those in which elements are chosen through non-random methods for inclusion into the research study and include convenience sampling, purposive sampling, and snowball sampling. Each approach offers distinct advantages and disadvantages and must be considered critically. In this research column, we provide an introduction to these key sampling techniques and draw on examples from the cardiovascular research. Understanding the differences in sampling techniques may aid nurses in effective appraisal of research literature and provide a reference pointfor nurses who engage in cardiovascular research.

  13. Surface Environmental Surveillance Procedures Manual

    Energy Technology Data Exchange (ETDEWEB)

    RW Hanf; TM Poston

    2000-09-20

    Environmental surveillance data are used in assessing the impact of current and past site operations on human health and the environment, demonstrating compliance with applicable local, state, and federal environmental regulations, and verifying the adequacy of containment and effluent controls. SESP sampling schedules are reviewed, revised, and published each calendar year in the Hanford Site Environmental Surveillance Master Sampling Schedule. Environmental samples are collected by SESP staff in accordance with the approved sample collection procedures documented in this manual.

  14. The status of dental caries and related factors in a sample of Iranian adolescents

    DEFF Research Database (Denmark)

    Pakpour, Amir H.; Hidarnia, Alireza; Hajizadeh, Ebrahim

    2011-01-01

    Objective: To describe the status of dental caries in a sample of Iranian adolescents aged 14 to 18 years in Qazvin, and to identify caries-related factors affecting this group. Study design: Qazvin was divided into three zones according to socio-economic status. The sampling procedure used...... was a stratified cluster sampling technique; incorporating 3 stratified zones, for each of which a cluster of school children were recruited from randomly selected high schools. The adolescents agreed to participate in the study and to complete a questionnaire. Dental caries status was assessed in terms of decayed...... their teeth on a regular basis. Although the incidence of caries was found to be moderate, it was influenced by demographic factors such as age and gender in addition to socio-behavioral variables such as family income, the level of education attained by parents, the frequency of dental brushing and flossing...

  15. Sampling designs and methods for estimating fish-impingement losses at cooling-water intakes

    International Nuclear Information System (INIS)

    Murarka, I.P.; Bodeau, D.J.

    1977-01-01

    Several systems for estimating fish impingement at power plant cooling-water intakes are compared to determine the most statistically efficient sampling designs and methods. Compared to a simple random sampling scheme the stratified systematic random sampling scheme, the systematic random sampling scheme, and the stratified random sampling scheme yield higher efficiencies and better estimators for the parameters in two models of fish impingement as a time-series process. Mathematical results and illustrative examples of the applications of the sampling schemes to simulated and real data are given. Some sampling designs applicable to fish-impingement studies are presented in appendixes

  16. THE ALFALFA H α SURVEY. I. PROJECT DESCRIPTION AND THE LOCAL STAR FORMATION RATE DENSITY FROM THE FALL SAMPLE

    International Nuclear Information System (INIS)

    Sistine, Angela Van; Salzer, John J.; Janowiecki, Steven; Sugden, Arthur; Giovanelli, Riccardo; Haynes, Martha P.; Jaskot, Anne E.; Wilcots, Eric M.

    2016-01-01

    The ALFALFA H α survey utilizes a large sample of H i-selected galaxies from the ALFALFA survey to study star formation (SF) in the local universe. ALFALFA H α contains 1555 galaxies with distances between ∼20 and ∼100 Mpc. We have obtained continuum-subtracted narrowband H α images and broadband R images for each galaxy, creating one of the largest homogeneous sets of H α images ever assembled. Our procedures were designed to minimize the uncertainties related to the calculation of the local SF rate density (SFRD). The galaxy sample we constructed is as close to volume-limited as possible, is a robust statistical sample, and spans a wide range of galaxy environments. In this paper, we discuss the properties of our Fall sample of 565 galaxies, our procedure for deriving individual galaxy SF rates, and our method for calculating the local SFRD. We present a preliminary value of log(SFRD[ M ⊙ yr −1 Mpc −3 ]) = −1.747 ± 0.018 (random) ±0.05 (systematic) based on the 565 galaxies in our Fall sub-sample. Compared to the weighted average of SFRD values around z ≈ 2, our local value indicates a drop in the global SFRD of a factor of 10.2 over that lookback time.

  17. A randomized study of a method for optimizing adolescent assent to biomedical research.

    Science.gov (United States)

    Annett, Robert D; Brody, Janet L; Scherer, David G; Turner, Charles W; Dalen, Jeanne; Raissy, Hengameh

    2017-01-01

    Voluntary consent/assent with adolescents invited to participate in research raises challenging problems. No studies to date have attempted to manipulate autonomy in relation to assent/consent processes. This study evaluated the effects of an autonomy-enhanced individualized assent/consent procedure embedded within a randomized pediatric asthma clinical trial. Families were randomly assigned to remain together or separated during a consent/assent process; the latter we characterize as an autonomy-enhanced assent/consent procedure. We hypothesized that separating adolescents from their parents would improve adolescent assent by increasing knowledge and appreciation of the clinical trial and willingness to participate. Sixty-four adolescent-parent dyads completed procedures. The together versus separate randomization made no difference in adolescent or parent willingness to participate. However, significant differences were found in both parent and adolescent knowledge of the asthma clinical trial based on the assent/consent procedure and adolescent age. The separate assent/consent procedure improved knowledge of study risks and benefits for older adolescents and their parents but not for the younger youth or their parents. Regardless of the assent/consent process, younger adolescents had lower comprehension of information associated with the study medication and research risks and benefits, but not study procedures or their research rights and privileges. The use of an autonomy-enhanced assent/consent procedure for adolescents may improve their and their parent's informed assent/consent without impacting research participation decisions. Traditional assent/consent procedures may result in a "diffusion of responsibility" effect between parents and older adolescents, specifically in attending to key information associated with study risks and benefits.

  18. Large-volume injection of sample diluents not miscible with the mobile phase as an alternative approach in sample preparation for bioanalysis: an application for fenspiride bioequivalence.

    Science.gov (United States)

    Medvedovici, Andrei; Udrescu, Stefan; Albu, Florin; Tache, Florentin; David, Victor

    2011-09-01

    Liquid-liquid extraction of target compounds from biological matrices followed by the injection of a large volume from the organic layer into the chromatographic column operated under reversed-phase (RP) conditions would successfully combine the selectivity and the straightforward character of the procedure in order to enhance sensitivity, compared with the usual approach of involving solvent evaporation and residue re-dissolution. Large-volume injection of samples in diluents that are not miscible with the mobile phase was recently introduced in chromatographic practice. The risk of random errors produced during the manipulation of samples is also substantially reduced. A bioanalytical method designed for the bioequivalence of fenspiride containing pharmaceutical formulations was based on a sample preparation procedure involving extraction of the target analyte and the internal standard (trimetazidine) from alkalinized plasma samples in 1-octanol. A volume of 75 µl from the octanol layer was directly injected on a Zorbax SB C18 Rapid Resolution, 50 mm length × 4.6 mm internal diameter × 1.8 µm particle size column, with the RP separation being carried out under gradient elution conditions. Detection was made through positive ESI and MS/MS. Aspects related to method development and validation are discussed. The bioanalytical method was successfully applied to assess bioequivalence of a modified release pharmaceutical formulation containing 80 mg fenspiride hydrochloride during two different studies carried out as single-dose administration under fasting and fed conditions (four arms), and multiple doses administration, respectively. The quality attributes assigned to the bioanalytical method, as resulting from its application to the bioequivalence studies, are highlighted and fully demonstrate that sample preparation based on large-volume injection of immiscible diluents has an increased potential for application in bioanalysis.

  19. Experimental phase diagram for random laser spectra

    International Nuclear Information System (INIS)

    El-Dardiry, Ramy G S; Mooiweer, Ronald; Lagendijk, Ad

    2012-01-01

    We systematically study the presence of narrow spectral features in a wide variety of random laser samples. Less gain or stronger scattering are shown to lead to a crossover from spiky to smooth spectra. A decomposition of random laser spectra into a set of Lorentzians provides unprecedented detail in the analysis of random laser spectra. We suggest an interpretation in terms of mode competition that enables an understanding of the observed experimental trends. In this interpretation, smooth random laser spectra are a consequence of competing modes for which the loss and gain are proportional. Spectral spikes are associated with modes that are uncoupled from the mode competition in the bulk of the sample. (paper)

  20. A headspace solid-phase microextraction procedure coupled with gas chromatography-mass spectrometry for the analysis of volatile polycyclic aromatic hydrocarbons in milk samples

    Energy Technology Data Exchange (ETDEWEB)

    Aguinaga, N.; Campillo, N.; Vinas, P.; Hernandez-Cordoba, M. [University of Murcia, Department of Analytical Chemistry, Faculty of Chemistry, Murcia (Spain)

    2008-06-15

    A sensitive and solvent-free method for the determination of ten polycyclic aromatic hydrocarbons, namely, naphthalene, acenaphthylene, acenaphthene, fluorene, phenanthrene, anthracene, fluoranthene, pyrene, benzo[a]anthracene and chrysene, with up to four aromatic rings, in milk samples using headspace solid-phase microextraction and gas chromatography-mass spectrometry detection has been developed. A polydimethylsiloxane-divinylbenzene fiber was chosen and used at 75 C for 60 min. Detection limits ranging from 0.2 to 5 ng L{sup -1} were attained at a signal-to-noise ratio of 3, depending on the compound and the milk sample under analysis. The proposed method was applied to ten different milk samples and the presence of six of the analytes studied in a skimmed milk with vegetal fiber sample was confirmed. The reliability of the procedure was verified by analyzing two different certified reference materials and by recovery studies. (orig.)