WorldWideScience

Sample records for random sampling technique

  1. Random sampling technique for ultra-fast computations of molecular opacities for exoplanet atmospheres

    Science.gov (United States)

    Min, M.

    2017-10-01

    Context. Opacities of molecules in exoplanet atmospheres rely on increasingly detailed line-lists for these molecules. The line lists available today contain for many species up to several billions of lines. Computation of the spectral line profile created by pressure and temperature broadening, the Voigt profile, of all of these lines is becoming a computational challenge. Aims: We aim to create a method to compute the Voigt profile in a way that automatically focusses the computation time into the strongest lines, while still maintaining the continuum contribution of the high number of weaker lines. Methods: Here, we outline a statistical line sampling technique that samples the Voigt profile quickly and with high accuracy. The number of samples is adjusted to the strength of the line and the local spectral line density. This automatically provides high accuracy line shapes for strong lines or lines that are spectrally isolated. The line sampling technique automatically preserves the integrated line opacity for all lines, thereby also providing the continuum opacity created by the large number of weak lines at very low computational cost. Results: The line sampling technique is tested for accuracy when computing line spectra and correlated-k tables. Extremely fast computations ( 3.5 × 105 lines per second per core on a standard current day desktop computer) with high accuracy (≤1% almost everywhere) are obtained. A detailed recipe on how to perform the computations is given.

  2. Generation and Analysis of Constrained Random Sampling Patterns

    DEFF Research Database (Denmark)

    Pierzchlewski, Jacek; Arildsen, Thomas

    2016-01-01

    Random sampling is a technique for signal acquisition which is gaining popularity in practical signal processing systems. Nowadays, event-driven analog-to-digital converters make random sampling feasible in practical applications. A process of random sampling is defined by a sampling pattern, whi...

  3. Sample size estimation and sampling techniques for selecting a representative sample

    Directory of Open Access Journals (Sweden)

    Aamir Omair

    2014-01-01

    Full Text Available Introduction: The purpose of this article is to provide a general understanding of the concepts of sampling as applied to health-related research. Sample Size Estimation: It is important to select a representative sample in quantitative research in order to be able to generalize the results to the target population. The sample should be of the required sample size and must be selected using an appropriate probability sampling technique. There are many hidden biases which can adversely affect the outcome of the study. Important factors to consider for estimating the sample size include the size of the study population, confidence level, expected proportion of the outcome variable (for categorical variables/standard deviation of the outcome variable (for numerical variables, and the required precision (margin of accuracy from the study. The more the precision required, the greater is the required sample size. Sampling Techniques: The probability sampling techniques applied for health related research include simple random sampling, systematic random sampling, stratified random sampling, cluster sampling, and multistage sampling. These are more recommended than the nonprobability sampling techniques, because the results of the study can be generalized to the target population.

  4. Improve natural gas sampling techniques

    Energy Technology Data Exchange (ETDEWEB)

    Jiskoot, R.J.J. (Jiskoot Autocontrol, Kent (United Kingdom))

    1994-02-01

    Accurate and reliable sampling systems are imperative when confirming natural gas' commercial value. Buyers and sellers need accurate hydrocarbon-composition information to conduct fair sale transactions. Because of poor sample extraction, preparation or analysis can invalidate the sale, more attention should be directed toward improving representative sampling. Consider all sampling components, i.e., gas types, line pressure and temperature, equipment maintenance and service needs, etc. The paper discusses gas sampling, design considerations (location, probe type, extraction devices, controller, and receivers), operating requirements, and system integration.

  5. Random matrix techniques in quantum information theory

    Science.gov (United States)

    Collins, Benoît; Nechita, Ion

    2016-01-01

    The purpose of this review is to present some of the latest developments using random techniques, and in particular, random matrix techniques in quantum information theory. Our review is a blend of a rather exhaustive review and of more detailed examples—coming mainly from research projects in which the authors were involved. We focus on two main topics, random quantum states and random quantum channels. We present results related to entropic quantities, entanglement of typical states, entanglement thresholds, the output set of quantum channels, and violations of the minimum output entropy of random channels.

  6. Random matrix techniques in quantum information theory

    Energy Technology Data Exchange (ETDEWEB)

    Collins, Benoît, E-mail: collins@math.kyoto-u.ac.jp [Department of Mathematics, Kyoto University, Kyoto 606-8502 (Japan); Département de Mathématique et Statistique, Université d’Ottawa, 585 King Edward, Ottawa, Ontario K1N6N5 (Canada); CNRS, Lyon (France); Nechita, Ion, E-mail: nechita@irsamc.ups-tlse.fr [Zentrum Mathematik, M5, Technische Universität München, Boltzmannstrasse 3, 85748 Garching (Germany); Laboratoire de Physique Théorique, CNRS, IRSAMC, Université de Toulouse, UPS, F-31062 Toulouse (France)

    2016-01-15

    The purpose of this review is to present some of the latest developments using random techniques, and in particular, random matrix techniques in quantum information theory. Our review is a blend of a rather exhaustive review and of more detailed examples—coming mainly from research projects in which the authors were involved. We focus on two main topics, random quantum states and random quantum channels. We present results related to entropic quantities, entanglement of typical states, entanglement thresholds, the output set of quantum channels, and violations of the minimum output entropy of random channels.

  7. k-Means: Random Sampling Procedure

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. k-Means: Random Sampling Procedure. Optimal 1-Mean is. Approximation of Centroid (Inaba et al). S = random sample of size O(1/ ); Centroid of S is a (1+ )-approx centroid of P with constant probability.

  8. Optimum allocation in multivariate stratified random sampling: Stochastic matrix optimisation

    OpenAIRE

    Diaz-Garcia, Jose A.; Ramos-Quiroga, Rogelio

    2011-01-01

    The allocation problem for multivariate stratified random sampling as a problem of stochastic matrix integer mathematical programming is considered. With these aims the asymptotic normality of sample covariance matrices for each strata is established. Some alternative approaches are suggested for its solution. An example is solved by applying the proposed techniques.

  9. Spectral Estimation by the Random DEC Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Jensen, J. Laigaard; Krenk, S.

    This paper contains an empirical study of the accuracy of the Random Dec (RDD) technique. Realizations of the response from a single-degree-of-freedom system loaded by white noise are simulated using an ARMA model. The Autocorrelation function is estimated using the RDD technique and the estimated...

  10. K-Median: Random Sampling Procedure

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. K-Median: Random Sampling Procedure. Sample a set of 1/ + 1 points from P. Let Q = first 1/ points, p = last point. Let T = Avg. 1-Median cost of P, c=1-Median. Let B1 = B(c,T/ 2), B2 = B(p, T). Let P' = points in B1.

  11. Urine sampling techniques in symptomatic primary-care patients

    DEFF Research Database (Denmark)

    Holm, Anne; Aabenhus, Rune

    2016-01-01

    in primary care. The aim of this study was to determine the accuracy of urine culture from different sampling-techniques in symptomatic non-pregnant women in primary care. Methods: A systematic review was conducted by searching Medline and Embase for clinical studies conducted in primary care using......Background: Choice of urine sampling technique in urinary tract infection may impact diagnostic accuracy and thus lead to possible over- or undertreatment. Currently no evidencebased consensus exists regarding correct sampling technique of urine from women with symptoms of urinary tract infection...... seven studies investigating urine sampling technique in 1062 symptomatic patients in primary care. Mid-stream-clean-catch had a positive predictive value of 0.79 to 0.95 and a negative predictive value close to 1 compared to sterile techniques. Two randomized controlled trials found no difference...

  12. Acceptance sampling using judgmental and randomly selected samples

    Energy Technology Data Exchange (ETDEWEB)

    Sego, Landon H.; Shulman, Stanley A.; Anderson, Kevin K.; Wilson, John E.; Pulsipher, Brent A.; Sieber, W. Karl

    2010-09-01

    We present a Bayesian model for acceptance sampling where the population consists of two groups, each with different levels of risk of containing unacceptable items. Expert opinion, or judgment, may be required to distinguish between the high and low-risk groups. Hence, high-risk items are likely to be identifed (and sampled) using expert judgment, while the remaining low-risk items are sampled randomly. We focus on the situation where all observed samples must be acceptable. Consequently, the objective of the statistical inference is to quantify the probability that a large percentage of the unsampled items in the population are also acceptable. We demonstrate that traditional (frequentist) acceptance sampling and simpler Bayesian formulations of the problem are essentially special cases of the proposed model. We explore the properties of the model in detail, and discuss the conditions necessary to ensure that required samples sizes are non-decreasing function of the population size. The method is applicable to a variety of acceptance sampling problems, and, in particular, to environmental sampling where the objective is to demonstrate the safety of reoccupying a remediated facility that has been contaminated with a lethal agent.

  13. Sequential time interleaved random equivalent sampling for repetitive signal

    Science.gov (United States)

    Zhao, Yijiu; Liu, Jingjing

    2016-12-01

    Compressed sensing (CS) based sampling techniques exhibit many advantages over other existing approaches for sparse signal spectrum sensing; they are also incorporated into non-uniform sampling signal reconstruction to improve the efficiency, such as random equivalent sampling (RES). However, in CS based RES, only one sample of each acquisition is considered in the signal reconstruction stage, and it will result in more acquisition runs and longer sampling time. In this paper, a sampling sequence is taken in each RES acquisition run, and the corresponding block measurement matrix is constructed using a Whittaker-Shannon interpolation formula. All the block matrices are combined into an equivalent measurement matrix with respect to all sampling sequences. We implemented the proposed approach with a multi-cores analog-to-digital converter (ADC), whose ADC cores are time interleaved. A prototype realization of this proposed CS based sequential random equivalent sampling method has been developed. It is able to capture an analog waveform at an equivalent sampling rate of 40 GHz while sampled at 1 GHz physically. Experiments indicate that, for a sparse signal, the proposed CS based sequential random equivalent sampling exhibits high efficiency.

  14. Enhanced sampling techniques in biomolecular simulations.

    Science.gov (United States)

    Spiwok, Vojtech; Sucur, Zoran; Hosek, Petr

    2015-11-01

    Biomolecular simulations are routinely used in biochemistry and molecular biology research; however, they often fail to match expectations of their impact on pharmaceutical and biotech industry. This is caused by the fact that a vast amount of computer time is required to simulate short episodes from the life of biomolecules. Several approaches have been developed to overcome this obstacle, including application of massively parallel and special purpose computers or non-conventional hardware. Methodological approaches are represented by coarse-grained models and enhanced sampling techniques. These techniques can show how the studied system behaves in long time-scales on the basis of relatively short simulations. This review presents an overview of new simulation approaches, the theory behind enhanced sampling methods and success stories of their applications with a direct impact on biotechnology or drug design. Copyright © 2014 Elsevier Inc. All rights reserved.

  15. A random spatial sampling method in a rural developing nation.

    Science.gov (United States)

    Kondo, Michelle C; Bream, Kent D W; Barg, Frances K; Branas, Charles C

    2014-04-10

    Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. We describe a stratified random sampling method using geographical information system (GIS) software and global positioning system (GPS) technology for application in a health survey in a rural region of Guatemala, as well as a qualitative study of the enumeration process. This method offers an alternative sampling technique that could reduce opportunities for bias in household selection compared to cluster methods. However, its use is subject to issues surrounding survey preparation, technological limitations and in-the-field household selection. Application of this method in remote areas will raise challenges surrounding the boundary delineation process, use and translation of satellite imagery between GIS and GPS, and household selection at each survey point in varying field conditions. This method favors household selection in denser urban areas and in new residential developments. Random spatial sampling methodology can be used to survey a random sample of population in a remote region of a developing nation. Although this method should be further validated and compared with more established methods to determine its utility in social survey applications, it shows promise for use in developing nations with resource-challenged environments where detailed geographic and human census data are less available.

  16. GSAMPLE: Stata module to draw a random sample

    OpenAIRE

    Jann, Ben

    2006-01-01

    gsample draws a random sample from the data in memory. Simple random sampling (SRS) is supported, as well as unequal probability sampling (UPS), of which sampling with probabilities proportional to size (PPS) is a special case. Both methods, SRS and UPS/PPS, provide sampling with replacement and sampling without replacement. Furthermore, stratified sampling and cluster sampling is supported.

  17. Sample size estimation and sampling techniques for selecting a representative sample

    OpenAIRE

    Aamir Omair

    2014-01-01

    Introduction: The purpose of this article is to provide a general understanding of the concepts of sampling as applied to health-related research. Sample Size Estimation: It is important to select a representative sample in quantitative research in order to be able to generalize the results to the target population. The sample should be of the required sample size and must be selected using an appropriate probability sampling technique. There are many hidden biases which can adversely affect ...

  18. Spur reduction technique for sampling PLLs

    NARCIS (Netherlands)

    Gao, X.; Bahai, A.; Klumperink, Eric A.M.; Nauta, Bram; Bohsali, M.; Djabbari, A.; Socci, G.

    2010-01-01

    Control circuitry and method of controlling a sampling phase locked loop (PLL). By controlling the duty cycle of a sampling control signal, in accordance with the PLL reference and output signals, spurious output signals from the sampling PLL being controlled can be reduced.

  19. Spur reduction technique for sampling PLLs

    NARCIS (Netherlands)

    Gao, X.; Bahai, A.; Klumperink, Eric A.M.; Nauta, Bram; Bohsali, M.; Djabbari, A.; Socci, G.

    2012-01-01

    Control circuitry and method of controlling a sampling phase locked loop (PLL). By controlling the duty cycle of a sampling control signal, in accordance with the PLL reference and output signals, spurious output signals from the sampling PLL being controlled can be reduced.

  20. Spur reduction technique for sampling PLLs

    NARCIS (Netherlands)

    Gao, X.; Bahai, Ahmad; Bohsali, Mounhir; Djabbari, Ali; Klumperink, Eric A.M.; Nauta, Bram; Socci, Gerard

    2013-01-01

    Control circuitry and method of controlling a sampling phase locked loop (PLL). By controlling the duty cycle of a sampling control signal, in accordance with the PLL reference and output signals, spurious output signals from the sampling PLL being controlled can be reduced.

  1. A Family of Estimators of a Sensitive Variable Using Auxiliary Information in Stratified Random Sampling

    Directory of Open Access Journals (Sweden)

    Nadia Mushtaq

    2017-03-01

    Full Text Available In this article, a combined general family of estimators is proposed for estimating finite population mean of a sensitive variable in stratified random sampling with non-sensitive auxiliary variable based on randomized response technique. Under stratified random sampling without replacement scheme, the expression of bias and mean square error (MSE up to the first-order approximations are derived. Theoretical and empirical results through a simulation study show that the proposed class of estimators is more efficient than the existing estimators, i.e., usual stratified random sample mean estimator, Sousa et al (2014 ratio and regression estimator of the sensitive variable in stratified sampling.

  2. Techniques for geothermal liquid sampling and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kindle, C.H.; Woodruff, E.M.

    1981-07-01

    A methodology has been developed that is particularly suited to liquid-dominated resources and adaptable to a variety of situations. It is intended to be a base methodology upon which variations can be made to meet specific needs or situations. The approach consists of recording flow conditions at the time of sampling, a specific insertable probe sampling system, a sample stabilization procedure, commercially available laboratory instruments, and data quality check procedures.

  3. A random spatial sampling method in a rural developing nation

    Science.gov (United States)

    Michelle C. Kondo; Kent D.W. Bream; Frances K. Barg; Charles C. Branas

    2014-01-01

    Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. We describe a stratified random sampling method...

  4. Evaluation of diesel particulate matter sampling techniques

    CSIR Research Space (South Africa)

    Pretorius, CJ

    2011-09-01

    Full Text Available The study evaluated diesel particulate matter (DPM) sampling methods used in the South African mining industry. The three-piece cassette respirable, open face and stopper sampling methods were compared with the SKC DPM cassette method to find a...

  5. Analysis of a global random stratified sample of nurse legislation.

    Science.gov (United States)

    Benton, D C; Fernández-Fernández, M P; González-Jurado, M A; Beneit-Montesinos, J V

    2015-06-01

    To identify, compare and contrast the major component parts of heterogeneous stratified sample of nursing legislation. Nursing legislation varies from one jurisdiction to another. Up until now no research exists into whether the variations of such legislation are random or if variations are related to a set of key attributes. This mixed method study used a random stratified sample of legislation to map through documentary analysis the content of 14 nursing acts and then explored, using quantitative techniques, whether the material contained relates to a number of key attributes. These attributes include: legal tradition of the jurisdiction; model of regulation; administrative approach; area of the world; and the economic status of the jurisdiction. Twelve component parts of nursing legislation were identified. These were remarkably similar irrespective of attributes of interest. However, not all component parts were specified in the same level of detail and the manner by which the elements were addressed did vary. A number of potential relationships between the structure of the legislation and the key attributes of interest were identified. This study generated a comprehensive and integrated map of a global sample of nursing legislation. It provides a set of descriptors to be used to undertake further quantitative work and provides an important policy tool to facilitate dialogue between regulatory bodies. At the individual nurse level it offers insights that can help nurses pursue recognition of credentials across jurisdictions. © 2015 International Council of Nurses.

  6. Power Spectrum Estimation of Randomly Sampled Signals

    DEFF Research Database (Denmark)

    Velte, Clara M.; Buchhave, Preben; K. George, William

    2014-01-01

    with high data rate and low inherent bias, respectively, while residence time weighting provides non-biased estimates regardless of setting. The free-running processor was also tested and compared to residence time weighting using actual LDA measurements in a turbulent round jet. Power spectra from...... of alternative methods attempting to produce correct power spectra have been invented andtested. The objective of the current study is to create a simple computer generated signal for baseline testing of residence time weighting and some of the most commonly proposed algorithms (or algorithms which most...... modernalgorithms ultimately are based on), sample-and-hold and the direct spectral estimator without residence time weighting, and compare how they perform in relation to power spectra based on the equidistantly sampled reference signal. The computer generated signal is a Poisson process with a sample rate...

  7. Random constraint sampling and duality for convex optimization

    OpenAIRE

    Haskell, William B.; Pengqian, Yu

    2016-01-01

    We are interested in solving convex optimization problems with large numbers of constraints. Randomized algorithms, such as random constraint sampling, have been very successful in giving nearly optimal solutions to such problems. In this paper, we combine random constraint sampling with the classical primal-dual algorithm for convex optimization problems with large numbers of constraints, and we give a convergence rate analysis. We then report numerical experiments that verify the effectiven...

  8. Random number datasets generated from statistical analysis of randomly sampled GSM recharge cards.

    Science.gov (United States)

    Okagbue, Hilary I; Opanuga, Abiodun A; Oguntunde, Pelumi E; Ugwoke, Paulinus O

    2017-02-01

    In this article, a random number of datasets was generated from random samples of used GSM (Global Systems for Mobile Communications) recharge cards. Statistical analyses were performed to refine the raw data to random number datasets arranged in table. A detailed description of the method and relevant tests of randomness were also discussed.

  9. A comparative study of sampling techniques for monitoring carcass contamination

    NARCIS (Netherlands)

    Snijders, J.M.A.; Janssen, M.H.W.; Gerats, G.E.; Corstiaensen, G.P.

    1984-01-01

    Four bacteriological sampling techniques i.e. the excision, double swab, agar contract and modified agar contact techniques were compared by sampling pig carcasses before and after chilling. As well as assessing the advantages and disadvantages of the techniques particular attention was paid to

  10. Power Spectrum Estimation of Randomly Sampled Signals

    DEFF Research Database (Denmark)

    Velte, C. M.; Buchhave, P.; K. George, W.

    . Residence time weighting provides non-biased estimates regardless of setting. The free-running processor was also tested and compared to residence time weighting using actual LDA measurements in a turbulent round jet. Power spectra from measurements on the jet centerline and the outer part of the jet...... sine waves. The primary signal and the corresponding power spectrum are shown in Figure 1. The conventional spectrum shows multiple erroneous mixing frequencies and the peak values are too low. The residence time weighted spectrum is correct. The sample-and-hold spectrum has lower power than...... the correct spectrum, and the f -2-filtering effect appearing for low data densities is evident (Adrian and Yao 1987). The remaining tests also show that sample-and-hold and the free-running processor perform well only under very particular circumstances with high data rate and low inherent bias, respectively...

  11. Estimation of Correlation Functions by the Random Decrement Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Krenk, S.; Jensen, Jakob Laigaard

    1993-01-01

    The Random Decrement (RDD) Technique is a versatile technique for characterization of random signals in the time domain. In this paper a short review of the theoretical basis is given, and the technique is illustrated by estimating auto-correlation functions and cross-correlation functions on modal...... - in some case up to 100 times faster that the FFT technique. Another important advantage is that if the RDD technique is implemented correctly, the correlation function estimates are unbiased. Comparison with exact solutions for the correlation functions show that the RDD auto-correlation estimates suffer...

  12. Estimation of Correlation Functions by the Random Decrement Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Krenk, Steen; Jensen, Jakob Laigaard

    The Random Decrement (RDD) Technique is a versatile technique for characterization of random signals in the time domain. In this paper a short review of the theoretical basis is given, and the technique is illustrated by estimating auto-correlation functions and cross-correlation functions on modal...... - in some case up to 100 times faster that the FFT technique. Another important advantage is that if the RDD technique is implemented correctly, the correlation function estimates are unbiased. Comparison with exact solutions for the correlation functions show that the RDD auto-correlation estimates suffer...

  13. Scaling Techniques for Combustion Device Random Vibration Predictions

    Science.gov (United States)

    Kenny, R. J.; Ferebee, R. C.; Duvall, L. D.

    2016-01-01

    This work presents compares scaling techniques that can be used for prediction of combustion device component random vibration levels with excitation due to the internal combustion dynamics. Acceleration and unsteady dynamic pressure data from multiple component test programs are compared and normalized per the two scaling approaches reviewed. Two scaling technique are reviewed and compared against the collected component test data. The first technique is an existing approach developed by Barrett, and the second technique is an updated approach new to this work. Results from utilizing both techniques are presented and recommendations about future component random vibration prediction approaches are given.

  14. Are most samples of animals systematically biased? Consistent individual trait differences bias samples despite random sampling.

    Science.gov (United States)

    Biro, Peter A

    2013-02-01

    Sampling animals from the wild for study is something nearly every biologist has done, but despite our best efforts to obtain random samples of animals, 'hidden' trait biases may still exist. For example, consistent behavioral traits can affect trappability/catchability, independent of obvious factors such as size and gender, and these traits are often correlated with other repeatable physiological and/or life history traits. If so, systematic sampling bias may exist for any of these traits. The extent to which this is a problem, of course, depends on the magnitude of bias, which is presently unknown because the underlying trait distributions in populations are usually unknown, or unknowable. Indeed, our present knowledge about sampling bias comes from samples (not complete population censuses), which can possess bias to begin with. I had the unique opportunity to create naturalized populations of fish by seeding each of four small fishless lakes with equal densities of slow-, intermediate-, and fast-growing fish. Using sampling methods that are not size-selective, I observed that fast-growing fish were up to two-times more likely to be sampled than slower-growing fish. This indicates substantial and systematic bias with respect to an important life history trait (growth rate). If correlations between behavioral, physiological and life-history traits are as widespread as the literature suggests, then many animal samples may be systematically biased with respect to these traits (e.g., when collecting animals for laboratory use), and affect our inferences about population structure and abundance. I conclude with a discussion on ways to minimize sampling bias for particular physiological/behavioral/life-history types within animal populations.

  15. Non-terminal blood sampling techniques in Guinea pigs

    DEFF Research Database (Denmark)

    Birck, Malene Muusfeldt; Tveden-Nyborg, Pernille; Lindblad, Maiken Marie

    2014-01-01

    of guinea pigs are slightly different from other rodent models, hence modulation of sampling techniques to accommodate for species-specific differences, e.g., compared to mice and rats, are necessary to obtain sufficient and high quality samples. As both long and short term in vivo studies often require...... repeated blood sampling the choice of technique should be well considered in order to reduce stress and discomfort in the animals but also to ensure survival as well as compliance with requirements of sample size and accessibility. Venous blood samples can be obtained at a number of sites in guinea pigs e.......g., the saphenous and jugular veins, each technique containing both advantages and disadvantages(4,5). Here, we present four different blood sampling techniques for either conscious or anaesthetized guinea pigs. The procedures are all non-terminal procedures provided that sample volumes and number of samples do...

  16. SOME SYSTEMATIC SAMPLING STRATEGIES USING MULTIPLE RANDOM STARTS

    Directory of Open Access Journals (Sweden)

    Sampath Sundaram

    2010-09-01

    Full Text Available In this paper an attempt is made to extend linear systematic sampling using multiple random starts due to Gautschi(1957for various types of systematic sampling schemes available in literature, namely(i  Balanced Systematic Sampling (BSS of  Sethi (1965 and (ii Modified Systematic Sampling (MSS of Singh, Jindal, and Garg  (1968. Further, the proposed methods were compared with Yates corrected estimator developed with reference to Gautschi’s Linear systematic sampling (LSS with two random starts using appropriate super population models with the  help of R package for statistical computing.

  17. Efficient sampling of complex network with modified random walk strategies

    Science.gov (United States)

    Xie, Yunya; Chang, Shuhua; Zhang, Zhipeng; Zhang, Mi; Yang, Lei

    2018-02-01

    We present two novel random walk strategies, choosing seed node (CSN) random walk and no-retracing (NR) random walk. Different from the classical random walk sampling, the CSN and NR strategies focus on the influences of the seed node choice and path overlap, respectively. Three random walk samplings are applied in the Erdös-Rényi (ER), Barabási-Albert (BA), Watts-Strogatz (WS), and the weighted USAir networks, respectively. Then, the major properties of sampled subnets, such as sampling efficiency, degree distributions, average degree and average clustering coefficient, are studied. The similar conclusions can be reached with these three random walk strategies. Firstly, the networks with small scales and simple structures are conducive to the sampling. Secondly, the average degree and the average clustering coefficient of the sampled subnet tend to the corresponding values of original networks with limited steps. And thirdly, all the degree distributions of the subnets are slightly biased to the high degree side. However, the NR strategy performs better for the average clustering coefficient of the subnet. In the real weighted USAir networks, some obvious characters like the larger clustering coefficient and the fluctuation of degree distribution are reproduced well by these random walk strategies.

  18. Statistical Theory of the Vector Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Brincker, Rune; Ibrahim, S. R.

    1999-01-01

    The Vector Random Decrement technique has previously been introduced as an effcient method to transform ambient responses of linear structures into Vector Random Decrement functions which are equivalent to free decays of the current structure. The modal parameters can be extracted from the free...

  19. A Manual for Selecting Sampling Techniques in Research

    OpenAIRE

    Alvi, Mohsin

    2016-01-01

    The Manual for Sampling Techniques used in Social Sciences is an effort to describe various types of sampling methodologies that are used in researches of social sciences in an easy and understandable way. Characteristics, benefits, crucial issues/ draw backs, and examples of each sampling type are provided separately. The manual begins by describing What is Sampling and its Purposes then it moves forward discussing the two broader types: probability sampling and non-probability sampling. Lat...

  20. Sample Subset Optimization Techniques for Imbalanced and Ensemble Learning Problems in Bioinformatics Applications.

    Science.gov (United States)

    Yang, Pengyi; Yoo, Paul D; Fernando, Juanita; Zhou, Bing B; Zhang, Zili; Zomaya, Albert Y

    2014-03-01

    Data sampling is a widely used technique in a broad range of machine learning problems. Traditional sampling approaches generally rely on random resampling from a given dataset. However, these approaches do not take into consideration additional information, such as sample quality and usefulness. We recently proposed a data sampling technique, called sample subset optimization (SSO). The SSO technique relies on a cross-validation procedure for identifying and selecting the most useful samples as subsets. In this paper, we describe the application of SSO techniques to imbalanced and ensemble learning problems, respectively. For imbalanced learning, the SSO technique is employed as an under-sampling technique for identifying a subset of highly discriminative samples in the majority class. In ensemble learning, the SSO technique is utilized as a generic ensemble technique where multiple optimized subsets of samples from each class are selected for building an ensemble classifier. We demonstrate the utilities and advantages of the proposed techniques on a variety of bioinformatics applications where class imbalance, small sample size, and noisy data are prevalent.

  1. A visual training tool for the Photoload sampling technique

    Science.gov (United States)

    Violet J. Holley; Robert E. Keane

    2010-01-01

    This visual training aid is designed to provide Photoload users a tool to increase the accuracy of fuel loading estimations when using the Photoload technique. The Photoload Sampling Technique (RMRS-GTR-190) provides fire managers a sampling method for obtaining consistent, accurate, inexpensive, and quick estimates of fuel loading. It is designed to require only one...

  2. Sampling techniques for adult Afrotropical malaria vectors and their ...

    African Journals Online (AJOL)

    It was the aim of this paper to critically evaluate the most common mosquito sampling techniques in relation to their reliability in the estimation of EIR. The techniques include man-landing, light trap, light trap/bednet combination and odour-baited traps. Although man-landing technique is the most reliable, it however, expose ...

  3. Spatial Random Sampling: A Structure-Preserving Data Sketching Tool

    Science.gov (United States)

    Rahmani, Mostafa; Atia, George K.

    2017-09-01

    Random column sampling is not guaranteed to yield data sketches that preserve the underlying structures of the data and may not sample sufficiently from less-populated data clusters. Also, adaptive sampling can often provide accurate low rank approximations, yet may fall short of producing descriptive data sketches, especially when the cluster centers are linearly dependent. Motivated by that, this paper introduces a novel randomized column sampling tool dubbed Spatial Random Sampling (SRS), in which data points are sampled based on their proximity to randomly sampled points on the unit sphere. The most compelling feature of SRS is that the corresponding probability of sampling from a given data cluster is proportional to the surface area the cluster occupies on the unit sphere, independently from the size of the cluster population. Although it is fully randomized, SRS is shown to provide descriptive and balanced data representations. The proposed idea addresses a pressing need in data science and holds potential to inspire many novel approaches for analysis of big data.

  4. Methods for sample size determination in cluster randomized trials.

    Science.gov (United States)

    Rutterford, Clare; Copas, Andrew; Eldridge, Sandra

    2015-06-01

    The use of cluster randomized trials (CRTs) is increasing, along with the variety in their design and analysis. The simplest approach for their sample size calculation is to calculate the sample size assuming individual randomization and inflate this by a design effect to account for randomization by cluster. The assumptions of a simple design effect may not always be met; alternative or more complicated approaches are required. We summarise a wide range of sample size methods available for cluster randomized trials. For those familiar with sample size calculations for individually randomized trials but with less experience in the clustered case, this manuscript provides formulae for a wide range of scenarios with associated explanation and recommendations. For those with more experience, comprehensive summaries are provided that allow quick identification of methods for a given design, outcome and analysis method. We present first those methods applicable to the simplest two-arm, parallel group, completely randomized design followed by methods that incorporate deviations from this design such as: variability in cluster sizes; attrition; non-compliance; or the inclusion of baseline covariates or repeated measures. The paper concludes with methods for alternative designs. There is a large amount of methodology available for sample size calculations in CRTs. This paper gives the most comprehensive description of published methodology for sample size calculation and provides an important resource for those designing these trials. © The Author 2015. Published by Oxford University Press on behalf of the International Epidemiological Association.

  5. A comparison of methods for representing sparsely sampled random quantities.

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Vicente Jose; Swiler, Laura Painton; Urbina, Angel; Mullins, Joshua

    2013-09-01

    This report discusses the treatment of uncertainties stemming from relatively few samples of random quantities. The importance of this topic extends beyond experimental data uncertainty to situations involving uncertainty in model calibration, validation, and prediction. With very sparse data samples it is not practical to have a goal of accurately estimating the underlying probability density function (PDF). Rather, a pragmatic goal is that the uncertainty representation should be conservative so as to bound a specified percentile range of the actual PDF, say the range between 0.025 and .975 percentiles, with reasonable reliability. A second, opposing objective is that the representation not be overly conservative; that it minimally over-estimate the desired percentile range of the actual PDF. The presence of the two opposing objectives makes the sparse-data uncertainty representation problem interesting and difficult. In this report, five uncertainty representation techniques are characterized for their performance on twenty-one test problems (over thousands of trials for each problem) according to these two opposing objectives and other performance measures. Two of the methods, statistical Tolerance Intervals and a kernel density approach specifically developed for handling sparse data, exhibit significantly better overall performance than the others.

  6. SOME SYSTEMATIC SAMPLING STRATEGIES USING MULTIPLE RANDOM STARTS

    OpenAIRE

    Sampath Sundaram; Ammani Sivaraman

    2010-01-01

    In this paper an attempt is made to extend linear systematic sampling using multiple random starts due to Gautschi(1957)for various types of systematic sampling schemes available in literature, namely(i)  Balanced Systematic Sampling (BSS) of  Sethi (1965) and (ii) Modified Systematic Sampling (MSS) of Singh, Jindal, and Garg  (1968). Further, the proposed methods were compared with Yates corrected estimator developed with reference to Gautschi’s Linear systematic samplin...

  7. Sampling large random knots in a confined space

    Energy Technology Data Exchange (ETDEWEB)

    Arsuaga, J [Department of Mathematics, San Francisco State University, 1600 Holloway Ave, San Francisco, CA 94132 (United States); Blackstone, T [Department of Computer Science, San Francisco State University, 1600 Holloway Ave., San Francisco, CA 94132 (United States); Diao, Y [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Hinson, K [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Karadayi, E [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States); Saito, M [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States)

    2007-09-28

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e{sup n{sup 2}}). We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n{sup 2}). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  8. Performance of Random Effects Model Estimators under Complex Sampling Designs

    Science.gov (United States)

    Jia, Yue; Stokes, Lynne; Harris, Ian; Wang, Yan

    2011-01-01

    In this article, we consider estimation of parameters of random effects models from samples collected via complex multistage designs. Incorporation of sampling weights is one way to reduce estimation bias due to unequal probabilities of selection. Several weighting methods have been proposed in the literature for estimating the parameters of…

  9. Estimation of Correlation Functions by the Random Decrement Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Krenk, Steen; Jensen, Jacob Laigaard

    1991-01-01

    The Random Decrement (RDD) Technique is a versatile technique for characterization of random signals in the time domain. In this paper a short review of the theoretical basis is given, and the technique is illustrated by estimating auto-correlation functions and cross-correlation functions on modal...... - in some cases up to 100 times faster than the FFT technique. Another important advantage is that if the RDD technique is implemented correctly, the correlation function estimates are unbiased. Comparison with exact solutions for the correlation functions shows that the RDD auto-correlation estimates...... suffer from smaller RDD auto-correlation estimation errors than the corresponding FFT estimates. However, in the case of estimating cross-correlation functions for the stochastic processes with low mutual correlation, the FFT tehcnique might be more accurate....

  10. Misrepresenting random sampling? A systematic review of research papers in the Journal of Advanced Nursing.

    Science.gov (United States)

    Williamson, Graham R

    2003-11-01

    This paper discusses the theoretical limitations of the use of random sampling and probability theory in the production of a significance level (or P-value) in nursing research. Potential alternatives, in the form of randomization tests, are proposed. Research papers in nursing, medicine and psychology frequently misrepresent their statistical findings, as the P-values reported assume random sampling. In this systematic review of studies published between January 1995 and June 2002 in the Journal of Advanced Nursing, 89 (68%) studies broke this assumption because they used convenience samples or entire populations. As a result, some of the findings may be questionable. The key ideas of random sampling and probability theory for statistical testing (for generating a P-value) are outlined. The result of a systematic review of research papers published in the Journal of Advanced Nursing is then presented, showing how frequently random sampling appears to have been misrepresented. Useful alternative techniques that might overcome these limitations are then discussed. REVIEW LIMITATIONS: This review is limited in scope because it is applied to one journal, and so the findings cannot be generalized to other nursing journals or to nursing research in general. However, it is possible that other nursing journals are also publishing research articles based on the misrepresentation of random sampling. The review is also limited because in several of the articles the sampling method was not completely clearly stated, and in this circumstance a judgment has been made as to the sampling method employed, based on the indications given by author(s). Quantitative researchers in nursing should be very careful that the statistical techniques they use are appropriate for the design and sampling methods of their studies. If the techniques they employ are not appropriate, they run the risk of misinterpreting findings by using inappropriate, unrepresentative and biased samples.

  11. Manipulation of biological samples using micro and nano techniques

    DEFF Research Database (Denmark)

    Castillo, Jaime; Dimaki, Maria; Svendsen, Winnie Edith

    2009-01-01

    The constant interest in handling, integrating and understanding biological systems of interest for the biomedical field, the pharmaceutical industry and the biomaterial researchers demand the use of techniques that allow the manipulation of biological samples causing minimal or no damage...... to their natural structure. Thanks to the advances in micro- and nanofabrication during the last decades several manipulation techniques offer us the possibility to image, characterize and manipulate biological material in a controlled way. Using these techniques the integration of biomaterials with remarkable...

  12. Micro and Nano Techniques for the Handling of Biological Samples

    DEFF Research Database (Denmark)

    Micro and Nano Techniques for the Handling of Biological Samples reviews the different techniques available to manipulate and integrate biological materials in a controlled manner, either by sliding them along a surface (2-D manipulation), or by gripping and moving them to a new position (3-D...

  13. Evaluation of sampling techniques for millipedes | Inyang | Moor ...

    African Journals Online (AJOL)

    Techniques (ii), (iii) and (iv) appeared appropriate for wet season sampling as the millipedes prefer the top soil during this period to avoid waterlogged condition or excessive moisture. The four techniques derived from the natural habitat, food needs, suitable conditions of temperature and moisture dictated by time of ...

  14. Sample Selection in Randomized Experiments: A New Method Using Propensity Score Stratified Sampling

    Science.gov (United States)

    Tipton, Elizabeth; Hedges, Larry; Vaden-Kiernan, Michael; Borman, Geoffrey; Sullivan, Kate; Caverly, Sarah

    2014-01-01

    Randomized experiments are often seen as the "gold standard" for causal research. Despite the fact that experiments use random assignment to treatment conditions, units are seldom selected into the experiment using probability sampling. Very little research on experimental design has focused on how to make generalizations to well-defined…

  15. Differences in sampling techniques on total post-mortem tryptase.

    Science.gov (United States)

    Tse, R; Garland, J; Kesha, K; Elstub, H; Cala, A D; Ahn, Y; Stables, S; Palmiere, C

    2017-11-20

    The measurement of mast cell tryptase is commonly used to support the diagnosis of anaphylaxis. In the post-mortem setting, the literature recommends sampling from peripheral blood sources (femoral blood) but does not specify the exact sampling technique. Sampling techniques vary between pathologists, and it is unclear whether different sampling techniques have any impact on post-mortem tryptase levels. The aim of this study is to compare the difference in femoral total post-mortem tryptase levels between two sampling techniques. A 6-month retrospective study comparing femoral total post-mortem tryptase levels between (1) aspirating femoral vessels with a needle and syringe prior to evisceration and (2) femoral vein cut down during evisceration. Twenty cases were identified, with three cases excluded from analysis. There was a statistically significant difference (paired t test, p sampling methods. The clinical significance of this finding and what factors may contribute to it are unclear. When requesting post-mortem tryptase, the pathologist should consider documenting the exact blood collection site and method used for collection. In addition, blood samples acquired by different techniques should not be mixed together and should be analyzed separately if possible.

  16. Development and evaluation of the photoload sampling technique

    Science.gov (United States)

    Robert E. Keane; Laura J. Dickinson

    2007-01-01

    Wildland fire managers need better estimates of fuel loading so they can accurately predict potential fire behavior and effects of alternative fuel and ecosystem restoration treatments. This report presents the development and evaluation of a new fuel sampling method, called the photoload sampling technique, to quickly and accurately estimate loadings for six common...

  17. Surface sampling techniques for 3D object inspection

    Science.gov (United States)

    Shih, Chihhsiong S.; Gerhardt, Lester A.

    1995-03-01

    While the uniform sampling method is quite popular for pointwise measurement of manufactured parts, this paper proposes three novel sampling strategies which emphasize 3D non-uniform inspection capability. They are: (a) the adaptive sampling, (b) the local adjustment sampling, and (c) the finite element centroid sampling techniques. The adaptive sampling strategy is based on a recursive surface subdivision process. Two different approaches are described for this adaptive sampling strategy. One uses triangle patches while the other uses rectangle patches. Several real world objects were tested using these two algorithms. Preliminary results show that sample points are distributed more closely around edges, corners, and vertices as desired for many classes of objects. Adaptive sampling using triangle patches is shown to generally perform better than both uniform and adaptive sampling using rectangle patches. The local adjustment sampling strategy uses a set of predefined starting points and then finds the local optimum position of each nodal point. This method approximates the object by moving the points toward object edges and corners. In a hybrid approach, uniform points sets and non-uniform points sets, first preprocessed by the adaptive sampling algorithm on a real world object were then tested using the local adjustment sampling method. The results show that the initial point sets when preprocessed by adaptive sampling using triangle patches, are moved the least amount of distance by the subsequently applied local adjustment method, again showing the superiority of this method. The finite element sampling technique samples the centroids of the surface triangle meshes produced from the finite element method. The performance of this algorithm was compared to that of the adaptive sampling using triangular patches. The adaptive sampling with triangular patches was once again shown to be better on different classes of objects.

  18. Random sampling and validation of covariance matrices of resonance parameters

    Science.gov (United States)

    Plevnik, Lucijan; Zerovnik, Gašper

    2017-09-01

    Analytically exact methods for random sampling of arbitrary correlated parameters are presented. Emphasis is given on one hand on the possible inconsistencies in the covariance data, concentrating on the positive semi-definiteness and consistent sampling of correlated inherently positive parameters, and on the other hand on optimization of the implementation of the methods itself. The methods have been applied in the program ENDSAM, written in the Fortran language, which from a file from a nuclear data library of a chosen isotope in ENDF-6 format produces an arbitrary number of new files in ENDF-6 format which contain values of random samples of resonance parameters (in accordance with corresponding covariance matrices) in places of original values. The source code for the program ENDSAM is available from the OECD/NEA Data Bank. The program works in the following steps: reads resonance parameters and their covariance data from nuclear data library, checks whether the covariance data is consistent, and produces random samples of resonance parameters. The code has been validated with both realistic and artificial data to show that the produced samples are statistically consistent. Additionally, the code was used to validate covariance data in existing nuclear data libraries. A list of inconsistencies, observed in covariance data of resonance parameters in ENDF-VII.1, JEFF-3.2 and JENDL-4.0 is presented. For now, the work has been limited to resonance parameters, however the methods presented are general and can in principle be extended to sampling and validation of any nuclear data.

  19. Generalized and synthetic regression estimators for randomized branch sampling

    Science.gov (United States)

    David L. R. Affleck; Timothy G. Gregoire

    2015-01-01

    In felled-tree studies, ratio and regression estimators are commonly used to convert more readily measured branch characteristics to dry crown mass estimates. In some cases, data from multiple trees are pooled to form these estimates. This research evaluates the utility of both tactics in the estimation of crown biomass following randomized branch sampling (...

  20. Effective sampling of random surfaces by baby universe surgery

    NARCIS (Netherlands)

    Ambjørn, J.; Białas, P.; Jurkiewicz, J.; Burda, Z.; Petersson, B.

    1994-01-01

    We propose a new, very efficient algorithm for sampling of random surfaces in the Monte Carlo simulations, based on so-called baby universe surgery, i.e. cutting and pasting of baby universe. It drastically reduces slowing down as compared to the standard local flip algorithm, thereby allowing

  1. Sampling versus Random Binning for Multiple Descriptions of a Bandlimited Source

    DEFF Research Database (Denmark)

    Mashiach, Adam; Østergaard, Jan; Zamir, Ram

    2013-01-01

    Random binning is an efficient, yet complex, coding technique for the symmetric L-description source coding problem. We propose an alternative approach, that uses the quantized samples of a bandlimited source as "descriptions". By the Nyquist condition, the source can be reconstructed if enough s...

  2. Identification of System Parameters by the Random Decrement Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Kirkegaard, Poul Henning; Rytter, Anders

    1991-01-01

    The aim of this paper is to investigate and illustrate the possibilities of using correlation functions estimated by the Random Decrement Technique as a basis for parameter identification. A two-stage system identification system is used: first, the correlation functions are estimated by the Random...... Decrement Technique, and then the system parameters are identified from the correlation function estimates. Three different techniques are used in the parameter identification process: a simple non-parametric method, estimation of an Auto Regressive (AR) model by solving an overdetermined set of Yule......-Walker equations and finally, least-square fitting of the theoretical correlation function. The results are compared to the results of fitting an Auto Regressive Moving Average (ARMA) model directly to the system output from a single-degree-of-freedom system loaded by white noise....

  3. Identification of System Parameters by the Random Decrement Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Kirkegaard, Poul Henning; Rytter, Anders

    The aim of this paper is to investigate and illustrate the possibilities of using correlation functions estimated by the Random Decrement Technique as a basis for parameter identification. A two-stage system identification method is used: first the correlation functions are estimated by the Random...... Decrement technique and then the system parameters are identified from the correlation function estimates. Three different techniques are used in the parameters identification process: a simple non-paramatic method, estimation of an Auto Regressive(AR) model by solving an overdetermined set of Yule......-Walker equations and finally least square fitting of the theoretical correlation function. The results are compared to the results of fitting an Auto Regressive Moving Average(ARMA) model directly to the system output. All investigations are performed on the simulated output from a single degree-off-freedom system...

  4. Water sampling techniques for continuous monitoring of pesticides in water

    Directory of Open Access Journals (Sweden)

    Šunjka Dragana

    2017-01-01

    Full Text Available Good ecological and chemical status of water represents the most important aim of the Water Framework Directive 2000/60/EC, which implies respect of water quality standards at the level of entire river basin (2008/105/EC and 2013/39/EC. This especially refers to the control of pesticide residues in surface waters. In order to achieve the set goals, a continuous monitoring program that should provide a comprehensive and interrelated overview of water status should be implemented. However, it demands the use of appropriate analysis techniques. Until now, the procedure for sampling and quantification of residual pesticide quantities in aquatic environment was based on the use of traditional sampling techniques that imply periodical collecting of individual samples. However, this type of sampling provides only a snapshot of the situation in regard to the presence of pollutants in water. As an alternative, the technique of passive sampling of pollutants in water, including pesticides has been introduced. Different samplers are available for pesticide sampling in surface water, depending on compounds. The technique itself is based on keeping a device in water over a longer period of time which varies from several days to several weeks, depending on the kind of compound. In this manner, the average concentrations of pollutants dissolved in water during a time period (time-weighted average concentrations, TWA are obtained, which enables monitoring of trends in areal and seasonal variations. The use of these techniques also leads to an increase in sensitivity of analytical methods, considering that pre-concentration of analytes takes place within the sorption medium. However, the use of these techniques for determination of pesticide concentrations in real water environments requires calibration studies for the estimation of sampling rates (Rs. Rs is a volume of water per time, calculated as the product of overall mass transfer coefficient and area of

  5. Estimation of Sensitive Proportion by Randomized Response Data in Successive Sampling

    Directory of Open Access Journals (Sweden)

    Bo Yu

    2015-01-01

    Full Text Available This paper considers the problem of estimation for binomial proportions of sensitive or stigmatizing attributes in the population of interest. Randomized response techniques are suggested for protecting the privacy of respondents and reducing the response bias while eliciting information on sensitive attributes. In many sensitive question surveys, the same population is often sampled repeatedly on each occasion. In this paper, we apply successive sampling scheme to improve the estimation of the sensitive proportion on current occasion.

  6. [Investigation and application of gasoline sample identity technique].

    Science.gov (United States)

    Liu, Yingrong; Xu, Yupeng; Yang, Haiying; Wang, Zheng

    2004-09-01

    Chemometrics method was used to solve the problem of automatic selecting model for the detailed hydrocarbon analysis (DHA) of gasoline samples by gas chromatography/ flame ionization detection (GC/FID). The 29 peaks in GC/FID DHA chromatogram and their amounts were selected as the discriminating parameters to establish the five pattern models for different gasoline samples, such as fluid catalytic cracking (FCC) gasoline, coking gasoline, straight run gasoline, reformed gasoline, and alkylation gasoline. The principle component analysis (PCA) and Soft Independent Modeling of Class Analogies (SIMCA) were used to classify the gasoline samples and to identify the unknown samples according to the above pattern models. One hundred gasoline samples, derived from known resources, were employed to validate the reliability of the sample identity technique. With the help of the pattern identity method referred here, the automation of GC/FID DHA method becomes possible.

  7. Use of nuclear technique in samples for agricultural purposes

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Kerley A. P. de; Sperling, Eduardo Von, E-mail: kerley@ufmg.br, E-mail: kerleyfisica@yahoo.com.br [Department of Sanitary and Environmental Engineering Federal University of Minas Gerais, Belo Horizonte (Brazil); Menezes, Maria Angela B. C.; Jacomino, Vanusa M.F. [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2013-01-15

    The concern related to environment is growing. Due to this, it is needed to determine chemical elements in a large range of concentration. The neutron activation technique (NAA) determines the elemental composition by the measurement of artificial radioactivity in a sample that was submitted to a neutron flux. NAA is a sensitive and accurate technique with low detection limits. An example of application of NAA was the measurement of concentrations of rare earth elements (REE) in waste samples of phosphogypsum (PG) and cerrado soil samples (clayey and sandy soils). Additionally, a soil reference material of the International Atomic Energy Agency (IAEA) was also analyzed. The REE concentration in PG samples was two times higher than those found in national fertilizers, (total of 4,000 mg kg{sup -1}), 154 times greater than the values found in the sandy soil (26 mg kg{sup -1}) and 14 times greater than the in clayey soil (280 mg kg{sup -1}). The experimental results for the reference material were inside the uncertainty of the certified values pointing out the accuracy of the method (95%). The determination of La, Ce, Pr, Nd, Pm, Sm, Eu, Tb, Dy, Ho, Er, Tm, Yb and Lu in the samples and reference material confirmed the versatility of the technique on REE determination in soil and phosphogypsum samples that are matrices for agricultural interest. (author)

  8. Experimental Verification of the Performance of the Aperture Sampling Technique

    Science.gov (United States)

    1975-09-15

    the author’s attention by Dr. S. Weisbrod of Teledyne Micronetics . 35 0.4 |U-3-l7»07| 3 HORNS 5 HORNS 2 3 TRUE ELEVATION ANGLE (deg) Fig...20 March 1974), DDC AD-781100/3. 2. "Application Study of Waveform Sampling Technique," RADC-TR-262, Teledyne Micronetics (November 1971). 3. J

  9. Random sampling and validation of covariance matrices of resonance parameters

    Directory of Open Access Journals (Sweden)

    Plevnik Lucijan

    2017-01-01

    Full Text Available Analytically exact methods for random sampling of arbitrary correlated parameters are presented. Emphasis is given on one hand on the possible inconsistencies in the covariance data, concentrating on the positive semi-definiteness and consistent sampling of correlated inherently positive parameters, and on the other hand on optimization of the implementation of the methods itself. The methods have been applied in the program ENDSAM, written in the Fortran language, which from a file from a nuclear data library of a chosen isotope in ENDF-6 format produces an arbitrary number of new files in ENDF-6 format which contain values of random samples of resonance parameters (in accordance with corresponding covariance matrices in places of original values. The source code for the program ENDSAM is available from the OECD/NEA Data Bank. The program works in the following steps: reads resonance parameters and their covariance data from nuclear data library, checks whether the covariance data is consistent, and produces random samples of resonance parameters. The code has been validated with both realistic and artificial data to show that the produced samples are statistically consistent. Additionally, the code was used to validate covariance data in existing nuclear data libraries. A list of inconsistencies, observed in covariance data of resonance parameters in ENDF-VII.1, JEFF-3.2 and JENDL-4.0 is presented. For now, the work has been limited to resonance parameters, however the methods presented are general and can in principle be extended to sampling and validation of any nuclear data.

  10. Development of core sampling technique for ITER Type B radwaste

    Energy Technology Data Exchange (ETDEWEB)

    Kim, S. G.; Hong, K. P.; Oh, W. H.; Park, M. C.; Jung, S. H.; Ahn, S. B. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Type B radwaste (intermediate level and long lived radioactive waste) imported from ITER vacuum vessel are to be treated and stored in basement of hot cell building. The Type B radwaste treatment process is composed of buffer storage, cutting, sampling/tritium measurement, tritium removal, characterization, pre-packaging, inspection/decontamination, and storage etc. The cut slices of Type B radwaste components generated from cutting process undergo sampling process before and after tritium removal process. The purpose of sampling is to obtain small pieces of samples in order to investigate the tritium content and concentration of Type B radwaste. Core sampling, which is the candidates of sampling technique to be applied to ITER hot cell, is available for not thick (less than 50 mm) metal without use of coolant. Experimented materials were SS316L and CuCrZr in order to simulate ITER Type B radwaste. In core sampling, substantial secondary wastes from cutting chips will be produced unavoidably. Thus, core sampling machine will have to be equipped with disposal system such as suction equipment. Core sampling is considered an unfavorable method for tool wear compared to conventional drilling.

  11. Nocardia isolation from clinical samples with the paraffin baiting technique.

    Science.gov (United States)

    Bafghi, Mehdi Fatahi; Heidarieh, Parvin; Soori, Tahereh; Saber, Sasan; Meysamie, Alipasha; Gheitoli, Khavar; Habibnia, Shadi; Rasouli Nasab, Masoumeh; Eshraghi, Seyyed Saeed

    2015-03-01

    The genus Nocardia is a cause of infection in the lungs, skin, brain, cerebrospinal fluid, eyes, joints and kidneys. Nocardia isolation from polymicrobial specimens is difficult due to its slow growth. Several methods have been reported for Nocardia isolation from clinical samples. In the current study, we used three methods: paraffin baiting technique, paraffin agar, and conventional media for Nocardia isolation from various clinical specimens from Iranian patients. In this study, we examined 517 samples from various clinical specimens such as: sputum of patients with suspected tuberculosis, bronchoalveolar lavage, sputum of patients with cystic fibrosis, tracheal aspirate, cutaneous and subcutaneous abscesses, cerebrospinal fluid, dental abscess, mycetoma, wound, bone marrow biopsy, and gastric lavage. All collected specimens were cultured on carbon-free broth tubes (paraffin baiting technique), paraffin agar, Sabouraud dextrose agar, and Sabouraud dextrose agar with cycloheximide and were incubated at 35°C for one month. Seven Nocardia spp. were isolated with paraffin baiting technique, compared with 5 positive results with the paraffin agar technique and 3 positive results with Sabouraud dextrose agar with and without cycloheximide. The prevalence of nocardial infections in our specimens was 5.28%. In the present study, the use of the paraffin baiting technique appeared to be more effective than other methods for Nocardia isolation from various clinical specimens.

  12. Samples and techniques highlighting the links between obesity and microbiota.

    Science.gov (United States)

    Angelakis, Emmanouil; Lagier, Jean-Christophe

    2017-05-01

    The composition of gut microbiota and its relationship to human health, particularly its links with obesity remain an ongoing challenge for scientists. The current gold standard for exploring human gut microbiota consists of using stool samples and only applying next generations sequencing techniques, which sometimes generate contradictory results. Here, we comprehensively describe nutrient absorption, fat digestion, carbohydrate and protein absorption, demonstrating that absorption of these diverse nutrients occurs mainly in the stomach and small intestine. Indeed, bariatric surgery, including Roux-en-Y, removes part of the upper intestine, resulting in weight loss, while colonic surgery is associated with a stable weight. However, most studies only use stool samples rather than small intestine samples because of the easy with which this can be accessed. Metagenomics studies are associated with several biases such as extraction and primer biases and depth bias, including the more modern platforms. High-throughput culture-dependent techniques, such as culturomics, which uses rapid identification methods such as MALDI-TOF, remain time-consuming, but have demonstrated their complementarity with molecular techniques. In conclusion, we believe that a comprehensive analysis of the relationships between obesity and gut microbiota requires large-scale studies coupling metagenomics and culture-dependent research, in order to analyse both small intestine and stool samples. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Nuts and Bolts - Techniques for Genesis Sample Curation

    Science.gov (United States)

    Burkett, Patti J.; Rodriquez, M. C.; Allton, J. H.

    2011-01-01

    The Genesis curation staff at NASA Johnson Space Center provides samples and data for analysis to the scientific community, following allocation approval by the Genesis Oversight Committee, a sub-committee of CAPTEM (Curation Analysis Planning Team for Extraterrestrial Materials). We are often asked by investigators within the scientific community how we choose samples to best fit the requirements of the request. Here we will demonstrate our techniques for characterizing samples and satisfying allocation requests. Even with a systematic approach, every allocation is unique. We are also providing updated status of the cataloging and characterization of solar wind collectors as of January 2011. The collection consists of 3721 inventoried samples consisting of a single fragment, or multiple fragments containerized or pressed between post-it notes, jars or vials of various sizes.

  14. Sampling Polymorphs of Ionic Solids using Random Superlattices.

    Science.gov (United States)

    Stevanović, Vladan

    2016-02-19

    Polymorphism offers rich and virtually unexplored space for discovering novel functional materials. To harness this potential approaches capable of both exploring the space of polymorphs and assessing their realizability are needed. One such approach devised for partially ionic solids is presented. The structure prediction part is carried out by performing local density functional theory relaxations on a large set of random supperlattices (RSLs) with atoms distributed randomly over different planes in a way that favors cation-anion coordination. Applying the RSL sampling on MgO, ZnO, and SnO_{2} reveals that the resulting probability of occurrence of a given structure offers a measure of its realizability explaining fully the experimentally observed, metastable polymorphs in these three systems.

  15. Atmospheric Sampling of Persistent Organic Pollutants: Needs, Applications and Advances in Passive Air Sampling Techniques

    Directory of Open Access Journals (Sweden)

    Wendy A. Ockenden

    2001-01-01

    Full Text Available There are numerous potential applications for validated passive sampling techniques to measure persistent organic pollutants (POPs in the atmosphere, but such techniques are still in their infancy. Potential uses include: monitoring to check for regulatory compliance and identification of potential sources; cheap/efficient reconnaissance surveying of the spatial distribution of POPs; and deployment in studies to investigate environmental processes affecting POP cycling. This article reviews and discusses the principles and needs of passive sampling methodologies. The timescales required for analytical purposes and for the scientific objectives of the study are critical in the choice and design of a passive sampler. Some techniques may operate over the timescales of hours/days, others over weeks/months/years. We distinguish between approaches based on "kinetic uptake" and "equilibrium partitioning". We highlight potentially useful techniques and discuss their potential advantages, disadvantages, and research requirements, drawing attention to the urgent need for detailed studies of sampler performance and calibration.

  16. Comparison of correlation analysis techniques for irregularly sampled time series

    Directory of Open Access Journals (Sweden)

    K. Rehfeld

    2011-06-01

    Full Text Available Geoscientific measurements often provide time series with irregular time sampling, requiring either data reconstruction (interpolation or sophisticated methods to handle irregular sampling. We compare the linear interpolation technique and different approaches for analyzing the correlation functions and persistence of irregularly sampled time series, as Lomb-Scargle Fourier transformation and kernel-based methods. In a thorough benchmark test we investigate the performance of these techniques.

    All methods have comparable root mean square errors (RMSEs for low skewness of the inter-observation time distribution. For high skewness, very irregular data, interpolation bias and RMSE increase strongly. We find a 40 % lower RMSE for the lag-1 autocorrelation function (ACF for the Gaussian kernel method vs. the linear interpolation scheme,in the analysis of highly irregular time series. For the cross correlation function (CCF the RMSE is then lower by 60 %. The application of the Lomb-Scargle technique gave results comparable to the kernel methods for the univariate, but poorer results in the bivariate case. Especially the high-frequency components of the signal, where classical methods show a strong bias in ACF and CCF magnitude, are preserved when using the kernel methods.

    We illustrate the performances of interpolation vs. Gaussian kernel method by applying both to paleo-data from four locations, reflecting late Holocene Asian monsoon variability as derived from speleothem δ18O measurements. Cross correlation results are similar for both methods, which we attribute to the long time scales of the common variability. The persistence time (memory is strongly overestimated when using the standard, interpolation-based, approach. Hence, the Gaussian kernel is a reliable and more robust estimator with significant advantages compared to other techniques and suitable for large scale application to paleo-data.

  17. Classification of epileptic EEG signals based on simple random sampling and sequential feature selection

    OpenAIRE

    Ghayab, Hadi Ratham Al; Li, Yan; Abdulla, Shahab; Diykh, Mohammed; Wan, Xiangkui

    2016-01-01

    Electroencephalogram (EEG) signals are used broadly in the medical fields. The main applications of EEG signals are the diagnosis and treatment of diseases such as epilepsy, Alzheimer, sleep problems and so on. This paper presents a new method which extracts and selects features from multi-channel EEG signals. This research focuses on three main points. Firstly, simple random sampling (SRS) technique is used to extract features from the time domain of EEG signals. Secondly, the sequential fea...

  18. Comparison of sample preparation techniques for large-scale proteomics.

    Science.gov (United States)

    Kuljanin, Miljan; Dieters-Castator, Dylan Z; Hess, David A; Postovit, Lynne-Marie; Lajoie, Gilles A

    2017-01-01

    Numerous workflows exist for large-scale bottom-up proteomics, many of which achieve exceptional proteome depth. Herein, we evaluated the performance of several commonly used sample preparation techniques for proteomic characterization of HeLa lysates [unfractionated in-solution digests, SDS-PAGE coupled with in-gel digestion, gel-eluted liquid fraction entrapment electrophoresis (GELFrEE) technology, SCX StageTips and high-/low-pH reversed phase fractionation (HpH)]. HpH fractionation was found to be superior in terms of proteome depth (>8400 proteins detected) and fractionation efficiency compared to other techniques. SCX StageTip fractionation required minimal sample handling and was also a substantial improvement over SDS-PAGE separation and GELFrEE technology. Sequence coverage of the HeLa proteome increased to 38% when combining all workflows, however, total proteins detected improved only slightly to 8710. In summary, HpH fractionation and SCX StageTips are robust techniques and highly suited for complex proteome analysis. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Multilayer pixel super-resolution lensless in-line holographic microscope with random sample movement.

    Science.gov (United States)

    Wang, Mingjun; Feng, Shaodong; Wu, Jigang

    2017-10-06

    We report a multilayer lensless in-line holographic microscope (LIHM) with improved imaging resolution by using the pixel super-resolution technique and random sample movement. In our imaging system, a laser beam illuminated the sample and a CMOS imaging sensor located behind the sample recorded the in-line hologram for image reconstruction. During the imaging process, the sample was moved by hand randomly and the in-line holograms were acquired sequentially. Then the sample image was reconstructed from an enhanced-resolution hologram obtained from multiple low-resolution in-line holograms by applying the pixel super-resolution (PSR) technique. We studied the resolution enhancement effects by using the U.S. Air Force (USAF) target as the sample in numerical simulation and experiment. We also showed that multilayer pixel super-resolution images can be obtained by imaging a triple-layer sample made with the filamentous algae on the middle layer and microspheres with diameter of 2 μm on the top and bottom layers. Our pixel super-resolution LIHM provides a compact and low-cost solution for microscopic imaging and is promising for many biomedical applications.

  20. Randomized trial of tapas acupressure technique for weight loss maintenance

    Directory of Open Access Journals (Sweden)

    Elder Charles R

    2012-03-01

    Full Text Available Abstract Background Obesity is an urgent public health problem, yet only a few clinical trials have systematically tested the efficacy of long-term weight-loss maintenance interventions. This randomized clinical trial tested the efficacy of a novel mind and body technique for weight-loss maintenance. Methods Participants were obese adults who had completed a six-month behavioral weight-loss program prior to randomization. Those who successfully lost weight were randomized into either an experimental weight-loss maintenance intervention, Tapas Acupressure Technique (TAT®, or a control intervention comprised of social-support group meetings (SS led by professional facilitators. TAT combines self-applied light pressure to specific acupressure points accompanied by a prescribed sequence of mental steps. Participants in both maintenance conditions attended eight group sessions over six months of active weight loss maintenance intervention, followed by an additional 6 months of no intervention. The main outcome measure was change in weight from the beginning of the weight loss maintenance intervention to 12 months later. Secondary outcomes were change in depression, stress, insomnia, and quality of life. We used analysis of covariance as the primary analysis method. Missing values were replaced using multiple imputation. Results Among 285 randomized participants, 79% were female, mean age was 56 (standard deviation (sd = 11, mean BMI at randomization was 34 (sd = 5, and mean initial weight loss was 9.8 kg (sd = 5. In the primary outcome model, there was no significant difference in weight regain between the two arms (1.72 kg (se 0.85 weight regain for TAT and 2.96 kg (se 0.96 weight regain for SS, p post hoc tests showing that greater initial weight loss was associated with more weight regain for SS but less weight regain for TAT. Conclusions The primary analysis showed no significant difference in weight regain between TAT and SS, while secondary

  1. Comparing efficiency of American Fisheries Society standard snorkeling techniques to environmental DNA sampling techniques

    Science.gov (United States)

    Ulibarri, Roy M.; Bonar, Scott A.; Rees, Christopher B.; Amberg, Jon J.; Ladell, Bridget; Jackson, Craig

    2017-01-01

    Analysis of environmental DNA (eDNA) is an emerging technique used to detect aquatic species through water sampling and the extraction of biological material for amplification. Our study compared the efficacy of eDNA methodology to American Fisheries Society (AFS) standard snorkeling surveys with regard to detecting the presence of rare fish species. Knowing which method is more efficient at detecting target species will help managers to determine the best way to sample when both traditional sampling methods and eDNA sampling are available. Our study site included three Navajo Nation streams that contained Navajo Nation Genetic Subunit Bluehead Suckers Catostomus discobolus and Zuni Bluehead Suckers C. discobolus yarrowi. We first divided the entire wetted area of streams into consecutive 100-m reaches and then systematically selected 10 reaches/stream for snorkel and eDNA surveys. Surface water samples were taken in 10-m sections within each 100-m reach, while fish presence was noted via snorkeling in each 10-m section. Quantitative PCR was run on each individual water sample in quadruplicate to test for the presence or absence of the target species. With eDNA sampling techniques, we were able to positively detect both species in two out of the three streams. Snorkeling resulted in positive detection of both species in all three streams. In streams where the target species were detected with eDNA sampling, snorkeling detected fish at 11–29 sites/stream, whereas eDNA detected fish at 3–12 sites/stream. Our results suggest that AFS standard snorkeling is more effective than eDNA for detecting target fish species. To improve our eDNA procedures, the amount of water collected and tested should be increased. Additionally, filtering water on-site may improve eDNA techniques for detecting fish. Future research should focus on standardization of eDNA sampling to provide a widely operational sampling tool.

  2. Appendectomy Skin Closure Technique, Randomized Controlled Trial: Changing Paradigms (ASC).

    Science.gov (United States)

    Andrade, Luis Angel Medina; Muñoz, Franz Yeudiel Pérez; Báez, María Valeria Jiménez; Collazos, Stephanie Serrano; de Los Angeles Martinez Ferretiz, Maria; Ruiz, Brenda; Montes, Oscar; Woolf, Stephanie; Noriega, Jessica Gonzalez; Aparicio, Uriel Maldonado; Gonzalez, Israel Gonzalez

    2016-11-01

    Appendectomy is the most frequent and urgent gastrointestinal surgery. Overtime, the surgical techniques have been improved upon, in order to reduce complications, get better cosmetic results, and limit the discomfort associated with this procedure, by its high impact in the surgery departments. The traditional skin closure is associated with a poor cosmetic result and it requires stitches removal, alongside the pain associated with this procedure, and no benefits were demonstrated in the literature regarding separated stitches over intradermic stitch. This is a randomized controlled trial, and our objective is to compare two different skin closure techniques in open appendectomy. A prospective randomized trial method was used, with a total number of 208 patients participating in the study, after acute appendicitis diagnosis in the emergency department. They were randomized into two groups: patients who would receive skin closure with a unique absorbable intradermic stitch (Group A) and another group that would receive the traditional closure technique, consistent in non-absorbable separated stitches (Group B). General characteristics like gender, age, Body Mass Index (BMI), comorbidities, and allergies were registered. Days of Evolution (DOE) until surgery, previous use of antibiotics, complicated or uncomplicated appendicitis, surgical time, and wound complications like skin infection, dehiscence, seroma or abscess were also registered in each case. 8 patients were excluded due to negative appendicitis during surgery and lack of follow-up. Two groups, each containing 100 patients, were formed. General characteristics and parity were compared, and no statistically significant differences were observed. Difference in the surgical time (Group A: 47.35 min vs Group B: 54.13 min, p  25 kg/m2 and seroma (p = .006), BMI > 25 kg/m2 and abscess (p = .02), surgical time >50 min and seroma (p 2 DOE and abscess (p = .001), and complicated appendicitis with

  3. Extraction techniques in speciation analysis of environmental samples

    Energy Technology Data Exchange (ETDEWEB)

    Morabito, R. [ENEA Casaccia, Rome (Italy). Environmental Dept.

    1995-02-01

    One of the main problems in speciation analysis is that the different species of metals are present in complex matrices at very low concentration levels. Thus it is almost always necessary to separate the analytes of concern from the matrix and to concentrate them up to detectable concentration levels. Special care should be paid during extraction in order to avoid any contamination of samples, losses and changes in speciation of analytes of concern. The most common extraction techniques for speciation analysis of Pb, Sn, Hg, Cr, As, Se and Sb in liquid and solid samples are presented and briefly discussed. Due to the large quantity of material to be covered, speciation of alkyl, aryl, and macromolecular compounds (porphyrines, thioneines, etc.) has not been taken into account. (orig.)

  4. SROT: Sparse representation-based over-sampling technique for classification of imbalanced dataset

    Science.gov (United States)

    Zou, Xionggao; Feng, Yueping; Li, Huiying; Jiang, Shuyu

    2017-08-01

    As one of the most popular research fields in machine learning, the research on imbalanced dataset receives more and more attentions in recent years. The imbalanced problem usually occurs in when minority classes have extremely fewer samples than the others. Traditional classification algorithms have not taken the distribution of dataset into consideration, thus they fail to deal with the problem of class-imbalanced learning, and the performance of classification tends to be dominated by the majority class. SMOTE is one of the most effective over-sampling methods processing this problem, which changes the distribution of training sets by increasing the size of minority class. However, SMOTE would easily result in over-fitting on account of too many repetitive data samples. According to this issue, this paper proposes an improved method based on sparse representation theory and over-sampling technique, named SROT (Sparse Representation-based Over-sampling Technique). The SROT uses a sparse dictionary to create synthetic samples directly for solving the imbalanced problem. The experiments are performed on 10 UCI datasets using C4.5 as the learning algorithm. The experimental results show that compared our algorithm with Random Over-sampling techniques, SMOTE and other methods, SROT can achieve better performance on AUC value.

  5. Randomly Sampled-Data Control Systems. Ph.D. Thesis

    Science.gov (United States)

    Han, Kuoruey

    1990-01-01

    The purpose is to solve the Linear Quadratic Regulator (LQR) problem with random time sampling. Such a sampling scheme may arise from imperfect instrumentation as in the case of sampling jitter. It can also model the stochastic information exchange among decentralized controllers to name just a few. A practical suboptimal controller is proposed with the nice property of mean square stability. The proposed controller is suboptimal in the sense that the control structure is limited to be linear. Because of i. i. d. assumption, this does not seem unreasonable. Once the control structure is fixed, the stochastic discrete optimal control problem is transformed into an equivalent deterministic optimal control problem with dynamics described by the matrix difference equation. The N-horizon control problem is solved using the Lagrange's multiplier method. The infinite horizon control problem is formulated as a classical minimization problem. Assuming existence of solution to the minimization problem, the total system is shown to be mean square stable under certain observability conditions. Computer simulations are performed to illustrate these conditions.

  6. A Table-Based Random Sampling Simulation for Bioluminescence Tomography

    Directory of Open Access Journals (Sweden)

    Xiaomeng Zhang

    2006-01-01

    Full Text Available As a popular simulation of photon propagation in turbid media, the main problem of Monte Carlo (MC method is its cumbersome computation. In this work a table-based random sampling simulation (TBRS is proposed. The key idea of TBRS is to simplify multisteps of scattering to a single-step process, through randomly table querying, thus greatly reducing the computing complexity of the conventional MC algorithm and expediting the computation. The TBRS simulation is a fast algorithm of the conventional MC simulation of photon propagation. It retained the merits of flexibility and accuracy of conventional MC method and adapted well to complex geometric media and various source shapes. Both MC simulations were conducted in a homogeneous medium in our work. Also, we present a reconstructing approach to estimate the position of the fluorescent source based on the trial-and-error theory as a validation of the TBRS algorithm. Good agreement is found between the conventional MC simulation and the TBRS simulation.

  7. Determination of metals in air samples using X-Ray fluorescence associated the APDC preconcentration technique

    Energy Technology Data Exchange (ETDEWEB)

    Nardes, Raysa C.; Santos, Ramon S.; Sanches, Francis A.C.R.A.; Gama Filho, Hamilton S.; Oliveira, Davi F.; Anjos, Marcelino J., E-mail: rc.nardes@gmail.com, E-mail: ramonziosp@yahoo.com.br, E-mail: francissanches@gmail.com, E-mail: hamiltongamafilho@hotmail.com, E-mail: davi.oliveira@uerj.br, E-mail: marcelin@uerj.br [Universidade do Estado do Rio de Janeiro (UERJ), Rio de Janeiro, RJ (Brazil). Instituto de Fisica. Departamento de Fisica Aplicada e Termodinamica

    2015-07-01

    Air pollution has become one of the leading quality degradation factors of life for people in large urban centers. Studies indicate that the suspended particulate matter in the atmosphere is directly associated with risks to public health, in addition, it can cause damage to fauna, flora and public / cultural patrimonies. The inhalable particulate materials can cause the emergence and / or worsening of chronic diseases related to respiratory system and other diseases, such as reduced physical strength. In this study, we propose a new method to measure the concentration of total suspended particulate matter (TSP) in the air using an impinger as an air cleaning apparatus, preconcentration with APDC and Total Reflection X-ray Fluorescence technique (TXRF) to analyze the heavy metals present in the air. The samples were collected from five random points in the city of Rio de Janeiro/Brazil. Analyses of TXRF were performed at the Brazilian Synchrotron Light Laboratory (LNLS). The technique proved viable because it was able to detect five important metallic elements to environmental studies: Cr, Fe, Ni, Cu and Zn. This technique presented substantial efficiency in determining the elementary concentration of air pollutants, in addition to low cost. It can be concluded that the metals analysis technique in air samples using an impinger as sample collection instrument associated with a complexing agent (APDC) was viable because it is a low-cost technique, moreover, it was possible the detection of five important metal elements in environmental studies associated with industrial emissions and urban traffic. (author)

  8. Innovative techniques for sampling stream-inhabiting salamanders

    Energy Technology Data Exchange (ETDEWEB)

    T.M. Luhring; C.A. Young

    2006-01-01

    Although salamanders are excellent indicators of environmental health, the ability to catch them efficiently without substantially disrupting their habitat is not always practical or even possible with current techniques. Ripping open logs and raking leaf packs onto shore (Bruce 1972) are examples of such practices that are disruptive but widely used by herpetologists who have no other means of efficient collection. Drift fences with pitfall traps are effective in catching animals moving within or between habitats but are time consuming and require an initial financial investment and constant upkeep to maintain functionality and prevent animal fatalities (Gibbons and Semlitsch 1981). One current alternative to drift fences is the use of coverboards (Grant et al. 1992), which require less maintenance and sampling effort than drift fences. However, coverboards do not integrate captures over a long time period and often result in a lower number of captures per trap (Grant et al. 1992).

  9. On analysis-based two-step interpolation methods for randomly sampled seismic data

    Science.gov (United States)

    Yang, Pengliang; Gao, Jinghuai; Chen, Wenchao

    2013-02-01

    Interpolating the missing traces of regularly or irregularly sampled seismic record is an exceedingly important issue in the geophysical community. Many modern acquisition and reconstruction methods are designed to exploit the transform domain sparsity of the few randomly recorded but informative seismic data using thresholding techniques. In this paper, to regularize randomly sampled seismic data, we introduce two accelerated, analysis-based two-step interpolation algorithms, the analysis-based FISTA (fast iterative shrinkage-thresholding algorithm) and the FPOCS (fast projection onto convex sets) algorithm from the IST (iterative shrinkage-thresholding) algorithm and the POCS (projection onto convex sets) algorithm. A MATLAB package is developed for the implementation of these thresholding-related interpolation methods. Based on this package, we compare the reconstruction performance of these algorithms, using synthetic and real seismic data. Combined with several thresholding strategies, the accelerated convergence of the proposed methods is also highlighted.

  10. Studies on spectral analysis of randomly sampled signals: Application to laser velocimetry data

    Science.gov (United States)

    Sree, David

    1992-01-01

    Spectral analysis is very useful in determining the frequency characteristics of many turbulent flows, for example, vortex flows, tail buffeting, and other pulsating flows. It is also used for obtaining turbulence spectra from which the time and length scales associated with the turbulence structure can be estimated. These estimates, in turn, can be helpful for validation of theoretical/numerical flow turbulence models. Laser velocimetry (LV) is being extensively used in the experimental investigation of different types of flows, because of its inherent advantages; nonintrusive probing, high frequency response, no calibration requirements, etc. Typically, the output of an individual realization laser velocimeter is a set of randomly sampled velocity data. Spectral analysis of such data requires special techniques to obtain reliable estimates of correlation and power spectral density functions that describe the flow characteristics. FORTRAN codes for obtaining the autocorrelation and power spectral density estimates using the correlation-based slotting technique were developed. Extensive studies have been conducted on simulated first-order spectrum and sine signals to improve the spectral estimates. A first-order spectrum was chosen because it represents the characteristics of a typical one-dimensional turbulence spectrum. Digital prefiltering techniques, to improve the spectral estimates from randomly sampled data were applied. Studies show that the spectral estimates can be increased up to about five times the mean sampling rate.

  11. ANALYSIS OF MONTE CARLO SIMULATION SAMPLING TECHNIQUES ON SMALL SIGNAL STABILITY OF WIND GENERATOR- CONNECTED POWER SYSTEM

    Directory of Open Access Journals (Sweden)

    TEMITOPE RAPHAEL AYODELE

    2016-04-01

    Full Text Available Monte Carlo simulation using Simple Random Sampling (SRS technique is popularly known for its ability to handle complex uncertainty problems. However, to produce a reasonable result, it requires huge sample size. This makes it to be computationally expensive, time consuming and unfit for online power system applications. In this article, the performance of Latin Hypercube Sampling (LHS technique is explored and compared with SRS in term of accuracy, robustness and speed for small signal stability application in a wind generator-connected power system. The analysis is performed using probabilistic techniques via eigenvalue analysis on two standard networks (Single Machine Infinite Bus and IEEE 16–machine 68 bus test system. The accuracy of the two sampling techniques is determined by comparing their different sample sizes with the IDEAL (conventional. The robustness is determined based on a significant variance reduction when the experiment is repeated 100 times with different sample sizes using the two sampling techniques in turn. Some of the results show that sample sizes generated from LHS for small signal stability application produces the same result as that of the IDEAL values starting from 100 sample size. This shows that about 100 sample size of random variable generated using LHS method is good enough to produce reasonable results for practical purpose in small signal stability application. It is also revealed that LHS has the least variance when the experiment is repeated 100 times compared to SRS techniques. This signifies the robustness of LHS over that of SRS techniques. 100 sample size of LHS produces the same result as that of the conventional method consisting of 50000 sample size. The reduced sample size required by LHS gives it computational speed advantage (about six times over the conventional method.

  12. Novel technique to separate systematic and random defects during 65nm and 45nm process development

    Science.gov (United States)

    Yeh, J. H.; Park, Allen

    2007-03-01

    Defect inspections performed in R&D may often result in 100k to 1M defect counts on a single wafer. Such defect data combine systematic and random defects that may be yield limiting or just nuisance defects. It is difficult to identify systematic defects from defect wafer map by traditional defect classification where random sample of 50 to 100 defects are reviewed on review SEM. Missing important systematic defect types by traditional sampling technique can be very costly in device introduction. Being able to efficiently sample defects for SEM review is not only challenging, but can result in a Pareto that lacks in usefulness for R& D and for yield improvement. To mitigate the issue and to reduce yield improvement cycle in advanced technology, a novel method has been proposed. Instead of using random sampling method, we have applied a pattern search engine to correlate defect of interest (DOI) to its pattern background. Based on the approach we have identified an important defect type, STI cave defect, to be the major defect type on defect Pareto. For the defect type, stack die map was generated that indicated a distinctive signature. The result was compared against design layout to confirm that the defects were occurring at certain locations of design layout. Afterwards the defect types were reviewed using SEM and in-line FIB for further confirmation. We have found the cause of this void defect type to be poor gap-fill in deposition step. Based on the novel technique, we were able to filter out a systematic defect type quickly and efficiently from wafer map that consist of random and systematic defects.

  13. Classification of Phishing Email Using Random Forest Machine Learning Technique

    Directory of Open Access Journals (Sweden)

    Andronicus A. Akinyelu

    2014-01-01

    Full Text Available Phishing is one of the major challenges faced by the world of e-commerce today. Thanks to phishing attacks, billions of dollars have been lost by many companies and individuals. In 2012, an online report put the loss due to phishing attack at about $1.5 billion. This global impact of phishing attacks will continue to be on the increase and thus requires more efficient phishing detection techniques to curb the menace. This paper investigates and reports the use of random forest machine learning algorithm in classification of phishing attacks, with the major objective of developing an improved phishing email classifier with better prediction accuracy and fewer numbers of features. From a dataset consisting of 2000 phishing and ham emails, a set of prominent phishing email features (identified from the literature were extracted and used by the machine learning algorithm with a resulting classification accuracy of 99.7% and low false negative (FN and false positive (FP rates.

  14. Macro elemental analysis of food samples by nuclear analytical technique

    Science.gov (United States)

    Syahfitri, W. Y. N.; Kurniawati, S.; Adventini, N.; Damastuti, E.; Lestiani, D. D.

    2017-06-01

    Energy-dispersive X-ray fluorescence (EDXRF) spectrometry is a non-destructive, rapid, multi elemental, accurate, and environment friendly analysis compared with other detection methods. Thus, EDXRF spectrometry is applicable for food inspection. The macro elements calcium and potassium constitute important nutrients required by the human body for optimal physiological functions. Therefore, the determination of Ca and K content in various foods needs to be done. The aim of this work is to demonstrate the applicability of EDXRF for food analysis. The analytical performance of non-destructive EDXRF was compared with other analytical techniques; neutron activation analysis and atomic absorption spectrometry. Comparison of methods performed as cross checking results of the analysis and to overcome the limitations of the three methods. Analysis results showed that Ca found in food using EDXRF and AAS were not significantly different with p-value 0.9687, whereas p-value of K between EDXRF and NAA is 0.6575. The correlation between those results was also examined. The Pearson correlations for Ca and K were 0.9871 and 0.9558, respectively. Method validation using SRM NIST 1548a Typical Diet was also applied. The results showed good agreement between methods; therefore EDXRF method can be used as an alternative method for the determination of Ca and K in food samples.

  15. The pursuit of balance: An overview of covariate-adaptive randomization techniques in clinical trials.

    Science.gov (United States)

    Lin, Yunzhi; Zhu, Ming; Su, Zheng

    2015-11-01

    Randomization is fundamental to the design and conduct of clinical trials. Simple randomization ensures independence among subject treatment assignments and prevents potential selection biases, yet it does not guarantee balance in covariate distributions across treatment groups. Ensuring balance in important prognostic covariates across treatment groups is desirable for many reasons. A broad class of randomization methods for achieving balance are reviewed in this paper; these include block randomization, stratified randomization, minimization, and dynamic hierarchical randomization. Practical considerations arising from experience with using the techniques are described. A review of randomization methods used in practice in recent randomized clinical trials is also provided. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. Classification of Phishing Email Using Random Forest Machine Learning Technique

    National Research Council Canada - National Science Library

    Akinyelu, Andronicus A; Adewumi, Aderemi O

    2014-01-01

    .... This paper investigates and reports the use of random forest machine learning algorithm in classification of phishing attacks, with the major objective of developing an improved phishing email...

  17. Accuracy and sampling error of two age estimation techniques using rib histomorphometry on a modern sample.

    Science.gov (United States)

    García-Donas, Julieta G; Dyke, Jeffrey; Paine, Robert R; Nathena, Despoina; Kranioti, Elena F

    2016-02-01

    Most age estimation methods are proven problematic when applied in highly fragmented skeletal remains. Rib histomorphometry is advantageous in such cases; yet it is vital to test and revise existing techniques particularly when used in legal settings (Crowder and Rosella, 2007). This study tested Stout & Paine (1992) and Stout et al. (1994) histological age estimation methods on a Modern Greek sample using different sampling sites. Six left 4th ribs of known age and sex were selected from a modern skeletal collection. Each rib was cut into three equal segments. Two thin sections were acquired from each segment. A total of 36 thin sections were prepared and analysed. Four variables (cortical area, intact and fragmented osteon density and osteon population density) were calculated for each section and age was estimated according to Stout & Paine (1992) and Stout et al. (1994). The results showed that both methods produced a systemic underestimation of the individuals (to a maximum of 43 years) although a general improvement in accuracy levels was observed when applying the Stout et al. (1994) formula. There is an increase of error rates with increasing age with the oldest individual showing extreme differences between real age and estimated age. Comparison of the different sampling sites showed small differences between the estimated ages suggesting that any fragment of the rib could be used without introducing significant error. Yet, a larger sample should be used to confirm these results. Copyright © 2015 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  18. A system identification technique based on the random decrement signatures. Part 1: Theory and simulation

    Science.gov (United States)

    Bedewi, Nabih E.; Yang, Jackson C. S.

    1987-01-01

    Identification of the system parameters of a randomly excited structure may be treated using a variety of statistical techniques. Of all these techniques, the Random Decrement is unique in that it provides the homogeneous component of the system response. Using this quality, a system identification technique was developed based on a least-squares fit of the signatures to estimate the mass, damping, and stiffness matrices of a linear randomly excited system. The mathematics of the technique is presented in addition to the results of computer simulations conducted to demonstrate the prediction of the response of the system and the random forcing function initially introduced to excite the system.

  19. Modified Transseptal Puncture Technique in Challenging Septa: A Randomized Comparison to Conventional Technique

    Directory of Open Access Journals (Sweden)

    Vikas Kataria

    2017-01-01

    Full Text Available Background. Transseptal puncture (TSP can be challenging. We compared safety and efficacy of a modified TSP technique (“mosquito” technique, MOSQ-TSP to conventional TSP (CONV-TSP. Method. Patients undergoing AF ablation in whom first attempt of TSP did not result in left atrial (LA pressure (failure to cross, FTC were randomized to MOSQ-TSP (i.e., puncture of the fossa via a wafer-thin inner stylet or CONV-TSP (i.e., additional punctures at different positions. Primary endpoint was LA access. Secondary endpoints were safety, time, fluoroscopic dose (dose-area product, DAP, and number of additional punctures from FTC to final LA access. Result. Of 384 patients, 68 had FTC (MOSQ-TSP, n=34 versus CONV-TSP, n=34. No complications were reported. In MOSQ-TSP, primary endpoint was 100% (versus 73.5%, p<0.002, median time to LA access was 72 s [from 37 to 384 s] (versus 326 s [from 75 s to 1936 s], p<0.002, mean DAP to LA access was 1778±2315 mGy/cm2 (versus 9347±10690 mGy/cm2, p<0.002, and median number of additional punctures was 2 [1 to 3] (versus 0, p<0.002. Conclusion. In AF patients in whom the first attempt of TSP fails, the “mosquito” technique allows effective, safe, and time sparing LA access. This approach might facilitate TSP in elastic, aneurysmatic, or fibrosed septa.

  20. Multiple sensitive estimation and optimal sample size allocation in the item sum technique.

    Science.gov (United States)

    Perri, Pier Francesco; Rueda García, María Del Mar; Cobo Rodríguez, Beatriz

    2017-09-27

    For surveys of sensitive issues in life sciences, statistical procedures can be used to reduce nonresponse and social desirability response bias. Both of these phenomena provoke nonsampling errors that are difficult to deal with and can seriously flaw the validity of the analyses. The item sum technique (IST) is a very recent indirect questioning method derived from the item count technique that seeks to procure more reliable responses on quantitative items than direct questioning while preserving respondents' anonymity. This article addresses two important questions concerning the IST: (i) its implementation when two or more sensitive variables are investigated and efficient estimates of their unknown population means are required; (ii) the determination of the optimal sample size to achieve minimum variance estimates. These aspects are of great relevance for survey practitioners engaged in sensitive research and, to the best of our knowledge, were not studied so far. In this article, theoretical results for multiple estimation and optimal allocation are obtained under a generic sampling design and then particularized to simple random sampling and stratified sampling designs. Theoretical considerations are integrated with a number of simulation studies based on data from two real surveys and conducted to ascertain the efficiency gain derived from optimal allocation in different situations. One of the surveys concerns cannabis consumption among university students. Our findings highlight some methodological advances that can be obtained in life sciences IST surveys when optimal allocation is achieved. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Hybrid computer technique yields random signal probability distributions

    Science.gov (United States)

    Cameron, W. D.

    1965-01-01

    Hybrid computer determines the probability distributions of instantaneous and peak amplitudes of random signals. This combined digital and analog computer system reduces the errors and delays of manual data analysis.

  2. Sample size calculations for 3-level cluster randomized trials

    NARCIS (Netherlands)

    Teerenstra, S.; Moerbeek, M.; Achterberg, T. van; Pelzer, B.J.; Borm, G.F.

    2008-01-01

    BACKGROUND: The first applications of cluster randomized trials with three instead of two levels are beginning to appear in health research, for instance, in trials where different strategies to implement best-practice guidelines are compared. In such trials, the strategy is implemented in health

  3. Sample size calculations for 3-level cluster randomized trials

    NARCIS (Netherlands)

    Teerenstra, S.; Moerbeek, M.; Achterberg, T. van; Pelzer, B.J.; Borm, G.F.

    2008-01-01

    Background The first applications of cluster randomized trials with three instead of two levels are beginning to appear in health research, for instance, in trials where different strategies to implement best-practice guidelines are compared. In such trials, the strategy is implemented in health

  4. Improved estimator of finite population mean using auxiliary attribute in stratified random sampling

    OpenAIRE

    Verma, Hemant K.; Sharma, Prayas; Singh, Rajesh

    2014-01-01

    The present study discuss the problem of estimating the finite population mean using auxiliary attribute in stratified random sampling. In this paper taking the advantage of point bi-serial correlation between the study variable and auxiliary attribute, we have improved the estimation of population mean in stratified random sampling. The expressions for Bias and Mean square error have been derived under stratified random sampling. In addition, an empirical study has been carried out to examin...

  5. Application of work sampling technique to analyze logging operations.

    Science.gov (United States)

    Edwin S. Miyata; Helmuth M. Steinhilb; Sharon A. Winsauer

    1981-01-01

    Discusses the advantages and disadvantages of various time study methods for determining efficiency and productivity in logging. The work sampling method is compared with the continuous time-study method. Gives the feasibility, capability, and limitation of the work sampling method.

  6. Effectiveness of three different oral hygiene techniques on Viridans streptococci: A randomized controlled trial

    Directory of Open Access Journals (Sweden)

    N Naveen

    2016-01-01

    Full Text Available Introduction: Tongue cleaning is an important aspect of oral hygiene maintenance along with other mechanical and chemical aids. These methods have an influence on microorganism count in saliva. Aim: To assess the effectiveness of three different oral hygiene techniques on Viridans streptococci. Materials and Methods: This was a randomized controlled trial with 45 study subjects aged between 14 and 16 years and were randomly allocated into three groups: Group A - plastic tongue scraper, Group B - chlorhexidine mouthwash along with plastic tongue scraper, and Group C - chlorhexidine mouthwash. Unstimulated salivary samples were collected on the 1st, 7th, and 15th day before routine oral hygiene practices. Saliva samples were collected and incubated for 48 h on itis Salivarius(MS agar. Streptococcus mitis, Streptococcus mutans, and Streptococcus salivarius were counted. Data were analyzed using descriptive and inferential statistics. Results: The mean count of S. mitis, S. mutans, and S. salivarius for Group A, B, and C was found to be significant (P < 0.001 when compared between 1st, 7th, and 15th day. Between-groups comparisons revealed a significant difference between Groups A and C, B and C (P < 0.001. Conclusion: There was a significant reduction in bacterial count in all the participants indicating that all the three methods are useful in improving oral hygiene. Combination technique was found to be most effective.

  7. Surgical Technique in Distal Pancreatectomy: A Systematic Review of Randomized Trials

    Directory of Open Access Journals (Sweden)

    Filip Čečka

    2014-01-01

    Full Text Available Despite recent improvements in surgical technique, the morbidity of distal pancreatectomy remains high, with pancreatic fistula being the most significant postoperative complication. A systematic review of randomized controlled trials (RCTs dealing with surgical techniques in distal pancreatectomy was carried out to summarize up-to-date knowledge on this topic. The Cochrane Central Registry of Controlled Trials, Embase, Web of Science, and Pubmed were searched for relevant articles published from 1990 to December 2013. Ten RCTs were identified and included in the systematic review, with a total of 1286 patients being randomized (samples ranging from 41 to 450. The reviewers were in agreement for application of the eligibility criteria for study selection. It was not possible to carry out meta-analysis of these studies because of the heterogeneity of surgical techniques and approaches, such as varying methods of pancreas transection, reinforcement of the stump with seromuscular patch or pancreaticoenteric anastomosis, sealing with fibrin sealants and pancreatic stent placement. Management of the pancreatic remnant after distal pancreatectomy is still a matter of debate. The results of this systematic review are possibly biased by methodological problems in some of the included studies. New well designed and carefully conducted RCTs must be performed to establish the optimal strategy for pancreatic remnant management after distal pancreatectomy.

  8. Surgical Technique in Distal Pancreatectomy: A Systematic Review of Randomized Trials

    Science.gov (United States)

    Čečka, Filip; Jon, Bohumil; Šubrt, Zdeněk; Ferko, Alexander

    2014-01-01

    Despite recent improvements in surgical technique, the morbidity of distal pancreatectomy remains high, with pancreatic fistula being the most significant postoperative complication. A systematic review of randomized controlled trials (RCTs) dealing with surgical techniques in distal pancreatectomy was carried out to summarize up-to-date knowledge on this topic. The Cochrane Central Registry of Controlled Trials, Embase, Web of Science, and Pubmed were searched for relevant articles published from 1990 to December 2013. Ten RCTs were identified and included in the systematic review, with a total of 1286 patients being randomized (samples ranging from 41 to 450). The reviewers were in agreement for application of the eligibility criteria for study selection. It was not possible to carry out meta-analysis of these studies because of the heterogeneity of surgical techniques and approaches, such as varying methods of pancreas transection, reinforcement of the stump with seromuscular patch or pancreaticoenteric anastomosis, sealing with fibrin sealants and pancreatic stent placement. Management of the pancreatic remnant after distal pancreatectomy is still a matter of debate. The results of this systematic review are possibly biased by methodological problems in some of the included studies. New well designed and carefully conducted RCTs must be performed to establish the optimal strategy for pancreatic remnant management after distal pancreatectomy. PMID:24971333

  9. Comparison of sampling techniques for Rift Valley Fever virus ...

    African Journals Online (AJOL)

    A total of 1814 mosquitoes were collected, of which 738 were collected by CDC light traps and 1076 by Mosquito Magnet trapping technique. Of the collected mosquitoes, 12.46% (N= 226) were Aedes aegypti and 87.54% (N= 1588) were Culex pipiens complex. More mosquitoes were collected outdoors using Mosquito ...

  10. Assessment of three major decomposition techniques for sample ...

    African Journals Online (AJOL)

    Three main rock-decomposition techniques; microwave oven, open beaker acid and basic fusion were examined in an attempt to establish the most appropriate method for the decomposition of granite rocks for elemental analysis. Standard reference rock material NIM- SARM-I was dissolved using each of the digestion ...

  11. Comparison of sampling techniques for Rift Valley Fever virus ...

    African Journals Online (AJOL)

    which 738 were collected by CDC light traps and 1076 by Mosquito Magnet trapping technique. Of the collected mosquitoes, 12.46% (N= 226) were Aedes aegypti and 87.54% (N= 1588) were Culex pipiens complex. More mosquitoes were collected outdoors using Mosquito Magnets baited with octenol attractant, 36.38%.

  12. Enhanced sampling techniques in molecular dynamics simulations of biological systems.

    Science.gov (United States)

    Bernardi, Rafael C; Melo, Marcelo C R; Schulten, Klaus

    2015-05-01

    Molecular dynamics has emerged as an important research methodology covering systems to the level of millions of atoms. However, insufficient sampling often limits its application. The limitation is due to rough energy landscapes, with many local minima separated by high-energy barriers, which govern the biomolecular motion. In the past few decades methods have been developed that address the sampling problem, such as replica-exchange molecular dynamics, metadynamics and simulated annealing. Here we present an overview over theses sampling methods in an attempt to shed light on which should be selected depending on the type of system property studied. Enhanced sampling methods have been employed for a broad range of biological systems and the choice of a suitable method is connected to biological and physical characteristics of the system, in particular system size. While metadynamics and replica-exchange molecular dynamics are the most adopted sampling methods to study biomolecular dynamics, simulated annealing is well suited to characterize very flexible systems. The use of annealing methods for a long time was restricted to simulation of small proteins; however, a variant of the method, generalized simulated annealing, can be employed at a relatively low computational cost to large macromolecular complexes. Molecular dynamics trajectories frequently do not reach all relevant conformational substates, for example those connected with biological function, a problem that can be addressed by employing enhanced sampling algorithms. This article is part of a Special Issue entitled Recent developments of molecular dynamics. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Importance of accurate sampling techniques in microbiological diagnosis of endophthalmitis

    Directory of Open Access Journals (Sweden)

    Banu A

    2011-05-01

    Full Text Available BackgroundEndophthalmitis is an ocular emergency and bacteria arethe commonest aetiological agents of infectiousendophthalmitis. Any delay in treatment will result inserious complications like complete loss of vision.Therefore, obtaining the most appropriate sample is ofparamount importance for a microbiologist to identify theaetiological agents that help the ophthalmologist inplanning treatment.ObjectiveThis study was undertaken to determine the intraocularspecimen that is most likely to yield a positive culture onmicrobiological examination.MethodsFrom 60 cases, intraocular samples were collected in theoperation theatre under anaesthesia. The samples obtainedwere aqueous humour and vitreous humour by vitreous tap,vitreous biopsy or pars plana vitrectomy. The specimenswere processed within half an hour, first by inoculating ontoculture media and then direct smear examination by Gram’sStainResultsEighty samples were obtained from 60 cases of which themost were vitreous fluid (vitreous biopsy/tap + vitrectomyfluid, i.e., 75%. Culture was positive in 88% vitrectomy fluidas compared to 74% in vitreous tap/biopsy followed by 20%in aqueous fluid.ConclusionsVitrectomy fluid appears to be the best sample for culturefrom clinically diagnosed endophthalmitis cases.

  14. Study of gastric cancer samples using terahertz techniques

    Science.gov (United States)

    Wahaia, Faustino; Kasalynas, Irmantas; Seliuta, Dalius; Molis, Gediminas; Urbanowicz, Andrzej; Carvalho Silva, Catia D.; Carneiro, Fatima; Valusis, Gintaras; Granja, Pedro L.

    2014-08-01

    In the present work, samples of healthy and adenocarcinoma-affected human gastric tissue were analyzed using transmission time-domain THz spectroscopy (THz-TDS) and spectroscopic THz imaging at 201 and 590 GHz. The work shows that it is possible to distinguish between normal and cancerous regions in dried and paraffin-embedded samples. Plots of absorption coefficient α and refractive index n of normal and cancer affected tissues, as well as 2-D transmission THz images are presented and the conditions for discrimination between normal and affected tissues are discussed.

  15. Computer Corner: A Note on Pascal's Triangle and Simple Random Sampling.

    Science.gov (United States)

    Wright, Tommy

    1989-01-01

    Describes the algorithm used to select a simple random sample of certain size without having to list all possible samples and a justification based on Pascal's triangle. Provides testing results by various computers. (YP)

  16. Exponential ratio-product type estimators under second order approximation in stratified random sampling

    OpenAIRE

    Singh, Rajesh; Sharma, Prayas; Smarandache, Florentin

    2014-01-01

    Singh et al (20009) introduced a family of exponential ratio and product type estimators in stratified random sampling. Under stratified random sampling without replacement scheme, the expressions of bias and mean square error (MSE) of Singh et al (2009) and some other estimators, up to the first- and second-order approximations are derived. Also, the theoretical findings are supported by a numerical example.

  17. Study of an RF Direct Sampling Technique for Geodetic VLBI

    Science.gov (United States)

    Takefuji, K.; Kondo, T.; Sekido, M.; Ichikawa, R.; Kurihara, S.; Kokado, K.; Kawabata, R.

    2012-12-01

    Recently some digital samplers, which involve high RF frequency sensitivity, have been developed. We installed such samplers (sensitivity up to 24 GHz) at the Kashima 11-m station and the Tsukuba 32-m station (about 50 km baseline) in Japan and directly sampled X-band without any frequency conversion such as analog mixers. After the correlation process, we successfully detected first fringes at X-band. For the purpose of observing geodetic VLBI, we mixed signals of the S-band and the X-band just after the low noise amplifier. The mixed signal became overlapped and aliased baseband signals after 1024 MHz, 2-bit sampling. We could obtain four fringes (one from S-band and three from X-band), which came from the overlapped baseband signals, and successfully determined the baseline length.

  18. Query-Based Sampling: Can we do Better than Random?

    NARCIS (Netherlands)

    Tigelaar, A.S.; Hiemstra, Djoerd

    2010-01-01

    Many servers on the web offer content that is only accessible via a search interface. These are part of the deep web. Using conventional crawling to index the content of these remote servers is impossible without some form of cooperation. Query-based sampling provides an alternative to crawling

  19. Stratified random sampling plan for an irrigation customer telephone survey

    Energy Technology Data Exchange (ETDEWEB)

    Johnston, J.W.; Davis, L.J.

    1986-05-01

    This report describes the procedures used to design and select a sample for a telephone survey of individuals who use electricity in irrigating agricultural cropland in the Pacific Northwest. The survey is intended to gather information on the irrigated agricultural sector that will be useful for conservation assessment, load forecasting, rate design, and other regional power planning activities.

  20. Classification of epileptic EEG signals based on simple random sampling and sequential feature selection.

    Science.gov (United States)

    Ghayab, Hadi Ratham Al; Li, Yan; Abdulla, Shahab; Diykh, Mohammed; Wan, Xiangkui

    2016-06-01

    Electroencephalogram (EEG) signals are used broadly in the medical fields. The main applications of EEG signals are the diagnosis and treatment of diseases such as epilepsy, Alzheimer, sleep problems and so on. This paper presents a new method which extracts and selects features from multi-channel EEG signals. This research focuses on three main points. Firstly, simple random sampling (SRS) technique is used to extract features from the time domain of EEG signals. Secondly, the sequential feature selection (SFS) algorithm is applied to select the key features and to reduce the dimensionality of the data. Finally, the selected features are forwarded to a least square support vector machine (LS_SVM) classifier to classify the EEG signals. The LS_SVM classifier classified the features which are extracted and selected from the SRS and the SFS. The experimental results show that the method achieves 99.90, 99.80 and 100 % for classification accuracy, sensitivity and specificity, respectively.

  1. Randomization techniques for assessing the significance of gene periodicity results

    Directory of Open Access Journals (Sweden)

    Vuokko Niko

    2011-08-01

    Full Text Available Abstract Background Modern high-throughput measurement technologies such as DNA microarrays and next generation sequencers produce extensive datasets. With large datasets the emphasis has been moving from traditional statistical tests to new data mining methods that are capable of detecting complex patterns, such as clusters, regulatory networks, or time series periodicity. Study of periodic gene expression is an interesting research question that also is a good example of challenges involved in the analysis of high-throughput data in general. Unlike for classical statistical tests, the distribution of test statistic for data mining methods cannot be derived analytically. Results We describe the randomization based approach to significance testing, and show how it can be applied to detect periodically expressed genes. We present four randomization methods, three of which have previously been used for gene cycle data. We propose a new method for testing significance of periodicity in gene expression short time series data, such as from gene cycle and circadian clock studies. We argue that the underlying assumptions behind existing significance testing approaches are problematic and some of them unrealistic. We analyze the theoretical properties of the existing and proposed methods, showing how our method can be robustly used to detect genes with exceptionally high periodicity. We also demonstrate the large differences in the number of significant results depending on the chosen randomization methods and parameters of the testing framework. By reanalyzing gene cycle data from various sources, we show how previous estimates on the number of gene cycle controlled genes are not supported by the data. Our randomization approach combined with widely adopted Benjamini-Hochberg multiple testing method yields better predictive power and produces more accurate null distributions than previous methods. Conclusions Existing methods for testing significance

  2. Sampling hydrometeors in clouds in-situ - the replicator technique

    Science.gov (United States)

    Wex, Heike; Löffler, Mareike; Griesche, Hannes; Bühl, Johannes; Stratmann, Frank; Schmitt, Carl; Dirksen, Ruud; Reichardt, Jens; Wolf, Veronika; Kuhn, Thomas; Prager, Lutz; Seifert, Patric

    2017-04-01

    For the examination of ice crystals in clouds, concerning their number concentrations, sizes and shapes, often instruments mounted on fast flying aircraft are used. One related disadvantage is possible shattering of the ice crystals on inlets, which has been improved with the introduction of the "Korolev-tip" and by accounting for inter-arrival times (Korolev et al., 2013, 2015), but additionally, the typically fast flying aircraft allow only for a low spatial resolution. Alternative sampling methods have been introduced as e.g., a replicator by Miloshevich & Heymsfield (1997) and an in-situ imager by by Kuhn & Heymsfield (2016). They both sample ice crystals onto an advancing stripe while ascending on a balloon, conserving the ice crystals either in formvar for later off-line analysis under a microscope (Miloshevich & Heymsfield, 1997) or imaging them upon their impaction on silicone oil (Kuhn & Heymsfield, 2016), both yielding vertical profiles for different ice crystal properties. A measurement campaign was performed at the Lindenberg Meteorological Observatory of the German Meteorological Service (DWD) in Germany in October 2016, during which both types of instruments were used during balloon ascents, while ground-based Lidar and cloud-radar measurements were performed simultaneously. The two ice particle sondes were operated by people from the Lulea University of Technology and from TROPOS, where the latter one was made operational only recently. Here, we will show first results of the TROPOS replicator on ice crystals sampled during one ascent, for which the collected ice crystals were analyzed off-line using a microscope. Literature: Korolev, A., E. Emery, and K. Creelman (2013), Modification and tests of particle probe tips to mitigate effects of ice shattering, J. Atmos. Ocean. Tech., 30, 690-708, 2013. Korolev, A., and P. R. Field (2015), Assessment of the performance of the inter-arrival time algorithm to identify ice shattering artifacts in cloud

  3. Fast egg collection method greatly improves randomness of egg sampling in Drosophila melanogaster

    DEFF Research Database (Denmark)

    Schou, Mads Fristrup

    2013-01-01

    When obtaining samples for population genetic studies, it is essential that the sampling is random. For Drosophila, one of the crucial steps in sampling experimental flies is the collection of eggs. Here an egg collection method is presented, which randomizes the eggs in a water column and dimini......When obtaining samples for population genetic studies, it is essential that the sampling is random. For Drosophila, one of the crucial steps in sampling experimental flies is the collection of eggs. Here an egg collection method is presented, which randomizes the eggs in a water column...... and to obtain a representative collection of genotypes, the method presented here is strongly recommended when collecting eggs from Drosophila....

  4. Moving Target Techniques: Cyber Resilience throught Randomization, Diversity, and Dynamism

    Science.gov (United States)

    2017-03-03

    cyber resilience that attempts to rebalance the cyber landscape is known as cyber moving target (MT) (or just moving target) techniques. Moving target...articulated threat model , it may be unclear to network defenders what threat needs to be mitigated and thus how best to deploy the defensive...systems and hardware architectures . For example, the application can run on top of a platform consisting of the Fedora operating system and x86

  5. Parameter Estimation in Stratified Cluster Sampling under Randomized Response Models for Sensitive Question Survey.

    Science.gov (United States)

    Pu, Xiangke; Gao, Ge; Fan, Yubo; Wang, Mian

    2016-01-01

    Randomized response is a research method to get accurate answers to sensitive questions in structured sample survey. Simple random sampling is widely used in surveys of sensitive questions but hard to apply on large targeted populations. On the other side, more sophisticated sampling regimes and corresponding formulas are seldom employed to sensitive question surveys. In this work, we developed a series of formulas for parameter estimation in cluster sampling and stratified cluster sampling under two kinds of randomized response models by using classic sampling theories and total probability formulas. The performances of the sampling methods and formulas in the survey of premarital sex and cheating on exams at Soochow University were also provided. The reliability of the survey methods and formulas for sensitive question survey was found to be high.

  6. Parameter Estimation in Stratified Cluster Sampling under Randomized Response Models for Sensitive Question Survey.

    Directory of Open Access Journals (Sweden)

    Xiangke Pu

    Full Text Available Randomized response is a research method to get accurate answers to sensitive questions in structured sample survey. Simple random sampling is widely used in surveys of sensitive questions but hard to apply on large targeted populations. On the other side, more sophisticated sampling regimes and corresponding formulas are seldom employed to sensitive question surveys. In this work, we developed a series of formulas for parameter estimation in cluster sampling and stratified cluster sampling under two kinds of randomized response models by using classic sampling theories and total probability formulas. The performances of the sampling methods and formulas in the survey of premarital sex and cheating on exams at Soochow University were also provided. The reliability of the survey methods and formulas for sensitive question survey was found to be high.

  7. A Unified Approach to Power Calculation and Sample Size Determination for Random Regression Models

    Science.gov (United States)

    Shieh, Gwowen

    2007-01-01

    The underlying statistical models for multiple regression analysis are typically attributed to two types of modeling: fixed and random. The procedures for calculating power and sample size under the fixed regression models are well known. However, the literature on random regression models is limited and has been confined to the case of all…

  8. Sample selection and preservation techniques for the Mars sample return mission

    Science.gov (United States)

    Tsay, Fun-Dow

    1988-01-01

    It is proposed that a miniaturized electron spin resonance (ESR) spectrometer be developed as an effective, nondestructivew sample selection and characterization instrument for the Mars Rover Sample Return mission. The ESR instrument can meet rover science payload requirements and yet has the capability and versatility to perform the following in situ Martian sample analyses: (1) detection of active oxygen species, and characterization of Martian surface chemistry and photocatalytic oxidation processes; (2) determination of paramagnetic Fe(3+) in clay silicate minerals, Mn(2+) in carbonates, and ferromagnetic centers of magnetite, maghemite and hematite; (3) search for organic compounds in the form of free radicals in subsoil, and detection of Martian fossil organic matter likely to be associated with carbonate and other sedimentary deposits. The proposed instrument is further detailed.

  9. The gated integration technique for the accurate measurement of the autocorrelation function of speckle intensities scattered from random phase screens

    Science.gov (United States)

    Zhang, Ningyu; Cheng, Chuanfu; Teng, Shuyun; Chen, Xiaoyi; Xu, Zhizhan

    2007-09-01

    A new approach based on the gated integration technique is proposed for the accurate measurement of the autocorrelation function of speckle intensities scattered from a random phase screen. The Boxcar used for this technique in the acquisition of the speckle intensity data integrates the photoelectric signal during its sampling gate open, and it repeats the sampling by a preset number, m. The average analog of the m samplings output by the Boxcar enhances the signal-to-noise ratio by √{m}, because the repeated sampling and the average make the useful speckle signals stable, while the randomly varied photoelectric noise is suppressed by 1/√{m}. In the experiment, we use an analog-to-digital converter module to synchronize all the actions such as the stepped movement of the phase screen, the repeated sampling, the readout of the averaged output of the Boxcar, etc. The experimental results show that speckle signals are better recovered from contaminated signals, and the autocorrelation function with the secondary maximum is obtained, indicating that the accuracy of the measurement of the autocorrelation function is greatly improved by the gated integration technique.

  10. A Family of Estimators of a Sensitive Variable Using Auxiliary Information in Stratified Random Sampling

    National Research Council Canada - National Science Library

    Nadia Mushtaq; Noor Ul Amin; Muhammad Hanif

    2017-01-01

    In this article, a combined general family of estimators is proposed for estimating finite population mean of a sensitive variable in stratified random sampling with non-sensitive auxiliary variable...

  11. A Systematic Review of Efficacy of the Attention Training Technique in Clinical and Nonclinical Samples.

    Science.gov (United States)

    Knowles, Mark M; Foden, Philip; El-Deredy, Wael; Wells, Adrian

    2016-10-01

    The Attention Training Technique (ATT; Wells, 1990) is a brief metacognitive treatment strategy aimed at remediating self-focused processing and increasing attention flexibility in psychological disorder. We systematically reviewed and examined the efficacy of ATT in clinical and nonclinical samples. Scientific databases were searched from 1990 to 2014 and 10 studies (total N = 295) met inclusion criteria. Single-case data were meta-analyzed using the improvement rate difference, and standardized between and within-group effect sizes (ESs) were examined across 4 analogue randomized controlled trials (RCTs). Single-case outcomes indicated that ATT yields large ES estimates (pooled ES range: 0.74-1.00) for anxiety and depressive disorders. Standardized ESs across the RCTs indicated that ATT yields greater treatment gains than reference groups across majority outcomes (adjusted Cohen's d range: 0.40-1.23). These preliminary results suggest ATT may be effective in treating anxiety and depressive disorders and help remediate some symptoms of schizophrenia. Although a limited number of studies with small sample sizes warrants caution of interpretation, ATT appears promising and future studies will benefit from adequately powered RCTs. © 2016 Wiley Periodicals, Inc.

  12. A New Estimator For Population Mean Using Two Auxiliary Variables in Stratified random Sampling

    OpenAIRE

    Singh, Rajesh; Malik, Sachin

    2014-01-01

    In this paper, we suggest an estimator using two auxiliary variables in stratified random sampling. The propose estimator has an improvement over mean per unit estimator as well as some other considered estimators. Expressions for bias and MSE of the estimator are derived up to first degree of approximation. Moreover, these theoretical findings are supported by a numerical example with original data. Key words: Study variable, auxiliary variable, stratified random sampling, bias and mean squa...

  13. Techniques for wound closure at caesarean section: a randomized clinical trial

    NARCIS (Netherlands)

    de Graaf, I. M.; Oude Rengerink, K.; Wiersma, I. C.; Donker, M. E.; Mol, B. W.; Pajkrt, E.

    2012-01-01

    Objective: It is unclear which technique for skin closure should be used at caesarean section (CS) in order to get the best cosmetic result. Study design: We conducted a randomized controlled trial to assess the cosmetic result of different techniques for skin closure after CS. A two-center

  14. Large Signal Excitation Measurement Techniques for Random Telegraph Signal Noise in MOSFETs

    NARCIS (Netherlands)

    Hoekstra, E.

    2005-01-01

    This paper introduces large signal excitation measurement techniques to analyze random telegraph signal (RTS) noise originating from oxide-traps in MOSFETs. The paper concentrates on the trap-occupancy, which relates directly to the generated noise. The proposed measurement technique makes

  15. Large Signal Excitation Measurement Techniques for Random Telegraph Signal Noise in MOSFETs

    NARCIS (Netherlands)

    Hoekstra, E.; Kolhatkar, J.S.; van der Wel, A.P.; Salm, Cora; Klumperink, Eric A.M.

    2005-01-01

    This paper introduces large signal excitation measurement techniques to analyze Random Telegraph Signal (RTS) noise originating from oxide-traps in MOSFETs. The paper concentrates on the trap-occupancy, which relates directly to the generated noise. The proposed measurement technique makes

  16. Conflict-cost based random sampling design for parallel MRI with low rank constraints

    Science.gov (United States)

    Kim, Wan; Zhou, Yihang; Lyu, Jingyuan; Ying, Leslie

    2015-05-01

    In compressed sensing MRI, it is very important to design sampling pattern for random sampling. For example, SAKE (simultaneous auto-calibrating and k-space estimation) is a parallel MRI reconstruction method using random undersampling. It formulates image reconstruction as a structured low-rank matrix completion problem. Variable density (VD) Poisson discs are typically adopted for 2D random sampling. The basic concept of Poisson disc generation is to guarantee samples are neither too close to nor too far away from each other. However, it is difficult to meet such a condition especially in the high density region. Therefore the sampling becomes inefficient. In this paper, we present an improved random sampling pattern for SAKE reconstruction. The pattern is generated based on a conflict cost with a probability model. The conflict cost measures how many dense samples already assigned are around a target location, while the probability model adopts the generalized Gaussian distribution which includes uniform and Gaussian-like distributions as special cases. Our method preferentially assigns a sample to a k-space location with the least conflict cost on the circle of the highest probability. To evaluate the effectiveness of the proposed random pattern, we compare the performance of SAKEs using both VD Poisson discs and the proposed pattern. Experimental results for brain data show that the proposed pattern yields lower normalized mean square error (NMSE) than VD Poisson discs.

  17. Comparison of kriging interpolation precision between grid sampling scheme and simple random sampling scheme for precision agriculture

    Directory of Open Access Journals (Sweden)

    Jiang Houlong

    2016-01-01

    Full Text Available Sampling methods are important factors that can potentially limit the accuracy of predictions of spatial distribution patterns. A 10 ha tobacco-planted field was selected to compared the accuracy in predicting the spatial distribution of soil properties by using ordinary kriging and cross validation methods between grid sampling and simple random sampling scheme (SRS. To achieve this objective, we collected soil samples from the topsoil (0-20 cm in March 2012. Sample numbers of grid sampling and SRS were both 115 points each. Accuracies of spatial interpolation using the two sampling schemes were then evaluated based on validation samples (36 points and deviations of the estimates. The results suggested that soil pH and nitrate-N (NO3-N had low variation, whereas all other soil properties exhibited medium variation. Soil pH, organic matter (OM, total nitrogen (TN, cation exchange capacity (CEC, total phosphorus (TP and available phosphorus (AP matched the spherical model, whereas the remaining variables fit an exponential model with both sampling methods. The interpolation error of soil pH, TP, and AP was the lowest in SRS. The errors of interpolation for OM, CEC, TN, available potassium (AK and total potassium (TK were the lowest for grid sampling. The interpolation precisions of the soil NO3-N showed no significant differences between the two sampling schemes. Considering our data on interpolation precision and the importance of minerals for cultivation of flue-cured tobacco, the grid-sampling scheme should be used in tobacco-planted fields to determine the spatial distribution of soil properties. The grid-sampling method can be applied in a practical and cost-effective manner to facilitate soil sampling in tobacco-planted field.

  18. Mental techniques during manual stretching in spasticity--a pilot randomized controlled trial.

    Science.gov (United States)

    Bovend'Eerdt, Thamar J H; Dawes, Helen; Sackley, Cath; Izadi, Hooshang; Wade, Derick T

    2009-02-01

    To evaluate the feasibility and effects of using motor imagery during therapeutic stretching in individuals with spasticity. Randomized single-blind controlled pilot trial. Chronic day care unit, neurological rehabilitation unit and in the community. Eleven individuals with spasticity in the arm requiring stretching as part of their normal routine. In addition to their normal stretching routine, subjects in the experimental group received motor imagery during their stretches (n = 6). The control group received progressive muscle relaxation during their stretches (n = 5). The dose varied between 8 and 56 sessions over eight weeks. Resistance to passive movement, measured with a torque transducer, passive range of movement, measured with an electro-goniometer, Modified Ashworth Scale (MAS) and level of discomfort during the MAS were assessed at baseline and after eight weeks by an independent assessor. These measures were recorded before and after a stretch intervention on both assessments. Participants, therapists and carers tolerated the techniques well. Compliance was variable and adherence was good. Mixed ANOVA showed no difference over time and no difference between the motor imagery and progressive muscle relaxation group on the primary and secondary outcome measures (P>0.05). It is feasible to use motor imagery during therapeutic stretching. Statistical power was low due to the large variability in the population and the small sample size. Post-hoc sample size calculation suggests that future studies of this subject should include at least 54 participants per group. Further research is warranted.

  19. Calculating sample sizes for cluster randomized trials: we can keep it simple and efficient !

    NARCIS (Netherlands)

    van Breukelen, Gerard J.P.; Candel, Math J.J.M.

    2012-01-01

    Objective: Simple guidelines for efficient sample sizes in cluster randomized trials with unknown intraclass correlation and varying cluster sizes. Methods: A simple equation is given for the optimal number of clusters and sample size per cluster. Here, optimal means maximizing power for a given

  20. [Randomized prospective study of three different techniques for ultrasound-guided axillary brachial plexus block].

    Science.gov (United States)

    Ferraro, Leonardo Henirque Cunha; Takeda, Alexandre; Sousa, Paulo César Castello Branco de; Mehlmann, Fernanda Moreira Gomes; Junior, Jorge Kiyoshi Mitsunaga; Falcão, Luiz Fernando Dos Reis

    2017-06-23

    Randomized prospective study comparing two perivascular techniques with the perineural technique for ultrasound-guided axillary brachial plexus block (US-ABPB). The primary objective was to verify if these perivascular techniques are noninferior to the perineural technique. 240 patients were randomized to receive the techniques: below the artery (BA), around the artery (AA) or perineural (PN). The anesthetic volume used was 40mL of 0.375% bupivacaine. All patients received a musculocutaneous nerve blockade with 10mL. In BA technique, 30mL were injected below the axillary artery. In AA technique, 7.5mL were injected at 4 points around the artery. In PN technique, the median, ulnar, and radial nerves were anesthetized with 10mL per nerve. Confidence interval analysis showed that the perivascular techniques studied were not inferior to the perineural technique. The time to perform the blockade was shorter for the BA technique (300.4±78.4sec, 396.5±117.1sec, 487.6±172.6sec, respectively). The PN technique showed a lower latency time (PN - 655.3±348.9sec; BA -1044±389.5sec; AA-932.9±314.5sec), and less total time for the procedure (PN-1132±395.8sec; BA -1346.2±413.4sec; AA 1329.5±344.4sec). TA technique had a higher incidence of vascular puncture (BA - 22.5%; AA - 16.3%; PN - 5%). The perivascular techniques are viable alternatives to perineural technique for US-ABPB. There is a higher incidence of vascular puncture associated with the BA technique. Copyright © 2017. Publicado por Elsevier Editora Ltda.

  1. Magnetic separation techniques in sample preparation for biological analysis: a review.

    Science.gov (United States)

    He, Jincan; Huang, Meiying; Wang, Dongmei; Zhang, Zhuomin; Li, Gongke

    2014-12-01

    Sample preparation is a fundamental and essential step in almost all the analytical procedures, especially for the analysis of complex samples like biological and environmental samples. In past decades, with advantages of superparamagnetic property, good biocompatibility and high binding capacity, functionalized magnetic materials have been widely applied in various processes of sample preparation for biological analysis. In this paper, the recent advancements of magnetic separation techniques based on magnetic materials in the field of sample preparation for biological analysis were reviewed. The strategy of magnetic separation techniques was summarized. The synthesis, stabilization and bio-functionalization of magnetic nanoparticles were reviewed in detail. Characterization of magnetic materials was also summarized. Moreover, the applications of magnetic separation techniques for the enrichment of protein, nucleic acid, cell, bioactive compound and immobilization of enzyme were described. Finally, the existed problems and possible trends of magnetic separation techniques for biological analysis in the future were proposed. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Identifying the origin of groundwater samples in a multi-layer aquifer system with Random Forest classification

    Science.gov (United States)

    Baudron, Paul; Alonso-Sarría, Francisco; García-Aróstegui, José Luís; Cánovas-García, Fulgencio; Martínez-Vicente, David; Moreno-Brotóns, Jesús

    2013-08-01

    Accurate identification of the origin of groundwater samples is not always possible in complex multilayered aquifers. This poses a major difficulty for a reliable interpretation of geochemical results. The problem is especially severe when the information on the tubewells design is hard to obtain. This paper shows a supervised classification method based on the Random Forest (RF) machine learning technique to identify the layer from where groundwater samples were extracted. The classification rules were based on the major ion composition of the samples. We applied this method to the Campo de Cartagena multi-layer aquifer system, in southeastern Spain. A large amount of hydrogeochemical data was available, but only a limited fraction of the sampled tubewells included a reliable determination of the borehole design and, consequently, of the aquifer layer being exploited. Added difficulty was the very similar compositions of water samples extracted from different aquifer layers. Moreover, not all groundwater samples included the same geochemical variables. Despite of the difficulty of such a background, the Random Forest classification reached accuracies over 90%. These results were much better than the Linear Discriminant Analysis (LDA) and Decision Trees (CART) supervised classification methods. From a total of 1549 samples, 805 proceeded from one unique identified aquifer, 409 proceeded from a possible blend of waters from several aquifers and 335 were of unknown origin. Only 468 of the 805 unique-aquifer samples included all the chemical variables needed to calibrate and validate the models. Finally, 107 of the groundwater samples of unknown origin could be classified. Most unclassified samples did not feature a complete dataset. The uncertainty on the identification of training samples was taken in account to enhance the model. Most of the samples that could not be identified had an incomplete dataset.

  3. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality

    Science.gov (United States)

    Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Introduction Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. Materials and methods The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. Results The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. Conclusions The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in

  4. Random Model Sampling: Making Craig Interpolation Work When It Should Not

    Directory of Open Access Journals (Sweden)

    Marat Akhin

    2014-01-01

    Full Text Available One of the most serious problems when doing program analyses is dealing with function calls. While function inlining is the traditional approach to this problem, it nonetheless suffers from the increase in analysis complexity due to the state space explosion. Craig interpolation has been successfully used in recent years in the context of bounded model checking to do function summarization which allows one to replace the complete function body with its succinct summary and, therefore, reduce the complexity. Unfortunately this technique can be applied only to a pair of unsatisfiable formulae.In this work-in-progress paper we present an approach to function summarization based on Craig interpolation that overcomes its limitation by using random model sampling. It captures interesting input/output relations, strengthening satisfiable formulae into unsatisfiable ones and thus allowing the use of Craig interpolation. Preliminary experiments show the applicability of this approach; in our future work we plan to do a full evaluation on real-world examples.

  5. Neurofeedback Against Binge Eating: A Randomized Controlled Trial in a Female Subclinical Threshold Sample.

    Science.gov (United States)

    Schmidt, Jennifer; Martin, Alexandra

    2016-09-01

    Brain-directed treatment techniques, such as neurofeedback, have recently been proposed as adjuncts in the treatment of eating disorders to improve therapeutic outcomes. In line with this recommendation, a cue exposure EEG-neurofeedback protocol was developed. The present study aimed at the evaluation of the specific efficacy of neurofeedback to reduce subjective binge eating in a female subthreshold sample. A total of 75 subjects were randomized to EEG-neurofeedback, mental imagery with a comparable treatment set-up or a waitlist group. At post-treatment, only EEG-neurofeedback led to a reduced frequency of binge eating (p = .015, g = 0.65). The effects remained stable to a 3-month follow-up. EEG-neurofeedback further showed particular beneficial effects on perceived stress and dietary self-efficacy. Differences in outcomes did not arise from divergent treatment expectations. Because EEG-neurofeedback showed a specific efficacy, it may be a promising brain-directed approach that should be tested as a treatment adjunct in clinical groups with binge eating. Copyright © 2016 John Wiley & Sons, Ltd and Eating Disorders Association. Copyright © 2016 John Wiley & Sons, Ltd and Eating Disorders Association.

  6. A coupled well-balanced and random sampling scheme for computing bubble oscillations*

    Directory of Open Access Journals (Sweden)

    Jung Jonathan

    2012-04-01

    Full Text Available We propose a finite volume scheme to study the oscillations of a spherical bubble of gas in a liquid phase. Spherical symmetry implies a geometric source term in the Euler equations. Our scheme satisfies the well-balanced property. It is based on the VFRoe approach. In order to avoid spurious pressure oscillations, the well-balanced approach is coupled with an ALE (Arbitrary Lagrangian Eulerian technique at the interface and a random sampling remap. Nous proposons un schéma de volumes finis pour étudier les oscillations d’une bulle sphérique de gaz dans l’eau. La symétrie sphérique fait apparaitre un terme source géométrique dans les équations d’Euler. Notre schéma est basé sur une approche VFRoe et préserve les états stationnaires. Pour éviter les oscillations de pression, l’approche well-balanced est couplée avec une approche ALE (Arbitrary Lagrangian Eulerian, et une étape de projection basée sur un échantillonage aléatoire.

  7. Stratified random sampling for estimating billing accuracy in health care systems.

    Science.gov (United States)

    Buddhakulsomsiri, Jirachai; Parthanadee, Parthana

    2008-03-01

    This paper presents a stratified random sampling plan for estimating accuracy of bill processing performance for the health care bills submitted to third party payers in health care systems. Bill processing accuracy is estimated with two measures: percent accuracy and total dollar accuracy. Difficulties in constructing a sampling plan arise when the population strata structure is unknown, and when the two measures require different sampling schemes. To efficiently utilize sample resource, the sampling plan is designed to effectively estimate both measures from the same sample. The sampling plan features a simple but efficient strata construction method, called rectangular method, and two accuracy estimation methods, one for each measure. The sampling plan is tested on actual populations from an insurance company. Accuracy estimates obtained are then used to compare the rectangular method to other potential clustering methods for strata construction, and compare the accuracy estimation methods to other eligible methods. Computational study results show effectiveness of the proposed sampling plan.

  8. SNP selection and classification of genome-wide SNP data using stratified sampling random forests.

    Science.gov (United States)

    Wu, Qingyao; Ye, Yunming; Liu, Yang; Ng, Michael K

    2012-09-01

    For high dimensional genome-wide association (GWA) case-control data of complex disease, there are usually a large portion of single-nucleotide polymorphisms (SNPs) that are irrelevant with the disease. A simple random sampling method in random forest using default mtry parameter to choose feature subspace, will select too many subspaces without informative SNPs. Exhaustive searching an optimal mtry is often required in order to include useful and relevant SNPs and get rid of vast of non-informative SNPs. However, it is too time-consuming and not favorable in GWA for high-dimensional data. The main aim of this paper is to propose a stratified sampling method for feature subspace selection to generate decision trees in a random forest for GWA high-dimensional data. Our idea is to design an equal-width discretization scheme for informativeness to divide SNPs into multiple groups. In feature subspace selection, we randomly select the same number of SNPs from each group and combine them to form a subspace to generate a decision tree. The advantage of this stratified sampling procedure can make sure each subspace contains enough useful SNPs, but can avoid a very high computational cost of exhaustive search of an optimal mtry, and maintain the randomness of a random forest. We employ two genome-wide SNP data sets (Parkinson case-control data comprised of 408 803 SNPs and Alzheimer case-control data comprised of 380 157 SNPs) to demonstrate that the proposed stratified sampling method is effective, and it can generate better random forest with higher accuracy and lower error bound than those by Breiman's random forest generation method. For Parkinson data, we also show some interesting genes identified by the method, which may be associated with neurological disorders for further biological investigations.

  9. Random Walks on Directed Networks: Inference and Respondent-driven Sampling

    CERN Document Server

    Malmros, Jens; Britton, Tom

    2013-01-01

    Respondent driven sampling (RDS) is a method often used to estimate population properties (e.g. sexual risk behavior) in hard-to-reach populations. It combines an effective modified snowball sampling methodology with an estimation procedure that yields unbiased population estimates under the assumption that the sampling process behaves like a random walk on the social network of the population. Current RDS estimation methodology assumes that the social network is undirected, i.e. that all edges are reciprocal. However, empirical social networks in general also have non-reciprocated edges. To account for this fact, we develop a new estimation method for RDS in the presence of directed edges on the basis of random walks on directed networks. We distinguish directed and undirected edges and consider the possibility that the random walk returns to its current position in two steps through an undirected edge. We derive estimators of the selection probabilities of individuals as a function of the number of outgoing...

  10. S2I techniques for analog sampled-data signal processing

    DEFF Research Database (Denmark)

    Machado, Gerson A. S.; Toumazou, Chris; Saether, Geir E.

    1996-01-01

    Some recent developments in Analog-Sampled-Data Signal Processing (ASD SP) are reviewed. Following a brief review of the state-of-the-art in switched capacitor (SC) signal processing, the "current mode" switched-current (SI/S2I) technique is presented. New techniques for exploring niches in low...

  11. The application of compressive sampling in rapid ultrasonic computerized tomography (UCT) technique of steel tube slab (STS)

    Science.gov (United States)

    Jiang, Baofeng; Jia, Pengjiao; Zhao, Wen; Wang, Wentao

    2018-01-01

    This paper explores a new method for rapid structural damage inspection of steel tube slab (STS) structures along randomly measured paths based on a combination of compressive sampling (CS) and ultrasonic computerized tomography (UCT). In the measurement stage, using fewer randomly selected paths rather than the whole measurement net is proposed to detect the underlying damage of a concrete-filled steel tube. In the imaging stage, the ℓ1-minimization algorithm is employed to recover the information of the microstructures based on the measurement data related to the internal situation of the STS structure. A numerical concrete tube model, with the various level of damage, was studied to demonstrate the performance of the rapid UCT technique. Real-world concrete-filled steel tubes in the Shenyang Metro stations were detected using the proposed UCT technique in a CS framework. Both the numerical and experimental results show the rapid UCT technique has the capability of damage detection in an STS structure with a high level of accuracy and with fewer required measurements, which is more convenient and efficient than the traditional UCT technique. PMID:29293593

  12. The photoload sampling technique: estimating surface fuel loadings from downward-looking photographs of synthetic fuelbeds

    Science.gov (United States)

    Robert E. Keane; Laura J. Dickinson

    2007-01-01

    Fire managers need better estimates of fuel loading so they can more accurately predict the potential fire behavior and effects of alternative fuel and ecosystem restoration treatments. This report presents a new fuel sampling method, called the photoload sampling technique, to quickly and accurately estimate loadings for six common surface fuel components (1 hr, 10 hr...

  13. Improved Upper Blepharoplasty Outcome Using an Internal Intradermal Suture Technique : A Prospective Randomized Study

    NARCIS (Netherlands)

    Pool, Shariselle M. W.; Krabbe-Timmerman, Irene S.; Cromheecke, Michel; van der Lei, Berend

    OBJECTIVETo assess whether a suture technique in upper blepharoplasty may be the cause of differences in the occurrence of suture abscess formation and focal inflammation.MATERIALS AND METHODSA Level I, randomized controlled trial. The upper blepharoplasty wound was closed with a running intradermal

  14. Continuous sample drop flow-based microextraction method as a microextraction technique for determination of organic compounds in water sample.

    Science.gov (United States)

    Moinfar, Soleyman; Khayatian, Gholamreza; Milani-Hosseini, Mohammad-Reza

    2014-11-01

    Continuous sample drop flow-based microextraction (CSDF-ME) is an improved version of continuous-flow microextraction (CFME) and a novel technique developed for extraction and preconcentration of benzene, toluene, ethyl benzene, m-xylene and o-xylene (BTEXs) from aqueous samples prior to gas chromatography-flame ionization detection (GC-FID). In this technique, a small amount (a few microliters) of organic solvent is transferred to the bottom of a conical bottom test tube and a few mL of aqueous solution is moved through the organic solvent at relatively slow flow rate. The aqueous solution transforms into fine droplets while passing through the organic solvent. After extraction, the enriched analyte in the extraction solvent is determined by GC-FID. The type of extraction solvent, its volume, needle diameter, and aqueous sample flow rate were investigated. The enrichment factor was 221-269 under optimum conditions and the recovery was 89-102%. The linear ranges and limits of detection for BTEXs were 2-500 and 1.4-3.1 µg L(-1), respectively. The relative standard deviations for 10 µg L(-1) of BTEXs in water were 1.8-6.2% (n=5). The advantages of CSDF-ME are its low cost, relatively short sample preparation time, low solvent consumption, high recovery, and high enrichment factor. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. Thermophilic Campylobacter spp. in turkey samples: evaluation of two automated enzyme immunoassays and conventional microbiological techniques

    DEFF Research Database (Denmark)

    Borck, Birgitte; Stryhn, H.; Ersboll, A.K.

    2002-01-01

    , neckskin and environmental samples) were collected over a period of 4 months at a turkey slaughterhouse and meat-cutting plant in Denmark. Faecal and environmental samples were tested by the conventional culture method and by the two EIAs, whereas meat and neckskin samples were tested by the two EIAs only......Aims: To determine the sensitivity and specificity of two automated enzyme immunoassays (EIA), EiaFoss and Minividas, and a conventional microbiological culture technique for detecting thermophilic Campylobacter spp. in turkey samples. Methods and Results: A total of 286 samples (faecal, meat...

  16. Elemental analyses of goundwater: demonstrated advantage of low-flow sampling and trace-metal clean techniques over standard techniques

    Science.gov (United States)

    Creasey, C. L.; Flegal, A. R.

    The combined use of both (1) low-flow purging and sampling and (2) trace-metal clean techniques provides more representative measurements of trace-element concentrations in groundwater than results derived with standard techniques. The use of low-flow purging and sampling provides relatively undisturbed groundwater samples that are more representative of in situ conditions, and the use of trace-element clean techniques limits the inadvertent introduction of contaminants during sampling, storage, and analysis. When these techniques are applied, resultant trace-element concentrations are likely to be markedly lower than results based on standard sampling techniques. In a comparison of data derived from contaminated and control groundwater wells at a site in California, USA, trace-element concentrations from this study were 2-1000 times lower than those determined by the conventional techniques used in sampling of the same wells prior to (5months) and subsequent to (1month) the collections for this study. Specifically, the cadmium and chromium concentrations derived using standard sampling techniques exceed the California Maximum Contaminant Levels (MCL), whereas in this investigation concentrations of both of those elements are substantially below their MCLs. Consequently, the combined use of low-flow and trace-metal clean techniques may preclude erroneous reports of trace-element contamination in groundwater. Résumé L'utilisation simultanée de la purge et de l'échantillonnage à faible débit et des techniques sans traces de métaux permet d'obtenir des mesures de concentrations en éléments en traces dans les eaux souterraines plus représentatives que les résultats fournis par les techniques classiques. L'utilisation de la purge et de l'échantillonnage à faible débit donne des échantillons d'eau souterraine relativement peu perturbés qui sont plus représentatifs des conditions in situ, et le recours aux techniques sans éléments en traces limite l

  17. Thermal Analysis of Brazing Seal and Sterilizing Technique to Break Contamination Chain for Mars Sample Return

    Science.gov (United States)

    Bao, Xiaoqi; Badescu, Mircea; Bar-Cohen, Yoseph

    2015-01-01

    The potential to return Martian samples to Earth for extensive analysis is in great interest of the planetary science community. It is important to make sure the mission would securely contain any microbes that may possibly exist on Mars so that they would not be able to cause any adverse effects on Earth's environment. A brazing sealing and sterilizing technique has been proposed to break the Mars-to-Earth contamination chain. Thermal analysis of the brazing process was conducted for several conceptual designs that apply the technique. Control of the increase of the temperature of the Martian samples is a challenge. The temperature profiles of the Martian samples being sealed in the container were predicted by finite element thermal models. The results show that the sealing and sterilization process can be controlled such that the samples' temperature is maintained below the potentially required level, and that the brazing technique is a feasible approach to break the contamination chain.

  18. Thermal analysis of brazing seal and sterilizing technique to break contamination chain for Mars sample return

    Science.gov (United States)

    Bao, Xiaoqi; Badescu, Mircea; Bar-Cohen, Yoseph

    2015-03-01

    The potential to return Martian samples to Earth for extensive analysis is in great interest of the planetary science community. It is important to make sure the mission would securely contain any microbes that may possibly exist on Mars so that they would not be able to cause any adverse effects on Earth's environment. A brazing sealing and sterilizing technique has been proposed to break the Mars-to-Earth contamination chain. Thermal analysis of the brazing process was conducted for several conceptual designs that apply the technique. Control of the increase of the temperature of the Martian samples is a challenge. The temperature profiles of the Martian samples being sealed in the container were predicted by finite element thermal models. The results show that the sealing and sterilization process can be controlled such that the samples' temperature is maintained below the potentially required level, and that the brazing technique is a feasible approach to break the contamination chain.

  19. Random sampling of elementary flux modes in large-scale metabolic networks.

    Science.gov (United States)

    Machado, Daniel; Soons, Zita; Patil, Kiran Raosaheb; Ferreira, Eugénio C; Rocha, Isabel

    2012-09-15

    The description of a metabolic network in terms of elementary (flux) modes (EMs) provides an important framework for metabolic pathway analysis. However, their application to large networks has been hampered by the combinatorial explosion in the number of modes. In this work, we develop a method for generating random samples of EMs without computing the whole set. Our algorithm is an adaptation of the canonical basis approach, where we add an additional filtering step which, at each iteration, selects a random subset of the new combinations of modes. In order to obtain an unbiased sample, all candidates are assigned the same probability of getting selected. This approach avoids the exponential growth of the number of modes during computation, thus generating a random sample of the complete set of EMs within reasonable time. We generated samples of different sizes for a metabolic network of Escherichia coli, and observed that they preserve several properties of the full EM set. It is also shown that EM sampling can be used for rational strain design. A well distributed sample, that is representative of the complete set of EMs, should be suitable to most EM-based methods for analysis and optimization of metabolic networks. Source code for a cross-platform implementation in Python is freely available at http://code.google.com/p/emsampler. dmachado@deb.uminho.pt Supplementary data are available at Bioinformatics online.

  20. A Combined Polling and Random Access Technique for Enhanced Anti-Collision Performance in RFID Systems

    Science.gov (United States)

    Kim, Jeong Geun

    In this paper we propose a novel RFID anti-collision technique that intelligently combines polling and random access schemes. These two fundamentally different medium access control protocols are coherently integrated in our design while functionally complementing each other. The polling mode is designed to enable fast collision-free identification for the tags that exist within reader's coverage across the sessions. In contrast, the random access mode attempts to read the tags uncovered by the polling mode. Our proposed technique is particularly suited for a class of RFID applications in which a stationary reader periodically attempts to identify the tags with slow mobility. Numerical results show that our proposed technique yields much faster identification time against the existing approaches under various operating conditions.

  1. Power and sample size calculations for Mendelian randomization studies using one genetic instrument.

    Science.gov (United States)

    Freeman, Guy; Cowling, Benjamin J; Schooling, C Mary

    2013-08-01

    Mendelian randomization, which is instrumental variable analysis using genetic variants as instruments, is an increasingly popular method of making causal inferences from observational studies. In order to design efficient Mendelian randomization studies, it is essential to calculate the sample sizes required. We present formulas for calculating the power of a Mendelian randomization study using one genetic instrument to detect an effect of a given size, and the minimum sample size required to detect effects for given levels of significance and power, using asymptotic statistical theory. We apply the formulas to some example data and compare the results with those from simulation methods. Power and sample size calculations using these formulas should be more straightforward to carry out than simulation approaches. These formulas make explicit that the sample size needed for Mendelian randomization study is inversely proportional to the square of the correlation between the genetic instrument and the exposure and proportional to the residual variance of the outcome after removing the effect of the exposure, as well as inversely proportional to the square of the effect size.

  2. Recidivism among Child Sexual Abusers: Initial Results of a 13-Year Longitudinal Random Sample

    Science.gov (United States)

    Patrick, Steven; Marsh, Robert

    2009-01-01

    In the initial analysis of data from a random sample of all those charged with child sexual abuse in Idaho over a 13-year period, only one predictive variable was found that related to recidivism of those convicted. Variables such as ethnicity, relationship, gender, and age differences did not show a significant or even large association with…

  3. HABITAT ASSESSMENT USING A RANDOM PROBABILITY BASED SAMPLING DESIGN: ESCAMBIA RIVER DELTA, FLORIDA

    Science.gov (United States)

    Smith, Lisa M., Darrin D. Dantin and Steve Jordan. In press. Habitat Assessment Using a Random Probability Based Sampling Design: Escambia River Delta, Florida (Abstract). To be presented at the SWS/GERS Fall Joint Society Meeting: Communication and Collaboration: Coastal Systems...

  4. Reinforcing Sampling Distributions through a Randomization-Based Activity for Introducing ANOVA

    Science.gov (United States)

    Taylor, Laura; Doehler, Kirsten

    2015-01-01

    This paper examines the use of a randomization-based activity to introduce the ANOVA F-test to students. The two main goals of this activity are to successfully teach students to comprehend ANOVA F-tests and to increase student comprehension of sampling distributions. Four sections of students in an advanced introductory statistics course…

  5. Sampling Nomads: A New Technique for Remote, Hard-to-Reach, and Mobile Populations

    Directory of Open Access Journals (Sweden)

    Himelein Kristen

    2014-06-01

    Full Text Available Livestock are an important component of rural livelihoods in developing countries, but data about this source of income and wealth are difficult to collect due to the nomadic and seminomadic nature of many pastoralist populations. Most household surveys exclude those without permanent dwellings, leading to undercoverage. In this study, we explore the use of a random geographic cluster sample (RGCS as an alternative to the household-based sample. In this design, points are randomly selected and all eligible respondents found inside circles drawn around the selected points are interviewed. This approach should eliminate undercoverage of mobile populations. We present results of an RGCS survey with a total sample size of 784 households to measure livestock ownership in the Afar region of Ethiopia in 2012. We explore the RGCS data quality relative to a recent household survey, and discuss the implementation challenges.

  6. Flexible sampling large-scale social networks by self-adjustable random walk

    Science.gov (United States)

    Xu, Xiao-Ke; Zhu, Jonathan J. H.

    2016-12-01

    Online social networks (OSNs) have become an increasingly attractive gold mine for academic and commercial researchers. However, research on OSNs faces a number of difficult challenges. One bottleneck lies in the massive quantity and often unavailability of OSN population data. Sampling perhaps becomes the only feasible solution to the problems. How to draw samples that can represent the underlying OSNs has remained a formidable task because of a number of conceptual and methodological reasons. Especially, most of the empirically-driven studies on network sampling are confined to simulated data or sub-graph data, which are fundamentally different from real and complete-graph OSNs. In the current study, we propose a flexible sampling method, called Self-Adjustable Random Walk (SARW), and test it against with the population data of a real large-scale OSN. We evaluate the strengths of the sampling method in comparison with four prevailing methods, including uniform, breadth-first search (BFS), random walk (RW), and revised RW (i.e., MHRW) sampling. We try to mix both induced-edge and external-edge information of sampled nodes together in the same sampling process. Our results show that the SARW sampling method has been able to generate unbiased samples of OSNs with maximal precision and minimal cost. The study is helpful for the practice of OSN research by providing a highly needed sampling tools, for the methodological development of large-scale network sampling by comparative evaluations of existing sampling methods, and for the theoretical understanding of human networks by highlighting discrepancies and contradictions between existing knowledge/assumptions of large-scale real OSN data.

  7. Calibrating passive sampling and passive dosing techniques to lipid based concentrations

    DEFF Research Database (Denmark)

    Mayer, Philipp; Schmidt, Stine Nørgaard; Annika, A.

    2011-01-01

    external partitioning standards in vegetable or fish oil for the complete calibration of equilibrium sampling techniques without additional steps. Equilibrium in tissue sampling in three different fish yielded lipid based PCB concentrations in good agreement with those determined using total extraction...... and lipid normalization. These results support the validity of the in tissue sampling technique, while at the same time confirming that the fugacity capacity of these lipid-rich fish tissues for PCBs was dominated by the lipid fraction. Equilibrium sampling of PCB contaminated lake sediments with PDMS......Equilibrium sampling into various formats of the silicone polydimethylsiloxane (PDMS) is increasingly used to measure the exposure of hydrophobic organic chemicals in environmental matrices, and passive dosing from silicone is increasingly used to control and maintain their exposure in laboratory...

  8. An improved technique for soil solution sampling in the vadose zone utilizing real-time data

    Science.gov (United States)

    Singer, J. H.; Seaman, J. C.; Aburime, S. A.; Harris, J.; Karapatakis, D.

    2005-12-01

    The vadose zone is an area of ongoing concern because of its role in the fate and transport of chemicals resulting from waste disposal and agricultural practices. The degree of contamination and movement of solutes in soil solution are often difficult to assess due to temporal variability in precipitation or irrigation events and spatial variability in soil physical properties. For this reason, modeling groundwater and contaminant flow in unsaturated soil is crucial in determining the extent of the contamination. Unfortunately, manual methods used to sample soil solutions and validate model results are often difficult due to the variable nature of unsaturated soil systems. Manual techniques are traditionally performed without specific knowledge of the conditions in the soil at the time of sampling. This hit or miss approach can lead to missed samples, unsuccessful sampling, and samples that are not representative of the event of interest. In an effort to target specific soil conditions at the point of sampling that are conducive to successful sample acquisition, an automated lysimeter sampling and fraction collector system was developed. We demonstrate an innovative technique coupling real-time data with soil solution sampling methods which will improve the efficiency and accuracy of contaminant sampling in the field. The infrastructure of this system can also be implemented in a laboratory setting which adds to its practicality in model development.

  9. X-ray spectrometry and X-ray microtomography techniques for soil and geological samples analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kubala-Kukuś, A.; Banaś, D.; Braziewicz, J. [Institute of Physics, Jan Kochanowski University, ul. Świetokrzyska 15, 25-406 Kielce (Poland); Holycross Cancer Center, ul. Artwińskiego 3, 25-734 Kielce (Poland); Dziadowicz, M.; Kopeć, E. [Institute of Physics, Jan Kochanowski University, ul. Świetokrzyska 15, 25-406 Kielce (Poland); Majewska, U. [Institute of Physics, Jan Kochanowski University, ul. Świetokrzyska 15, 25-406 Kielce (Poland); Holycross Cancer Center, ul. Artwińskiego 3, 25-734 Kielce (Poland); Mazurek, M.; Pajek, M.; Sobisz, M.; Stabrawa, I. [Institute of Physics, Jan Kochanowski University, ul. Świetokrzyska 15, 25-406 Kielce (Poland); Wudarczyk-Moćko, J. [Holycross Cancer Center, ul. Artwińskiego 3, 25-734 Kielce (Poland); Góźdź, S. [Holycross Cancer Center, ul. Artwińskiego 3, 25-734 Kielce (Poland); Institute of Public Health, Jan Kochanowski University, IX Wieków Kielc 19, 25-317 Kielce (Poland)

    2015-12-01

    A particular subject of X-ray fluorescence analysis is its application in studies of the multielemental sample of composition in a wide range of concentrations, samples with different matrices, also inhomogeneous ones and those characterized with different grain size. Typical examples of these kinds of samples are soil or geological samples for which XRF elemental analysis may be difficult due to XRF disturbing effects. In this paper the WDXRF technique was applied in elemental analysis concerning different soil and geological samples (therapeutic mud, floral soil, brown soil, sandy soil, calcium aluminum cement). The sample morphology was analyzed using X-ray microtomography technique. The paper discusses the differences between the composition of samples, the influence of procedures with respect to the preparation of samples as regards their morphology and, finally, a quantitative analysis. The results of the studies were statistically tested (one-way ANOVA and correlation coefficients). For lead concentration determination in samples of sandy soil and cement-like matrix, the WDXRF spectrometer calibration was performed. The elemental analysis of the samples was complemented with knowledge of chemical composition obtained by X-ray powder diffraction.

  10. Sample size calculations for micro-randomized trials in mHealth.

    Science.gov (United States)

    Liao, Peng; Klasnja, Predrag; Tewari, Ambuj; Murphy, Susan A

    2016-05-30

    The use and development of mobile interventions are experiencing rapid growth. In "just-in-time" mobile interventions, treatments are provided via a mobile device, and they are intended to help an individual make healthy decisions 'in the moment,' and thus have a proximal, near future impact. Currently, the development of mobile interventions is proceeding at a much faster pace than that of associated data science methods. A first step toward developing data-based methods is to provide an experimental design for testing the proximal effects of these just-in-time treatments. In this paper, we propose a 'micro-randomized' trial design for this purpose. In a micro-randomized trial, treatments are sequentially randomized throughout the conduct of the study, with the result that each participant may be randomized at the 100s or 1000s of occasions at which a treatment might be provided. Further, we develop a test statistic for assessing the proximal effect of a treatment as well as an associated sample size calculator. We conduct simulation evaluations of the sample size calculator in various settings. Rules of thumb that might be used in designing a micro-randomized trial are discussed. This work is motivated by our collaboration on the HeartSteps mobile application designed to increase physical activity. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  11. Occupational position and its relation to mental distress in a random sample of Danish residents

    DEFF Research Database (Denmark)

    Rugulies, Reiner Ernst; Madsen, Ida E H; Nielsen, Maj Britt D

    2010-01-01

    somatization symptoms (OR = 6.28, 95% CI = 1.39-28.46). CONCLUSIONS: Unskilled manual workers, the unemployed, and, to a lesser extent, the low-grade self-employed showed an increased level of mental distress. Activities to promote mental health in the Danish population should be directed toward these groups.......PURPOSE: To analyze the distribution of depressive, anxiety, and somatization symptoms across different occupational positions in a random sample of Danish residents. METHODS: The study sample consisted of 591 Danish residents (50% women), aged 20-65, drawn from an age- and gender-stratified random...... sample of the Danish population. Participants filled out a survey that included the 92 item version of the Hopkins Symptom Checklist (SCL-92). We categorized occupational position into seven groups: high- and low-grade non-manual workers, skilled and unskilled manual workers, high- and low-grade self...

  12. Established and emerging atmospheric pressure surface sampling/ionization techniques for mass spectrometry.

    Science.gov (United States)

    Van Berkel, Gary J; Pasilis, Sofie P; Ovchinnikova, Olga

    2008-09-01

    The number and type of atmospheric pressure techniques suitable for sampling analytes from surfaces, forming ions from these analytes, and subsequently transporting these ions into vacuum for interrogation by MS have rapidly expanded over the last several years. Moreover, the literature in this area is complicated by an explosion in acronyms for these techniques, many of which provide no information relating to the chemical or physical processes involved. In this tutorial article, we sort this vast array of techniques into relatively few categories on the basis of the approaches used for surface sampling and ionization. For each technique, we explain, as best known, many of the underlying principles of operation, describe representative applications, and in some cases, discuss needed research or advancements and attempt to forecast their future analytical utility.

  13. Improvement of Frequency Domain Output Only Modal Identification from the Application of the Random Decrement Technique

    DEFF Research Database (Denmark)

    Rodrigues, J.; Brincker, Rune; Andersen, P.

    2004-01-01

    that length. The idea is applied in the analysis of ambient vibration data collected in a ¼ scale model of a 4-story building. The results show that a considerable improvement is achieved, in terms of noise reduction in the spectral density functions and corresponding quality of the frequency domain modal...... from the time series, are due to the noise reduction that results from the time averaging procedure of the random decrement technique, and from avoiding leakage in the spectral densities, as long as the random decrement functions are evaluated with sufficient time length to have a complete decay within...

  14. Random sampling for a mental health survey in a deprived multi-ethnic area of Berlin.

    Science.gov (United States)

    Mundt, Adrian P; Aichberger, Marion C; Kliewe, Thomas; Ignatyev, Yuriy; Yayla, Seda; Heimann, Hannah; Schouler-Ocak, Meryam; Busch, Markus; Rapp, Michael; Heinz, Andreas; Ströhle, Andreas

    2012-12-01

    The aim of the study was to assess the response to random sampling for a mental health survey in a deprived multi-ethnic area of Berlin, Germany, with a large Turkish-speaking population. A random list from the registration office with 1,000 persons stratified by age and gender was retrieved from the population registry and these persons were contacted using a three-stage design including written information, telephone calls and personal contact at home. A female bilingual interviewer contacted persons with Turkish names. Of the persons on the list, 202 were not living in the area, one was deceased, 502 did not respond. Of the 295 responders, 152 explicitly refused(51.5%) to participate. We retained a sample of 143 participants(48.5%) representing the rate of multi-ethnicity in the area (52.1% migrants in the sample vs. 53.5% in the population). Turkish migrants were over-represented(28.9% in the sample vs. 18.6% in the population). Polish migrants (2.1 vs. 5.3% in the population) and persons from the former Yugoslavia (1.4 vs. 4.8% in the population)were under-represented. Bilingual contact procedures can improve the response rates of the most common migrant populations to random sampling if migrants of the same origin gate the contact. High non-contact and non-response rates for migrant and non-migrant populations in deprived urban areas remain a challenge for obtaining representative random samples.

  15. Assessment of proteinuria by using protein: creatinine index in random urine sample.

    Science.gov (United States)

    Khan, Dilshad Ahmed; Ahmad, Tariq Mahmood; Qureshil, Ayaz Hussain; Halim, Abdul; Ahmad, Mumtaz; Afzal, Saeed

    2005-10-01

    To assess the quantitative measurement of proteinuria by using random urine protein:creatinine index/ratio in comparison with 24 hours urinary protein excretion in patients of renal diseases having normal glomerular filtration rate. One hundred and thirty patients, 94 males and 36 females, with an age range of 5 to 60 years; having proteinuria of more than 150 mg/day were included in this study. Qualitative urinary protein estimation was done on random urine specimen by dipstick. Quantitative measurement of protein in the random and 24 hours urine specimens were carried out by a method based on the formation of a red complex of protein with pyrogallal red in acid medium on Micro lab 200 (Merck). Estimation of creatinine was done on Selectra -2 (Merck) by Jaffe's reaction. The urine protein:creatinine index and ratio were calculated by dividing the urine protein concentration (mg/L) by urine creatinine concentration (mmol/L) multilplied by 10 and mg/mg respectively. The protein:creatinine index and ratio of more than 140 and 0.18 respectively in a random urine sample indicated pathological proteinuria. An excellent correlation (r=0.96) was found between random urine protein:creatinine index/ratio and standard 24 hours urinary protein excretion in these patients (pprotein:creatinine index in random urine is a convenient, quick and reliable method of estimation of proteinuria as compared to 24 hours of urinary protein excretion for diagnosis and monitoring of renal diseases in our medical setup.

  16. Systematic comparison of static and dynamic headspace sampling techniques for gas chromatography.

    Science.gov (United States)

    Kremser, Andreas; Jochmann, Maik A; Schmidt, Torsten C

    2016-09-01

    Six automated, headspace-based sample preparation techniques were used to extract volatile analytes from water with the goal of establishing a systematic comparison between commonly available instrumental alternatives. To that end, these six techniques were used in conjunction with the same gas chromatography instrument for analysis of a common set of volatile organic carbon (VOC) analytes. The methods were thereby divided into three classes: static sampling (by syringe or loop), static enrichment (SPME and PAL SPME Arrow), and dynamic enrichment (ITEX and trap sampling). For PAL SPME Arrow, different sorption phase materials were also included in the evaluation. To enable an effective comparison, method detection limits (MDLs), relative standard deviations (RSDs), and extraction yields were determined and are discussed for all techniques. While static sampling techniques exhibited sufficient extraction yields (approx. 10-20 %) to be reliably used down to approx. 100 ng L(-1), enrichment techniques displayed extraction yields of up to 80 %, resulting in MDLs down to the picogram per liter range. RSDs for all techniques were below 27 %. The choice on one of the different instrumental modes of operation (aforementioned classes) was thereby the most influential parameter in terms of extraction yields and MDLs. Individual methods inside each class showed smaller deviations, and the least influences were observed when evaluating different sorption phase materials for the individual enrichment techniques. The option of selecting specialized sorption phase materials may, however, be more important when analyzing analytes with different properties such as high polarity or the capability of specific molecular interactions. Graphical Abstract PAL SPME Arrow during the extraction of volatile analytes from the headspace of an aqueous sample.

  17. Comparison of Residual Stress Characterization Techniques Using an Interference Fit Sample (Preprint)

    Science.gov (United States)

    2017-04-06

    AFRL-RX-WP-JA-2017-0329 COMPARISON OF RESIDUAL STRESS CHARACTERIZATION TECHNIQUES USING AN INTERFERENCE FIT SAMPLE (PREPRINT...To) 3 April 2017 Interim 19 March 2014 – 3 March 2017 4. TITLE AND SUBTITLE COMPARISON OF RESIDUAL STRESS CHARACTERIZATION TECHNIQUES USING AN...work. 14. ABSTRACT (Maximum 200 words) Residual stress in an engineering component induced from processing is pervasive and can impact the

  18. Generalized essential energy space random walks to more effectively accelerate solute sampling in aqueous environment.

    Science.gov (United States)

    Lv, Chao; Zheng, Lianqing; Yang, Wei

    2012-01-28

    Molecular dynamics sampling can be enhanced via the promoting of potential energy fluctuations, for instance, based on a Hamiltonian modified with the addition of a potential-energy-dependent biasing term. To overcome the diffusion sampling issue, which reveals the fact that enlargement of event-irrelevant energy fluctuations may abolish sampling efficiency, the essential energy space random walk (EESRW) approach was proposed earlier. To more effectively accelerate the sampling of solute conformations in aqueous environment, in the current work, we generalized the EESRW method to a two-dimension-EESRW (2D-EESRW) strategy. Specifically, the essential internal energy component of a focused region and the essential interaction energy component between the focused region and the environmental region are employed to define the two-dimensional essential energy space. This proposal is motivated by the general observation that in different conformational events, the two essential energy components have distinctive interplays. Model studies on the alanine dipeptide and the aspartate-arginine peptide demonstrate sampling improvement over the original one-dimension-EESRW strategy; with the same biasing level, the present generalization allows more effective acceleration of the sampling of conformational transitions in aqueous solution. The 2D-EESRW generalization is readily extended to higher dimension schemes and employed in more advanced enhanced-sampling schemes, such as the recent orthogonal space random walk method. © 2012 American Institute of Physics

  19. Solid Phase Microextraction and Related Techniques for Drugs in Biological Samples

    Science.gov (United States)

    Moein, Mohammad Mahdi; Said, Rana; Bassyouni, Fatma

    2014-01-01

    In drug discovery and development, the quantification of drugs in biological samples is an important task for the determination of the physiological performance of the investigated drugs. After sampling, the next step in the analytical process is sample preparation. Because of the low concentration levels of drug in plasma and the variety of the metabolites, the selected extraction technique should be virtually exhaustive. Recent developments of sample handling techniques are directed, from one side, toward automatization and online coupling of sample preparation units. The primary objective of this review is to present the recent developments in microextraction sample preparation methods for analysis of drugs in biological fluids. Microextraction techniques allow for less consumption of solvent, reagents, and packing materials, and small sample volumes can be used. In this review the use of solid phase microextraction (SPME), microextraction in packed sorbent (MEPS), and stir-bar sorbtive extraction (SBSE) in drug analysis will be discussed. In addition, the use of new sorbents such as monoliths and molecularly imprinted polymers will be presented. PMID:24688797

  20. Solid Phase Microextraction and Related Techniques for Drugs in Biological Samples

    Directory of Open Access Journals (Sweden)

    Mohammad Mahdi Moein

    2014-01-01

    Full Text Available In drug discovery and development, the quantification of drugs in biological samples is an important task for the determination of the physiological performance of the investigated drugs. After sampling, the next step in the analytical process is sample preparation. Because of the low concentration levels of drug in plasma and the variety of the metabolites, the selected extraction technique should be virtually exhaustive. Recent developments of sample handling techniques are directed, from one side, toward automatization and online coupling of sample preparation units. The primary objective of this review is to present the recent developments in microextraction sample preparation methods for analysis of drugs in biological fluids. Microextraction techniques allow for less consumption of solvent, reagents, and packing materials, and small sample volumes can be used. In this review the use of solid phase microextraction (SPME, microextraction in packed sorbent (MEPS, and stir-bar sorbtive extraction (SBSE in drug analysis will be discussed. In addition, the use of new sorbents such as monoliths and molecularly imprinted polymers will be presented.

  1. Use of superheated liquid dispersion technique for measuring alpha-emitting actinides in environmental samples

    Science.gov (United States)

    Wang, C. K.; Lim, W.; Pan, L. K.

    1994-12-01

    This paper presents a novel fast screening technique of measuring concentrations of alpha-emitting actinides in environmental samples. This novel technique is called superheated liquid dispersion (SLD), which involves dispersing fine superheated liquid (e.g. Freon-12) droplets into a mixture of glycerin and the actinide-containing chemical extractant. The interactions between alpha particles and superheated liquid droplets trigger bubbles. Therefore, one may relate the number of bubbles to the actinide concentration in the sample. The results obtained from the computer simulation and the experiment support the above claim.

  2. Modular Sampling and Analysis Techniques for the Real-Time Analysis of Human Breath

    Energy Technology Data Exchange (ETDEWEB)

    Frank, M; Farquar, G; Adams, K; Bogan, M; Martin, A; Benner, H; Spadaccini, C; Steele, P; Davis, C; Loyola, B; Morgan, J; Sankaran, S

    2007-07-09

    At LLNL and UC Davis, we are developing several techniques for the real-time sampling and analysis of trace gases, aerosols and exhaled breath that could be useful for a modular, integrated system for breath analysis. Those techniques include single-particle bioaerosol mass spectrometry (BAMS) for the analysis of exhaled aerosol particles or droplets as well as breath samplers integrated with gas chromatography mass spectrometry (GC-MS) or MEMS-based differential mobility spectrometry (DMS). We describe these techniques and present recent data obtained from human breath or breath condensate, in particular, addressing the question of how environmental exposure influences the composition of breath.

  3. Randomized controlled trials 5: Determining the sample size and power for clinical trials and cohort studies.

    Science.gov (United States)

    Greene, Tom

    2015-01-01

    Performing well-powered randomized controlled trials is of fundamental importance in clinical research. The goal of sample size calculations is to assure that statistical power is acceptable while maintaining a small probability of a type I error. This chapter overviews the fundamentals of sample size calculation for standard types of outcomes for two-group studies. It considers (1) the problems of determining the size of the treatment effect that the studies will be designed to detect, (2) the modifications to sample size calculations to account for loss to follow-up and nonadherence, (3) the options when initial calculations indicate that the feasible sample size is insufficient to provide adequate power, and (4) the implication of using multiple primary endpoints. Sample size estimates for longitudinal cohort studies must take account of confounding by baseline factors.

  4. Characterization of Electron Microscopes with Binary Pseudo-random Multilayer Test Samples

    Energy Technology Data Exchange (ETDEWEB)

    V Yashchuk; R Conley; E Anderson; S Barber; N Bouet; W McKinney; P Takacs; D Voronov

    2011-12-31

    Verification of the reliability of metrology data from high quality X-ray optics requires that adequate methods for test and calibration of the instruments be developed. For such verification for optical surface profilometers in the spatial frequency domain, a modulation transfer function (MTF) calibration method based on binary pseudo-random (BPR) gratings and arrays has been suggested [1] and [2] and proven to be an effective calibration method for a number of interferometric microscopes, a phase shifting Fizeau interferometer, and a scatterometer [5]. Here we describe the details of development of binary pseudo-random multilayer (BPRML) test samples suitable for characterization of scanning (SEM) and transmission (TEM) electron microscopes. We discuss the results of TEM measurements with the BPRML test samples fabricated from a WiSi2/Si multilayer coating with pseudo-randomly distributed layers. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML test samples can be used to characterize X-ray microscopes. Corresponding work with X-ray microscopes is in progress.

  5. Characterization of electron microscopes with binary pseudo-random multilayer test samples

    Energy Technology Data Exchange (ETDEWEB)

    Yashchuk, Valeriy V., E-mail: VVYashchuk@lbl.gov [Advanced Light Source, Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Conley, Raymond [NSLS-II, Brookhaven National Laboratory, Upton, NY 11973 (United States); Anderson, Erik H. [Center for X-ray Optics, Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Barber, Samuel K. [Advanced Light Source, Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Bouet, Nathalie [NSLS-II, Brookhaven National Laboratory, Upton, NY 11973 (United States); McKinney, Wayne R. [Advanced Light Source, Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Takacs, Peter Z. [Brookhaven National Laboratory, Upton, NY 11973 (United States); Voronov, Dmitriy L. [Advanced Light Source, Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States)

    2011-09-01

    Verification of the reliability of metrology data from high quality X-ray optics requires that adequate methods for test and calibration of the instruments be developed. For such verification for optical surface profilometers in the spatial frequency domain, a modulation transfer function (MTF) calibration method based on binary pseudo-random (BPR) gratings and arrays has been suggested and proven to be an effective calibration method for a number of interferometric microscopes, a phase shifting Fizeau interferometer, and a scatterometer [5]. Here we describe the details of development of binary pseudo-random multilayer (BPRML) test samples suitable for characterization of scanning (SEM) and transmission (TEM) electron microscopes. We discuss the results of TEM measurements with the BPRML test samples fabricated from a WiSi{sub 2}/Si multilayer coating with pseudo-randomly distributed layers. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML test samples can be used to characterize X-ray microscopes. Corresponding work with X-ray microscopes is in progress.

  6. Accounting for Sampling Error in Genetic Eigenvalues Using Random Matrix Theory.

    Science.gov (United States)

    Sztepanacz, Jacqueline L; Blows, Mark W

    2017-07-01

    The distribution of genetic variance in multivariate phenotypes is characterized by the empirical spectral distribution of the eigenvalues of the genetic covariance matrix. Empirical estimates of genetic eigenvalues from random effects linear models are known to be overdispersed by sampling error, where large eigenvalues are biased upward, and small eigenvalues are biased downward. The overdispersion of the leading eigenvalues of sample covariance matrices have been demonstrated to conform to the Tracy-Widom (TW) distribution. Here we show that genetic eigenvalues estimated using restricted maximum likelihood (REML) in a multivariate random effects model with an unconstrained genetic covariance structure will also conform to the TW distribution after empirical scaling and centering. However, where estimation procedures using either REML or MCMC impose boundary constraints, the resulting genetic eigenvalues tend not be TW distributed. We show how using confidence intervals from sampling distributions of genetic eigenvalues without reference to the TW distribution is insufficient protection against mistaking sampling error as genetic variance, particularly when eigenvalues are small. By scaling such sampling distributions to the appropriate TW distribution, the critical value of the TW statistic can be used to determine if the magnitude of a genetic eigenvalue exceeds the sampling error for each eigenvalue in the spectral distribution of a given genetic covariance matrix. Copyright © 2017 by the Genetics Society of America.

  7. Hemodynamic and glucometabolic factors fail to predict renal function in a random population sample

    DEFF Research Database (Denmark)

    Pareek, M.; Nielsen, M.; Olesen, Thomas Bastholm

    2015-01-01

    Objective: To determine whether baseline hemodynamic and/or glucometabolic risk factors could predict renal function at follow-up, independently of baseline serum creatinine, in survivors from a random population sample. Design and method: We examined associations between baseline serum creatinine...... indices of beta-cell function (HOMA-2B), insulin sensitivity (HOMA-2S), and insulin resistance (HOMA-2IR)), traditional cardiovascular risk factors (age, sex, smoking status, body mass index, diabetes mellitus, total serum cholesterol), and later renal function determined as serum cystatin C in 238 men...... and 7 women aged 38 to 49 years at the time of inclusion, using multivariable linear regression analysis (p-entry 0.05, p-removal 0.20). Study subjects came from a random population based sample and were included 1974-1992, whilst the follow-up with cystatin C measurement was performed 2002...

  8. An inversion method based on random sampling for real-time MEG neuroimaging

    CERN Document Server

    Pascarella, Annalisa

    2016-01-01

    The MagnetoEncephaloGraphy (MEG) has gained great interest in neurorehabilitation training due to its high temporal resolution. The challenge is to localize the active regions of the brain in a fast and accurate way. In this paper we use an inversion method based on random spatial sampling to solve the real-time MEG inverse problem. Several numerical tests on synthetic but realistic data show that the method takes just a few hundredths of a second on a laptop to produce an accurate map of the electric activity inside the brain. Moreover, it requires very little memory storage. For this reasons the random sampling method is particularly attractive in real-time MEG applications.

  9. Review of sample preparation techniques for the analysis of pesticide residues in soil.

    Science.gov (United States)

    Tadeo, José L; Pérez, Rosa Ana; Albero, Beatriz; García-Valcárcel, Ana I; Sánchez-Brunete, Consuelo

    2012-01-01

    This paper reviews the sample preparation techniques used for the analysis of pesticides in soil. The present status and recent advances made during the last 5 years in these methods are discussed. The analysis of pesticide residues in soil requires the extraction of analytes from this matrix, followed by a cleanup procedure, when necessary, prior to their instrumental determination. The optimization of sample preparation is a very important part of the method development that can reduce the analysis time, the amount of solvent, and the size of samples. This review considers all aspects of sample preparation, including extraction and cleanup. Classical extraction techniques, such as shaking, Soxhlet, and ultrasonic-assisted extraction, and modern techniques like pressurized liquid extraction, microwave-assisted extraction, solid-phase microextraction and QuEChERS (Quick, Easy, Cheap, Effective, Rugged, and Safe) are reviewed. The different cleanup strategies applied for the purification of soil extracts are also discussed. In addition, the application of these techniques to environmental studies is considered.

  10. Language Sample Analysis and Elicitation Technique Effects in Bilingual Children with and without Language Impairment

    Science.gov (United States)

    Kapantzoglou, Maria; Fergadiotis, Gerasimos; Restrepo, M. Adelaida

    2017-01-01

    Purpose: This study examined whether the language sample elicitation technique (i.e., storytelling and story-retelling tasks with pictorial support) affects lexical diversity (D), grammaticality (grammatical errors per communication unit [GE/CU]), sentence length (mean length of utterance in words [MLUw]), and sentence complexity (subordination…

  11. Performance Evaluation and Parameter Optimization of Wavelength Division Multiplexing Networks with Importance Sampling Techniques

    NARCIS (Netherlands)

    Remondo Bueno, D.; Srinivasan, R.; Nicola, V.F.; van Etten, Wim; Tattje, H.E.P.

    1998-01-01

    In this paper new adaptive importance sampling techniques are applied to the performance evaluation and parameter optimization of wavelength division multiplexing (WDM) network impaired by crosstalk in an optical cross-connect. Worst-case analysis is carried out including all the beat noise terms

  12. BEHAVIOR OF AN IMMERSED CORE SAMPLE IN A FLUID CONTAINER DURING A SATURATION TECHNIQUE

    Directory of Open Access Journals (Sweden)

    S GHEBOULI

    2002-06-01

    Full Text Available The objective of this paper is to present the behavior of some core samples immersed in a  fluid container during a saturation technique. The experimental work was carried out by using two different fluids. The results obtained are similar except for the saturation stages. They are caused by different viscosities and surface tensions.

  13. Modified Exponential Type Estimator for Population Mean Using Auxiliary Variables in Stratified Random Sampling

    OpenAIRE

    Özel, Gamze

    2015-01-01

    In this paper, a new exponential type estimator is developed in the stratified random sampling for the population mean using auxiliary variable information. In order to evaluate efficiency of the introduced estimator, we first review some estimators and study the optimum property of the suggested strategy. To judge the merits of the suggested class of estimators over others under the optimal condition, simulation study and real data applications are conducted. The results show that the introduc...

  14. Effectiveness of hand hygiene education among a random sample of women from the community

    OpenAIRE

    Ubheeram, J.; Biranjia-Hurdoyal, S.D.

    2017-01-01

    Summary Objective. The effectiveness of hand hygiene education was investigated by studying the hand hygiene awareness and bacterial hand contamination among a random sample of 170 women in the community. Methods. Questionnaire was used to assess the hand hygiene awareness score, followed by swabbing of the dominant hand. Bacterial identification was done by conventional biochemical tests. Results. Better hand hygiene awareness score was significantly associated with age, scarce bacterial gro...

  15. Control Capacity and A Random Sampling Method in Exploring Controllability of Complex Networks

    OpenAIRE

    Jia, Tao; Barab?si, Albert-L?szl?

    2013-01-01

    Controlling complex systems is a fundamental challenge of network science. Recent advances indicate that control over the system can be achieved through a minimum driver node set (MDS). The existence of multiple MDS's suggests that nodes do not participate in control equally, prompting us to quantify their participations. Here we introduce control capacity quantifying the likelihood that a node is a driver node. To efficiently measure this quantity, we develop a random sampling algorithm. Thi...

  16. Precision of systematic and random sampling in clustered populations: habitat patches and aggregating organisms.

    Science.gov (United States)

    McGarvey, Richard; Burch, Paul; Matthews, Janet M

    2016-01-01

    Natural populations of plants and animals spatially cluster because (1) suitable habitat is patchy, and (2) within suitable habitat, individuals aggregate further into clusters of higher density. We compare the precision of random and systematic field sampling survey designs under these two processes of species clustering. Second, we evaluate the performance of 13 estimators for the variance of the sample mean from a systematic survey. Replicated simulated surveys, as counts from 100 transects, allocated either randomly or systematically within the study region, were used to estimate population density in six spatial point populations including habitat patches and Matérn circular clustered aggregations of organisms, together and in combination. The standard one-start aligned systematic survey design, a uniform 10 x 10 grid of transects, was much more precise. Variances of the 10 000 replicated systematic survey mean densities were one-third to one-fifth of those from randomly allocated transects, implying transect sample sizes giving equivalent precision by random survey would need to be three to five times larger. Organisms being restricted to patches of habitat was alone sufficient to yield this precision advantage for the systematic design. But this improved precision for systematic sampling in clustered populations is underestimated by standard variance estimators used to compute confidence intervals. True variance for the survey sample mean was computed from the variance of 10 000 simulated survey mean estimates. Testing 10 published and three newly proposed variance estimators, the two variance estimators (v) that corrected for inter-transect correlation (ν₈ and ν(W)) were the most accurate and also the most precise in clustered populations. These greatly outperformed the two "post-stratification" variance estimators (ν₂ and ν₃) that are now more commonly applied in systematic surveys. Similar variance estimator performance rankings were found with

  17. High-order Sampling Techniques of Aliased Signals for Very Long Baseline Interferometry

    Science.gov (United States)

    Takefuji, K.; Kondo, T.; Sekido, M.; Kumazawa, T.; Harada, K.; Nakayama, T.; Kurihara, S.; Kokado, K.; Kawabata, R.; Ichikawa, R.

    2012-10-01

    Radio frequency (RF) direct sampling is a technique used to sample RF signals that are higher than the sampling rate, without the use of a frequency converter and an anti-aliasing filter. In the case of geodetic VLBI, the RF frequency is at most 9 GHz. Recently, a digital sampler with high sensitivity at RF frequencies greater than 10 GHz was developed. The sampler enables us to evaluate the use of the RF direct sampling technique in geodetic VLBI. RF direct sampling has the potential to make the system simple and stable because, unlike a conventional system, analog frequency converters are not used. We have developed two sets of RF direct sampling systems and operated them on Kashima and Tsukuba baseline (about 50 km length) in Japan. At first, we carried out the VLBI experiment only for X band (8 GHz) signals and successfully got the first fringes. Aliased signals could be discriminated through correlation processing. Then, we adopted RF direct sampling for mixed signals, i.e., S band (2 GHz) and X band signals are combined with each other to make a geodetic VLBI observation. We carried out a 24 hr geodetic VLBI session on 2011 October 19 and succeeded in fringe detection for both S and X bands. After correlation processing, baseline analysis was carried out and we got results consistent with those obtained by conventional VLBI.

  18. Determining optimal sample sizes for multi-stage randomized clinical trials using value of information methods.

    Science.gov (United States)

    Willan, Andrew; Kowgier, Matthew

    2008-01-01

    Traditional sample size calculations for randomized clinical trials depend on somewhat arbitrarily chosen factors, such as Type I and II errors. An effectiveness trial (otherwise known as a pragmatic trial or management trial) is essentially an effort to inform decision-making, i.e., should treatment be adopted over standard? Taking a societal perspective and using Bayesian decision theory, Willan and Pinto (Stat. Med. 2005; 24:1791-1806 and Stat. Med. 2006; 25:720) show how to determine the sample size that maximizes the expected net gain, i.e., the difference between the cost of doing the trial and the value of the information gained from the results. These methods are extended to include multi-stage adaptive designs, with a solution given for a two-stage design. The methods are applied to two examples. As demonstrated by the two examples, substantial increases in the expected net gain (ENG) can be realized by using multi-stage adaptive designs based on expected value of information methods. In addition, the expected sample size and total cost may be reduced. Exact solutions have been provided for the two-stage design. Solutions for higher-order designs may prove to be prohibitively complex and approximate solutions may be required. The use of multi-stage adaptive designs for randomized clinical trials based on expected value of sample information methods leads to substantial gains in the ENG and reductions in the expected sample size and total cost.

  19. Sample size calculations for pilot randomized trials: a confidence interval approach.

    Science.gov (United States)

    Cocks, Kim; Torgerson, David J

    2013-02-01

    To describe a method using confidence intervals (CIs) to estimate the sample size for a pilot randomized trial. Using one-sided CIs and the estimated effect size that would be sought in a large trial, we calculated the sample size needed for pilot trials. Using an 80% one-sided CI, we estimated that a pilot trial should have at least 9% of the sample size of the main planned trial. Using the estimated effect size difference for the main trial and using a one-sided CI, this allows us to calculate a sample size for a pilot trial, which will make its results more useful than at present. Copyright © 2013 Elsevier Inc. All rights reserved.

  20. Heating and thermal control of brazing technique to break contamination path for potential Mars sample return

    Science.gov (United States)

    Bao, Xiaoqi; Badescu, Mircea; Sherrit, Stewart; Bar-Cohen, Yoseph; Campos, Sergio

    2017-04-01

    The potential return of Mars sample material is of great interest to the planetary science community, as it would enable extensive analysis of samples with highly sensitive laboratory instruments. It is important to make sure such a mission concept would not bring any living microbes, which may possibly exist on Mars, back to Earth's environment. In order to ensure the isolation of Mars microbes from Earth's Atmosphere, a brazing sealing and sterilizing technique was proposed to break the Mars-to-Earth contamination path. Effectively, heating the brazing zone in high vacuum space and controlling the sample temperature for integrity are key challenges to the implementation of this technique. The break-thechain procedures for container configurations, which are being considered, were simulated by multi-physics finite element models. Different heating methods including induction and resistive/radiation were evaluated. The temperature profiles of Martian samples in a proposed container structure were predicted. The results show that the sealing and sterilizing process can be controlled such that the samples temperature is maintained below the level that may cause damage, and that the brazing technique is a feasible approach to breaking the contamination path.

  1. Randomized comparison of coronary bifurcation stenting with the crush versus the culotte technique using sirolimus eluting stents: the Nordic stent technique study

    DEFF Research Database (Denmark)

    Erglis, Andrejs; Kumsars, Indulis; Niemelä, Matti

    2009-01-01

    BACKGROUND: In a number of coronary bifurcation lesions, both the main vessel and the side branch need stent coverage. Using sirolimus eluting stents, we compared 2 dedicated bifurcation stent techniques, the crush and the culotte techniques in a randomized trial with separate clinical and angiog......BACKGROUND: In a number of coronary bifurcation lesions, both the main vessel and the side branch need stent coverage. Using sirolimus eluting stents, we compared 2 dedicated bifurcation stent techniques, the crush and the culotte techniques in a randomized trial with separate clinical...

  2. Toward greener analytical techniques for the absolute quantification of peptides in pharmaceutical and biological samples.

    Science.gov (United States)

    Van Eeckhaut, Ann; Mangelings, Debby

    2015-09-10

    Peptide-based biopharmaceuticals represent one of the fastest growing classes of new drug molecules. New reaction types included in the synthesis strategies to reduce the rapid metabolism of peptides, along with the availability of new formulation and delivery technologies, resulted in an increased marketing of peptide drug products. In this regard, the development of analytical methods for quantification of peptides in pharmaceutical and biological samples is of utmost importance. From the sample preparation step to their analysis by means of chromatographic or electrophoretic methods, many difficulties should be tackled to analyze them. Recent developments in analytical techniques emphasize more and more on the use of green analytical techniques. This review will discuss the progresses in and challenges observed during green analytical method development for the quantification of peptides in pharmaceutical and biological samples. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. A quantitative technique for sampling motile macroinvertebrates in beds of the seagrass Posidonia oceanica (L. Delile

    Directory of Open Access Journals (Sweden)

    Joseph A. Borg

    2002-03-01

    Full Text Available Techniques for sampling motile macroinvertebrates associated with Posidonia oceanica seagrass meadows have mainly involved the use of hand-nets and suction samplers or collection by hand. These techniques give unreliable quantitative estimates or have practical difficulties. A large cylindrical saw-rimmed corer was designed and used successfully to obtain quantitative samples of macroinvertebrates from both foliage and root-rhizome matrix of a Posidonia oceanica meadow in Malta (central Mediterranean. Choice of the appropriate sample unit size was assessed by comparing the relative accuracy, precision and efficiency of three different core diameters: 25 cm, 35 cm and 45 cm. The results suggest that for comparison of macrofaunal species richness and abundance between different meadows/sites the 25 cm diameter corer is recommended. For surveys aimed at estimating total diversity within a particular site, the 35 cm diameter corer is more appropriate.

  4. Multivariate Multi-Objective Allocation in Stratified Random Sampling: A Game Theoretic Approach.

    Science.gov (United States)

    Muhammad, Yousaf Shad; Hussain, Ijaz; Shoukry, Alaa Mohamd

    2016-01-01

    We consider the problem of multivariate multi-objective allocation where no or limited information is available within the stratum variance. Results show that a game theoretic approach (based on weighted goal programming) can be applied to sample size allocation problems. We use simulation technique to determine payoff matrix and to solve a minimax game.

  5. Phase microscopy of technical and biological samples through random phase modulation with a difuser

    DEFF Research Database (Denmark)

    Almoro, Percival; Pedrini, Giancarlo; Gundu, Phanindra Narayan

    2010-01-01

    A technique for phase microscopy using a phase diffuser and a reconstruction algorithm is proposed. A magnified specimen wavefront is projected on the diffuser plane that modulates the wavefront into a speckle field. The speckle patterns at axially displaced planes are sampled and used in an iter...

  6. Dental Students' Perceptions of Digital and Conventional Impression Techniques: A Randomized Controlled Trial.

    Science.gov (United States)

    Zitzmann, Nicola U; Kovaltschuk, Irina; Lenherr, Patrik; Dedem, Philipp; Joda, Tim

    2017-10-01

    The aim of this randomized controlled trial was to analyze inexperienced dental students' perceptions of the difficulty and applicability of digital and conventional implant impressions and their preferences including performance. Fifty undergraduate dental students at a dental school in Switzerland were randomly divided into two groups (2×25). Group A first took digital impressions in a standardized phantom model and then conventional impressions, while the procedures were reversed for Group B. Participants were asked to complete a VAS questionnaire (0-100) on the level of difficulty and applicability (user/patient-friendliness) of both techniques. They were asked which technique they preferred and perceived to be more efficient. A quotient of "effective scan time per software-recorded time" (TRIOS) was calculated as an objective quality indicator for intraoral optical scanning (IOS). The majority of students perceived IOS as easier than the conventional technique. Most (72%) preferred the digital approach using IOS to take the implant impression to the conventional method (12%) or had no preference (12%). Although total work was similar for males and females, the TRIOS quotient indicated that male students tended to use their time more efficiently. In this study, dental students with no clinical experience were very capable of acquiring digital tools, indicating that digital impression techniques can be included early in the dental curriculum to help them catch up with ongoing development in computer-assisted technologies used in oral rehabilitation.

  7. Clinical evaluation of an improved cementation technique for implant-supported restorations: a randomized controlled trial.

    Science.gov (United States)

    Canullo, Luigi; Cocchetto, Roberto; Marinotti, Fabio; Oltra, David Peñarrocha; Diago, María Peñarrocha; Loi, Ignazio

    2016-12-01

    Cement remnants were frequently associated with peri-implantitis. Recently, a shoulderless abutment was proposed, raising some concern about cement excess removal. To compare different cementation techniques for implant-supported restorations assessing the amount of cement remnants in the peri-implant sulcus. Additional aim was to compare the effect of these cementation techniques using two different abutment designs. Forty-six patients requiring double implant-supported restoration in the posterior maxilla were randomly divided in two groups according to the cementation modality: intraoral and extraoral. According to the abutment finishing line, implants in each patient were randomly assigned to shoulderless or chamfer subgroup. In the intraoral group, crowns were directly seated onto the titanium abutment. In the extraoral group, crowns were firstly seated onto a resin abutment replica and immediately removed, then cleansed of the cement excess and finally seated on the titanium abutment. After cement setting, in both groups, cement excess was carefully tried to remove. Three months later, framework/abutment complexes were disconnected and prepared for microscopic analysis: surface occupied by exposed cement remnants and marginal gaps were measured. Additionally, crown/abutment complexes were grinded, and voids of cement were measured at abutment/crown interface. Related-samples Friedman's two-way analysis of variance by ranks was used to detect differences between groups and subgroups (P ≤ 0.5). At the end of the study, a mean value of 0.45 mm(2) (±0.80), 0.38 mm(2) (±0.84), and 0.065 mm(2) (±0.13) and 0.07 mm(2) (±0.15) described surface occupied by cement remnants in shoulderless and chamfer abutment with intraoral cementation and shoulderless and chamfer abutment with extraoral cementation, respectively. A mean value of 0.40 mm(2) (±0.377), 0.41 mm(2) (±0.39) and 0.485 mm(2) (±0.47) and 0.477 mm(2) (±0.43) described cement voids at the

  8. A systematic review of randomized trials evaluating regional techniques for postthoracotomy analgesia

    DEFF Research Database (Denmark)

    Joshi, G.P.; Bonnet, F.; Shah, R.

    2008-01-01

    of the evidence is needed to assess the comparative benefits of alternative techniques, guide clinical practice and identify areas requiring further research. METHODS: In this systematic review of randomized trials we evaluated thoracic epidural, paravertebral, intrathecal, intercostal, and interpleural analgesic...... techniques, compared to each other and to systemic opioid analgesia, in adult thoracotomy. Postoperative pain, analgesic use, and complications were analyzed. RESULTS: Continuous paravertebral block was as effective as thoracic epidural analgesia with local anesthetic (LA) but was associated with a reduced...... incidence of hypotension. Paravertebral block reduced the incidence of pulmonary complications compared with systemic analgesia, whereas thoracic epidural analgesia did not. Thoracic epidural analgesia was superior to intrathecal and intercostal techniques, although these were superior to systemic analgesia...

  9. Comparison of Logistic Regression and Random Forests techniques for shallow landslide susceptibility assessment in Giampilieri (NE Sicily, Italy)

    Science.gov (United States)

    Trigila, Alessandro; Iadanza, Carla; Esposito, Carlo; Scarascia-Mugnozza, Gabriele

    2015-11-01

    The aim of this work is to define reliable susceptibility models for shallow landslides using Logistic Regression and Random Forests multivariate statistical techniques. The study area, located in North-East Sicily, was hit on October 1st 2009 by a severe rainstorm (225 mm of cumulative rainfall in 7 h) which caused flash floods and more than 1000 landslides. Several small villages, such as Giampilieri, were hit with 31 fatalities, 6 missing persons and damage to buildings and transportation infrastructures. Landslides, mainly types such as earth and debris translational slides evolving into debris flows, were triggered on steep slopes and involved colluvium and regolith materials which cover the underlying metamorphic bedrock. The work has been carried out with the following steps: i) realization of a detailed event landslide inventory map through field surveys coupled with observation of high resolution aerial colour orthophoto; ii) identification of landslide source areas; iii) data preparation of landslide controlling factors and descriptive statistics based on a bivariate method (Frequency Ratio) to get an initial overview on existing relationships between causative factors and shallow landslide source areas; iv) choice of criteria for the selection and sizing of the mapping unit; v) implementation of 5 multivariate statistical susceptibility models based on Logistic Regression and Random Forests techniques and focused on landslide source areas; vi) evaluation of the influence of sample size and type of sampling on results and performance of the models; vii) evaluation of the predictive capabilities of the models using ROC curve, AUC and contingency tables; viii) comparison of model results and obtained susceptibility maps; and ix) analysis of temporal variation of landslide susceptibility related to input parameter changes. Models based on Logistic Regression and Random Forests have demonstrated excellent predictive capabilities. Land use and wildfire

  10. Review of online coupling of sample preparation techniques with liquid chromatography.

    Science.gov (United States)

    Pan, Jialiang; Zhang, Chengjiang; Zhang, Zhuomin; Li, Gongke

    2014-03-07

    Sample preparation is still considered as the bottleneck of the whole analytical procedure, and efforts has been conducted towards the automation, improvement of sensitivity and accuracy, and low comsuption of organic solvents. Development of online sample preparation techniques (SP) coupled with liquid chromatography (LC) is a promising way to achieve these goals, which has attracted great attention. This article reviews the recent advances on the online SP-LC techniques. Various online SP techniques have been described and summarized, including solid-phase-based extraction, liquid-phase-based extraction assisted with membrane, microwave assisted extraction, ultrasonic assisted extraction, accelerated solvent extraction and supercritical fluids extraction. Specially, the coupling approaches of online SP-LC systems and the corresponding interfaces have been discussed and reviewed in detail, such as online injector, autosampler combined with transport unit, desorption chamber and column switching. Typical applications of the online SP-LC techniques have been summarized. Then the problems and expected trends in this field are attempted to be discussed and proposed in order to encourage the further development of online SP-LC techniques. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Estimating the Size of a Large Network and its Communities from a Random Sample.

    Science.gov (United States)

    Chen, Lin; Karbasi, Amin; Crawford, Forrest W

    2016-01-01

    Most real-world networks are too large to be measured or studied directly and there is substantial interest in estimating global network properties from smaller sub-samples. One of the most important global properties is the number of vertices/nodes in the network. Estimating the number of vertices in a large network is a major challenge in computer science, epidemiology, demography, and intelligence analysis. In this paper we consider a population random graph G = (V, E) from the stochastic block model (SBM) with K communities/blocks. A sample is obtained by randomly choosing a subset W ⊆ V and letting G(W) be the induced subgraph in G of the vertices in W. In addition to G(W), we observe the total degree of each sampled vertex and its block membership. Given this partial information, we propose an efficient PopULation Size Estimation algorithm, called PULSE, that accurately estimates the size of the whole population as well as the size of each community. To support our theoretical analysis, we perform an exhaustive set of experiments to study the effects of sample size, K, and SBM model parameters on the accuracy of the estimates. The experimental results also demonstrate that PULSE significantly outperforms a widely-used method called the network scale-up estimator in a wide variety of scenarios.

  12. Direct sampling technique of bees on Vriesea philippocoburgii (Bromeliaceae, Tillandsioideae flowers

    Directory of Open Access Journals (Sweden)

    Afonso Inácio Orth

    2004-11-01

    Full Text Available In our study on Vriesea philippocoburgii Wawra pollination, due to the small proportion of flowers in anthesis on a single day and the damage caused to inflorescences when netting directly on flowers, we used the direct sampling technique (DST of bees on flowers. This technique was applied to 40 flowering plants and resulted in the capture of 160 specimens, belonging to nine genera of Apoidea and separated into 19 morph species. As DST maintains the integrity of flowers for later Bees’ visits, it can enhance the survey’s performance, constituting an alternative methodology for the collection of bees visiting flowering plants.

  13. A novel in-situ sampling and VFA sensor technique for anaerobic systems

    DEFF Research Database (Denmark)

    Pind, Peter Frode; Angelidaki, Irini; Ahring, Birgitte Kiær

    2002-01-01

    that has made it possible to monitor VFA on-line in one of the most difficult media: animal slurry or manure. A novel in-situ filtration technique has made it possible to perform microfiltration inside the reactor system. This filter enables sampling from closed reactor systems without large scale pumping...... and filtering. Using this filtration technique together with commercially available membrane filters we have constructed a VFA sensor system that can perform automatic analysis on animal slurry at a frequency as high as every 15 minutes. The VFA sensor has been tested for a period of more than 60 days with more...

  14. Sampling methods for rumen microbial counts by Real-Time PCR techniques

    Directory of Open Access Journals (Sweden)

    S. Puppo

    2010-02-01

    Full Text Available Fresh rumen samples were withdrawn from 4 cannulated buffalo females fed a fibrous diets in order to quantify bacteria concentration in the rumen by Real-Time PCR techniques. To obtain DNA of a good quality from whole rumen fluid, eight (M1-M8 different pre-filtration methods (cheese cloths, glass-fibre and nylon filter in combination with various centrifugation speeds (1000, 5000 and 14,000 rpm were tested. Genomic DNA extraction was performed either on fresh or frozen samples (-20°C. The quantitative bacteria analysis was realized according to Real-Time PCR procedure for Butyrivibrio fibrisolvens reported in literature. M5 resulted the best sampling procedure allowing to obtain a suitable genomic DNA. No differences were revealed between fresh and frozen samples.

  15. Application of digital sampling techniques to particle identification in scintillation detectors

    CERN Document Server

    Bardelli, L; Poggi, G; Taccetti, N

    2002-01-01

    In this paper, the use of a fast digitizing system for identification of fast charged particles with scintillation detectors is discussed. The three-layer phoswich detectors developed in the framework of the FIASCO experiment for the detection of light charged particles (LCP) and intermediate mass fragments (IMF) emitted in heavy-ion collisions at Fermi energies are briefly discussed. The standard analog electronics treatment of the signals for particle identification is illustrated. After a description of the digitizer designed to perform a fast digital sampling of the phoswich signals, the feasibility of particle identification on the sampled data is demonstrated. The results obtained with two different pulse shape discrimination analyses based on the digitally sampled data are compared with the standard analog signal treatment. The obtained results suggest, for the present application, the replacement of the analog methods with the digital sampling technique.

  16. Nicotine therapy sampling to induce quit attempts among smokers unmotivated to quit: a randomized clinical trial.

    Science.gov (United States)

    Carpenter, Matthew J; Hughes, John R; Gray, Kevin M; Wahlquist, Amy E; Saladin, Michael E; Alberg, Anthony J

    2011-11-28

    Rates of smoking cessation have not changed in a decade, accentuating the need for novel approaches to prompt quit attempts. Within a nationwide randomized clinical trial (N = 849) to induce further quit attempts and cessation, smokers currently unmotivated to quit were randomized to a practice quit attempt (PQA) alone or to nicotine replacement therapy (hereafter referred to as nicotine therapy), sampling within the context of a PQA. Following a 6-week intervention period, participants were followed up for 6 months to assess outcomes. The PQA intervention was designed to increase motivation, confidence, and coping skills. The combination of a PQA plus nicotine therapy sampling added samples of nicotine lozenges to enhance attitudes toward pharmacotherapy and to promote the use of additional cessation resources. Primary outcomes included the incidence of any ever occurring self-defined quit attempt and 24-hour quit attempt. Secondary measures included 7-day point prevalence abstinence at any time during the study (ie, floating abstinence) and at the final follow-up assessment. Compared with PQA intervention, nicotine therapy sampling was associated with a significantly higher incidence of any quit attempt (49% vs 40%; relative risk [RR], 1.2; 95% CI, 1.1-1.4) and any 24-hour quit attempt (43% vs 34%; 1.3; 1.1-1.5). Nicotine therapy sampling was marginally more likely to promote floating abstinence (19% vs 15%; RR, 1.3; 95% CI, 1.0-1.7); 6-month point prevalence abstinence rates were no different between groups (16% vs 14%; 1.2; 0.9-1.6). Nicotine therapy sampling during a PQA represents a novel strategy to motivate smokers to make a quit attempt. clinicaltrials.gov Identifier: NCT00706979.

  17. Fast patient-specific Monte Carlo brachytherapy dose calculations via the correlated sampling variance reduction technique

    Energy Technology Data Exchange (ETDEWEB)

    Sampson, Andrew; Le Yi; Williamson, Jeffrey F. [Department of Radiation Oncology, Virginia Commonwealth University, Richmond, Virginia 23298 (United States)

    2012-02-15

    Purpose: To demonstrate potential of correlated sampling Monte Carlo (CMC) simulation to improve the calculation efficiency for permanent seed brachytherapy (PSB) implants without loss of accuracy. Methods: CMC was implemented within an in-house MC code family (PTRAN) and used to compute 3D dose distributions for two patient cases: a clinical PSB postimplant prostate CT imaging study and a simulated post lumpectomy breast PSB implant planned on a screening dedicated breast cone-beam CT patient exam. CMC tallies the dose difference, {Delta}D, between highly correlated histories in homogeneous and heterogeneous geometries. The heterogeneous geometry histories were derived from photon collisions sampled in a geometrically identical but purely homogeneous medium geometry, by altering their particle weights to correct for bias. The prostate case consisted of 78 Model-6711 {sup 125}I seeds. The breast case consisted of 87 Model-200 {sup 103}Pd seeds embedded around a simulated lumpectomy cavity. Systematic and random errors in CMC were unfolded using low-uncertainty uncorrelated MC (UMC) as the benchmark. CMC efficiency gains, relative to UMC, were computed for all voxels, and the mean was classified in regions that received minimum doses greater than 20%, 50%, and 90% of D{sub 90}, as well as for various anatomical regions. Results: Systematic errors in CMC relative to UMC were less than 0.6% for 99% of the voxels and 0.04% for 100% of the voxels for the prostate and breast cases, respectively. For a 1 x 1 x 1 mm{sup 3} dose grid, efficiency gains were realized in all structures with 38.1- and 59.8-fold average gains within the prostate and breast clinical target volumes (CTVs), respectively. Greater than 99% of the voxels within the prostate and breast CTVs experienced an efficiency gain. Additionally, it was shown that efficiency losses were confined to low dose regions while the largest gains were located where little difference exists between the homogeneous and

  18. A smart rotary technique versus conventional pulpectomy for primary teeth: A randomized controlled clinical study.

    Science.gov (United States)

    Mokhtari, Negar; Shirazi, Alireza-Sarraf; Ebrahimi, Masoumeh

    2017-11-01

    Techniques with adequate accuracy of working length determination along with shorter duration of treatment in pulpectomy procedure seems to be essential in pediatric dentistry. The aim of the present study was to evaluate the accuracy of root canal length measurement with Root ZX II apex locator and rotary system in pulpectomy of primary teeth. In this randomized control clinical trial complete pulpectomy was performed on 80 mandibular primary molars in 80, 4-6-year-old children. The study population was randomly divided into case and control groups. In control group conventional pulpectomy was performed and in the case group working length was determined by electronic apex locator Root ZXII and instrumented with Mtwo rotary files. Statistical evaluation was performed using Mann-Whitney and Chi-Square tests (Protary files (P=0.000). Considering the comparable results in accuracy of root canal length determination and the considerably shorter instrumentation time in Root ZXII apex locator and rotary system, it may be suggested for pulpectomy in primary molar teeth. Key words:Rotary technique, conventional technique, pulpectomy, primary teeth.

  19. Effect of manual therapy techniques on headache disability in patients with tension-type headache. Randomized controlled trial.

    Science.gov (United States)

    Espí-López, G V; Rodríguez-Blanco, C; Oliva-Pascual-Vaca, A; Benítez-Martínez, J C; Lluch, E; Falla, D

    2014-12-01

    Tension-type headache (TTH) is the most common type of primary headache however there is no clear evidence as to which specific treatment is most effective or whether combined treatment is more effective than individual treatments. To assess the effectiveness of manual therapy techniques, applied to the suboccipital region, on aspects of disability in a sample of patients with tension-type headache. Randomized Controlled Trial. Specialized centre for headache treatment. Seventy-six (62 women) patients (age: 39.9 ± 10.9 years) with episodic chronic TTH. Patients were randomly divided into four treatment groups: 1) suboccipital soft tissue inhibition; 2) occiput-atlas-axis manipulation; 3) combined treatment of both techniques; 4) control. Four sessions were applied over 4 weeks and disability was assessed before and after treatment using the Headache Disability Inventory (HDI). Headache frequency, severity and the functional and emotional subscales of the questionnaire were assessed. Photophobia, phonophobia and pericranial tenderness were also monitored. Headache frequency was significantly reduced with the manipulative and combined treatment (Pmanual therapy treatments showed a positive change in headache features, measures of photophobia, photophobia and pericranial tenderness only improved in the group that received the combined treatment suggesting that combined treatment is the most appropriate for symptomatic relief of TTH.

  20. Location and multi-depot vehicle routing for emergency vehicles using tour coverage and random sampling

    Directory of Open Access Journals (Sweden)

    Alireza Goli

    2015-09-01

    Full Text Available Distribution and optimum allocation of emergency resources are the most important tasks, which need to be accomplished during crisis. When a natural disaster such as earthquake, flood, etc. takes place, it is necessary to deliver rescue efforts as quickly as possible. Therefore, it is important to find optimum location and distribution of emergency relief resources. When a natural disaster occurs, it is not possible to reach some damaged areas. In this paper, location and multi-depot vehicle routing for emergency vehicles using tour coverage and random sampling is investigated. In this study, there is no need to visit all the places and some demand points receive their needs from the nearest possible location. The proposed study is implemented for some randomly generated numbers in different sizes. The preliminary results indicate that the proposed method was capable of reaching desirable solutions in reasonable amount of time.

  1. ESTIMATION OF FINITE POPULATION MEAN USING RANDOM NON–RESPONSE IN SURVEY SAMPLING

    Directory of Open Access Journals (Sweden)

    Housila P. Singh

    2010-12-01

    Full Text Available This paper consider the problem of estimating the population mean under three different situations of random non–response envisaged by Singh et al (2000. Some ratio and product type estimators have been proposed and their properties are studied under an assumption that the number of sampling units on which information can not be obtained owing to random non–response follows some distribution. The suggested estimators are compared with the usual ratio and product estimators. An empirical study is carried out to show the performance of the suggested estimators over usual unbiased estimator, ratio and product estimators. A generalized version of the proposed ratio and product estimators is also given.

  2. Chi-Squared Test of Fit and Sample Size-A Comparison between a Random Sample Approach and a Chi-Square Value Adjustment Method.

    Science.gov (United States)

    Bergh, Daniel

    2015-01-01

    Chi-square statistics are commonly used for tests of fit of measurement models. Chi-square is also sensitive to sample size, which is why several approaches to handle large samples in test of fit analysis have been developed. One strategy to handle the sample size problem may be to adjust the sample size in the analysis of fit. An alternative is to adopt a random sample approach. The purpose of this study was to analyze and to compare these two strategies using simulated data. Given an original sample size of 21,000, for reductions of sample sizes down to the order of 5,000 the adjusted sample size function works as good as the random sample approach. In contrast, when applying adjustments to sample sizes of lower order the adjustment function is less effective at approximating the chi-square value for an actual random sample of the relevant size. Hence, the fit is exaggerated and misfit under-estimated using the adjusted sample size function. Although there are big differences in chi-square values between the two approaches at lower sample sizes, the inferences based on the p-values may be the same.

  3. Randomized controlled trial on timing and number of sampling for bile aspiration cytology.

    Science.gov (United States)

    Tsuchiya, Tomonori; Yokoyama, Yukihiro; Ebata, Tomoki; Igami, Tsuyoshi; Sugawara, Gen; Kato, Katsuyuki; Shimoyama, Yoshie; Nagino, Masato

    2014-06-01

    The issue on timing and number of bile sampling for exfoliative bile cytology is still unsettled. A total of 100 patients with cholangiocarcinoma undergoing resection after external biliary drainage were randomized into two groups: a 2-day group where bile was sampled five times per day for 2 days; and a 10-day group where bile was sampled once per day for 10 days (registered University Hospital Medical Information Network/ID 000005983). The outcome of 87 patients who underwent laparotomy was analyzed, 44 in the 2-day group and 43 in the 10-day group. There were no significant differences in patient characteristics between the two groups. Positivity after one sampling session was significantly lower in the 2-day group than in the 10-day group (17.0 ± 3.7% vs. 20.7 ± 3.5%, P = 0.034). However, cumulative positivity curves were similar and overlapped each other between both groups. The final cumulative positivity by the 10th sampling session was 52.3% in the 2-day group and 51.2% in the 10-day group. We observed a small increase in cumulative positivity after the 5th or 6th session in both groups. Bile cytology positivity is unlikely to be affected by sample time. © 2013 Japanese Society of Hepato-Biliary-Pancreatic Surgery.

  4. Demonstration of Novel Sampling Techniques for Measurement of Turbine Engine Volatile and Non-Volatile Particulate Matter (PM) Emissions

    Science.gov (United States)

    2017-03-06

    WP-201317) Demonstration of Novel Sampling Techniques for Measurement of Turbine Engine Volatile and Non-volatile Particulate Matter (PM...Performance Report 1 January 2013 – 1 January 2016 4. TITLE AND SUBTITLE Demonstration of Novel Sampling Techniques for Measurement of Turbine...regulations. Evidently, it is imperative that accurate and reliable aircraft turbine engine measurement techniques are developed for total (volatile

  5. Variation of surface water spectral response as a function of in situ sampling technique

    Science.gov (United States)

    Davis, Bruce A.; Hodgson, Michael E.

    1988-01-01

    Tests were carried out to determine the spectral variation contributed by a particular sampling technique. A portable radiometer was used to measure the surface water spectral response. Variation due to the reflectance of objects near the radiometer (i.e., the boat side) during data acquisition was studied. Consideration was also given to the variation due to the temporal nature of the phenomena (i.e., wave activity).

  6. Estimating the Size of a Large Network and its Communities from a Random Sample

    CERN Document Server

    Chen, Lin; Crawford, Forrest W

    2016-01-01

    Most real-world networks are too large to be measured or studied directly and there is substantial interest in estimating global network properties from smaller sub-samples. One of the most important global properties is the number of vertices/nodes in the network. Estimating the number of vertices in a large network is a major challenge in computer science, epidemiology, demography, and intelligence analysis. In this paper we consider a population random graph G = (V;E) from the stochastic block model (SBM) with K communities/blocks. A sample is obtained by randomly choosing a subset W and letting G(W) be the induced subgraph in G of the vertices in W. In addition to G(W), we observe the total degree of each sampled vertex and its block membership. Given this partial information, we propose an efficient PopULation Size Estimation algorithm, called PULSE, that correctly estimates the size of the whole population as well as the size of each community. To support our theoretical analysis, we perform an exhausti...

  7. Prevalence and correlates of problematic smartphone use in a large random sample of Chinese undergraduates.

    Science.gov (United States)

    Long, Jiang; Liu, Tie-Qiao; Liao, Yan-Hui; Qi, Chang; He, Hao-Yu; Chen, Shu-Bao; Billieux, Joël

    2016-11-17

    Smartphones are becoming a daily necessity for most undergraduates in Mainland China. Because the present scenario of problematic smartphone use (PSU) is largely unexplored, in the current study we aimed to estimate the prevalence of PSU and to screen suitable predictors for PSU among Chinese undergraduates in the framework of the stress-coping theory. A sample of 1062 undergraduate smartphone users was recruited by means of the stratified cluster random sampling strategy between April and May 2015. The Problematic Cellular Phone Use Questionnaire was used to identify PSU. We evaluated five candidate risk factors for PSU by using logistic regression analysis while controlling for demographic characteristics and specific features of smartphone use. The prevalence of PSU among Chinese undergraduates was estimated to be 21.3%. The risk factors for PSU were majoring in the humanities, high monthly income from the family (≥1500 RMB), serious emotional symptoms, high perceived stress, and perfectionism-related factors (high doubts about actions, high parental expectations). PSU among undergraduates appears to be ubiquitous and thus constitutes a public health issue in Mainland China. Although further longitudinal studies are required to test whether PSU is a transient phenomenon or a chronic and progressive condition, our study successfully identified socio-demographic and psychological risk factors for PSU. These results, obtained from a random and thus representative sample of undergraduates, opens up new avenues in terms of prevention and regulation policies.

  8. New technique to take samples from environmental surfaces using flocked nylon swabs.

    Science.gov (United States)

    Hedin, G; Rynbäck, J; Loré, B

    2010-08-01

    Environmental surfaces near infected and/or colonised patients in hospitals are commonly contaminated with potentially pathogenic micro-organisms. At present, however, there is no standardised method for taking samples from surfaces in order to perform quantitative cultures. Usually contact plates or swabs are used, but these methods may give different results. The recovery rate of traditional swabbing, e.g. cotton or rayon, is poor. With a new type of swab utilising flocked nylon, the recovery may be enhanced up to three times compared with a rayon swab. In this study, we inoculated reference strains of Staphylococcus aureus and Enterococcus hirae onto a bedside table and took samples 1h later when inocula were dry. Sequential samples were taken from the same surface. A new sampling technique using two sequential nylon swabs for each sample was validated. The efficiency of the sampling, percentage recovery of the inoculum and the variation of culture results obtained from repeated experiments are described. Enhanced efficiency and higher recovery of inoculum were demonstrated using two sequential flocked nylon swabs for sampling. Copyright 2010 The Hospital Infection Society. Published by Elsevier Ltd. All rights reserved.

  9. Randomized controlled trial of the Alexander technique for idiopathic Parkinson's disease.

    Science.gov (United States)

    Stallibrass, C; Sissons, P; Chalmers, C

    2002-11-01

    To determine whether the Alexander Technique, alongside normal treatment, is of benefit to people disabled by idiopathic Parkinson's disease. A randomized controlled trial with three groups, one receiving lessons in the Alexander Technique, another receiving massage and one with no additional intervention. Measures were taken pre- and post-intervention, and at follow-up, six months later. The Polyclinic at the University of Westminster, Central London. Ninety-three people with clinically confirmed idiopathic Parkinson's disease. The Alexander Technique group received 24 lessons in the Alexander Technique and the massage group received 24 sessions of massage. The main outcome measures were the Self-assessment Parkinson's Disease Disability Scale (SPDDS) at best and at worst times of day. Secondary measures included the Beck Depression Inventory and an Attitudes to Self Scale. The Alexander Technique group improved compared with the no additional intervention group, pre-intervention to post-intervention, both on the SPDDS at best, p = 0.04 (confidence interval (CI) -6.4 to 0.0) and on the SPDDS at worst, p = 0.01 (CI -11.5 to -1.8). The comparative improvement was maintained at six-month follow-up: on the SPDDS at best, p = 0.04 (CI -7.7 to 0.0) and on the SPDDS at worst, p = 0.01 (CI -11.8 to -0.9). The Alexander Technique group was comparatively less depressed post-intervention, p = 0.03 (CI -3.8 to 0.0) on the Beck Depression Inventory, and at six-month follow-up had improved on the Attitudes to Self Scale, p = 0.04 (CI -13.9 to 0.0). There is evidence that lessons in the Alexander Technique are likely to lead to sustained benefit for people with Parkinson's disease.

  10. A randomized comparison of cold snare polypectomy versus a suction pseudopolyp technique.

    Science.gov (United States)

    Din, Said; Ball, Alex J; Riley, Stuart A; Kitsanta, Panagiota; Johal, Shawinder

    2015-11-01

    Cold snare techniques are widely used for removal of diminutive and small colorectal polyps. The influence of resection technique on the effectiveness of polypectomy is unknown. We therefore compared standard cold snare polypectomy with a newly described suction pseudopolyp technique, for completeness of excision and for complications. In this single-center study, 112 patients were randomized to cold snare polypectomy or the suction pseudopolyp technique. Primary outcome was endoscopic completeness of excision. Consensus regarding the endoscopic assessment of completeness of excision was standardized and aided by chromoendoscopy. Secondary outcomes included: completeness of histological excision, polyp "fly away" and retrieval rates, early bleeding (48 hours), delayed bleeding (2 weeks), and perforation. 148 polyps were removed, with size range 3 - 7 mm, 60 % in the left colon, and 90 % being sessile. Regarding completeness of excision (with uncertain findings omitted): endoscopically, this was higher with the suction pseudopolyp technique compared with cold snare polypectomy but not statistically significantly so (73/74 [98.6 %] vs. 63/68 [92.6 %]; P = 0.08). A trend towards a higher complete histological excision rate with the suction pseudopolyp technique was also not statistically significant (45/59 [76.3 %] vs. 37/58 [63.8 %]; P = 0.14). Polyp retrieval rate was not significantly different (suction 68/76 [89.5 %] vs. cold snare 64/72 [88.9 %]; P = 0.91). No perforation or bleeding requiring hemostasis occurred in either group.  In this study both polypectomy techniques were found to be safe and highly effective, but further large multicenter trials are required.Clinical trial registration at www.clinicaltrials.gov: NCT02208401. © Georg Thieme Verlag KG Stuttgart · New York.

  11. Detection of equine herpesvirus in horses with idiopathic keratoconjunctivitis and comparison of three sampling techniques.

    Science.gov (United States)

    Hollingsworth, Steven R; Pusterla, Nicola; Kass, Philip H; Good, Kathryn L; Brault, Stephanie A; Maggs, David J

    2015-09-01

    To determine the role of equine herpesvirus (EHV) in idiopathic keratoconjunctivitis in horses and to determine whether sample collection method affects detection of EHV DNA by quantitative polymerase chain reaction (qPCR). Twelve horses with idiopathic keratoconjunctivitis and six horses without signs of ophthalmic disease. Conjunctival swabs, corneal scrapings, and conjunctival biopsies were collected from 18 horses: 12 clinical cases with idiopathic keratoconjunctivitis and six euthanized controls. In horses with both eyes involved, the samples were taken from the eye judged to be more severely affected. Samples were tested with qPCR for EHV-1, EHV-2, EHV-4, and EHV-5 DNA. Quantity of EHV DNA and viral replicative activity were compared between the two populations and among the different sampling techniques; relative sensitivities of the sampling techniques were determined. Prevalence of EHV DNA as assessed by qPCR did not differ significantly between control horses and those with idiopathic keratoconjunctivitis. Sampling by conjunctival swab was more likely to yield viral DNA as assessed by qPCR than was conjunctival biopsy. EHV-1 and EHV-4 DNA were not detected in either normal or IKC-affected horses; EHV-2 DNA was detected in two of 12 affected horses but not in normal horses. EHV-5 DNA was commonly found in ophthalmically normal horses and horses with idiopathic keratoconjunctivitis. Because EHV-5 DNA was commonly found in control horses and in horses with idiopathic keratoconjunctivitis, qPCR was not useful for the etiological diagnosis of equine keratoconjunctivitis. Conjunctival swabs were significantly better at obtaining viral DNA samples than conjunctival biopsy in horses in which EHV-5 DNA was found. © 2015 American College of Veterinary Ophthalmologists.

  12. Random-Access Technique for Self-Organization of 5G Millimeter-Wave Cellular Communications

    Directory of Open Access Journals (Sweden)

    Jasper Meynard Arana

    2016-01-01

    Full Text Available The random-access (RA technique is a key procedure in cellular networks and self-organizing networks (SONs, but the overall processing time of this technique in millimeter-wave (mm-wave cellular systems with directional beams is very long because RA preambles (RAPs should be transmitted in all directions of Tx and Rx beams. In this paper, two different types of preambles (RAP-1 and RAP-2 are proposed to reduce the processing time in the RA stage. After analyzing the correlation property, false-alarm probability, and detection probability of the proposed RAPs, we perform simulations to show that the RAP-2 is suitable for RA in mm-wave cellular systems with directional beams because of the smaller processing time and high detection probability in multiuser environments.

  13. Comparison of sampling techniques for parallel analysis of transcript and metabolite levels in Saccharomyces cerevisiae.

    Science.gov (United States)

    Martins, Ana Margarida; Sha, Wei; Evans, Clive; Martino-Catt, Susan; Mendes, Pedro; Shulaev, Vladimir

    2007-03-01

    Mathematical modelling of cellular processes is crucial for the understanding of the cell or organism as a whole. Genome-wide observations, at the levels of the transcriptome, proteome and metabolome, provide a high coverage of the molecular constituents of the system in study. Time-course experiments are important for gaining insight into a system's dynamics and are needed for mathematical modelling. In time-course experiments it is crucial to use efficient and fast sampling techniques. We evaluated several techniques to sample and process yeast cultures for parallel analysis of the transcriptome and metabolome. The evaluation was made by measuring the quality of the RNA obtained with UV-spectroscopy, capillary electrophoresis and microarray hybridization. The protocol developed involves rapid collection by spraying the sample into -40 degrees C tricine-buffered methanol (as previously described for yeast metabolome analysis), followed by the separation of cells from the culture medium in low-temperature rapid centrifugation. Removal of the residual methanol is carried out by freeze-drying the pellet at -35 degrees C. RNA and metabolites can then be extracted from the same freeze-dried sample obtained with this procedure.

  14. Experimental improvements in sample preparation for the track registration technique from dry and solution media

    Energy Technology Data Exchange (ETDEWEB)

    Suarez-Navarro, M.J. [Universidad Politecnica de Madrid (UPM), E.T.S.I de Caminos, Canales y Puertos, Profesor Aranguren s/n, 28040 Madrid (Spain)]. E-mail: he04@caminos.upm.es; Pujol, Ll. [Centro de Estudios y Experimentacion de Obras Publicas (CEDEX), Alfonso XII, 3, 28014 Madrid (Spain); Gonzalez-Gonzalez, J.A. [Universidad Politecnica de Madrid (UPM), E.T.S.I de Caminos, Canales y Puertos, Profesor Aranguren s/n, 28040 Madrid (Spain)

    2006-04-15

    This paper describes the sample preparation studies carried out to determine gross alpha activities in waste materials by means of alpha-particle track counting using CR-39 detector. Sample preparation for the track registration technique using evaporation or electroplating methods (also known as conventional 'dry methods') has a number of drawbacks. The distribution of tracks in different areas of the detector surface is non-uniform, so accurate quantitative determinations depend on tedious and time-consuming counting of tracks under an optical microscope. In this paper, we propose the use of tensioactives in sample preparation to achieve uniform track distribution over the entire detector surface, which enables track density to be evaluated by scanning a small representative area. Under our counting conditions, uniform distribution was achieved with 0.2 ml of Teg from a planchetted source. Furthermore, track registration techniques using solution media (also known as the 'wet methods') and conventional 'dry methods' were analysed and compared with the proposed method. The reproducibility of the procedure described in the study was tested by analysing gross alpha activity in two low-level nuclear waste samples at two different laboratories.

  15. ON THE SAMPLING OF SERIAL SECTIONING TECHNIQUE FOR THREE DIMENSIONAL SPACE-FILLING GRAIN STRUCTURES

    Directory of Open Access Journals (Sweden)

    Guoquan Liu

    2011-05-01

    Full Text Available Serial sectioning technique provides plenty of quantitative geometric information of the microstructure analyzed, including those unavailable from stereology with one- and two-dimensional probes. This may be why it used to be and is being continuously served as one of the most common and invaluable methods to study the size and the size distribution, the topology and the distribution of topology parameters, and even the shape of three-dimensional space filling grains or cells. On the other hand, requiring tedious lab work, the method is also very time and energy consuming, most often only less than one hundred grains per sample were sampled and measured in almost all reported practice. Thus, a question is often asked: for typical microstructures in engineering materials, are so many grains or cells sampled adequate to obtain reliable results from this technique? To answer this question, experimental data of 1292 contiguous austenite grains in a low-carbon steel specimen obtained from the serial sectioning analysis are presented in this paper, which demonstrates the effect of sampling on the measurement of various parameters of grain size distribution and of the grain topology distribution. The result provides one of rules of thumb for grain stereology of similar microstructures.

  16. Evaluation of Primary Immunization Coverage of Infants Under Universal Immunization Programme in an Urban Area of Bangalore City Using Cluster Sampling and Lot Quality Assurance Sampling Techniques

    Science.gov (United States)

    K, Punith; K, Lalitha; G, Suman; BS, Pradeep; Kumar K, Jayanth

    2008-01-01

    Research Question: Is LQAS technique better than cluster sampling technique in terms of resources to evaluate the immunization coverage in an urban area? Objective: To assess and compare the lot quality assurance sampling against cluster sampling in the evaluation of primary immunization coverage. Study Design: Population-based cross-sectional study. Study Setting: Areas under Mathikere Urban Health Center. Study Subjects: Children aged 12 months to 23 months. Sample Size: 220 in cluster sampling, 76 in lot quality assurance sampling. Statistical Analysis: Percentages and Proportions, Chi square Test. Results: (1) Using cluster sampling, the percentage of completely immunized, partially immunized and unimmunized children were 84.09%, 14.09% and 1.82%, respectively. With lot quality assurance sampling, it was 92.11%, 6.58% and 1.31%, respectively. (2) Immunization coverage levels as evaluated by cluster sampling technique were not statistically different from the coverage value as obtained by lot quality assurance sampling techniques. Considering the time and resources required, it was found that lot quality assurance sampling is a better technique in evaluating the primary immunization coverage in urban area. PMID:19876474

  17. Decomposition and (importance) sampling techniques for multi-stage stochastic linear programs

    Energy Technology Data Exchange (ETDEWEB)

    Infanger, G.

    1993-11-01

    The difficulty of solving large-scale multi-stage stochastic linear programs arises from the sheer number of scenarios associated with numerous stochastic parameters. The number of scenarios grows exponentially with the number of stages and problems get easily out of hand even for very moderate numbers of stochastic parameters per stage. Our method combines dual (Benders) decomposition with Monte Carlo sampling techniques. We employ importance sampling to efficiently obtain accurate estimates of both expected future costs and gradients and right-hand sides of cuts. The method enables us to solve practical large-scale problems with many stages and numerous stochastic parameters per stage. We discuss the theory of sharing and adjusting cuts between different scenarios in a stage. We derive probabilistic lower and upper bounds, where we use importance path sampling for the upper bound estimation. Initial numerical results turned out to be promising.

  18. Total reflection X-ray fluorescence as a fast multielemental technique for human placenta sample analysis

    Science.gov (United States)

    Marguí, E.; Ricketts, P.; Fletcher, H.; Karydas, A. G.; Migliori, A.; Leani, J. J.; Hidalgo, M.; Queralt, I.; Voutchkov, M.

    2017-04-01

    In the present contribution, benchtop total reflection X-ray fluorescence spectrometry (TXRF) has been evaluated as a cost-effective multielemental analytical technique for human placenta analysis. An easy and rapid sample preparation consisting of suspending 50 mg of sample in 1 mL of a Triton 1% solution in deionized water showed to be the most suitable for this kind of samples. However, for comparison purposes, an acidic microwave acidic digestion procedure was also applied. For both sample treatment methodologies, limits of detection for most elements were in the low mg/kg level. Accurate and precise results were obtained using internal standardization as quantification approach and applying a correction factor to compensate for absorption effects. The correction factor was based on the proportional ratio between the slurry preparation results and those obtained for the analysis of a set of human placenta samples analysed by microwave acidic digestion and ICP-AES analysis. As a study case, the developed TXRF methodology was applied for multielemental analysis (K, Ca, Fe, Cu, Zn, As, Se, Br, Rb and Sr) of several healthy women's placenta samples from two regions in Jamaica.

  19. Dried urine spots - A novel sampling technique for comprehensive LC-MSn drug screening.

    Science.gov (United States)

    Michely, Julian A; Meyer, Markus R; Maurer, Hans H

    2017-08-22

    Dried matrix spot (DMS) technique as alternative sampling strategy, especially dried urine spots (DUS), might be an alternative for drug screening. So far only particular drugs or drug classes were covered in DMS screenings. Therefore, workup of DUS for a broad comprehensive library-based LC-MSn screening was developed. It consisted of enzymatic on-spot deconjugation followed by liquid extraction and LC-MSn analysis. This workup was compared to established urine precipitation (UP) and validated according to international guidelines for qualitative approaches, using exemplary compounds of several drug classes (antidepressants, benzodiazepines, cardiovascular drugs, neuroleptics, opioids, stimulants, etc.) with a broad range of (physico-)chemical properties and chromatographic behaviors. On-spot conjugate cleavage and liquid extraction was sufficient for most compounds and the validation results comparable to those obtained after simple UP. For demonstrating the applicability, 103 authentic urine samples, six rat urine samples after low dose substance administrations, and two proficiency tests for systematic toxicological urinalysis were worked up with the new DUS approach or by UP without or with conjugate cleavage. In the authentic urine samples, 112 different drugs out of 43 categories plus metabolites were identified via the used LC-MSn library. With the new DUS approach, 5% less positive hits could be found compared to the UP approach and 15% less than the latter after conjugate cleavage. The differences should be caused mainly by smaller urine volumes used for DUS. In the two proficiency tests, all 15 drugs could be detected. Unfortunately, all three approaches were not able to detect very low-dosed substances in rat urine samples. However, they could be detected using a more sensitive LC-high resolution-MS/MS approach showing that the DUS workup was also suitable for those. In conclusion, DUS might be an alternative sampling technique for comprehensive drug

  20. Protein/creatinine ratio on random urine samples for prediction of proteinuria in preeclampsia.

    Science.gov (United States)

    Roudsari, F Vahid; Ayati, S; Ayatollahi, H; Shakeri, M T

    2012-01-01

    To evaluate Protein/Creatinine ratio on random urine samples for prediction of proteinuria in preeclampsia. This study was performed on 150 pregnant women who were hospitalized as preeclampsia in Ghaem Hospital during 2006. At first, a 24-hours urine sample was collected for each patient to determine protein/creatinine ratio. Then, 24-hours urine collection was analyzed for the evaluation of proteinuria. Statistical analysis was performed with SPSS software. A total of 150 patients entered the study. There was a significant relation between the 24-hours urine protein and protein/creatinine ratio (r = 0.659, P < 0.001). Since the measurement of protein/creatinine ratio is more accurate, reliable, and cost-effective, it can be replaced by the method of measurement the 24-hours urine protein.

  1. Robustness of double random phase encoding spread-space spread-spectrum image watermarking technique

    Science.gov (United States)

    Liu, Shi; Hennelly, Bryan M.; Sheridan, John T.

    2013-09-01

    In this paper the robustness of a recently proposed image watermarking scheme is investigated, namely the Double Random Phase Encoding spread-space spread-spectrum watermarking (DRPE SS-SS) technique. In the DRPE SS-SS method, the watermark is in the form of a digital barcode image which is numerically encrypted using a simulation of the optical DRPE process. This produces a random complex image, which is then processed to form a real valued random image with a low number of quantization levels. This signal is added to the host image. Extraction of the barcode, involves applying an inverse DRPE process to the watermarked image followed by a low pass filter. This algorithm is designed to utilize the capability of the DRPE to reversibly spread the energy of the watermarking information in both the space and spatial frequency domains, and the energy of the watermark in any spatial or spatial frequency bin is very small. The common geometric transformations and signal processing operations are performed using both the informed and the blind detections for different barcode widths and different quantization levels. The results presented indicate that the DRPE SS-SS method is robust to scaling, JPEG compression distortion, cropping, low pass and high pass filtering. It is also demonstrated that the bigger the barcode width is, the lower the false positive rate will be.

  2. A prospective randomized trial of different laparoscopic gastric banding techniques for morbid obesity.

    Science.gov (United States)

    Weiner, R; Bockhorn, H; Rosenthal, R; Wagner, D

    2001-01-01

    Slippage of the stomach is the most common postoperative complication after laparoscopic adjustable silicone gastric banding (LASGB) for morbid obesity. Retrogastric placement (RGP) of the band through the lesser sac can cause posterior slippage Incomplete suturing often is responsible for anterior slippage. A randomized prospective study was constructed to determine whether laparoscopic esophagogastric placement (EGP) is associated with a lower incidence of postoperative slippage and pouch dilation than RGP. Morbid obese patients presenting for LASGB were randomized to undergo either an EGP (n = 50) or an RGP (n = 51). Patients were blinded to which procedure they underwent, and follow-up date were obtained by a blinded independent investigator. Standardized clinical and radiologic controls were used to assess pouch enlargement and slippage. Operating time was similar for the two procedures (54.5 min for EGP vs 58 min for RGP). There was no significant difference in postoperative weight loss (34 kg after EGP vs 37 kg after RGP within 12 months), esophagus dilation, or postoperative quality of life. There were two postoperative slippages and one pouch dilation in the RGP group and no postoperative complication in the EGP group. The placement of a LAP-BAND adjustable gastric banding system by the EGP technique is safe and results in a lower frequency of postoperative complications than its placement by the RGP technique. Clear anatomic landmarks are a benefit to education and to the learning curve for LASGB.

  3. Examining predictors of chemical toxicity in freshwater fish using the random forest technique.

    Science.gov (United States)

    Tuulaikhuu, Baigal-Amar; Guasch, Helena; García-Berthou, Emili

    2017-04-01

    Chemical pollution is one of the main issues globally threatening the enormous biodiversity of freshwater ecosystems. The toxicity of substances depends on many factors such as the chemical itself, the species affected, environmental conditions, exposure duration, and concentration. We used the random forest technique to examine the factors that mediate toxicity in a set of widespread fishes and analyses of covariance to further assess the importance of differential sensitivity among fish species. Among 13 variables, the 5 most important predictors of toxicity with random forests were, by order of importance, the chemical substance itself (i.e., Chemical Abstracts Service number considered as a categorical factor), octanol-water partition coefficient (log P), pollutant prioritization, ecological structure-activity relationship (ECOSAR) classification, and fish species for 50% lethal concentrations (LC 50 ) and the chemical substance, fish species, log P, ECOSAR classification, and water temperature for no observed effect concentrations (NOECs). Fish species was a very important predictor for both endpoints and with the two contrasting statistical techniques used. Different fish species displayed very different relationships with log P, often with different slopes and with as much importance as the partition coefficient. Therefore, caution should be exercised when extrapolating toxicological results or relationships among species. In addition, further research is needed to determine species-specific sensitivities and unravel the mechanisms behind them.

  4. Peyton's four-step approach for teaching complex spinal manipulation techniques - a prospective randomized trial.

    Science.gov (United States)

    Gradl-Dietsch, Gertraud; Lübke, Cavan; Horst, Klemens; Simon, Melanie; Modabber, Ali; Sönmez, Tolga T; Münker, Ralf; Nebelung, Sven; Knobe, Matthias

    2016-11-03

    The objectives of this prospective randomized trial were to assess the impact of Peyton's four-step approach on the acquisition of complex psychomotor skills and to examine the influence of gender on learning outcomes. We randomly assigned 95 third to fifth year medical students to an intervention group which received instructions according to Peyton (PG) or a control group, which received conventional teaching (CG). Both groups attended four sessions on the principles of manual therapy and specific manipulative and diagnostic techniques for the spine. We assessed differences in theoretical knowledge (multiple choice (MC) exam) and practical skills (Objective Structured Practical Examination (OSPE)) with respect to type of intervention and gender. Participants took a second OSPE 6 months after completion of the course. There were no differences between groups with respect to the MC exam. Students in the PG group scored significantly higher in the OSPE. Gender had no additional impact. Results of the second OSPE showed a significant decline in competency regardless of gender and type of intervention. Peyton's approach is superior to standard instruction for teaching complex spinal manipulation skills regardless of gender. Skills retention was equally low for both techniques.

  5. LOD score exclusion analyses for candidate QTLs using random population samples.

    Science.gov (United States)

    Deng, Hong-Wen

    2003-11-01

    While extensive analyses have been conducted to test for, no formal analyses have been conducted to test against, the importance of candidate genes as putative QTLs using random population samples. Previously, we developed an LOD score exclusion mapping approach for candidate genes for complex diseases. Here, we extend this LOD score approach for exclusion analyses of candidate genes for quantitative traits. Under this approach, specific genetic effects (as reflected by heritability) and inheritance models at candidate QTLs can be analyzed and if an LOD score is < or = -2.0, the locus can be excluded from having a heritability larger than that specified. Simulations show that this approach has high power to exclude a candidate gene from having moderate genetic effects if it is not a QTL and is robust to population admixture. Our exclusion analysis complements association analysis for candidate genes as putative QTLs in random population samples. The approach is applied to test the importance of Vitamin D receptor (VDR) gene as a potential QTL underlying the variation of bone mass, an important determinant of osteoporosis.

  6. A Combined Weighting Method Based on Hybrid of Interval Evidence Fusion and Random Sampling

    Directory of Open Access Journals (Sweden)

    Ying Yan

    2017-01-01

    Full Text Available Due to the complexity of system and lack of expertise, epistemic uncertainties may present in the experts’ judgment on the importance of certain indices during group decision-making. A novel combination weighting method is proposed to solve the index weighting problem when various uncertainties are present in expert comments. Based on the idea of evidence theory, various types of uncertain evaluation information are uniformly expressed through interval evidence structures. Similarity matrix between interval evidences is constructed, and expert’s information is fused. Comment grades are quantified using the interval number, and cumulative probability function for evaluating the importance of indices is constructed based on the fused information. Finally, index weights are obtained by Monte Carlo random sampling. The method can process expert’s information with varying degrees of uncertainties, which possesses good compatibility. Difficulty in effectively fusing high-conflict group decision-making information and large information loss after fusion is avertible. Original expert judgments are retained rather objectively throughout the processing procedure. Cumulative probability function constructing and random sampling processes do not require any human intervention or judgment. It can be implemented by computer programs easily, thus having an apparent advantage in evaluation practices of fairly huge index systems.

  7. A descriptive analysis of a representative sample of pediatric randomized controlled trials published in 2007

    Directory of Open Access Journals (Sweden)

    Thomson Denise

    2010-12-01

    Full Text Available Abstract Background Randomized controlled trials (RCTs are the gold standard for trials assessing the effects of therapeutic interventions; therefore it is important to understand how they are conducted. Our objectives were to provide an overview of a representative sample of pediatric RCTs published in 2007 and assess the validity of their results. Methods We searched Cochrane Central Register of Controlled Trials using a pediatric filter and randomly selected 300 RCTs published in 2007. We extracted data on trial characteristics; outcomes; methodological quality; reporting; and registration and protocol characteristics. Trial registration and protocol availability were determined for each study based on the publication, an Internet search and an author survey. Results Most studies (83% were efficacy trials, 40% evaluated drugs, and 30% were placebo-controlled. Primary outcomes were specified in 41%; 43% reported on adverse events. At least one statistically significant outcome was reported in 77% of trials; 63% favored the treatment group. Trial registration was declared in 12% of publications and 23% were found through an Internet search. Risk of bias (ROB was high in 59% of trials, unclear in 33%, and low in 8%. Registered trials were more likely to have low ROB than non-registered trials (16% vs. 5%; p = 0.008. Effect sizes tended to be larger for trials at high vs. low ROB (0.28, 95% CI 0.21,0.35 vs. 0.16, 95% CI 0.07,0.25. Among survey respondents (50% response rate, the most common reason for trial registration was a publication requirement and for non-registration, a lack of familiarity with the process. Conclusions More than half of this random sample of pediatric RCTs published in 2007 was at high ROB and three quarters of trials were not registered. There is an urgent need to improve the design, conduct, and reporting of child health research.

  8. Determination of Se in soil samples using the proton induced X-ray emission technique

    Science.gov (United States)

    Cruvinel, Paulo E.; Flocchini, Robert G.

    1993-04-01

    An alternative method for the direct determination of total Se in soil samples is presented. A large number of trace elements is present in soil at concentration values in the range of part per billion and tenths of parts of million. The most common are the trace elements of Al, Si, K, Ca, Ti, V, Cr, Fe, Cu, Zn, Br, Rb, Mo, Cd and Pb. As for biological samples many of these elements are of great importance for the nutrition of plants, while others are toxic and others have an unknown role. Selenium is an essential micronutrient for humans and animals but it is also known that in certain areas Se deficiency or toxicity has caused endemic disease to livestock and humans through the soil-plant-animal linkage. In this work the suitability of the proton induced X-ray emission (PIXE) technique as a fast and nondestructive technique useful to measure total the Se content in soil samples is demonstrated. To validate the results a comparison of data collected using the conventional atomic absorption spectrophotometry (AAS) method was performed.

  9. Applied Focused Ion Beam Techniques for Sample Preparation of Astromaterials for Integrated Nano-Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Graham, G A; Teslich, N E; Kearsley, A T; Stadermann, F J; Stroud, R M; Dai, Z R; Ishii, H A; Hutcheon, I D; Bajt, S; Snead, C J; Weber, P K; Bradley, J P

    2007-02-20

    Sample preparation is always a critical step in study of micrometer sized astromaterials available for study in the laboratory, whether their subsequent analysis is by electron microscopy or secondary ion mass spectrometry. A focused beam of gallium ions has been used to prepare electron transparent sections from an interplanetary dust particle, as part of an integrated analysis protocol to maximize the mineralogical, elemental, isotopic and spectroscopic information extracted from one individual particle. In addition, focused ion beam techniques have been employed to extract cometary residue preserved on the rims and walls of micro-craters in 1100 series aluminum foils that were wrapped around the sample tray assembly on the Stardust cometary sample collector. Non-ideal surface geometries and inconveniently located regions of interest required creative solutions. These include support pillar construction and relocation of a significant portion of sample to access a region of interest. Serial sectioning, in a manner similar to ultramicrotomy, is a significant development and further demonstrates the unique capabilities of focused ion beam microscopy for sample preparation of astromaterials.

  10. Can a bleaching toothpaste containing Blue Covarine demonstrate the same bleaching as conventional techniques? An in vitro, randomized and blinded study

    OpenAIRE

    DANTAS,Andréa Abi Rached; BORTOLATTO,Janaina Freitas; Ávery RONCOLATO; MERCHAN,Hugo; FLOROS,Michael Christopher; Kuga,Milton Carlos; Oliveira Junior, Osmir Batista de [UNESP

    2015-01-01

    ABSTRACT Objective The purpose of this in vitro study was to compare the efficacy of a bleaching toothpaste containing Blue Covarine vs. conventional tooth bleaching techniques using peroxides (both in-office and at-home). Material and Methods Samples were randomly distributed into five experimental groups (n=15): C - Control; BC – Bleaching toothpaste containing Blue Covarine; WBC – Bleaching toothpaste without Blue Covarine; HP35 - In-office bleaching using 35% hydrogen pero...

  11. Comparison of four techniques of nasogastric tube insertion in anaesthetised, intubated patients: A randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Mohan Chandra Mandal

    2014-01-01

    Full Text Available Background and Aims: Insertion of nasogastric tubes (NGTs in anaesthetised, intubated patients with a conventional method is sometimes difficult. Different techniques of NGT insertion have been tried with varying degree of success. The aim of this prospective, randomised, open-label study was to evaluate three modified techniques of NGT insertion comparing with the conventional method in respect of success rate, time taken for insertion and the adverse events. Methods: In the operation theatre of general surgery, the patients were randomly allocated into four groups: Group C (control group, n = 54, Group W (ureteral guide wire group, n = 54, Group F (neck flexion with lateral pressure, n = 54 and Group R (reverse Sellick′s manoeuvre, n = 54. The number of attempts for successful NGT insertion, time taken for insertion and adverse events were noted. Results: All the three modified techniques were found more successful than the conventional method on the first attempt. The least time taken for insertion was noted in the reverse Sellick′s method. However, on intergroup analysis, neck flexion and reverse Sellick′s methods were comparable but significantly faster than the other two methods with respect to time taken for insertion. Conclusion: Reverse Sellick′s manoeuver, neck flexion with lateral neck pressure and guide wire-assisted techniques are all better alternatives to the conventional method for successful, quick and reliable NGT insertion with permissible adverse events in anaesthetised, intubated adult patients. Further studies after eliminating major limitations of the present study are warranted to establish the superiority of any one of these modified techniques.

  12. Comparative evaluation of gingival depigmentation by tetrafluroethane cryosurgery and surgical scalpel technique. A randomized clinical study

    Directory of Open Access Journals (Sweden)

    Suraj D Narayankar

    2017-01-01

    Full Text Available Introduction: Importance of good smile cannot be underestimated in enhancement of beauty, self-confidence and personality of a person. Health and appearance of gingiva is an essential part of attractive smile. Gingival pigmentation gives rise to unesthetic smile line. In present world, with increasing awareness to esthetic, people have become highly concerned about black gums. Various treatment modalities like abrasion, scrapping, scalpel technique, cryosurgery, electrosurgery and laser are available for treatment of gingival pigmentation. The present study was conducted with an objective of comparing efficacy of gingival depigmentation by cryosurgery and scalpel technique. Method: A Randomized control split mouth study was conducted for 25 patients with gingival pigmentation. Gingival pigmentation Index (GPI for pigmentation and Visual Analoug Scale (VAS for pain was evaluated for both test (Cryosurgery and control sites (Scalpel technique at baseline, 1month, 3months and 6 months. Results: GPI score was 3 and 2 for 21/25 and 4/25 control sites and was 22/25 and 3/25 test sites respectively at baseline. Both the groups showed significant reduction in GPI score i.e., 0 at 1 and 3 months interval after treatment. GPI score increased to 1 for 5/25 sites treated with scalpel technique and 2/25 sites treated with cryosurgery at 6 months interval (P=0.0691. This indicates recurrence rate for pigmentation is higher after scalpel treatment. VAS Score was 3 for 10/25 sites treated with scalpel and was 2 for 12/25 sites treated with cryosurgery (P<0.001. Conclusion: It can be concluded that cryosurgery can be effectively and efficiently used for depigmentation by keeping patients acceptance and comfort in mind and also the long term results and ease of use when compared to scalpel technique.

  13. Comparative Evaluation of Gingival Depigmentation by Tetrafluroethane Cryosurgery and Surgical Scalpel Technique. A Randomized Clinical Study.

    Science.gov (United States)

    Narayankar, Suraj D; Deshpande, Neeraj C; Dave, Deepak H; Thakkar, Dhaval J

    2017-01-01

    Importance of good smile cannot be underestimated in enhancement of beauty, self-confidence and personality of a person. Health and appearance of gingiva is an essential part of attractive smile. Gingival pigmentation gives rise to unesthetic smile line. In present world, with increasing awareness to esthetic, people have become highly concerned about black gums. Various treatment modalities like abrasion, scrapping, scalpel technique, cryosurgery, electrosurgery and laser are available for treatment of gingival pigmentation. The present study was conducted with an objective of comparing efficacy of gingival depigmentation by cryosurgery and scalpel technique. A Randomized control split mouth study was conducted for 25 patients with gingival pigmentation. Gingival pigmentation Index (GPI) for pigmentation and Visual Analoug Scale (VAS) for pain was evaluated for both test (Cryosurgery) and control sites (Scalpel technique) at baseline, 1month, 3months and 6 months. GPI score was 3 and 2 for 21/25 and 4/25 control sites and was 22/25 and 3/25 test sites respectively at baseline. Both the groups showed significant reduction in GPI score i.e., 0 at 1 and 3 months interval after treatment. GPI score increased to 1 for 5/25 sites treated with scalpel technique and 2/25 sites treated with cryosurgery at 6 months interval (P =0.0691). This indicates recurrence rate for pigmentation is higher after scalpel treatment. VAS Score was 3 for 10/25 sites treated with scalpel and was 2 for 12/25 sites treated with cryosurgery (P depigmentation by keeping patients acceptance and comfort in mind and also the long term results and ease of use when compared to scalpel technique.

  14. Effects of anesthesia and blood sampling techniques on plasma metabolites and corticosterone in the rat.

    Science.gov (United States)

    Arnold, Myrtha; Langhans, Wolfgang

    2010-04-19

    Blood is routinely sampled from laboratory animals in biomedical research, and many of the commonly applied sampling techniques require anesthesia. Acute effects of many sampling and anesthesia procedures may confound the results, but those effects are incompletely characterized. We here compare the effects of four common anesthesia procedures (inhalation anesthesia with ether (EA) or isoflurane (IA) and intraperitoneal injection anesthesia with xylazin/ketamine (XKA) or medetomidine/midazolam/fentanyl (MMFA)) on plasma concentrations of glucose, lactate, non-esterified fatty acids (NEFAs), and corticosterone in blood obtained from a previously implanted jugular vein (JV) catheter with the effect of JV blood sampling from non-anesthetized, freely-moving rats (JV-NA). Also, we included in the comparison two other blood sampling procedures usually performed without anesthesia (NA), i.e., puncture of the saphenic vein (SV) and tail incision (TI). Whereas the control procedure (JV-NA) did not significantly affect any of the target parameters, plasma glucose increased from 14 (JV-IA) to 44 (JV-MMFA) % (all Ps=0.05 when compared with the control procedure) in all blood samples collected in anesthesia and was 12 and 14% lower (both Pssamples, respectively. Plasma lactate increased from 74 (JV-IA) to 226% (SV-NA) (all Pssampling and anesthesia procedures except for JV-XKA and JV-MMF. Plasma NEFAs increased to 52% (P0.05). Finally, only the JV-EA and the JV-MMFA procedures increased plasma corticosterone (+525 and +353%, respectively, both Pssampling procedures can have profound acute effects on plasma metabolite and hormone concentrations. This must be considered for the design and interpretation of blood sampling experiments in laboratory animals. (c) 2010. Published by Elsevier Inc.

  15. Non-destructive high-resolution thermal imaging techniques to evaluate wildlife and delicate biological samples

    Energy Technology Data Exchange (ETDEWEB)

    Lavers, C; Franklin, P; Franklin, P; Plowman, A; Sayers, G; Bol, J; Shepard, D; Fields, D, E-mail: brnc-radarcomms1@nrta.mod.u [Sensors Team, Plymouth University at Britannia Royal Naval College, Dartmouth, Devon (United Kingdom) and Paignton Zoological Park, Paignton, Devon (United Kingdom); Thermal Wave Imaging, Inc., 845 Livernoise St, Ferndale, MI (United States); Buckfast Butterfly and Otter Sanctuary, Buckfast, Devon (United Kingdom)

    2009-07-01

    Thermal imaging cameras now allows routine monitoring of dangerous yet endangered wildlife in captivity. This study looks at the potential applications of radiometrically calibrated thermal data to wildlife, as well as providing parameters for future materials applications. We present a non-destructive active testing technique suitable for enhancing imagery contrast of thin or delicate biological specimens yielding improved thermal contrast at room temperature, for analysis of sample thermal properties. A broad spectrum of animals is studied with different textured surfaces, reflective and emissive properties in the infra red part of the electromagnetic spectrum. Some surface features offer biomimetic materials design opportunities.

  16. Nucleic acid sample preparation for in vitro molecular diagnosis: from conventional techniques to biotechnology.

    Science.gov (United States)

    Rahman, Md Mahbubor; Elaissari, Abdelhamid

    2012-11-01

    Nucleic acid (DNA and RNA)-based molecular diagnosis is a promising laboratory technique because of its ability to identify disease accurately. However, one of its disadvantages is the inevitable purification and detection of nucleic acids from other contaminated entities. Different nano- and microparticles have been developed for use in an advanced, efficient high-throughput autosystem for the purification and detection of nucleic acid samples for use in molecular diagnoses. In this review, we discuss recent advances in the development of particle-based nucleic acid purification and detection. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. Inflammatory Biomarkers and Risk of Schizophrenia: A 2-Sample Mendelian Randomization Study.

    Science.gov (United States)

    Hartwig, Fernando Pires; Borges, Maria Carolina; Horta, Bernardo Lessa; Bowden, Jack; Davey Smith, George

    2017-12-01

    Positive associations between inflammatory biomarkers and risk of psychiatric disorders, including schizophrenia, have been reported in observational studies. However, conventional observational studies are prone to bias, such as reverse causation and residual confounding, thus limiting our understanding of the effect (if any) of inflammatory biomarkers on schizophrenia risk. To evaluate whether inflammatory biomarkers have an effect on the risk of developing schizophrenia. Two-sample mendelian randomization study using genetic variants associated with inflammatory biomarkers as instrumental variables to improve inference. Summary association results from large consortia of candidate gene or genome-wide association studies, including several epidemiologic studies with different designs, were used. Gene-inflammatory biomarker associations were estimated in pooled samples ranging from 1645 to more than 80 000 individuals, while gene-schizophrenia associations were estimated in more than 30 000 cases and more than 45 000 ancestry-matched controls. In most studies included in the consortia, participants were of European ancestry, and the prevalence of men was approximately 50%. All studies were conducted in adults, with a wide age range (18 to 80 years). Genetically elevated circulating levels of C-reactive protein (CRP), interleukin-1 receptor antagonist (IL-1Ra), and soluble interleukin-6 receptor (sIL-6R). Risk of developing schizophrenia. Individuals with schizophrenia or schizoaffective disorders were included as cases. Given that many studies contributed to the analyses, different diagnostic procedures were used. The pooled odds ratio estimate using 18 CRP genetic instruments was 0.90 (random effects 95% CI, 0.84-0.97; P = .005) per 2-fold increment in CRP levels; consistent results were obtained using different mendelian randomization methods and a more conservative set of instruments. The odds ratio for sIL-6R was 1.06 (95% CI, 1.01-1.12; P = .02

  18. Enhanced retention of drop vertical jump landing technique: A randomized controlled trial.

    Science.gov (United States)

    Welling, Wouter; Benjaminse, Anne; Gokeler, Alli; Otten, Bert

    2016-02-01

    External focus instructions have been shown to result in superior motor performance compared to internal focus instructions. Using an EF may help to optimize current anterior cruciate ligament (ACL) injury prevention programs. The purpose of the current study was to investigate the effects of instructions on landing technique and performance by comparing an external focus (EF), internal focus (IF), video (VI) and control (CTRL) group. Subjects (age 22.50±1.62years, height 179.70±10.43cm, mass 73.98±12.68kg) were randomly assigned to IF (n=10), EF (n=10), VI (n=10) or CTRL group (n=10). Landing was assessed from a drop vertical jump (DVJ) in five sessions: pretest, two training blocks (TR1 and TR2) and directly after the training sessions (post test) and retention test 1week later. Group specific instructions were offered in TR1 and TR2. Landing technique was assessed with the Landing Error Scoring System (LESS) and jump height was taken as performance measure. The results show that males in the VI group and females both in the VI and EF groups significantly improved jump-landing technique. Retention was achieved and jump height was maintained for males in the VI group and females both in the VI and EF groups. It is therefore concluded that EF and VI instructions have great potential in ACL injury prevention. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Techniques for avoiding discrimination errors in the dynamic sampling of condensable vapors

    Science.gov (United States)

    Lincoln, K. A.

    1983-01-01

    In the mass spectrometric sampling of dynamic systems, measurements of the relative concentrations of condensable and noncondensable vapors can be significantly distorted if some subtle, but important, instrumental factors are overlooked. Even with in situ measurements, the condensables are readily lost to the container walls, and the noncondensables can persist within the vacuum chamber and yield a disproportionately high output signal. Where single pulses of vapor are sampled this source of error is avoided by gating either the mass spectrometer ""on'' or the data acquisition instrumentation ""on'' only during the very brief time-window when the initial vapor cloud emanating directly from the vapor source passes through the ionizer. Instrumentation for these techniques is detailed and its effectiveness is demonstrated by comparing gated and nongated spectra obtained from the pulsed-laser vaporization of several materials.

  20. Validation of the 2008 Landsat Burned Area Ecv Product for North America Using Stratified Random Sampling

    Science.gov (United States)

    Brunner, N. M.; Mladinich, C. S.; Caldwell, M. K.; Beal, Y. J. G.

    2014-12-01

    The U.S. Geological Survey is generating a suite of Essential Climate Variables (ECVs) products, as defined by the Global Climate Observing System, from the Landsat data archive. Validation protocols for these products are being established, incorporating the Committee on Earth Observing Satellites Land Product Validation Subgroup's best practice guidelines and validation hierarchy stages. The sampling design and accuracy measures follow the methodology developed by the European Space Agency's Climate Change Initiative Fire Disturbance (fire_cci) project (Padilla and others, 2014). A rigorous validation was performed on the 2008 Burned Area ECV (BAECV) prototype product, using a stratified random sample of 48 Thiessen scene areas overlaying Landsat path/rows distributed across several terrestrial biomes throughout North America. The validation reference data consisted of fourteen sample sites acquired from the fire_cci project and the remaining new samples sites generated from a densification of the stratified sampling for North America. The reference burned area polygons were generated using the ABAMS (Automatic Burned Area Mapping) software (Bastarrika and others, 2011; Izagirre, 2014). Accuracy results will be presented indicating strengths and weaknesses of the BAECV algorithm.Bastarrika, A., Chuvieco, E., and Martín, M.P., 2011, Mapping burned areas from Landsat TM/ETM+ data with a two-phase algorithm: Balancing omission and commission errors: Remote Sensing of Environment, v. 115, no. 4, p. 1003-1012.Izagirre, A.B., 2014, Automatic Burned Area Mapping Software (ABAMS), Preliminary Documentation, Version 10 v4,: Vitoria-Gasteiz, Spain, University of Basque Country, p. 27.Padilla, M., Chuvieco, E., Hantson, S., Theis, R., and Sandow, C., 2014, D2.1 - Product Validation Plan: UAH - University of Alcalá de Henares (Spain), 37 p.

  1. The use of ESR technique for assessment of heating temperatures of archaeological lentil samples

    Science.gov (United States)

    Aydaş, Canan; Engin, Birol; Dönmez, Emel Oybak; Belli, Oktay

    2010-01-01

    Heat-induced paramagnetic centers in modern and archaeological lentils ( Lens culinaris, Medik.) were studied by X-band (9.3 GHz) electron spin resonance (ESR) technique. The modern red lentil samples were heated in an electrical furnace at increasing temperatures in the range 70-500 °C. The ESR spectral parameters (the intensity, g-value and peak-to-peak line width) of the heat-induced organic radicals were investigated for modern red lentil ( Lens culinaris, Medik.) samples. The obtained ESR spectra indicate that the relative number of heat-induced paramagnetic species and peak-to-peak line widths depends on the temperature and heating time of the modern lentil. The g-values also depend on the heating temperature but not heating time. Heated modern red lentils produced a range of organic radicals with g-values from g = 2.0062 to 2.0035. ESR signals of carbonised archaeological lentil samples from two archaeological deposits of the Van province in Turkey were studied and g-values, peak-to-peak line widths, intensities and elemental compositions were compared with those obtained for modern samples in order to assess at which temperature these archaeological lentils were heated in prehistoric sites. The maximum temperatures of the previous heating of carbonised UA5 and Y11 lentil seeds are as follows about 500 °C and above 500 °C, respectively.

  2. A novel non-invasive diagnostic sampling technique for cutaneous leishmaniasis.

    Directory of Open Access Journals (Sweden)

    Yasaman Taslimi

    2017-07-01

    Full Text Available Accurate diagnosis of cutaneous leishmaniasis (CL is important for chemotherapy and epidemiological studies. Common approaches for Leishmania detection involve the invasive collection of specimens for direct identification of amastigotes by microscopy and the culturing of promastigotes from infected tissues. Although these techniques are highly specific, they require highly skilled health workers and have the inherent risks of all invasive procedures, such as pain and risk of bacterial and fungal super-infection. Therefore, it is essential to reduce discomfort, potential infection and scarring caused by invasive diagnostic approaches especially for children. In this report, we present a novel non-invasive method, that is painless, rapid and user-friendly, using sequential tape strips for sampling and isolation of DNA from the surface of active and healed skin lesions of CL patients. A total of 119 patients suspected of suffering from cutaneous leishmaniasis with different clinical manifestations were recruited and samples were collected both from their lesions and from uninfected areas. In addition, 15 fungal-infected lesions and 54 areas of healthy skin were examined. The duration of sampling is short (less than one minute and species identification by PCR is highly specific and sensitive. The sequential tape stripping sampling method is a sensitive, non-invasive and cost-effective alternative to traditional diagnostic assays and it is suitable for field studies as well as for use in health care centers.

  3. Conic sampling: an efficient method for solving linear and quadratic programming by randomly linking constraints within the interior.

    Science.gov (United States)

    Serang, Oliver

    2012-01-01

    Linear programming (LP) problems are commonly used in analysis and resource allocation, frequently surfacing as approximations to more difficult problems. Existing approaches to LP have been dominated by a small group of methods, and randomized algorithms have not enjoyed popularity in practice. This paper introduces a novel randomized method of solving LP problems by moving along the facets and within the interior of the polytope along rays randomly sampled from the polyhedral cones defined by the bounding constraints. This conic sampling method is then applied to randomly sampled LPs, and its runtime performance is shown to compare favorably to the simplex and primal affine-scaling algorithms, especially on polytopes with certain characteristics. The conic sampling method is then adapted and applied to solve a certain quadratic program, which compute a projection onto a polytope; the proposed method is shown to outperform the proprietary software Mathematica on large, sparse QP problems constructed from mass spectometry-based proteomics.

  4. Improvement of sampling strategies for randomly distributed hotspots in soil applying a computerized simulation considering the concept of uncertainty.

    Science.gov (United States)

    Hildebrandt, Thomas; Pick, Denis; Einax, Jürgen W

    2012-02-01

    The pollution of soil and environment as a result of human activity is a major problem. Nowadays, the determination of local contaminations is of interest for environmental remediation. These hotspots can have various toxic effects on plants, animals, humans, and the whole ecological system. However, economical and juridical consequences are also possible, e.g., high costs for remediation measures. In this study three sampling strategies (simple random sampling, stratified sampling, and systematic sampling) were applied on randomly distributed hotspot contaminations to prove their efficiency in term of finding hotspots. The results were used for the validation of a computerized simulation. This application can simulate the contamination on a field, the sampling pattern, and a virtual sampling. A constant hit rate showed that none of the sampling patterns could reach better results than others. Furthermore, the uncertainty associated with the results is described by confidence intervals. It is to be considered that the uncertainty during sampling is enormous and will decrease slightly, even the number of samples applied was increased to an unreasonable amount. It is hardly possible to identify the exact number of randomly distributed hotspot contaminations by statistical sampling. But a range of possible results could be calculated. Depending on various parameters such as shape and size of the area, number of hotspots, and sample quantity, optimal sampling strategies could be derived. Furthermore, an estimation of bias arising from sampling methodology is possible. The developed computerized simulation is an innovative tool for optimizing sampling strategies in terrestrial compartments for hotspot distributions.

  5. RANdom SAmple Consensus (RANSAC) algorithm for material-informatics: application to photovoltaic solar cells.

    Science.gov (United States)

    Kaspi, Omer; Yosipof, Abraham; Senderowitz, Hanoch

    2017-06-06

    An important aspect of chemoinformatics and material-informatics is the usage of machine learning algorithms to build Quantitative Structure Activity Relationship (QSAR) models. The RANdom SAmple Consensus (RANSAC) algorithm is a predictive modeling tool widely used in the image processing field for cleaning datasets from noise. RANSAC could be used as a "one stop shop" algorithm for developing and validating QSAR models, performing outlier removal, descriptors selection, model development and predictions for test set samples using applicability domain. For "future" predictions (i.e., for samples not included in the original test set) RANSAC provides a statistical estimate for the probability of obtaining reliable predictions, i.e., predictions within a pre-defined number of standard deviations from the true values. In this work we describe the first application of RNASAC in material informatics, focusing on the analysis of solar cells. We demonstrate that for three datasets representing different metal oxide (MO) based solar cell libraries RANSAC-derived models select descriptors previously shown to correlate with key photovoltaic properties and lead to good predictive statistics for these properties. These models were subsequently used to predict the properties of virtual solar cells libraries highlighting interesting dependencies of PV properties on MO compositions.

  6. Faster the better: a reliable technique to sample anopluran lice in large hosts.

    Science.gov (United States)

    Leonardi, María Soledad

    2014-06-01

    Among Anoplura, the family Echinophthiriidae includes those species that infest mainly the pinnipeds. Working with large hosts implies methodological considerations as the time spent in the sampling, and the way in that the animal is restrained. Previous works on echinophthiriids combined a diverse array of analyses including field counts of lice and in vitro observations. To collect lice, the authors used forceps, and each louse was collected individually. This implied a long manipulation time, i.e., ≈60 min and the need to physically and/or chemically immobilize the animal. The present work described and discussed for the first a sample technique that minimized the manipulation time and also avoiding the use of anesthesia. This methodology implied combing the host's pelage with a fine-tooth plastic comb, as used in the treatment of human pediculosis, and keeping the comb with the lice retained in a Ziploc® bag with ethanol. This technique was used successfully in studies of population dynamic, habitat selection, and transmission pattern, being a reliable methodology. Lice are collected entirely and are in a good condition to prepare them for mounting for studying under light or scanning electron microscopy. Moreover, the use of the plastic comb protects from damaging taxonomically important structures as spines being also recommended to reach taxonomic or morphological goals.

  7. Chemometric compositional analysis of phenolic compounds in fermenting samples and wines using different infrared spectroscopy techniques.

    Science.gov (United States)

    Aleixandre-Tudo, Jose Luis; Nieuwoudt, Helene; Aleixandre, Jose Luis; du Toit, Wessel

    2018-01-01

    The wine industry requires reliable methods for the quantification of phenolic compounds during the winemaking process. Infrared spectroscopy appears as a suitable technique for process control and monitoring. The ability of Fourier transform near infrared (FT-NIR), attenuated total reflectance mid infrared (ATR-MIR) and Fourier transform infrared (FT-IR) spectroscopies to predict compositional phenolic levels during red wine fermentation and aging was investigated. Prediction models containing a large number of samples collected over two vintages from several industrial fermenting tanks as well as wine samples covering a varying number of vintages were validated. FT-NIR appeared as the most accurate technique to predict the phenolic content. Although slightly less accurate models were observed, ATR-MIR and FT-IR can also be used for the prediction of the majority of phenolic measurements. Additionally, the slope and intercept test indicated a systematic error for the three spectroscopies which seems to be slightly more pronounced for HPLC generated phenolics data than for the spectrophotometric parameters. However, the results also showed that the predictions made with the three instruments are statistically comparable. The robustness of the prediction models was also investigated and discussed. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Sample size and power for a stratified doubly randomized preference design.

    Science.gov (United States)

    Cameron, Briana; Esserman, Denise A

    2016-11-21

    The two-stage (or doubly) randomized preference trial design is an important tool for researchers seeking to disentangle the role of patient treatment preference on treatment response through estimation of selection and preference effects. Up until now, these designs have been limited by their assumption of equal preference rates and effect sizes across the entire study population. We propose a stratified two-stage randomized trial design that addresses this limitation. We begin by deriving stratified test statistics for the treatment, preference, and selection effects. Next, we develop a sample size formula for the number of patients required to detect each effect. The properties of the model and the efficiency of the design are established using a series of simulation studies. We demonstrate the applicability of the design using a study of Hepatitis C treatment modality, specialty clinic versus mobile medical clinic. In this example, a stratified preference design (stratified by alcohol/drug use) may more closely capture the true distribution of patient preferences and allow for a more efficient design than a design which ignores these differences (unstratified version). © The Author(s) 2016.

  9. Notes on interval estimation of the generalized odds ratio under stratified random sampling.

    Science.gov (United States)

    Lui, Kung-Jong; Chang, Kuang-Chao

    2013-05-01

    It is not rare to encounter the patient response on the ordinal scale in a randomized clinical trial (RCT). Under the assumption that the generalized odds ratio (GOR) is homogeneous across strata, we consider four asymptotic interval estimators for the GOR under stratified random sampling. These include the interval estimator using the weighted-least-squares (WLS) approach with the logarithmic transformation (WLSL), the interval estimator using the Mantel-Haenszel (MH) type of estimator with the logarithmic transformation (MHL), the interval estimator using Fieller's theorem with the MH weights (FTMH) and the interval estimator using Fieller's theorem with the WLS weights (FTWLS). We employ Monte Carlo simulation to evaluate the performance of these interval estimators by calculating the coverage probability and the average length. To study the bias of these interval estimators, we also calculate and compare the noncoverage probabilities in the two tails of the resulting confidence intervals. We find that WLSL and MHL can generally perform well, while FTMH and FTWLS can lose either precision or accuracy. We further find that MHL is likely the least biased. Finally, we use the data taken from a study of smoking status and breathing test among workers in certain industrial plants in Houston, Texas, during 1974 to 1975 to illustrate the use of these interval estimators.

  10. Advantages of Arthroscopic Rotator Cuff Repair With a Transosseous Suture Technique: A Prospective Randomized Controlled Trial.

    Science.gov (United States)

    Randelli, Pietro; Stoppani, Carlo Alberto; Zaolino, Carlo; Menon, Alessandra; Randelli, Filippo; Cabitza, Paolo

    2017-07-01

    Rotator cuff tear is a common finding in patients with painful, poorly functioning shoulders. The surgical management of this disorder has improved greatly and can now be fully arthroscopic. To evaluate clinical and radiological results of arthroscopic rotator cuff repair using 2 different techniques: single-row anchor fixation versus transosseous hardware-free suture repair. Randomized controlled trial; Level of evidence, 1. Sixty-nine patients with rotator cuff tears were enrolled: 35 patients were operated with metal anchors and 34 with standardized transosseous repair. The patients were clinically evaluated before surgery, during the 28 days after surgery, and at least 1 year after the operation by the use of validated rating scores (Constant score, QuickDASH, and numerical rating scale [NRS]). Final follow-up was obtained at more than 3 years by a QuickDASH evaluation to detect any difference from the previous follow-up. During the follow-up, rotator cuff integrity was determined through magnetic resonance imaging and was classified according to the 5 Sugaya categories. Patients operated with the transosseous technique had significantly less pain, especially from the 15th postoperative day: In the third week, the mean NRS value for the anchor group was 3.00 while that for transosseous group was 2.46 ( P = .02); in the fourth week, the values were 2.44 and 1.76, respectively ( P < .01). No differences in functional outcome were noted between the 2 groups at the final evaluation. In the evaluation of rotator cuff repair integrity, based on Sugaya magnetic resonance imaging classification, no significant difference was found between the 2 techniques in terms of retear rate ( P = .81). No significant differences were found between the 2 arthroscopic repair techniques in terms of functional and radiological results. However, postoperative pain decreased more quickly after the transosseous procedure, which therefore emerges as a possible improvement in the surgical

  11. Control capacity and a random sampling method in exploring controllability of complex networks.

    Science.gov (United States)

    Jia, Tao; Barabási, Albert-László

    2013-01-01

    Controlling complex systems is a fundamental challenge of network science. Recent advances indicate that control over the system can be achieved through a minimum driver node set (MDS). The existence of multiple MDS's suggests that nodes do not participate in control equally, prompting us to quantify their participations. Here we introduce control capacity quantifying the likelihood that a node is a driver node. To efficiently measure this quantity, we develop a random sampling algorithm. This algorithm not only provides a statistical estimate of the control capacity, but also bridges the gap between multiple microscopic control configurations and macroscopic properties of the network under control. We demonstrate that the possibility of being a driver node decreases with a node's in-degree and is independent of its out-degree. Given the inherent multiplicity of MDS's, our findings offer tools to explore control in various complex systems.

  12. Randomized Controlled Trial of Attention Bias Modification in a Racially Diverse, Socially Anxious, Alcohol Dependent Sample

    Science.gov (United States)

    Clerkin, Elise M.; Magee, Joshua C.; Wells, Tony T.; Beard, Courtney; Barnett, Nancy P.

    2016-01-01

    Objective Attention biases may be an important treatment target for both alcohol dependence and social anxiety. This is the first ABM trial to investigate two (vs. one) targets of attention bias within a sample with co-occurring symptoms of social anxiety and alcohol dependence. Additionally, we used trial-level bias scores (TL-BS) to capture the phenomena of attention bias in a more ecologically valid, dynamic way compared to traditional attention bias scores. Method Adult participants (N=86; 41% Female; 52% African American; 40% White) with elevated social anxiety symptoms and alcohol dependence were randomly assigned to an 8-session training condition in this 2 (Social Anxiety ABM vs. Social Anxiety Control) by 2 (Alcohol ABM vs. Alcohol Control) design. Symptoms of social anxiety, alcohol dependence, and attention bias were assessed across time. Results Multilevel models estimated the trajectories for each measure within individuals, and tested whether these trajectories differed according to the randomized training conditions. Across time, there were significant or trending decreases in all attention TL-BS parameters (but not traditional attention bias scores) and most symptom measures. However, there were not significant differences in the trajectories of change between any ABM and control conditions for any symptom measures. Conclusions These findings add to previous evidence questioning the robustness of ABM and point to the need to extend the effects of ABM to samples that are racially diverse and/or have co-occurring psychopathology. The results also illustrate the potential importance of calculating trial-level attention bias scores rather than only including traditional bias scores. PMID:27591918

  13. Randomized controlled trial: hybrid technique using balloon dilation of the frontal sinus drainage pathway.

    Science.gov (United States)

    Hathorn, Iain F; Pace-Asciak, Pia; Habib, Al-Rahim R; Sunkaraneni, Vishnu; Javer, Amin R

    2015-02-01

    The objectives of this study were as follows: (1) to evaluate frontal sinus ostial patency following balloon dilation with the Ventera Sinus Dilation System, compared with frontal sinusotomy (Draf 2a); and (2) to compare mean blood loss and mean surgical time for frontal sinusotomy using balloon dilation compared with traditional surgical methods. A single blinded, randomized, controlled, prospective study was performed at St. Paul's Sinus Center, Vancouver, a tertiary referral rhinology center. Thirty patients undergoing functional endoscopic sinus surgery (FESS) for chronic rhinosinusitis (CRS) were randomized to a hybrid approach with exposure of the frontal recess using standard instrumentation and then balloon dilation of 1 frontal sinus drainage pathway and traditional frontal sinusotomy for the opposite side. Blood loss and surgical time for opening the frontal sinus drainage pathway was recorded for each side. Patients acted as their own controls. Ostial patency and size were assessed 5 weeks and 3 months postoperatively using endoscopy. Ostial patency was also recorded at 1 year following surgery. All frontal sinus ostia in both groups (n = 30) were successfully opened and were patent with both techniques 3 months postoperatively. All frontal sinus ostia assessed at 1 year (73%) remained patent and none required revision frontal surgery. Balloon dilation showed a mean surgical time of 655 seconds compared to 898 seconds for traditional FESS (p = 0.03). Mean blood loss was less with balloon dilation (58 mL vs 91 mL; p = 0.008). A hybrid balloon technique successfully dilates the frontal sinus drainage pathway with reduced blood loss. Also, short-term patency appears to be comparable to traditional frontal sinusotomy. © 2014 ARS-AAOA, LLC.

  14. Random sampling or geostatistical modelling? Choosing between design-based and model-based sampling strategies for soil (with discussion)

    NARCIS (Netherlands)

    Brus, D.J.; Gruijter, de J.J.

    1997-01-01

    Classical sampling theory has been repeatedly identified with classical statistics which assumes that data are identically and independently distributed. This explains the switch of many soil scientists from design-based sampling strategies, based on classical sampling theory, to the model-based

  15. Optimal hand washing technique to minimize bacterial contamination before neuraxial anesthesia: a randomized control trial.

    Science.gov (United States)

    Siddiqui, N; Friedman, Z; McGeer, A; Yousefzadeh, A; Carvalho, J C; Davies, S

    2017-02-01

    Infectious complications related to neuraxial anesthesia may result in adverse outcomes. There are no best practice guidelines regarding hand-sanitizing measures specifically for these procedures. The objective of this study was to compare the growth of microbial organisms on the operator's forearm between five common techniques of hand washing for labor epidurals. In this single blind randomized controlled trial, all anesthesiologists performing labor epidurals in a tertiary care hospital were randomized into five study groups: hand washing with alcohol gel only up to elbows (Group A); hand washing with soap up to elbows, sterile towel to dry, followed by alcohol gel (Group B); hand washing with soap up to elbows, non-sterile towel to dry, followed by alcohol gel (Group C); hand washing with soap up to elbows, non-sterile towel to dry (Group D) or hand washing with soap up to elbows, sterile towel to dry (Group E). The number of colonies for each specimen/rate per 100 specimens on one or both arms per group was measured. The incidence of colonization was 2.5, 23.0, 18.5, 114.5, and 53.0 in Groups A, B, C, D and E, respectively. Compared to Group A, the odds ratio of bacterial growth for Group B was 1.52 (P=0.519), Group C 5.44 (P=0.003), Group D 13.82 (Phand-sanitizing practices among epidural practitioners. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Endodontic pathogens causing deep neck space infections: clinical impact of different sampling techniques and antibiotic susceptibility.

    Science.gov (United States)

    Poeschl, Paul W; Crepaz, Valentina; Russmueller, Guenter; Seemann, Rudolf; Hirschl, Alexander M; Ewers, Rolf

    2011-09-01

    The aims of the present study were to compare microbial populations in patients suffering from deep neck space abscesses caused by primary endodontic infections by sampling the infections with aspiration or swabbing techniques and to determine the susceptibility rates of the isolated bacteria to commonly used antibiotics. A total of 89 patients with deep neck space abscesses caused by primary endodontic infections requiring extraoral incision and drainage under general anesthesia were included. Either aspiration or swabbing was used to sample microbial pus specimens. The culture of the microbial specimens and susceptibility testing were performed following standard procedures. A total of 142 strains were recovered from 76 patients. In 13 patients, no bacteria were found. The predominant bacteria observed were streptococci (36%), staphylococci (13%), Prevotella (8%), and Peptostreptococcus (6%). A statistically significant greater number of obligate anaerobes were found in the aspiration group. The majority of patients presented a mixed aerobic-anaerobic population of bacterial flora (62%). The antibiotic resistance rates for the predominant bacteria were 10% for penicillin G, 9% for amoxicillin, 0% for amoxicillin clavulanate, 24% for clindamycin, and 24% for erythromycin. The results of our study indicated that a greater number of anaerobes were found when sampling using the aspiration technique. Penicillin G and aminopenicillins alone are not always sufficient for the treatment of severe deep neck space abscesses; beta-lactamase inhibitor combinations are more effective. Bacteria showed significant resistant rates to clindamycin. Thus, its single use in penicillin-allergic patients has to be carefully considered. Copyright © 2011 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  17. Random Photon Absorption Model Elucidates How Early Gain Control in Fly Photoreceptors Arises from Quantal Sampling

    Science.gov (United States)

    Song, Zhuoyi; Zhou, Yu; Juusola, Mikko

    2016-01-01

    Many diurnal photoreceptors encode vast real-world light changes effectively, but how this performance originates from photon sampling is unclear. A 4-module biophysically-realistic fly photoreceptor model, in which information capture is limited by the number of its sampling units (microvilli) and their photon-hit recovery time (refractoriness), can accurately simulate real recordings and their information content. However, sublinear summation in quantum bump production (quantum-gain-nonlinearity) may also cause adaptation by reducing the bump/photon gain when multiple photons hit the same microvillus simultaneously. Here, we use a Random Photon Absorption Model (RandPAM), which is the 1st module of the 4-module fly photoreceptor model, to quantify the contribution of quantum-gain-nonlinearity in light adaptation. We show how quantum-gain-nonlinearity already results from photon sampling alone. In the extreme case, when two or more simultaneous photon-hits reduce to a single sublinear value, quantum-gain-nonlinearity is preset before the phototransduction reactions adapt the quantum bump waveform. However, the contribution of quantum-gain-nonlinearity in light adaptation depends upon the likelihood of multi-photon-hits, which is strictly determined by the number of microvilli and light intensity. Specifically, its contribution to light-adaptation is marginal (≤ 1%) in fly photoreceptors with many thousands of microvilli, because the probability of simultaneous multi-photon-hits on any one microvillus is low even during daylight conditions. However, in cells with fewer sampling units, the impact of quantum-gain-nonlinearity increases with brightening light. PMID:27445779

  18. Acute effects of mobile phone radiations on subtle energy levels of teenagers using electrophotonic imaging technique: A randomized controlled study.

    Science.gov (United States)

    Bhargav, Hemant; Srinivasan, T M; Bista, Suman; Mooventhan, A; Suresh, Vandana; Hankey, Alex; Nagendra, H R

    2017-01-01

    Mobile phones induce radio frequency electromagnetic field (RF-EMF) which has been found to affect subtle energy levels of adults through Electrophotonic Imaging (EPI) technique in a previous pilot study. We enrolled 61 healthy right-handed healthy teenagers (22 males and 39 females) in the age range of 17.40 ± 0.24 years from educational institutes in Bengaluru. Subjects were randomly divided into two groups: (1) (mobile phone in ON mode [MPON] at right ear) and (2) mobile phone in OFF mode (MPOF). Subtle energy levels of various organs of the subjects were measured using gas discharge visualization Camera Pro device, in double-blind conditions, at two points of time: (1) baseline and (2) after 15 min of MPON/MPOF exposure. As the data were found normally distributed, paired and independent samples t-test were applied to perform within and between group comparisons, respectively. The subtle energy levels were significantly reduced after RF-EMF exposure in MPON group as compared to MPOF group for following areas: (a) Pancreas (P = 0.001), (b) thyroid gland (P = 0.002), (c) cerebral cortex (P teenagers. Future studies should try to correlate these findings with respective biochemical markers and standard radio-imaging techniques.

  19. Performance evaluation of an importance sampling technique in a Jackson network

    Science.gov (United States)

    brahim Mahdipour, E.; Masoud Rahmani, Amir; Setayeshi, Saeed

    2014-03-01

    Importance sampling is a technique that is commonly used to speed up Monte Carlo simulation of rare events. However, little is known regarding the design of efficient importance sampling algorithms in the context of queueing networks. The standard approach, which simulates the system using an a priori fixed change of measure suggested by large deviation analysis, has been shown to fail in even the simplest network settings. Estimating probabilities associated with rare events has been a topic of great importance in queueing theory, and in applied probability at large. In this article, we analyse the performance of an importance sampling estimator for a rare event probability in a Jackson network. This article carries out strict deadlines to a two-node Jackson network with feedback whose arrival and service rates are modulated by an exogenous finite state Markov process. We have estimated the probability of network blocking for various sets of parameters, and also the probability of missing the deadline of customers for different loads and deadlines. We have finally shown that the probability of total population overflow may be affected by various deadline values, service rates and arrival rates.

  20. Evaluation of surface sampling techniques for collection of Bacillus spores on common drinking water pipe materials.

    Science.gov (United States)

    Packard, Benjamin H; Kupferle, Margaret J

    2010-01-01

    Drinking water utilities may face biological contamination of the distribution system from a natural incident or deliberate contamination. Determining the extent of contamination or the efficacy of decontamination is a challenge, because it may require sampling of the wetted surfaces of distribution infrastructure. This study evaluated two sampling techniques that utilities might use to sample exhumed pipe sections. Polyvinyl chloride (PVC), cement-lined ductile iron, and ductile iron pipe coupons (3 cm x 14 cm) cut from new water main piping were conditioned for three months in dechlorinated Cincinnati, Ohio tap water. Coupons were spiked with Bacillus atrophaeus subsp. globigii, a surrogate for Bacillus anthracis. Brushing and scraping were used to recover the inoculated spores from the coupons. Mean recoveries for all materials ranged from 37 +/- 30% to 43 +/- 20% for brushing vs. 24 +/- 10% to 51 +/- 29% for scraping. On cement-lined pipe, brushing yielded a significantly different recovery than scraping. No differences were seen between brushing and scraping the PVC and iron pipe coupons. Mean brushing and scraping recoveries from PVC coupons were more variable than mean recoveries from cement-lined and iron coupons. Spore retention differed between pipe materials and the presence of established biofilms also had an impact. Conditioned PVC coupons (with established biofilms) had significantly lower spore retention (31 +/- 11%) than conditioned cement-lined coupons (61 +/- 14%) and conditioned iron coupons (71 +/- 8%).

  1. Three-Trocar Sleeve Gastrectomy vs Standard Five-Trocar Technique: a Randomized Controlled Trial.

    Science.gov (United States)

    Consalvo, Vincenzo; Salsano, Vincenzo; Sarno, Gerardo; Chaze, Iphigenie

    2017-12-01

    Bariatric surgery is a treatment for morbid obesity. Different surgical procedures have been described in order to obtain excess weight loss (EWL), but currently laparoscopic sleeve gastrectomy is the most commonly performed procedure throughout the world. Reducing abdominal wall trauma and increasing the aesthetic result are important goals for all bariatric surgeons. We conducted a randomized, controlled trial in order to assess if the three-trocar sleeve gastrectomy can be safely carried out or should be abandoned. From September 2016 to February 2017, 90 patients were enrolled in our trial. Each patients was evaluated by a multidisciplinary team before surgery. Two groups were created after application of the inclusion and exclusion criteria. The primary endpoint was to define the features of early post-operative complications of patients in group 1 (the three-trocar technique-the experimental group) compared to group 2 (five-trocar technique-the control group). The secondary endpoints were to evaluate any differences between the two groups concerning post-operative pain and patients' satisfaction with the aesthetic results. There was no difference between the two groups concerning age, sex distribution, weight, and BMI. The rate of co-morbidities was similar in both groups. Operative time was inferior in the control group, but patient satisfaction was better in the three-trocar sleeve gastrectomy group. The three-trocar sleeve gastrectomy can be safely carried out with a modest increase in operative time, without additional early surgical complications and with a greater patient aesthetic satisfaction. researchregistry2386.

  2. Notes on interval estimation of the gamma correlation under stratified random sampling.

    Science.gov (United States)

    Lui, Kung-Jong; Chang, Kuang-Chao

    2012-07-01

    We have developed four asymptotic interval estimators in closed forms for the gamma correlation under stratified random sampling, including the confidence interval based on the most commonly used weighted-least-squares (WLS) approach (CIWLS), the confidence interval calculated from the Mantel-Haenszel (MH) type estimator with the Fisher-type transformation (CIMHT), the confidence interval using the fundamental idea of Fieller's Theorem (CIFT) and the confidence interval derived from a monotonic function of the WLS estimator of Agresti's α with the logarithmic transformation (MWLSLR). To evaluate the finite-sample performance of these four interval estimators and note the possible loss of accuracy in application of both Wald's confidence interval and MWLSLR using pooled data without accounting for stratification, we employ Monte Carlo simulation. We use the data taken from a general social survey studying the association between the income level and job satisfaction with strata formed by genders in black Americans published elsewhere to illustrate the practical use of these interval estimators. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Predictive value of testing random urine sample to detect microalbuminuria in diabetic subjects during outpatient visit.

    Science.gov (United States)

    Bouhanick, B; Berrut, G; Chameau, A M; Hallar, M; Bled, F; Chevet, B; Vergely, J; Rohmer, V; Fressinaud, P; Marre, M

    1992-01-01

    The predictive value of random urine sample during outpatient visit to predict persistent microalbuminuria was studied in 76 Type 1, insulin-dependent diabetic subjects, 61 Type 2, non-insulin-dependent diabetic subjects, and 72 Type 2, insulin-treated diabetic subjects. Seventy-six patients attended outpatient clinic during morning, and 133 during afternoon. Microalbuminuria was suspected if Urinary Albumin Excretion (UAE) exceeded 20 mg/l. All patients were hospitalized within 6 months following outpatient visit, and persistent microalbuminuria was assessed then if UAE was between 30 and 300 mg/24 h on 2-3 occasions in 3 urines samples. Of these 209 subjects eighty-three were also screened with Microbumintest (Ames-Bayer), a semi-quantitative method. Among the 209 subjects, 71 were positive both for microalbuminuria during outpatient visit and a persistent microalbuminuria during hospitalization: sensitivity 91.0%, specificity 83.2%, concordance 86.1%, and positive predictive value 76.3% (chi-squared test: 191; p less than 10(-4)). Data were not different for subjects examined on morning, or on afternoon. Among the 83 subjects also screened with Microbumintest, 22 displayed both a positive reaction and a persistent microalbuminuria: sensitivity 76%, specificity 81%, concordance 80%, and positive predictive value 69% (chi-squared test: 126; p less than 10(-4)). Both types of screening appeared equally effective during outpatient visit. Hence, a persistent microalbuminuria can be predicted during an outpatient visit in a diabetic clinic.

  4. Effectiveness of hand hygiene education among a random sample of women from the community.

    Science.gov (United States)

    Ubheeram, J; Biranjia-Hurdoyal, S D

    2017-03-01

    The effectiveness of hand hygiene education was investigated by studying the hand hygiene awareness and bacterial hand contamination among a random sample of 170 women in the community. Questionnaire was used to assess the hand hygiene awareness score, followed by swabbing of the dominant hand. Bacterial identification was done by conventional biochemical tests. Better hand hygiene awareness score was significantly associated with age, scarce bacterial growth and absence of potential pathogen (p hand samples, bacterial growth was noted in 155 (91.2%), which included 91 (53.5%) heavy growth, 53 (31.2%) moderate growth and 11 (6.47%) scanty growth. The presence of enteric bacteria was associated with long nails (49.4% vs 29.2%; p = 0.007; OR = 2.3; 95% CI: 1.25-4.44) while finger rings were associated with higher bacterial load (p = 0.003). Coliforms was significantly higher among women who had a lower hand hygiene awareness score, washed their hands at lower frequency (59.0% vs 32.8%; p = 0.003; OR = 2.9; 95% CI: 1.41-6.13) and used common soap as compared to antiseptic soaps (69.7% vs 30.3%, p = 0.000; OR = 4.11; 95% CI: 1.67-10.12). Level of hand hygiene awareness among the participants was satisfactory but not the compliance of hand washing practice, especially among the elders.

  5. Association between stalking victimisation and psychiatric morbidity in a random community sample.

    Science.gov (United States)

    Purcell, Rosemary; Pathé, Michele; Mullen, Paul E

    2005-11-01

    No studies have assessed psychopathology among victims of stalking who have not sought specialist help. To examine the associations between stalking victimisation and psychiatric morbidity in a representative community sample. A random community sample (n=1844) completed surveys examining the experience of harassment and current mental health. The 28-item General Health Questionnaire (GHQ-28) and the Impact of Event Scale were used to assess symptomatology in those reporting brief harassment (n=196) or protracted stalking (n=236) and a matched control group reporting no harassment (n=432). Rates of caseness on the GHQ-28 were higher among stalking victims (36.4%) than among controls (19.3%) and victims of brief harassment (21.9%). Psychiatric morbidity did not differ according to the recency of victimisation, with 34.1% of victims meeting caseness criteria 1 year after stalking had ended. In a significant minority of victims, stalking victimisation is associated with psychiatric morbidity that may persist long after it has ceased. Recognition of the immediate and long-term impacts of stalking is necessary to assist victims and help alleviate distress and long-term disability.

  6. Random sample community-based health surveys: does the effort to reach participants matter?

    Science.gov (United States)

    Messiah, Antoine; Castro, Grettel; Rodríguez de la Vega, Pura; Acuna, Juan M

    2014-12-15

    Conducting health surveys with community-based random samples are essential to capture an otherwise unreachable population, but these surveys can be biased if the effort to reach participants is insufficient. This study determines the desirable amount of effort to minimise such bias. A household-based health survey with random sampling and face-to-face interviews. Up to 11 visits, organised by canvassing rounds, were made to obtain an interview. Single-family homes in an underserved and understudied population in North Miami-Dade County, Florida, USA. Of a probabilistic sample of 2200 household addresses, 30 corresponded to empty lots, 74 were abandoned houses, 625 households declined to participate and 265 could not be reached and interviewed within 11 attempts. Analyses were performed on the 1206 remaining households. Each household was asked if any of their members had been told by a doctor that they had high blood pressure, heart disease including heart attack, cancer, diabetes, anxiety/ depression, obesity or asthma. Responses to these questions were analysed by the number of visit attempts needed to obtain the interview. Return per visit fell below 10% after four attempts, below 5% after six attempts and below 2% after eight attempts. As the effort increased, household size decreased, while household income and the percentage of interviewees active and employed increased; proportion of the seven health conditions decreased, four of which did so significantly: heart disease 20.4-9.2%, high blood pressure 63.5-58.1%, anxiety/depression 24.4-9.2% and obesity 21.8-12.6%. Beyond the fifth attempt, however, cumulative percentages varied by less than 1% and precision varied by less than 0.1%. In spite of the early and steep drop, sustaining at least five attempts to reach participants is necessary to reduce selection bias. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  7. Evaluation of injection techniques in the treatment of lateral epicondylitis: a prospective randomized clinical trial.

    Science.gov (United States)

    Okçu, Güvenir; Erkan, Serkan; Sentürk, Mehmet; Ozalp, R Taçkın; Yercan, H Serhat

    2012-01-01

    We aimed to compare the efficacy of two different injection techniques of local corticosteroid and local anesthetic in the management of lateral epicondylitis. This prospective study followed 80 consecutive patients who were diagnosed with lateral epicondylitis at our hospital outpatient clinic between 2005 and 2006. Patients were randomly assigned into two equal groups. Group 1 received a single injection of 1 ml betamethasone and 1 ml prilocaine on the lateral epicondyle at the point of maximum tenderness. Group 2 patients received an injection of the same drug mixture. Following the initial injection, the needle tip was redirected and reinserted down the bone approximately 30 to 40 times without emerging from the skin, creating a hematoma. Patients were evaluated with the Turkish version of the Disabilities of the Arm, Shoulder and Hand questionnaire before injection and at the final follow-up. The unpaired t-test and chi-square tests were used to compare results. Sixteen patients in Group 1 and 15 patients in Group 2 were lost during follow-up. The average follow-up period of the remaining 49 patients was 21.6 months. There were no significant differences between the two groups with regard to gender, age, follow-up period, symptom duration, involvement side and number of dominant limbs. The Turkish DASH scores of Group 2 were significantly lower than those of Group 1 (p=0.017). Long-term clinical success in the treatment of lateral epicondylitis depends on the injection method. The peppering technique appears to be more effective than the single injection technique in the long-term.

  8. Variances in the projections, resulting from CLIMEX, Boosted Regression Trees and Random Forests techniques

    Science.gov (United States)

    Shabani, Farzin; Kumar, Lalit; Solhjouy-fard, Samaneh

    2017-08-01

    The aim of this study was to have a comparative investigation and evaluation of the capabilities of correlative and mechanistic modeling processes, applied to the projection of future distributions of date palm in novel environments and to establish a method of minimizing uncertainty in the projections of differing techniques. The location of this study on a global scale is in Middle Eastern Countries. We compared the mechanistic model CLIMEX (CL) with the correlative models MaxEnt (MX), Boosted Regression Trees (BRT), and Random Forests (RF) to project current and future distributions of date palm ( Phoenix dactylifera L.). The Global Climate Model (GCM), the CSIRO-Mk3.0 (CS) using the A2 emissions scenario, was selected for making projections. Both indigenous and alien distribution data of the species were utilized in the modeling process. The common areas predicted by MX, BRT, RF, and CL from the CS GCM were extracted and compared to ascertain projection uncertainty levels of each individual technique. The common areas identified by all four modeling techniques were used to produce a map indicating suitable and unsuitable areas for date palm cultivation for Middle Eastern countries, for the present and the year 2100. The four different modeling approaches predict fairly different distributions. Projections from CL were more conservative than from MX. The BRT and RF were the most conservative methods in terms of projections for the current time. The combination of the final CL and MX projections for the present and 2100 provide higher certainty concerning those areas that will become highly suitable for future date palm cultivation. According to the four models, cold, hot, and wet stress, with differences on a regional basis, appears to be the major restrictions on future date palm distribution. The results demonstrate variances in the projections, resulting from different techniques. The assessment and interpretation of model projections requires reservations

  9. Probabilistic techniques using Monte Carlo sampling for multi- component system diagnostics

    Energy Technology Data Exchange (ETDEWEB)

    Aumeier, S.E. [Argonne National Lab., Idaho Falls, ID (United States); Lee, J.C.; Akcasu, A.Z. [Univ. of Michigan, Ann Arbor, MI (United States). Dept. of Nuclear Engineering

    1995-06-01

    We outline the structure of a new approach at multi-component system fault diagnostics which utilizes detailed system simulation models, uncertain system observation data, statistical knowledge of system parameters, expert opinion, and component reliability data in an effort to identify incipient component performance degradations of arbitrary number and magnitude. The technique involves the use of multiple adaptive Kalman filters for fault estimation, the results of which are screened using standard hypothesis testing procedures to define a set of component events that could have transpired. Latin Hypercube sample each of these feasible component events in terms of uncertain component reliability data and filter estimates. The capabilities of the procedure are demonstrated through the analysis of a simulated small magnitude binary component fault in a boiling water reactor balance of plant. The results show that the procedure has the potential to be a very effective tool for incipient component fault diagnosis.

  10. IR laser extraction technique applied to oxygen isotope analysis of small biogenic silica samples.

    Science.gov (United States)

    Crespin, Julien; Alexandre, Anne; Sylvestre, Florence; Sonzogni, Corinne; Paillès, Christine; Garreta, Vincent

    2008-04-01

    An IR-laser fluorination technique is reported here for analyzing the oxygen isotope composition (delta18O) of microscopic biogenic silica grains (phytoliths and diatoms). Performed after a controlled isotopic exchanged (CIE) procedure, the laser fluorination technique that allows one to visually check the success of the fluorination reaction is faster than the conventional fluorination technique and allows analyzing delta18O of small to minute samples (1.6-0.3 mg) as required for high-resolution paleoenvironmental reconstructions. The long-term reproducibility achieved with the IR laser-heating fluorination/O2 delta18O analysis is lower than or equal to +/-0.26 per thousand (1 SD; n = 99) for phytoliths and +/-0.17 per thousand (1 SD; n = 47) for diatoms. When several CIE are taken into account in the SD calculation, the resulting reproducibility is lower than or equal to +/-0.51 per thousand for phytoliths (1 SD; n = 99; CIE > 5) and +/-0.54 per thousand (1 SD; n = 47; CIE = 13) for diatoms. A minimum reproducibility of +/-0.5 per thousand leads to an estimated uncertainty on delta18Osilica close to +/-0.5 per thousand. Resulting uncertainties on reconstructed temperature and delta18Oforming water are, respectively, +/-2 degrees C and +/-0.5 per thousand and fit in the precisions required for intertropical paleoenvironmental reconstructions. Several methodological points such as optimal extraction protocols and the necessity or not of performing two CIE prior to oxygen extraction are assessed.

  11. In situ aqueous derivatization as sample preparation technique for gas chromatographic determinations.

    Science.gov (United States)

    Ferreira, Ana María Casas; Laespada, María Esther Fernández; Pavón, José Luis Pérez; Cordero, Bernardo Moreno

    2013-06-28

    The use of derivatization reactions is a common practice in analytical laboratories. Although in many cases it is tedious and time-consuming, it does offer a good alternative for the determination of analytes not compatible to gas chromatography. Many of the reactions reported in the literature occur in organic medium. However, in situ aqueous derivatization reactions, which can be performed directly in aqueous medium, offer important advantages over those mentioned above, such as no need of a previous extraction step and easy automation. Here we review the most recent developments and applications of in situ aqueous derivatization. The discussion focuses on the derivatization reactions used for the determination of alcohols and phenols, carboxylic acids, aldehydes and ketones, nitrogen-containing compounds and thiols in different aqueous matrices, such as environmental, biological and food samples. Several reactions are described for each functional group (acylation, alkylation, esterification, among others) and, in some cases, the same reagents can be used for several functional groups, such that there is an unavoidable overlap between sections. Finally, attention is also focused on the techniques used for the introduction of the derivatives formed in the aqueous medium into the chromatographic system. The implementation of in situ aqueous derivatization coupled to preconcentration techniques has permitted the enhancement of recoveries and improvements in the separation, selectivity and sensitivity of the analytical methods. Copyright © 2013 Elsevier B.V. All rights reserved.

  12. A Coordinated Focused Ion Beam/Ultramicrotomy Technique for Serial Sectioning of Hayabusa Particles and Other Returned Samples

    Science.gov (United States)

    Berger, E. L.; Keller, L. P.

    2014-01-01

    Recent sample return missions, such as NASA's Stardust mission to comet 81P/Wild 2 and JAXA's Hayabusa mission to asteroid 25143 Itokawa, have returned particulate samples (typically 5-50 µm) that pose tremendous challenges to coordinated analysis using a variety of nano- and micro-beam techniques. The ability to glean maximal information from individual particles has become increasingly important and depends critically on how the samples are prepared for analysis. This also holds true for other extraterrestrial materials, including interplanetary dust particles, micrometeorites and lunar regolith grains. Traditionally, particulate samples have been prepared using microtomy techniques (e.g., [1]). However, for hard mineral particles ?20 µm, microtome thin sections are compromised by severe chatter and sample loss. For these difficult samples, we have developed a hybrid technique that combines traditional ultramicrotomy with focused ion beam (FIB) techniques, allowing for the in situ investigation of grain surfaces and interiors. Using this method, we have increased the number of FIB-SEM prepared sections that can be recovered from a particle with dimensions on the order of tens of µms. These sections can be subsequently analyzed using a variety of electron beam techniques. Here, we demonstrate this sample preparation technique on individual lunar regolith grains in order to study their space-weathered surfaces. We plan to extend these efforts to analyses of individual Hayabusa samples.

  13. Measurement of sodium concentration in sweat samples: comparison of 5 analytical techniques.

    Science.gov (United States)

    Goulet, Eric D B; Asselin, Audrey; Gosselin, Jonathan; Baker, Lindsay B

    2017-08-01

    Sweat sodium concentration (SSC) can be determined using different analytical techniques (ATs), which may have implications for athletes and scientists. This study compared the SSC measured with 5 ATs: ion chromatography (IChr), flame photometry (FP), direct (DISE) and indirect (IISE) ion-selective electrode, and ion conductivity (IC). Seventy sweat samples collected from 14 athletes were analyzed with 5 instruments: the 883 Basic IC Plus (IChr, reference instrument), AAnalyst 200 (FP), Cobas 6000 (IISE), Sweat-Chek (IC), and B-722 Laqua Twin (DISE). Instruments showed excellent relative (intraclass correlation coefficient (ICC) ≥ 0.999) and absolute (coefficient of variation (CV) ≤ 2.6%) reliability. Relative validity was also excellent between ATs (ICC ≥ 0.961). In regards to the inter-AT absolute validity, compared with IChr, standard error of the estimates were similar among ATs (2.8-3.8 mmol/L), but CV was lowest with DISE (3.9%), intermediate with IISE (7.6%), and FP (6.9%) and highest with IC (12.3%). In conclusion, SSC varies depending on the AT used to analyze samples. Therefore, results obtained from different ATs are scarcely comparable and should not be used interchangeably. Nevertheless, taking into account the normal variability in SSC (∼±12%), the imprecision of the recommendations deriving from FP, IISE, IC, and DISE should have trivial health and physiological consequences under most exercise circumstances.

  14. High Field In Vivo 13C Magnetic Resonance Spectroscopy of Brain by Random Radiofrequency Heteronuclear Decoupling and Data Sampling

    Science.gov (United States)

    Li, Ningzhi; Li, Shizhe; Shen, Jun

    2017-06-01

    In vivo 13C magnetic resonance spectroscopy (MRS) is a unique and effective tool for studying dynamic human brain metabolism and the cycling of neurotransmitters. One of the major technical challenges for in vivo 13C-MRS is the high radio frequency (RF) power necessary for heteronuclear decoupling. In the common practice of in vivo 13C-MRS, alkanyl carbons are detected in the spectra range of 10-65ppm. The amplitude of decoupling pulses has to be significantly greater than the large one-bond 1H-13C scalar coupling (1JCH=125-145 Hz). Two main proton decoupling methods have been developed: broadband stochastic decoupling and coherent composite or adiabatic pulse decoupling (e.g., WALTZ); the latter is widely used because of its efficiency and superb performance under inhomogeneous B1 field. Because the RF power required for proton decoupling increases quadratically with field strength, in vivo 13C-MRS using coherent decoupling is often limited to low magnetic fields (Drug Administration (FDA). Alternately, carboxylic/amide carbons are coupled to protons via weak long-range 1H-13C scalar couplings, which can be decoupled using low RF power broadband stochastic decoupling. Recently, the carboxylic/amide 13C-MRS technique using low power random RF heteronuclear decoupling was safely applied to human brain studies at 7T. Here, we review the two major decoupling methods and the carboxylic/amide 13C-MRS with low power decoupling strategy. Further decreases in RF power deposition by frequency-domain windowing and time-domain random under-sampling are also discussed. Low RF power decoupling opens the possibility of performing in vivo 13C experiments of human brain at very high magnetic fields (such as 11.7T), where signal-to-noise ratio as well as spatial and temporal spectral resolution are more favorable than lower fields.

  15. Sample-to-sample fluctuations of power spectrum of a random motion in a periodic Sinai model

    Science.gov (United States)

    Dean, David S.; Iorio, Antonio; Marinari, Enzo; Oshanin, Gleb

    2016-09-01

    The Sinai model of a tracer diffusing in a quenched Brownian potential is a much-studied problem exhibiting a logarithmically slow anomalous diffusion due to the growth of energy barriers with the system size. However, if the potential is random but periodic, the regime of anomalous diffusion crosses over to one of normal diffusion once a tracer has diffused over a few periods of the system. Here we consider a system in which the potential is given by a Brownian bridge on a finite interval (0 ,L ) and then periodically repeated over the whole real line and study the power spectrum S (f ) of the diffusive process x (t ) in such a potential. We show that for most of realizations of x (t ) in a given realization of the potential, the low-frequency behavior is S (f ) ˜A /f2 , i.e., the same as for standard Brownian motion, and the amplitude A is a disorder-dependent random variable with a finite support. Focusing on the statistical properties of this random variable, we determine the moments of A of arbitrary, negative, or positive order k and demonstrate that they exhibit a multifractal dependence on k and a rather unusual dependence on the temperature and on the periodicity L , which are supported by atypical realizations of the periodic disorder. We finally show that the distribution of A has a log-normal left tail and exhibits an essential singularity close to the right edge of the support, which is related to the Lifshitz singularity. Our findings are based both on analytic results and on extensive numerical simulations of the process x (t ) .

  16. Evaluation of a Jugular Venipuncture Alpaca Model to Teach the Technique of Blood Sampling in Adult Alpacas.

    Science.gov (United States)

    Rousseau, Marjolaine; Beauchamp, Guy; Nichols, Sylvain

    2017-01-01

    The effectiveness of teaching aids in veterinary medical education is not often assessed rigorously. The objective in the present study was to evaluate the effectiveness of a commercially available jugular venipuncture alpaca model as a complementary tool to teach veterinary students how to perform venipuncture in adult alpacas. We hypothesized that practicing on the model would allow veterinary students to draw blood in alpacas more rapidly with fewer attempts than students without previous practice on the model. Thirty-six third-year veterinary students were enrolled and randomly allocated to the model (group M; n=18) or the control group (group C; n=18). The venipuncture technique was taught to all students on day 0. Students in group M practiced on the model on day 2. On day 5, an evaluator blinded to group allocation evaluated the students' venipuncture skills during a practical examination using live alpacas. Success was defined as the aspiration of a 6-ml sample of blood. Measured outcomes included number of attempts required to achieve success (success score), total procedural time, and overall qualitative score. Success scores, total procedural time, and overall scores did not differ between groups. Use of restless alpacas reduced performance. The jugular venipuncture alpaca model failed to improve jugular venipuncture skills in this student population. Lack of movement represents a significant weakness of this training model.

  17. Closer to the native state. Critical evaluation of cryo-techniques for Transmission Electron Microscopy: preparation of biological samples.

    Science.gov (United States)

    Mielanczyk, Lukasz; Matysiak, Natalia; Michalski, Marek; Buldak, Rafal; Wojnicz, Romuald

    2014-01-01

    Over the years Transmission Electron Microscopy (TEM) has evolved into a powerful technique for the structural analysis of cells and tissues at various levels of resolution. However, optimal sample preservation is required to achieve results consistent with reality. During the last few decades, conventional preparation methods have provided most of the knowledge about the ultrastructure of organelles, cells and tissues. Nevertheless, some artefacts can be introduced at all stagesofstandard electron microscopy preparation technique. Instead, rapid freezing techniques preserve biological specimens as close as possible to the native state. Our review focuses on different cryo-preparation approaches, starting from vitrification methods dependent on sample size. Afterwards, we discuss Cryo-Electron Microscopy Of VItreous Sections (CEMOVIS) and the main difficulties associated with this technique. Cryo-Focused Ion Beam (cryo-FIB) is described as a potential alternative for CEMOVIS. Another post-processing route for vitrified samples is freeze substitution and embedding in resin for structural analysis or immunolocalization analysis. Cryo-sectioning according to Tokuyasu is a technique dedicated to high efficiency immunogold labelling. Finally, we introduce hybrid techniques, which combine advantages of primary techniques originally dedicated to different approaches. Hybrid approaches permit to perform the study of difficult-to-fix samples and antigens or help optimize the sample preparation protocol for the integrated Laser and Electron Microscopy (iLEM) technique.

  18. Discriminative motif discovery via simulated evolution and random under-sampling.

    Directory of Open Access Journals (Sweden)

    Tao Song

    Full Text Available Conserved motifs in biological sequences are closely related to their structure and functions. Recently, discriminative motif discovery methods have attracted more and more attention. However, little attention has been devoted to the data imbalance problem, which is one of the main reasons affecting the performance of the discriminative models. In this article, a simulated evolution method is applied to solve the multi-class imbalance problem at the stage of data preprocessing, and at the stage of Hidden Markov Models (HMMs training, a random under-sampling method is introduced for the imbalance between the positive and negative datasets. It is shown that, in the task of discovering targeting motifs of nine subcellular compartments, the motifs found by our method are more conserved than the methods without considering data imbalance problem and recover the most known targeting motifs from Minimotif Miner and InterPro. Meanwhile, we use the found motifs to predict protein subcellular localization and achieve higher prediction precision and recall for the minority classes.

  19. Accelerating Markov chain Monte Carlo simulation by differential evolution with self-adaptive randomized subspace sampling

    Energy Technology Data Exchange (ETDEWEB)

    Vrugt, Jasper A [Los Alamos National Laboratory; Hyman, James M [Los Alamos National Laboratory; Robinson, Bruce A [Los Alamos National Laboratory; Higdon, Dave [Los Alamos National Laboratory; Ter Braak, Cajo J F [NETHERLANDS; Diks, Cees G H [UNIV OF AMSTERDAM

    2008-01-01

    Markov chain Monte Carlo (MCMC) methods have found widespread use in many fields of study to estimate the average properties of complex systems, and for posterior inference in a Bayesian framework. Existing theory and experiments prove convergence of well constructed MCMC schemes to the appropriate limiting distribution under a variety of different conditions. In practice, however this convergence is often observed to be disturbingly slow. This is frequently caused by an inappropriate selection of the proposal distribution used to generate trial moves in the Markov Chain. Here we show that significant improvements to the efficiency of MCMC simulation can be made by using a self-adaptive Differential Evolution learning strategy within a population-based evolutionary framework. This scheme, entitled DiffeRential Evolution Adaptive Metropolis or DREAM, runs multiple different chains simultaneously for global exploration, and automatically tunes the scale and orientation of the proposal distribution in randomized subspaces during the search. Ergodicity of the algorithm is proved, and various examples involving nonlinearity, high-dimensionality, and multimodality show that DREAM is generally superior to other adaptive MCMC sampling approaches. The DREAM scheme significantly enhances the applicability of MCMC simulation to complex, multi-modal search problems.

  20. Comparing cluster-level dynamic treatment regimens using sequential, multiple assignment, randomized trials: Regression estimation and sample size considerations.

    Science.gov (United States)

    NeCamp, Timothy; Kilbourne, Amy; Almirall, Daniel

    2017-08-01

    Cluster-level dynamic treatment regimens can be used to guide sequential treatment decision-making at the cluster level in order to improve outcomes at the individual or patient-level. In a cluster-level dynamic treatment regimen, the treatment is potentially adapted and re-adapted over time based on changes in the cluster that could be impacted by prior intervention, including aggregate measures of the individuals or patients that compose it. Cluster-randomized sequential multiple assignment randomized trials can be used to answer multiple open questions preventing scientists from developing high-quality cluster-level dynamic treatment regimens. In a cluster-randomized sequential multiple assignment randomized trial, sequential randomizations occur at the cluster level and outcomes are observed at the individual level. This manuscript makes two contributions to the design and analysis of cluster-randomized sequential multiple assignment randomized trials. First, a weighted least squares regression approach is proposed for comparing the mean of a patient-level outcome between the cluster-level dynamic treatment regimens embedded in a sequential multiple assignment randomized trial. The regression approach facilitates the use of baseline covariates which is often critical in the analysis of cluster-level trials. Second, sample size calculators are derived for two common cluster-randomized sequential multiple assignment randomized trial designs for use when the primary aim is a between-dynamic treatment regimen comparison of the mean of a continuous patient-level outcome. The methods are motivated by the Adaptive Implementation of Effective Programs Trial which is, to our knowledge, the first-ever cluster-randomized sequential multiple assignment randomized trial in psychiatry.

  1. Validation of a non-invasive blood-sampling technique for doubly-labelled water experiments.

    Science.gov (United States)

    Voigt, Christian C; Helversen, Otto Von; Michener, Robert H; Kunz, Thomas H

    2003-04-01

    Two techniques for bleeding small mammals have been used in doubly-labeled water (DLW) studies, including vena puncture and the use of starved nymphal stages of hematophagous reduviid bugs (Reduviidae, Hemiptera). In this study, we tested the validity of using reduviid bugs in doubly-labeled water experiments. We found that the isotope enrichment in initial blood samples collected with bugs was significantly lower compared to isotope enrichment in blood samples obtained using vena puncture. We therefore used the desiccation method for estimating total body water (TBW) in DLW experiments because TBW calculated using the isotope dilution method was overestimated when blood samples were collected using reduviid bugs. In our validation experiment with nectar-feeding bats (Glossophaga soricina), we compared estimates of daily energy expenditure (DEE) using DLW with those derived from the energy balance method. We considered Speakman's equation (controlling for 25% fractionated water loss) as the most appropriate for our study animal and calculated DEE accordingly. On average, DEE estimated with DLW was not significantly different from the mean value obtained with the energy balance method (mean deviation 1.2%). We conclude that although bug hemolymph or intestinal liquids most likely contaminate the samples, estimates of DEE are still valid because the DLW method does not depend on absolute isotope enrichments but on the rate of isotope decrease over time. However, dilution of blood with intestinal liquids or hemolymph from a bug may lead to larger variation in DEE estimates. We also tested how the relative error of DLW estimates changed with varying assumptions about fractionation. We used three additional equations for calculating DEE in DLW experiments. The basic equation for DLW experiments published by Lifson and McClintock (LM-6) assumes no fractionation, resulted in an overestimate of DEE by 10%. Nagy's equation (N-2) controls for changes in body mass but not for

  2. Boosting the FM-Index on the GPU: Effective Techniques to Mitigate Random Memory Access.

    Science.gov (United States)

    Chacón, Alejandro; Marco-Sola, Santiago; Espinosa, Antonio; Ribeca, Paolo; Moure, Juan Carlos

    2015-01-01

    The recent advent of high-throughput sequencing machines producing big amounts of short reads has boosted the interest in efficient string searching techniques. As of today, many mainstream sequence alignment software tools rely on a special data structure, called the FM-index, which allows for fast exact searches in large genomic references. However, such searches translate into a pseudo-random memory access pattern, thus making memory access the limiting factor of all computation-efficient implementations, both on CPUs and GPUs. Here, we show that several strategies can be put in place to remove the memory bottleneck on the GPU: more compact indexes can be implemented by having more threads work cooperatively on larger memory blocks, and a k-step FM-index can be used to further reduce the number of memory accesses. The combination of those and other optimisations yields an implementation that is able to process about two Gbases of queries per second on our test platform, being about 8 × faster than a comparable multi-core CPU version, and about 3 × to 5 × faster than the FM-index implementation on the GPU provided by the recently announced Nvidia NVBIO bioinformatics library.

  3. Brief Group Intervention Using Emotional Freedom Techniques for Depression in College Students: A Randomized Controlled Trial

    Directory of Open Access Journals (Sweden)

    Dawson Church

    2012-01-01

    Full Text Available Two hundred thirty-eight first-year college students were assessed using the Beck Depression Inventory (BDI. Thirty students meeting the BDI criteria for moderate to severe depression were randomly assigned to either a treatment or control group. The treatment group received four 90-minute group sessions of EFT (Emotional Freedom Techniques, a novel treatment that combines exposure, cognitive reprocessing, and somatic stimulation. The control group received no treatment. Posttests were conducted 3 weeks later on those that completed all requirements (N=18. The EFT group (n=9 had significantly more depression at baseline than the control group (n=9 (EFT BDI mean=23.44, SD=2.1 versus control BDI mean=20.33, SD=2.1. After controlling for baseline BDI score, the EFT group had significantly less depression than the control group at posttest, with a mean score in the “nondepressed” range (P=.001; EFT BDI mean=6.08, SE=1.8 versus control BDI mean=18.04, SE=1.8. Cohen's d was 2.28, indicating a very strong effect size. These results are consistent with those noted in other studies of EFT that included an assessment for depression and indicate the clinical usefulness of EFT as a brief, cost-effective, and efficacious treatment.

  4. A two-stage noise source identification technique based on a farfield random parametric array.

    Science.gov (United States)

    Bai, Mingsian R; Chen, You Siang; Lo, Yi-Yang

    2017-05-01

    A farfield random array is implemented for noise source identification. Microphone positions are optimized, with the aid of the simulated annealing method. A two-stage localization and separation algorithm is devised on the basis of the equivalent source method (ESM). In the localization stage, the active source regions are located by using the delay-and-sum method, followed by a parametric localization procedure, stochastic maximum likelihood algorithm. Multidimensional nonlinear optimization is exploited in the bearing estimation process. In the separation stage, source amplitudes are extracted by formulating an inverse problem based on the preceding source bearings identified. The number of equivalent sources is selected to be less than that of microphones to render an overdetermined problem which can be readily solved by using the Tikhonov regularization. Alternatively, the separation problem can be augmented into an underdetermined problem which can be solved by using the compressive sensing technique. Traditionally, farfield arrays only give a relative distribution of source field. However, by using the proposed method, the acoustic variables including sound pressure, particle velocity, sound intensity, and sound power can be calculated based on ESM. Numerical and experimental results of several objective and subjective tests are presented.

  5. A cosmetic evaluation of breast cancer treatment: a randomized study of radiotherapy boost technique.

    Science.gov (United States)

    Vass, Sylvie; Bairati, Isabelle

    2005-08-01

    To compare cosmetic results of two different radiotherapy (RT) boost techniques used in the treatment of breast cancer after whole breast radiotherapy and to identify factors affecting cosmetic outcomes. Between 1996 and 1998, 142 patients with Stage I and II breast cancer were treated with breast conservative surgery and adjuvant RT. Patients were then randomly assigned to receive a boost dose of 15 Gy delivered to the tumor bed either by iridium 192, or a combination of photons and electrons. Cosmetic evaluations were done on a 6-month basis, with a final evaluation at 36 months after RT. The evaluations were done using a panel of global and specific subjective scores, a digitized scoring system using the breast retraction assessment (BRA) measurement, and a patient's self-assessment evaluation. As cosmetic results were graded according to severity, the comparison of boost techniques was done using the ordinal logistic regression model. Adjusted odds ratios (OR) and their 95% confidence intervals (CI) are presented. At 36 months of follow-up, there was no significant difference between the two groups with respect to the global subjective cosmetic outcome (OR = 1.40; 95%CI = 0.69-2.85, p = 0.35). Good to excellent scores were observed in 65% of implant patients and 62% of photon/electron patients. At 24 months and beyond, telangiectasia was more severe in the implant group with an OR of 9.64 (95%CI = 4.05-22.92, p cosmetic outcome was the presence of concomitant chemotherapy (OR = 3.87; 95%CI = 1.74-8.62). The BRA value once adjusted for age, concomitant chemotherapy, and boost volume showed a positive association with the boost technique. The BRA value was significantly greater in the implant group (p = 0.03). There was no difference in the patient's final self-assessment score between the two groups. Three variables were statistically associated with an adverse self-evaluation: an inferior quadrant tumor localization, postoperative hematoma, and concomitant

  6. Testing the applicability of six macroscopic skeletal aging techniques on a modern Southeast Asian sample.

    Science.gov (United States)

    Gocha, Timothy P; Ingvoldstad, Megan E; Kolatorowicz, Adam; Cosgriff-Hernandez, Meghan-Tomasita J; Sciulli, Paul W

    2015-04-01

    Most macroscopic skeletal aging techniques used by forensic anthropologists have been developed and tested only on reference material from western populations. This study examined the performance of six aging techniques on a known age sample of 88 Southeast Asian individuals. Methods examined included the Suchey-Brooks method of aging the symphyseal face of the os pubis (Brooks and Suchey, Hum. Evol. 5 (1990) 227), Buckberry and Chamberlain's, Am. J. Phys. Anthropol. 119 (2002) 231 and Osborne et al.'s, J. Forensic Sci. 49 (2004) 1 revisions of the Lovejoy et al., Am. J. Phys. Anthropol. 68 (1985) 15 method of aging the auricular surface of the ilium, İşcan et al.'s, J. Forensic Sci. 29 (1984) 1094, İşcan et al.'s, J. Forensic Sci. 30 (1985) 853 method of aging the sternal end of the fourth rib, and Meindl and Lovejoy's, Am. J. Phys. Anthropol. 68 (1985) 57 methods for aging both lateral-anterior and vault sutures on the cranium. The results of this study indicate that application of aging techniques commonly used in forensic anthropology to individuals identified as Asian, and more specifically Southeast Asian, should not be undertaken injudiciously. Of the six individual methods tested here, the Suchey-Brooks pubic symphysis aging method performs best, though average age estimates were still off by nearly 10 years or greater. Methods for aging the auricular surface perform next best, though the Osborne et al. method works better for individuals below 50 years and the Buckberry and Chamberlain method works better for those above 50 years. Methods for age estimation from the sternal ends of the fourth rib and vault and lateral-anterior cranial sutures perform poorly and are not recommended for use on remains of Southeast Asian ancestry. Combining age estimates from multiple indicators, specifically the pubic symphysis and one auricular surface method, was superior to individual methods. Data and a worked example are provided for calculating the conditional

  7. Study of Event-based Sampling Techniques and Their Influence on Greenhouse Climate Control with Wireless Sensors Network

    OpenAIRE

    Pawlowski, Andrzej; Guzman, Jose L.; Rodriguez, Francisco; Berenguel, Manuel; Sanchez, Jose; Dormido, Sebastian

    2010-01-01

    This paper presents a study of event-based sampling techniques and their application to the greenhouse climate control problem. It was possible to obtain important information about data transmission and control performance for all techniques. As conclusion, it was deduced

  8. Development of novel separation techniques for biological samples in capillary electrophoresis

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Huan -Tsung [Iowa State Univ., Ames, IA (United States)

    1994-07-27

    This dissertation includes three different topics: general introduction of capillary electrophoresis (CE); gradient in CE and CE in biological separations; and capillary gel electrophoresis (CGE) for DNA separation. Factors such as temperature, viscosity, pH, and the surface of capillary walls affecting the separation performance are demonstrated. A pH gradient between 3.0 and 5.2 is useful to improve the resolution among eight different organic acids. A flow gradient due to the change in the concentration of surfactant, which is able to coat to the capillary wall to change the flow rate and its direction, is also shown as a good way to improve the resolution for organic compounds. A temperature gradient caused by joule heat is shown by voltage programming to enhance the resolution and shorten the separation time for several phenolic compounds. The author also shows that self-regulating dynamic control of electroosmotic flow in CE by simply running separation in different concentrations of surfactant has less matrix effect on the separation performance. One of the most important demonstrations in this dissertation is that the author proposes on-column reaction which gives several advantages including the use of a small amount of sample, low risk of contamination, and time saving and kinetic features. The author uses this idea with laser induced fluorescence (LIF) as a detection mode to detect an on-column digestion of sub-ng of protein. This technique also is applied to single cell analysis in the group.

  9. Techniques for the detection of pathogenic Cryptococcus species in wood decay substrata and the evaluation of viability in stored samples

    Directory of Open Access Journals (Sweden)

    Christian Alvarez

    2013-02-01

    Full Text Available In this study, we evaluated several techniques for the detection of the yeast form of Cryptococcus in decaying wood and measured the viability of these fungi in environmental samples stored in the laboratory. Samples were collected from a tree known to be positive for Cryptococcus and were each inoculated on 10 Niger seed agar (NSA plates. The conventional technique (CT yielded a greater number of positive samples and indicated a higher fungal density [in colony forming units per gram of wood (CFU.g-1] compared to the humid swab technique (ST. However, the difference in positive and false negative results between the CT-ST was not significant. The threshold of detection for the CT was 0.05.10³ CFU.g-1, while the threshold for the ST was greater than 0.1.10³ CFU-1. No colonies were recovered using the dry swab technique. We also determined the viability of Cryptococcus in wood samples stored for 45 days at 25ºC using the CT and ST and found that samples not only continued to yield a positive response, but also exhibited an increase in CFU.g-1, suggesting that Cryptococcus is able to grow in stored environmental samples. The ST.1, in which samples collected with swabs were immediately plated on NSA medium, was more efficient and less laborious than either the CT or ST and required approximately 10 min to perform; however, additional studies are needed to validate this technique.

  10. Contributions from the data samples in NOC technique on the extracting of the Sq variation

    Science.gov (United States)

    Wu, Yingyan; Xu, Wenyao

    2015-04-01

    The solar quiet daily variation, Sq, a rather regular variation is usually observed at mid-low latitudes on magnetic quiet days or less-disturbed days. It is mainly resulted from the dynamo currents in the ionospheric E region, which are driven by the atmospheric tidal wind and different processes and flow as two current whorls in each of the northern and southern hemispheres[1]. The Sq exhibits a conspicuous day-to-day (DTD) variability in daily range (or strength), shape (or phase) and its current focus. This variability is mainly attributed to changes in the ionospheric conductivity and tidal winds, varying with solar radiation and ionospheric conditions. Furthermore, it presents a seasonal variation and solar cycle variation[2-4]. In generally, Sq is expressed with the average value of the five international magnetic quiet days. Using data from global magnetic stations, equivalent current system of daily variation can be constructed to reveal characteristics of the currents[5]. In addition, using the differences of H component at two stations on north and south side of the Sq currents of focus, Sq is extracted much better[6]. Recently, the method of Natural Orthoganal Components (NOC) is used to decompose the magnetic daily variation and express it as the summation of eigenmodes, and indicate the first NOC eigenmode as the solar quiet daily variation, the second as the disturbance daily variation[7-9]. As we know, the NOC technique can help reveal simpler patterns within a complex set of variables, without designed basic-functions such as FFT technique. But the physical explanation of the NOC eigenmodes is greatly depends on the number of data samples and data regular-quality. Using the NOC method, we focus our present study on the analysis of the hourly means of the H component at BMT observatory in China from 2001 to 2008. The contributions of the number and the regular-quality of the data samples on which eigenmode corresponds to the Sq are analyzed, by

  11. Neural tension technique is no different from random passive movements in reducing spasticity in patients with traumatic brain injury

    DEFF Research Database (Denmark)

    Lorentzen, Jakob; Nielsen, Dorthe; Holm, Karl

    2012-01-01

    Purpose: Neural tension technique (NTT) is a therapy believed to reduce spasticity and to increase range of motion (ROM). This study compared the ability of NTT and random passive movements (RPMs) to reduce spasticity in the knee flexors in 10 spastic patients with brain injury. Methods: An RCT...

  12. A large-area ultra-precision 2D geometrical measurement technique based on statistical random phase detection

    Science.gov (United States)

    Ekberg, Peter; Stiblert, Lars; Mattsson, Lars

    2012-03-01

    The manufacturing of high-quality chrome masks used in the display industry for the manufacturing of liquid crystals, organic light emission diodes and other display devices would not be possible without high-precision large-area metrology. In contrast to the semiconductor industry where 6‧ masks are most common, the quartz glass masks for the manufacturing of large area TVs can have sizes of up to 1.6 × 1.8 m2. Besides the large area, there are demands of sub-micrometer accuracy in ‘registration’, i.e. absolute dimensional measurements and nanometer requirements for ‘overlay’, i.e. repeatability. The technique for making such precise measurements on large masks is one of the most challenging tasks in dimensional metrology today. This paper presents a new approach to two-dimensional (2D) ultra-precision measurements based on random sampling. The technique was recently presented for ultra-precise one-dimensional (1D) measurement. The 1D method relies on timing the scanning of a focused laser beam 200 µm in the Y-direction from an interferometrically determined reference position. This microsweep is controlled by an acousto-optical deflector. By letting the microsweep scan from random X-positions, we can build XY-recordings through a time-to-space conversion that gives very precise maps of the feature edges of the masks. The method differs a lot from ordinary image processing methods using CCD or CMOS sensors for capturing images in the spatial domain. We use events grabbed by a single detector in the time domain in both the X- and Y-directions. After a simple scaling, we get precise and repeatable spatial information. Thanks to the extremely linear microsweep and its precise power control, spatial and intensity distortions, common in ordinary image processing systems using 2D optics and 2D sensors, can be practically eliminated. Our 2D method has proved to give a standard deviation in repeatability of less than 4 nm (1σ) in both the X- and Y

  13. Alexander Technique Lessons or Acupuncture Sessions for Persons With Chronic Neck Pain: A Randomized Trial.

    Science.gov (United States)

    MacPherson, Hugh; Tilbrook, Helen; Richmond, Stewart; Woodman, Julia; Ballard, Kathleen; Atkin, Karl; Bland, Martin; Eldred, Janet; Essex, Holly; Hewitt, Catherine; Hopton, Ann; Keding, Ada; Lansdown, Harriet; Parrott, Steve; Torgerson, David; Wenham, Aniela; Watt, Ian

    2015-11-03

    Management of chronic neck pain may benefit from additional active self-care-oriented approaches. To evaluate clinical effectiveness of Alexander Technique lessons or acupuncture versus usual care for persons with chronic, nonspecific neck pain. Three-group randomized, controlled trial. (Current Controlled Trials: ISRCTN15186354). U.K. primary care. Persons with neck pain lasting at least 3 months, a score of at least 28% on the Northwick Park Questionnaire (NPQ) for neck pain and associated disability, and no serious underlying pathology. 12 acupuncture sessions or 20 one-to-one Alexander lessons (both 600 minutes total) plus usual care versus usual care alone. NPQ score (primary outcome) at 0, 3, 6, and 12 months (primary end point) and Chronic Pain Self-Efficacy Scale score, quality of life, and adverse events (secondary outcomes). 517 patients were recruited, and the median duration of neck pain was 6 years. Mean attendance was 10 acupuncture sessions and 14 Alexander lessons. Between-group reductions in NPQ score at 12 months versus usual care were 3.92 percentage points for acupuncture (95% CI, 0.97 to 6.87 percentage points) (P = 0.009) and 3.79 percentage points for Alexander lessons (CI, 0.91 to 6.66 percentage points) (P = 0.010). The 12-month reductions in NPQ score from baseline were 32% for acupuncture and 31% for Alexander lessons. Participant self-efficacy improved for both interventions versus usual care at 6 months (P Alexander lessons, 3.33 percentage points [CI, 2.22 to 4.44 percentage points]). No reported serious adverse events were considered probably or definitely related to either intervention. Practitioners belonged to the 2 main U.K.-based professional associations, which may limit generalizability of the findings. Acupuncture sessions and Alexander Technique lessons both led to significant reductions in neck pain and associated disability compared with usual care at 12 months. Enhanced self-efficacy may partially explain why longer

  14. Effectiveness of a Treatment Involving Soft Tissue Techniques and/or Neural Mobilization Techniques in the Management of Tension-Type Headache: A Randomized Controlled Trial.

    Science.gov (United States)

    Ferragut-Garcías, Alejandro; Plaza-Manzano, Gustavo; Rodríguez-Blanco, Cleofás; Velasco-Roldán, Olga; Pecos-Martín, Daniel; Oliva-Pascual-Vaca, Jesús; Llabrés-Bennasar, Bartomeu; Oliva-Pascual-Vaca, Ángel

    2017-02-01

    To evaluate the effects of a protocol involving soft tissue techniques and/or neural mobilization techniques in the management of patients with frequent episodic tension-type headache (FETTH) and those with chronic tension-type headache (CTTH). Randomized, double-blind, placebo-controlled before and after trial. Rehabilitation area of the local hospital and a private physiotherapy center. Patients (N=97; 78 women, 19 men) diagnosed with FETTH or CTTH were randomly assigned to groups A, B, C, or D. (A) Placebo superficial massage; (B) soft tissue techniques; (C) neural mobilization techniques; (D) a combination of soft tissue and neural mobilization techniques. The pressure pain threshold (PPT) in the temporal muscles (points 1 and 2) and supraorbital region (point 3), the frequency and maximal intensity of pain crisis, and the score in the Headache Impact Test-6 (HIT-6) were evaluated. All variables were assessed before the intervention, at the end of the intervention, and 15 and 30 days after the intervention. Groups B, C, and D had an increase in PPT and a reduction in frequency, maximal intensity, and HIT-6 values in all time points after the intervention as compared with baseline and group A (P<.001 for all cases). Group D had the highest PPT values and the lowest frequency and HIT-6 values after the intervention. The application of soft tissue and neural mobilization techniques to patients with FETTH or CTTH induces significant changes in PPT, the characteristics of pain crisis, and its effect on activities of daily living as compared with the application of these techniques as isolated interventions. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  15. Efficient sampling techniques for uncertainty quantification in history matching using nonlinear error models and ensemble level upscaling techniques

    KAUST Repository

    Efendiev, Y.

    2009-11-01

    The Markov chain Monte Carlo (MCMC) is a rigorous sampling method to quantify uncertainty in subsurface characterization. However, the MCMC usually requires many flow and transport simulations in evaluating the posterior distribution and can be computationally expensive for fine-scale geological models. We propose a methodology that combines coarse- and fine-scale information to improve the efficiency of MCMC methods. The proposed method employs off-line computations for modeling the relation between coarse- and fine-scale error responses. This relation is modeled using nonlinear functions with prescribed error precisions which are used in efficient sampling within the MCMC framework. We propose a two-stage MCMC where inexpensive coarse-scale simulations are performed to determine whether or not to run the fine-scale (resolved) simulations. The latter is determined on the basis of a statistical model developed off line. The proposed method is an extension of the approaches considered earlier where linear relations are used for modeling the response between coarse-scale and fine-scale models. The approach considered here does not rely on the proximity of approximate and resolved models and can employ much coarser and more inexpensive models to guide the fine-scale simulations. Numerical results for three-phase flow and transport demonstrate the advantages, efficiency, and utility of the method for uncertainty assessment in the history matching. Copyright 2009 by the American Geophysical Union.

  16. A Novel Randomized Search Technique for Multiple Mobile Robot Paths Planning In Repetitive Dynamic Environment

    Directory of Open Access Journals (Sweden)

    Vahid Behravesh

    2012-08-01

    Full Text Available Presented article is studying the issue of path navigating for numerous robots. Our presented approach is based on both priority and the robust method for path finding in repetitive dynamic. Presented model can be generally implementable and useable: We do not assume any restriction regarding the quantity of levels of freedom for robots, and robots of diverse kinds can be applied at the same time. We proposed a random method and hill-climbing technique in the area based on precedence plans, which is used to determine a solution to a given trajectory planning problem and to make less the extent of total track. Our method plans trajectories for particular robots in the setting-time scope. Therefore, in order to specifying the interval of constant objects similar to other robots and the extent of the tracks which is traversed. For measuring the hazard for robots to conflict with each other it applied a method based on probability of the movements of robots. This algorithm applied to real robots with successful results. The proposed method performed and judged on both real robots and in simulation. We performed sequence of100tests with 8 robots for comparing with coordination method and current performances are effective. However, maximizing the performance is still possible. These performances estimations performed on Windows operating system and 3GHz Intel Pentium IV with and compiles with GCC 3.4. We used our PCGA robot for all experiments.  For a large environment of 19×15m2where we accomplished 40tests, our model is competent to plan high-quality paths in a severely short time (less than a second. Moreover, this article utilized lookup tables to keep expenses the formerly navigated robots made, increasing the number of robots don’t expand computation time.

  17. EFT (Emotional Freedom Techniques) and Resiliency in Veterans at Risk for PTSD: A Randomized Controlled Trial.

    Science.gov (United States)

    Church, Dawson; Sparks, Terry; Clond, Morgan

    2016-01-01

    Prior research indicates elevated but subclinical posttraumatic stress disorder (PTSD) symptoms as a risk factor for a later diagnosis of PTSD. This study examined the progression of symptoms in 21 subclinical veterans. Participants were randomized into a treatment as usual (TAU) wait-list group and an experimental group, which received TAU plus six sessions of clinical emotional freedom techniques (EFT). Symptoms were assessed using the PCL-M (Posttraumatic Checklist-Military) on which a score of 35 or higher indicates increased risk for PTSD. The mean pretreatment score of participants was 39 ± 8.7, with no significant difference between groups. No change was found in the TAU group during the wait period. Afterward, the TAU group received an identical clinical EFT protocol. Posttreatment groups were combined for analysis. Scores declined to a mean of 25 (-64%, P < .0001). Participants maintained their gains, with mean three-month and six-month follow-up PCL-M scores of 27 (P < .0001). Similar reductions were noted in the depth and breadth of psychological conditions such as anxiety. A Cohen's d = 1.99 indicates a large treatment effect. Reductions in traumatic brain injury symptoms (P = .045) and insomnia (P = .004) were also noted. Symptom improvements were similar to those assessed in studies of PTSD-positive veterans. EFT may thus be protective against an increase in symptoms and a later PTSD diagnosis. As a simple and quickly learned self-help method, EFT may be a clinically useful element of a resiliency program for veterans and active-duty warriors. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Evaluation of alternative macroinvertebrate sampling techniques for use in a new tropical freshwater bioassessment scheme

    OpenAIRE

    Moore, Isabel Eleanor; Murphy, Kevin Joseph

    2015-01-01

    Aim: The study aimed to determine the effectiveness of benthic macroinvertebrate dredge net sampling procedures as an alternative method to kick net sampling in tropical freshwater systems, specifically as an evaluation of sampling methods used in the Zambian Invertebrate Scoring System (ZISS) river bioassessment scheme. Tropical freshwater ecosystems are sometimes dangerous or inaccessible to sampling teams using traditional kick-sampling methods, so identifying an alternative procedure that...

  19. Experiments with central-limit properties of spatial samples from locally covariant random fields

    Science.gov (United States)

    Barringer, T.H.; Smith, T.E.

    1992-01-01

    When spatial samples are statistically dependent, the classical estimator of sample-mean standard deviation is well known to be inconsistent. For locally dependent samples, however, consistent estimators of sample-mean standard deviation can be constructed. The present paper investigates the sampling properties of one such estimator, designated as the tau estimator of sample-mean standard deviation. In particular, the asymptotic normality properties of standardized sample means based on tau estimators are studied in terms of computer experiments with simulated sample-mean distributions. The effects of both sample size and dependency levels among samples are examined for various value of tau (denoting the size of the spatial kernel for the estimator). The results suggest that even for small degrees of spatial dependency, the tau estimator exhibits significantly stronger normality properties than does the classical estimator of standardized sample means. ?? 1992.

  20. LONG-TERM VARIABILITY OF BRONCHIAL RESPONSIVENESS TO HISTAMINE IN A RANDOM-POPULATION SAMPLE OF ADULTS

    NARCIS (Netherlands)

    RIJCKEN, B; SCHOUTEN, JP; WEISS, ST; ROSNER, B; DEVRIES, K; VANDERLENDE, R

    1993-01-01

    Long-term variability of bronchial responsiveness has been studied in a random population sample of adults. During a follow-up period of 18 yr, 2,216 subjects contributed 5,012 observations to the analyses. Each subject could have as many as seven observations. Bronchial responsiveness was assessed

  1. Albumin to creatinine ratio in a random urine sample: Correlation with severity of preeclampsia

    Directory of Open Access Journals (Sweden)

    Fady S. Moiety

    2014-06-01

    Conclusions: Random urine ACR may be a reliable method for prediction and assessment of severity of preeclampsia. Using the estimated cut-off may add to the predictive value of such a simple quick test.

  2. Evaluation of a Class of Simple and Effective Uncertainty Methods for Sparse Samples of Random Variables and Functions

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Vicente [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bonney, Matthew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Schroeder, Benjamin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Weirs, V. Gregory [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-11-01

    When very few samples of a random quantity are available from a source distribution of unknown shape, it is usually not possible to accurately infer the exact distribution from which the data samples come. Under-estimation of important quantities such as response variance and failure probabilities can result. For many engineering purposes, including design and risk analysis, we attempt to avoid under-estimation with a strategy to conservatively estimate (bound) these types of quantities -- without being overly conservative -- when only a few samples of a random quantity are available from model predictions or replicate experiments. This report examines a class of related sparse-data uncertainty representation and inference approaches that are relatively simple, inexpensive, and effective. Tradeoffs between the methods' conservatism, reliability, and risk versus number of data samples (cost) are quantified with multi-attribute metrics use d to assess method performance for conservative estimation of two representative quantities: central 95% of response; and 10-4 probability of exceeding a response threshold in a tail of the distribution. Each method's performance is characterized with 10,000 random trials on a large number of diverse and challenging distributions. The best method and number of samples to use in a given circumstance depends on the uncertainty quantity to be estimated, the PDF character, and the desired reliability of bounding the true value. On the basis of this large data base and study, a strategy is proposed for selecting the method and number of samples for attaining reasonable credibility levels in bounding these types of quantities when sparse samples of random variables or functions are available from experiments or simulations.

  3. Sampling

    CERN Document Server

    Thompson, Steven K

    2012-01-01

    Praise for the Second Edition "This book has never had a competitor. It is the only book that takes a broad approach to sampling . . . any good personal statistics library should include a copy of this book." —Technometrics "Well-written . . . an excellent book on an important subject. Highly recommended." —Choice "An ideal reference for scientific researchers and other professionals who use sampling." —Zentralblatt Math Features new developments in the field combined with all aspects of obtaining, interpreting, and using sample data Sampling provides an up-to-date treat

  4. Advanced analytical techniques for the measurement of nanomaterials in complex samples: a comparison

    NARCIS (Netherlands)

    Peters, R.J.B.; Herrera-Rivera, Z.; Bouwmeester, H.; Weigel, S.; Marvin, H.J.P.

    2014-01-01

    To solve the various analytical challenges related to the measurement of nanomaterials in complex matrices new advanced analytical techniques must be developed. In this study an interlaboratory exercise was organised to compare the capabilities and limitations of newly developed techniques with

  5. Efficacy and complications associated with a modified inferior alveolar nerve block technique. A randomized, triple-blind clinical trial.

    Science.gov (United States)

    Montserrat-Bosch, Marta; Figueiredo, Rui; Nogueira-Magalhães, Pedro; Arnabat-Dominguez, Josep; Valmaseda-Castellón, Eduard; Gay-Escoda, Cosme

    2014-07-01

    To compare the efficacy and complication rates of two different techniques for inferior alveolar nerve blocks (IANB). A randomized, triple-blind clinical trial comprising 109 patients who required lower third molar removal was performed. In the control group, all patients received an IANB using the conventional Halsted technique, whereas in the experimental group, a modified technique using a more inferior injection point was performed. A total of 100 patients were randomized. The modified technique group showed a significantly higher onset time in the lower lip and chin area, and was frequently associated to a lingual electric discharge sensation. Three failures were recorded, 2 of them in the experimental group. No relevant local or systemic complications were registered. Both IANB techniques used in this trial are suitable for lower third molar removal. However, performing an inferior alveolar nerve block in a more inferior position (modified technique) extends the onset time, does not seem to reduce the risk of intravascular injections and might increase the risk of lingual nerve injuries.

  6. Efficacy and complications associated with a modified inferior alveolar nerve block technique. A randomized, triple-blind clinical trial

    Science.gov (United States)

    Montserrat-Bosch, Marta; Nogueira-Magalhães, Pedro; Arnabat-Dominguez, Josep; Valmaseda-Castellón, Eduard; Gay-Escoda, Cosme

    2014-01-01

    Objectives: To compare the efficacy and complication rates of two different techniques for inferior alveolar nerve blocks (IANB). Study Design: A randomized, triple-blind clinical trial comprising 109 patients who required lower third molar removal was performed. In the control group, all patients received an IANB using the conventional Halsted technique, whereas in the experimental group, a modified technique using a more inferior injection point was performed. Results: A total of 100 patients were randomized. The modified technique group showed a significantly higher onset time in the lower lip and chin area, and was frequently associated to a lingual electric discharge sensation. Three failures were recorded, 2 of them in the experimental group. No relevant local or systemic complications were registered. Conclusions: Both IANB techniques used in this trial are suitable for lower third molar removal. However, performing an inferior alveolar nerve block in a more inferior position (modified technique) extends the onset time, does not seem to reduce the risk of intravascular injections and might increase the risk of lingual nerve injuries. Key words:Dental anesthesia, inferior alveolar nerve block, lidocaine, third molar, intravascular injection. PMID:24608204

  7. Validation of the k-filtering technique for a signal composed of random-phase plane waves and non-random coherent structures

    Directory of Open Access Journals (Sweden)

    O. W. Roberts

    2014-12-01

    Full Text Available Recent observations of astrophysical magnetic fields have shown the presence of fluctuations being wave-like (propagating in the plasma frame and those described as being structure-like (advected by the plasma bulk velocity. Typically with single-spacecraft missions it is impossible to differentiate between these two fluctuations, due to the inherent spatio-temporal ambiguity associated with a single point measurement. However missions such as Cluster which contain multiple spacecraft have allowed for temporal and spatial changes to be resolved, using techniques such as k filtering. While this technique does not assume Taylor's hypothesis it requires both weak stationarity of the time series and that the fluctuations can be described by a superposition of plane waves with random phases. In this paper we test whether the method can cope with a synthetic signal which is composed of a combination of non-random-phase coherent structures with a mean radius d and a mean separation λ, as well as plane waves with random phase.

  8. Beyond Random Walk and Metropolis-Hastings Samplers: Why You Should Not Backtrack for Unbiased Graph Sampling

    CERN Document Server

    Lee, Chul-Ho; Eun, Do Young

    2012-01-01

    Graph sampling via crawling has been actively considered as a generic and important tool for collecting uniform node samples so as to consistently estimate and uncover various characteristics of complex networks. The so-called simple random walk with re-weighting (SRW-rw) and Metropolis-Hastings (MH) algorithm have been popular in the literature for such unbiased graph sampling. However, an unavoidable downside of their core random walks -- slow diffusion over the space, can cause poor estimation accuracy. In this paper, we propose non-backtracking random walk with re-weighting (NBRW-rw) and MH algorithm with delayed acceptance (MHDA) which are theoretically guaranteed to achieve, at almost no additional cost, not only unbiased graph sampling but also higher efficiency (smaller asymptotic variance of the resulting unbiased estimators) than the SRW-rw and the MH algorithm, respectively. In particular, a remarkable feature of the MHDA is its applicability for any non-uniform node sampling like the MH algorithm,...

  9. Remnant preservation in anterior cruciate ligament reconstruction versus standard techniques: a meta-analysis of randomized controlled trials.

    Science.gov (United States)

    Ma, Tianjun; Zeng, Chun; Pan, Jianying; Zhao, Chang; Fang, Hang; Cai, Daozhang

    2017-01-01

    Preserving the remnant during anterior cruciate ligament (ACL) reconstruction is considered beneficial for graft healing, but it might increase the technical difficulties and complications. This study was to compare outcomes of using the technique of remnant preservation during the ACL reconstruction versus the standard procedure with the debridement of remnant. We searched PubMed and EMBASE and the Cochrane Library for randomized controlled trials comparing the outcomes of ACL reconstruction both with and without remnant preservation. The risk of bias was assessed in accordance with the Cochrane Collaboration's risk of bias tool. Meta-analysis was performed to compare results. Six randomized controlled trials with 346 patients were included. Statistically significant differences in favor of using technique of remnant preservation were observed for Lysholm Score, arthrometer measurements, and tibial tunnel enlargement. There was no significant difference between remnant technique of preservation and the standard procedure with respect to the IKDC (International Knee Documentation Committee) grade, IKDC score, Lachman Test, Pivot-shift Test, range of motion (ROM), and the incidence of the cyclops lesion. This meta-analysis of randomized controlled trials showed that ACL reconstruction with technique of remnant preservation cannot provide superior clinical outcomes compared with the standard procedure.

  10. Soil map disaggregation improved by soil-landscape relationships, area-proportional sampling and random forest implementation

    DEFF Research Database (Denmark)

    Møller, Anders Bjørn; Malone, Brendan P.; Odgers, Nathan

    algorithm were evaluated. The resulting maps were validated on 777 soil profiles situated in a grid covering Denmark. The experiments showed that the results obtained with Jacobsen’s map were more accurate than the results obtained with the CEC map, despite a nominally coarser scale of 1:2,000,000 vs. 1...... of European Communities (CEC, 1985) respectively, both using the FAO 1974 classification. Furthermore, the effects of implementing soil-landscape relationships, using area proportional sampling instead of per polygon sampling, and replacing the default C5.0 classification tree algorithm with a random forest......:1,000,000. This finding is probably related to the fact that Jacobsen’s map was more detailed with a larger number of polygons, soil map units and soil types, despite its coarser scale. The results showed that the implementation of soil-landscape relationships, area-proportional sampling and the random forest...

  11. Comparison of three techniques in the preparation of samples for the crystallization of cervical flow in lactating dairy cattle

    Directory of Open Access Journals (Sweden)

    Reátegui J

    2017-08-01

    Full Text Available The objective was to compare three techniques of sample preparation for cervical flow crystallization (pressure imprint, touch imprint and smear, analyzing the tree forms (crystallization as a characterization of the cervical flow of dairy cattle, according to the day of collection and moment of the estrous cycle. Ten clinically healthy, multiparous Holstein Friesian dairy cows were sampled and 30 to 50 days postpartum. Each was collected from the vaginal cervix using a disposable pipette and a 50cc syringe. The two imprints and the smear were prepared on slides, with two-step protocol according to the methodology of Prado et al. (2012. The samples were then allowed to dry in the environment and the microscope was read with a higher magnification objective (40X to observe the formation of the crystals, these procedures were performed in 4 different moments of the estrous cycle (0, 7, 14, 21 days. To quantify the crystallization, a scale from 0 to 4 was used, which varies depending on the formation of typical crystals at least formation or absence. At day 0, 7, 14, 21, the crystallization level in the three techniques had significant difference (P <0.05. At day 0, 50% of the samples processed by the touch imprint and pressure imprint showed typical formation compared to 20% that were processed by the smear technique. On day 7, 80% of the samples processed by touch imprint, 90% of the smear technique and 70% of the pressure imprint, present atypical crystals. On day 14, 60% of the samples processed by the contact imprint and 30% and 40% of the samples processed by the smear technique and pressure imprint, respectively, showed atypical crystal formation. On day 21, 40% of the samples processed by the touch imprint and 10% of the samples processed by the smear technique and pressure imprint showed typical crystal formation. It is concluded that the techniques of preparation of samples influence the crystallization of cervical mucus, being the most

  12. Does sample length influence the shape of xylem embolism vulnerability curves? A test with the Cavitron spinning technique.

    Science.gov (United States)

    Cochard, Hervé; Herbette, Stéphane; Barigah, Têtè; Badel, Eric; Ennajeh, Mustapha; Vilagrosa, Alberto

    2010-09-01

    The Cavitron spinning technique is used to construct xylem embolism vulnerability curves (VCs), but its reliability has been questioned for species with long vessels. This technique generates two types of VC: sigmoid 's'-shaped and exponential, levelling-off 'r'-shaped curves. We tested the hypothesis that 'r'-shaped VCs were anomalous and caused by the presence of vessels cut open during sample preparation. A Cavitron apparatus was used to construct VCs from samples of different lengths in species with contrasting vessel lengths. The results were compared with VCs obtained using other independent techniques. When vessel length exceeded sample length, VCs were 'r'-shaped and anomalous. Filling vessels cut open at both ends with air before measurement produced more typical 's'-shaped VCs. We also found that exposing segments of 11 woody species in a Cavitron at the pressure measured in planta before sampling considerably increased the degree of embolism above the native state level for species with long vessels. We concluded that open vessels were abnormally more vulnerable to cavitation than intact vessels. We recommend restricting this technique to species with short conduits. The relevance of our conclusions for other spinning techniques is discussed.

  13. Comparison of coarse coal dust sampling techniques in a laboratory-simulated longwall section.

    Science.gov (United States)

    Patts, Justin R; Barone, Teresa L

    2017-05-01

    Airborne coal dust generated during mining can deposit and accumulate on mine surfaces, presenting a dust explosion hazard. When assessing dust hazard mitigation strategies for airborne dust reduction, sampling is done in high-velocity ventilation air, which is used to purge the mining face and gallery tunnel. In this environment, the sampler inlet velocity should be matched to the air stream velocity (isokinetic sampling) to prevent oversampling of coarse dust at low sampler-to-air velocity ratios. Low velocity ratios are often encountered when using low flow rate, personal sampling pumps commonly used in underground mines. In this study, with a goal of employing mine-ready equipment, a personal sampler was adapted for area sampling of coarse coal dust in high-velocity ventilation air. This was done by adapting an isokinetic nozzle to the inlet of an Institute of Occupational Medicine (Edinburgh, Scotland) sampling cassette (IOM). Collected dust masses were compared for the modified IOM isokinetic sampler (IOM-MOD), the IOM without the isokinetic nozzle, and a conventional dust sampling cassette without the cyclone on the inlet. All samplers were operated at a flow rate typical of personal sampling pumps: 2 L/min. To ensure differences between collected masses that could be attributed to sampler design and were not influenced by artifacts from dust concentration gradients, relatively uniform and repeatable dust concentrations were demonstrated in the sampling zone of the National Institute for Occupational Safety and Health experimental mine gallery. Consistent with isokinetic theory, greater differences between isokinetic and non-isokinetic sampled masses were found for larger dust volume-size distributions and higher ventilation air velocities. Since isokinetic sampling is conventionally used to determine total dust concentration, and isokinetic sampling made a difference in collected masses, the results suggest when sampling for coarse coal dust the IOM-MOD may

  14. Dorsal penile nerve block for male pediatric circumcision--randomized comparison of ultrasound-guided vs anatomical landmark technique.

    Science.gov (United States)

    O'Sullivan, Michael J; Mislovic, Branislav; Alexander, Elise

    2011-12-01

    Dorsal penile nerve block (DPNB) is a commonly performed regional anesthetic technique for male circumcision. Traditionally, DPNB is based on an anatomical landmark technique. Recently, an ultrasound-guided technique for DPNB has been described. The aim of our study was to compare the anatomical landmark technique with this ultrasound-guided technique. The hypothesis to be tested was that ultrasound guidance of DPNB would lead to less administration of opioid when compared to the anatomical landmark technique. Boys of ASA status I/II scheduled for day case circumcision were prospectively recruited and randomized. DPNB was performed under general anesthesia using the anatomical landmark technique or ultrasound guidance. Fentanyl was administered intraoperatively and immediately postoperatively if patients demonstrated signs of pain. Similarly, oral codeine was given prior to discharge if required. The primary outcome measure was the number of patients requiring fentanyl. Secondary outcome measures included initial pain score on emergence from general anesthesia, requirement for codeine predischarge, and time to perform block. A total of 32 patients were recruited to the landmark group and 34 to the ultrasound group. There was no significant difference between the two groups in terms of fentanyl administration. The ultrasound technique took longer to perform but was associated with a reduction in codeine requirement prior to discharge. This study does not support the routine use of ultrasound for the performance of DPNB in male pediatric circumcision. Nonetheless, an associated reduction in codeine administration postoperatively suggests some benefit in terms of postoperative pain. © 2011 Blackwell Publishing Ltd.

  15. Large loop conformation sampling using the activation relaxation technique, ART-nouveau method.

    Science.gov (United States)

    St-Pierre, Jean-François; Mousseau, Normand

    2012-07-01

    We present an adaptation of the ART-nouveau energy surface sampling method to the problem of loop structure prediction. This method, previously used to study protein folding pathways and peptide aggregation, is well suited to the problem of sampling the conformation space of large loops by targeting probable folding pathways instead of sampling exhaustively that space. The number of sampled conformations needed by ART nouveau to find the global energy minimum for a loop was found to scale linearly with the sequence length of the loop for loops between 8 and about 20 amino acids. Considering the linear scaling dependence of the computation cost on the loop sequence length for sampling new conformations, we estimate the total computational cost of sampling larger loops to scale quadratically compared to the exponential scaling of exhaustive search methods. Copyright © 2012 Wiley Periodicals, Inc.

  16. Augmentation vs Nonaugmentation Techniques for Open Repairs of Achilles Tendon Ruptures with Early Functional Treatment: A Prospective Randomized Study.

    Science.gov (United States)

    Tezeren, Gündüz; Kuru, Ilhami

    2006-01-01

    A prospective randomized study was conducted in order to compare augmentation technique versus nonaugmentation technique, followed by early functional postoperative treatment, for operative repair of Achilles tendon ruptures. Twenty-four consecutive patients were assigned to two groups. Group I included 12 patients treated with Lindholm augmentation technique, whereas group II included 12 patients treated with modified Kessler end-to-end repair. Thereafter, these patients had postoperative management with a below-knee-cast for three weeks. The physioteraphy was initiated immediately after the cast was removed. Full weight bearing was allowed after five weeks postoperatively in the both groups. Two patients had reruptures in group II, whereas group I had prolonged operative time significantly. The patients with reruptures underwent reoperations and at the most final follow-up, it was observed that they could not resume to sporting activities. The other objective and subjective results were similar between two groups. Because of quite high rerupture rate in the group of patients treated with nonaugmentation technique, we favor functional postoperative treatment with early ankle movement in the patients treated with augmentation technique for the management of acute rupture of the Achilles tendon. Key PointsA prospective randomized study was conducted in order to compare augmentation technique versus nonaugmentation technique, followed by early functional postoperative treatment, for operative repair of Achilles tendon ruptures.Group I included 12 patients treated with Lindholm augmentation technique, whereas group II included 12 patients treated with modified Kessler end-to-end repair.Functional postoperative treatment with early ankle movement in the patients treated with augmentation for the management of acute rupture of the Achilles tendon is recommended.

  17. Comparison of the efficacy of two anesthetic techniques of mandibular primary first molar: A randomized clinical trial

    Directory of Open Access Journals (Sweden)

    Davood Ghasemi Tudeshchoie

    2013-01-01

    Full Text Available Background: The most common technique to anesthetize mandibular primary teeth is inferior alveolar (I.A nerve block injection which induces a relatively sustained anesthesia and in turn may potentially traumatize soft-tissues. Therefore, the need of having an alternative technique of anesthesia with a shorter term but the same efficacy is reasonable. The aim of this study was a comparison of the efficacy of two anesthetic techniques of mandibular primary first molar. Materials and Methods: In this randomized crossover clinical trial, 40 children with ages ranged from 5 years to 8 years whose mandibular primary first molars were eligible for pulpotomy, were selected and divided randomly into two groups. The right and left mandibular first molars of group A were anesthetized with infiltration and I. A nerve block techniques in the first and second sessions respectively. The left and right mandibular first molars of group B were anesthetized with I.A nerve block and infiltration techniques in the first and second sessions respectively. The severity of pain were measured and recorded according to sound-eye-motor scale by a certain person. Data was analyzed using Wilcoxon Signed Rank and Mann-Whitney U tests (P < 0.05. Results: The severity of pain was lower in infiltration technique versus I.A nerve block. There were no significant differences between the severities of pain on pulpal exposure of two techniques. Conclusion: It seems that infiltration technique is more favorable to anesthetize the mandibular primary first molar compared to I.A nerve block.

  18. Changes in selected biochemical indices resulting from various pre-sampling handling techniques in broilers

    National Research Council Canada - National Science Library

    Chloupek, Petr; Bedanova, Iveta; Chloupek, Jan; Vecerek, Vladimir

    2011-01-01

    .... This study focused on detection of changes in the corticosterone level and concentrations of other selected biochemical parameters in broilers handled in two different manners during blood sampling...

  19. Application of jade samples for high-dose dosimetry using the EPR technique

    Energy Technology Data Exchange (ETDEWEB)

    Teixeira, Maria Ines [Instituto de Pesquisas Energeticas e Nucleares/Comissao Nacional de Energia Nuclear Av. Prof. Lineu Prestes 2242, 05508-000, Sao Paulo (Brazil)], E-mail: miteixei@ipen.br; Melo, Adeilson P. [Instituto de Pesquisas Energeticas e Nucleares/Comissao Nacional de Energia Nuclear Av. Prof. Lineu Prestes 2242, 05508-000, Sao Paulo (Brazil); Centro Federal de Educacao Tecnologica de Sergipe, Aracaju (Brazil)], E-mail: adeilson_pessoa_melo@yahoo.com.br; Ferraz, Gilberto M. [Depto. de Fisica Nuclear, Instituto de Fisica, Universidade de Sao Paulo, Sao Paulo (Brazil)], E-mail: gmarconf@if.usp.br; Caldas, Linda V.E. [Instituto de Pesquisas Energeticas e Nucleares/Comissao Nacional de Energia Nuclear Av. Prof. Lineu Prestes 2242, 05508-000, Sao Paulo (Brazil)], E-mail: lcaldas@ipen.br

    2010-04-15

    The dosimeter characteristics of jade samples were studied for application in high-dose dosimetry. Jade is the common denomination of two silicates: jadeite and actinolite. The EPR spectra of different jade samples were obtained after irradiation with absorbed doses of 100 Gy up to 20 kGy. The jade samples present signals that increase with the absorbed dose (g-factors around 2.00); they can be attributed to electron centers. The EPR spectra obtained for the USA jade samples and their main dosimetric properties as reproducibility, calibration curves and energy dependence were investigated.

  20. Demonstrating Reliable High Level Waste Slurry Sampling Techniques to Support Hanford Waste Processing

    Energy Technology Data Exchange (ETDEWEB)

    Kelly, Steven E.

    2013-11-11

    The Hanford Tank Operations Contractor (TOC) and the Hanford Waste Treatment and Immobilization Plant (WTP) contractor are both engaged in demonstrating mixing, sampling, and transfer system capability using simulated Hanford High-Level Waste (HL W) formulations. This work represents one of the remaining technical issues with the high-level waste treatment mission at Hanford. The TOC must demonstrate the ability to adequately mix and sample high-level waste feed to meet the WTP Waste Acceptance Criteria and Data Quality Objectives. The sampling method employed must support both TOC and WTP requirements. To facilitate information transfer between the two facilities the mixing and sampling demonstrations are led by the One System Integrated Project Team. The One System team, Waste Feed Delivery Mixing and Sampling Program, has developed a full scale sampling loop to demonstrate sampler capability. This paper discusses the full scale sampling loops ability to meet precision and accuracy requirements, including lessons learned during testing. Results of the testing showed that the Isolok(R) sampler chosen for implementation provides precise, repeatable results. The Isolok(R) sampler accuracy as tested did not meet test success criteria. Review of test data and the test platform following testing by a sampling expert identified several issues regarding the sampler used to provide reference material used to judge the Isolok's accuracy. Recommendations were made to obtain new data to evaluate the sampler's accuracy utilizing a reference sampler that follows good sampling protocol.

  1. Effects of myofascial release techniques on pain, physical function, and postural stability in patients with fibromyalgia: a randomized controlled trial.

    Science.gov (United States)

    Castro-Sánchez, Adelaida María; Matarán-Peñarrocha, Guillermo A; Arroyo-Morales, Manuel; Saavedra-Hernández, Manuel; Fernández-Sola, Cayetano; Moreno-Lorenzo, Carmen

    2011-09-01

    To determine the effect of myofascial release techniques on pain symptoms, postural stability and physical function in fibromyalgia syndrome. A randomized, placebo-controlled trial was undertaken. Eighty-six patients with fibromyalgia syndrome were randomly assigned to an experimental group and a placebo group. Patients received treatments for 20 weeks. The experimental group underwent 10 myofascial release modalities and the placebo group received sham short-wave and ultrasound electrotherapy. Outcome variables were number of tender points, pain, postural stability, physical function, clinical severity and global clinical assessment of improvement. Outcome measures were assessed before and immediately after, at six months and one year after the last session of the corresponding intervention. After 20 weeks of myofascial therapy, the experimental group showed a significant improvement (P  myofascial release techniques can be a complementary therapy for pain symptoms, physical function and clinical severity but do not improve postural stability in patients with fibromyalgia syndrome.

  2. Randomization tests

    CERN Document Server

    Edgington, Eugene

    2007-01-01

    Statistical Tests That Do Not Require Random Sampling Randomization Tests Numerical Examples Randomization Tests and Nonrandom Samples The Prevalence of Nonrandom Samples in Experiments The Irrelevance of Random Samples for the Typical Experiment Generalizing from Nonrandom Samples Intelligibility Respect for the Validity of Randomization Tests Versatility Practicality Precursors of Randomization Tests Other Applications of Permutation Tests Questions and Exercises Notes References Randomized Experiments Unique Benefits of Experiments Experimentation without Mani

  3. Bottle Traps and Dipnetting: Evaluation of two Sampling Techniques for Assessing Macroinvertebrate Biodiversity in Depressional Wetlands.

    Science.gov (United States)

    Serieyssol, C. A.; Bouchard, R. W.; Sealock, A. W.; Rufer, M. M.; Chirhart, J.; Genet, J.; Ferrington, L. C.

    2005-05-01

    Dipnet (DN) sampling is routinely employed for macroinvertebrate bioassessments, however it has been shown that some taxa are more effectively sampled with activity traps, commonly called Bottle Traps (BT). In 2001, the Minnesota Pollution Control Agency used both DN and BT sampling in nine depressional wetlands in the North Central Hardwood Forest Ecoregion to evaluate macroinvertebrate biodiversity for the purpose of assessing water quality and developing biological criteria. Both methods, consisting of five bottle trap samples and two dip net samples per wetland, were collected from each of two sites in each wetland. To determine the performance of each method in documenting biodiversity, we compared taxa and their abundances by wetland, for each type of sample. DN sampling was more effective, with 44 of 140 macroinvertebrate taxa only identified from DN, compared to 14 only from BT. By contrast, BT more effectively collected leeches and beetles, especially active swimmers such as Tropisternus and several genera of Dytiscidae. However, taxa richness patterns for BT and DN were not strongly correlated. Consequently, we conclude these two sampling methods complement each other, providing a better overall picture of macroinvertebrate biodiversity, and should be used jointly when investigating macroinvertebrate biodiversity in depressional wetlands.

  4. Behavioural sampling techniques and activity pattern of Indian Pangolin Manis crassicaudata (Mammalia: Manidae in captivity

    Directory of Open Access Journals (Sweden)

    R.K. Mohapatra

    2013-12-01

    Full Text Available The study presents data on six Indian Pangolins Manis crassicaudata observed in captivity at the Pangolin Conservation Breeding Centre, Nandankanan, Odisha, India over 1377 hours of video recordings for each pangolin between 1500hr and 0800hr on 81 consecutive observational days. Video recordings were made through digital systems assisted by infrared enabled CCTV cameras. The data highlights patterns relate to 12 different behaviour and enclosure utilization. Different interval periods for sampling of instantaneous behaviour from video recordings have been evaluated to develop optimal study methods for the future. The activity budgets of pangolins displayed natural patterns of nocturnal activity with a peak between 20:00-21:00 hr. When out of their burrow, they spent about 59% of the time walking in the enclosure, and 14% of the time feeding. The repeatability of the behaviours has a significant negative correlation with the mean time spent in that behaviour. Focal behavioural samples significantly correlated with instantaneous samples up to 15 minutes interval. The correlation values gradually decreased with the increase in sampling interval. The results indicate that results obtained from focal sampling and instantaneous sampling with relatively shorter intervals (=5 minutes are about equally reliable. The study suggests use of focal sampling, instead of instantaneous sampling to record behaviour relating to social interactions.

  5. Comparison of mobile and stationary spore-sampling techniques for estimating virulence frequencies in aerial barley powdery mildew populations

    DEFF Research Database (Denmark)

    Hovmøller, M.S.; Munk, L.; Østergård, Hanne

    1995-01-01

    Gene frequencies in samples of aerial populations of barley powdery mildew (Erysiphe graminis f.sp. hordei), which were collected in adjacent barley areas and in successive periods of time, were compared using mobile and stationary sampling techniques. Stationary samples were collected from trap...... plants in three periods within 1 week at a distance of more than 1000 m from the nearest barley field. At four dates within the same 8-day period, other samples were collected by a mobile spore trap along four sampling routes of a total distance of 130 km around the stationary stand of exposure...... resistances genes, indicating a different distribution of source varieties along routes. There was no difference between allele frequencies at different dates, indicating that the proportions of spores from different source varieties were similar at these dates. In conclusion, samples collected...

  6. Finite-sample corrected generalized estimating equation of population average treatment effects in stepped wedge cluster randomized trials.

    Science.gov (United States)

    Scott, JoAnna M; deCamp, Allan; Juraska, Michal; Fay, Michael P; Gilbert, Peter B

    2017-04-01

    Stepped wedge designs are increasingly commonplace and advantageous for cluster randomized trials when it is both unethical to assign placebo, and it is logistically difficult to allocate an intervention simultaneously to many clusters. We study marginal mean models fit with generalized estimating equations for assessing treatment effectiveness in stepped wedge cluster randomized trials. This approach has advantages over the more commonly used mixed models that (1) the population-average parameters have an important interpretation for public health applications and (2) they avoid untestable assumptions on latent variable distributions and avoid parametric assumptions about error distributions, therefore, providing more robust evidence on treatment effects. However, cluster randomized trials typically have a small number of clusters, rendering the standard generalized estimating equation sandwich variance estimator biased and highly variable and hence yielding incorrect inferences. We study the usual asymptotic generalized estimating equation inferences (i.e., using sandwich variance estimators and asymptotic normality) and four small-sample corrections to generalized estimating equation for stepped wedge cluster randomized trials and for parallel cluster randomized trials as a comparison. We show by simulation that the small-sample corrections provide improvement, with one correction appearing to provide at least nominal coverage even with only 10 clusters per group. These results demonstrate the viability of the marginal mean approach for both stepped wedge and parallel cluster randomized trials. We also study the comparative performance of the corrected methods for stepped wedge and parallel designs, and describe how the methods can accommodate interval censoring of individual failure times and incorporate semiparametric efficient estimators.

  7. Ultrasonography-guided radial artery catheterization is superior compared with the traditional palpation technique: a prospective, randomized, blinded, crossover study.

    Science.gov (United States)

    Hansen, M A; Juhl-Olsen, P; Thorn, S; Frederiksen, C A; Sloth, E

    2014-04-01

    Radial artery catheterization is gaining popularity for diagnostic and interventional procedures. Palpation technique is widely used for the procedure, but ultrasonography has been shown to increase catheterization success. A recently described ultrasonography technique is termed 'dynamic needle tip positioning'. We aimed to compare the traditional palpation technique and dynamic needle tip positioning technique in regard to clinically relevant end points. The study was conducted as a randomized, patient-blinded, crossover study. Patients underwent bilateral radial artery catheterization using both techniques. The primary end point of the study was needle manipulation time. Additional end points were (1) the number of skin perforations, (2) the number of attempts targeting the vessel, (3) the number of catheters placed in first attempt and (4) the number of catheters used. Forty patients were analyzed. There was no significant difference in median needle manipulation time [32 s (range 11-96 s) vs. 39 s (range 9-575 s), P = 0.525], although the variance was lower in the dynamic needle tip positioning group (P palpation technique group, a higher number of skin perforations (57 vs. 40, P = 0.003), catheters (46 vs. 40, P = 0.025) and attempts targeting the vessel (104 vs. 43, P technique for radial artery catheterization significantly improves clinically relevant aspects of the procedure. © 2014 The Acta Anaesthesiologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  8. Analysis of hair samples using microscopical and molecular techniques to ascertain claims of rare animal species.

    Science.gov (United States)

    Zafarina, Zainuddin; Panneerchelvam, Sundararajulu

    2009-07-01

    An unidentified animal species named the Jenglot and claimed to be a rare living animal species was recently found in the deep jungle of Irian Jaya, Indonesia; brought to Kuala Lumpur, Malaysia by a businessman; and exhibited in a local museum. The owner of the Jenglot carcasses had made a request to perform DNA analysis on the Jenglot to ascertain its species. Because the muscle appeared very dry and recovery of DNA was extremely difficult, we therefore used the animals' hair for further analysis. Hair samples were collected from three different Jenglots that were different in colour and physical appearance. The samples were labelled as A, B, C and D, respectively. Microscopic characteristics indicated that all four hair samples were of human origin, with a medullary index less than 1/3 and pigment distribution towards the periphery. The scale pattern on the hair samples was of the imbricate type, adding certainty to the hypothesis of human origin. A dried root sheath was found in samples B and C, which was contrary to expectations since the sample collection method left a few cm of hair on the body of the Jenglots. Sample D had black dye granules over the cuticular surface. Sequencing of the mitochondrial DNA (mtDNA) hypervariable segment I (HVS-I) region showed polymorphisms at positions 16140, 16182C, 16183C, 16189, 16217 and 16274 and heteroplasmy at positions 16112, 16232 and 16251, a human-specific mtDNA haplotype that was consistent across all the samples. Based on these findings, it was concluded that it is unlikely that the samples of Jenglot hair originated from an animal species.

  9. Securing image information using double random phase encoding and parallel compressive sensing with updated sampling processes

    Science.gov (United States)

    Hu, Guiqiang; Xiao, Di; Wang, Yong; Xiang, Tao; Zhou, Qing

    2017-11-01

    Recently, a new kind of image encryption approach using compressive sensing (CS) and double random phase encoding has received much attention due to the advantages such as compressibility and robustness. However, this approach is found to be vulnerable to chosen plaintext attack (CPA) if the CS measurement matrix is re-used. Therefore, designing an efficient measurement matrix updating mechanism that ensures resistance to CPA is of practical significance. In this paper, we provide a novel solution to update the CS measurement matrix by altering the secret sparse basis with the help of counter mode operation. Particularly, the secret sparse basis is implemented by a reality-preserving fractional cosine transform matrix. Compared with the conventional CS-based cryptosystem that totally generates all the random entries of measurement matrix, our scheme owns efficiency superiority while guaranteeing resistance to CPA. Experimental and analysis results show that the proposed scheme has a good security performance and has robustness against noise and occlusion.

  10. Effect of DNA Extraction Methods and Sampling Techniques on the Apparent Structure of Cow and Sheep Rumen Microbial Communities

    Science.gov (United States)

    Henderson, Gemma; Cox, Faith; Kittelmann, Sandra; Miri, Vahideh Heidarian; Zethof, Michael; Noel, Samantha J.; Waghorn, Garry C.; Janssen, Peter H.

    2013-01-01

    Molecular microbial ecology techniques are widely used to study the composition of the rumen microbiota and to increase understanding of the roles they play. Therefore, sampling and DNA extraction methods that result in adequate yields of microbial DNA that also accurately represents the microbial community are crucial. Fifteen different methods were used to extract DNA from cow and sheep rumen samples. The DNA yield and quality, and its suitability for downstream PCR amplifications varied considerably, depending on the DNA extraction method used. DNA extracts from nine extraction methods that passed these first quality criteria were evaluated further by quantitative PCR enumeration of microbial marker loci. Absolute microbial numbers, determined on the same rumen samples, differed by more than 100-fold, depending on the DNA extraction method used. The apparent compositions of the archaeal, bacterial, ciliate protozoal, and fungal communities in identical rumen samples were assessed using 454 Titanium pyrosequencing. Significant differences in microbial community composition were observed between extraction methods, for example in the relative abundances of members of the phyla Bacteroidetes and Firmicutes. Microbial communities in parallel samples collected from cows by oral stomach-tubing or through a rumen fistula, and in liquid and solid rumen digesta fractions, were compared using one of the DNA extraction methods. Community representations were generally similar, regardless of the rumen sampling technique used, but significant differences in the abundances of some microbial taxa such as the Clostridiales and the Methanobrevibacter ruminantium clade were observed. The apparent microbial community composition differed between rumen sample fractions, and Prevotellaceae were most abundant in the liquid fraction. DNA extraction methods that involved phenol-chloroform extraction and mechanical lysis steps tended to be more comparable. However, comparison of data

  11. Evaluation of alternative macroinvertebrate sampling techniques for use in a new tropical freshwater bioassessment scheme

    Directory of Open Access Journals (Sweden)

    Isabel Eleanor Moore

    2015-06-01

    Full Text Available Aim: The study aimed to determine the effectiveness of benthic macroinvertebrate dredge net sampling procedures as an alternative method to kick net sampling in tropical freshwater systems, specifically as an evaluation of sampling methods used in the Zambian Invertebrate Scoring System (ZISS river bioassessment scheme. Tropical freshwater ecosystems are sometimes dangerous or inaccessible to sampling teams using traditional kick-sampling methods, so identifying an alternative procedure that produces similar results is necessary in order to collect data from a wide variety of habitats.MethodsBoth kick and dredge nets were used to collect macroinvertebrate samples at 16 riverine sites in Zambia, ranging from backwaters and floodplain lagoons to fast flowing streams and rivers. The data were used to calculate ZISS, diversity (S: number of taxa present, and Average Score Per Taxon (ASPT scores per site, using the two sampling methods to compare their sampling effectiveness. Environmental parameters, namely pH, conductivity, underwater photosynthetically active radiation (PAR, temperature, alkalinity, flow, and altitude, were also recorded and used in statistical analysis. Invertebrate communities present at the sample sites were determined using multivariate procedures.ResultsAnalysis of the invertebrate community and environmental data suggested that the testing exercise was undertaken in four distinct macroinvertebrate community types, supporting at least two quite different macroinvertebrate assemblages, and showing significant differences in habitat conditions. Significant correlations were found for all three bioassessment score variables between results acquired using the two methods, with dredge-sampling normally producing lower scores than did the kick net procedures. Linear regression models were produced in order to correct each biological variable score collected by a dredge net to a score similar to that of one collected by kick net

  12. Alcohol and marijuana use in adolescents' daily lives: a random sample of experiences.

    Science.gov (United States)

    Larson, R; Csikszentmihalyi, M; Freeman, M

    1984-07-01

    High school students filled out reports on their experiences at random times during their daily lives, including 48 occasions when they were using alcohol or marijuana. Alcohol use was reported primarily in the context of Friday and Saturday night social gatherings and was associated with a happy and gregarious subjective state. Marijuana use was reported across a wider range of situations and was associated with an average state that differed much less from ordinary experience.

  13. Detection by the fluorescence in situ hybridization technique of MYC translocations in paraffin-embedded lymphoma biopsy samples

    NARCIS (Netherlands)

    Haralambieva, E; Banham, AH; Bastard, C; Delsol, G; Gaulard, P; Ott, G; Pileri, S; Fletcher, JA; Mason, DY

    The detection of chromosomal translocations by fluorescence in situ hybridization (FISH) is widely performed, but very few studies have attempted to apply this technique to paraffin-embedded routine biopsy samples. We report the analysis of paraffin sections from 36 B-cell lymphoma biopsies for MYC

  14. An evaluation of sampling methods and supporting techniques for tackling lead in drinking water in Aberta Province

    Science.gov (United States)

    A collaborative project commenced in August 2013 with the aim of demonstrating a range of techniques that can be used in tackling the problems of lead in drinking water. The main project was completed in March 2014, with supplementary sampling exercises in mid-2014. It involved t...

  15. Survey Research: Determining Sample Size and Representative Response. and The Effects of Computer Use on Keyboarding Technique and Skill.

    Science.gov (United States)

    Wunsch, Daniel R.; Gades, Robert E.

    1986-01-01

    Two articles are presented. The first reviews and suggests procedures for determining appropriate sample sizes and for determining the response representativeness in survey research. The second presents a study designed to determine the effects of computer use on keyboarding technique and skill. (CT)

  16. Estimation of the Coefficient of Restitution of Rocking Systems by the Random Decrement Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Demosthenous, Milton; Manos, George C.

    1994-01-01

    The aim of this paper is to investigate the possibility of estimating an average damping parameter for a rocking system due to impact, the so-called coefficient of restitution, from the random response, i.e. when the loads are random and unknown, and the response is measured. The objective is to ...... of freedom system loaded by white noise, estimating the coefficient of restitution as explained, and comparing the estimates with the value used in the simulations. Several estimates for the coefficient of restitution are considered, and reasonable results are achieved....

  17. Estimation of the Coefficient of Restitution of Rocking Systems by the Random Decrement Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Demosthenous, M.; Manos, G. C.

    The aim of this paper is to investigate the possibility of estimating an average damping parameter for a rocking system due to impact, the so-called coefficient of restitution, from the random response, i.e. when the loads are random and unknown, and the response is measured. The objective is to ...... of freedom system loaded by white noise, estimating the coefficient of restitution as explained, and comparing the estimates with the value used in the simulations. Several estimates for the coefficient of restitution are considered, and reasonable results are achieved....

  18. Stemflow estimation in a redwood forest using model-based stratified random sampling

    Science.gov (United States)

    Jack Lewis

    2003-01-01

    Model-based stratified sampling is illustrated by a case study of stemflow volume in a redwood forest. The approach is actually a model-assisted sampling design in which auxiliary information (tree diameter) is utilized in the design of stratum boundaries to optimize the efficiency of a regression or ratio estimator. The auxiliary information is utilized in both the...

  19. Random or systematic sampling to detect a localised microbial contamination within a batch of food

    NARCIS (Netherlands)

    Jongenburger, I.; Reij, M.W.; Boer, E.P.J.; Gorris, L.G.M.; Zwietering, M.H.

    2011-01-01

    Pathogenic microorganisms are known to be distributed heterogeneously in food products that are solid, semi-solid or powdered, like for instance peanut butter, cereals, or powdered milk. This complicates effective detection of the pathogens by sampling. Two-class sampling plans, which are deployed

  20. Multistage point relascope and randomized branch sampling for downed coarse woody debris estimation

    Science.gov (United States)

    Jeffrey H. Gove; Mark J. Ducey; Harry T. Valentine

    2002-01-01

    New sampling methods have recently been introduced that allow estimation of downed coarse woody debris using an angle gauge, or relascope. The theory behind these methods is based on sampling straight pieces of downed coarse woody debris. When pieces deviate from this ideal situation, auxillary methods must be employed. We describe a two-stage procedure where the...

  1. An orientation-space super sampling technique for six-dimensional diffraction contrast tomography

    NARCIS (Netherlands)

    N.R. Viganò (Nicola); K.J. Batenburg (Joost); W. Ludwig (Wolfgang)

    2016-01-01

    textabstractDiffraction contrast tomography (DCT) is an X-ray full-field imaging technique that allows for the non-destructive three-dimensional investigation of polycrystalline materials and the determination of the physical and morphological properties of their crystallographic domains, called

  2. Critique of Sikkink and Keane's comparison of surface fuel sampling techniques

    Science.gov (United States)

    Clinton S. Wright; Roger D. Ottmar; Robert E. Vihnanek

    2010-01-01

    The 2008 paper of Sikkink and Keane compared several methods to estimate surface fuel loading in western Montana: two widely used inventory techniques (planar intersect and fixed-area plot) and three methods that employ photographs as visual guides (photo load, photoload macroplot and photo series). We feel, however, that their study design was inadequate to evaluate...

  3. Field Methods and Sample Collection Techniques for the Surveillance of West Nile Virus in Avian Hosts.

    Science.gov (United States)

    Wheeler, Sarah S; Boyce, Walter M; Reisen, William K

    2016-01-01

    Avian hosts play an important role in the spread, maintenance, and amplification of West Nile virus (WNV). Avian susceptibility to WNV varies from species to species thus surveillance efforts can focus both on birds that survive infection and those that succumb. Here we describe methods for the collection and sampling of live birds for WNV antibodies or viremia, and methods for the sampling of dead birds. Target species and study design considerations are discussed.

  4. Application of Receiver Operating Characteristic (ROC Curves for Explosives Detection Using Different Sampling and Detection Techniques

    Directory of Open Access Journals (Sweden)

    Mimy Young

    2013-12-01

    Full Text Available Reported for the first time are receiver operating characteristic (ROC curves constructed to describe the performance of a sorbent-coated disk, planar solid phase microextraction (PSPME unit for non-contact sampling of a variety of volatiles. The PSPME is coupled to ion mobility spectrometers (IMSs for the detection of volatile chemical markers associated with the presence of smokeless powders, model systems of explosives containing diphenylamine (DPA, 2,4-dinitrotoluene (2,4-DNT and nitroglycerin (NG as the target analytes. The performance of the PSPME-IMS was compared with the widely accepted solid-phase microextraction (SPME, coupled to a GC-MS. A set of optimized sampling conditions for different volume containers (1–45 L with various sample amounts of explosives, were studied in replicates (n = 30 to determine the true positive rates (TPR and false positive detection rates (FPR for the different scenarios. These studies were obtained in order to construct the ROC curves for two IMS instruments (a bench-top and field-portable system and a bench top GC-MS system in low and high clutter environments. Both static and dynamic PSPME sampling were studied in which 10–500 mg quantities of smokeless powders were detected within 10 min of static sampling and 1 min of dynamic sampling.

  5. Comparative Study of Radon Concentration with Two Techniques and Elemental Analysis in Drinking Water Samples of the Jammu District, Jammu and Kashmir, India.

    Science.gov (United States)

    Kumar, Ajay; Kaur, Manpreet; Mehra, Rohit; Sharma, Dinesh Kumar; Mishra, Rosaline

    2017-10-01

    The level of radon concentration has been assessed using the Advanced SMART RnDuo technique in 30 drinking water samples from Jammu district, Jammu and Kashmir, India. The water samples were collected from wells, hand pumps, submersible pumps, and stored waters. The randomly obtained 14 values of radon concentration in water sources using the SMART RnDuo technique have been compared and cross checked by a RAD7 device. A good positive correlation (R = 0.88) has been observed between the two techniques. The overall value of radon concentration in various water sources has ranged from 2.45 to 18.43 Bq L, with a mean value of 8.24 ± 4.04 Bq L, and it agreed well with the recommended limit suggested by the European Commission and UNSCEAR. However, the higher activity of mean radon concentration was found in groundwater drawn from well, hand and submersible pumps as compared to stored water. The total annual effective dose due to radon inhalation and ingestion ranged from 6.69 to 50.31 μSv y with a mean value of 22.48 ± 11.03 μSv y. The total annual effective dose was found to lie within the safe limit (100 μSv y) suggested by WHO. Heavy metal analysis was also carried out in various water sources by using an atomic absorption spectrophotometer (AAS), and the highest value of heavy metals was found mostly in groundwater samples. The obtained results were compared with Indian and International organizations like WHO and the EU Council. Among all the samples, the elemental analysis is not on the exceeding side of the permissible limit.

  6. The Random Forests Statistical Technique: An Examination of Its Value for the Study of Reading

    Science.gov (United States)

    Matsuki, Kazunaga; Kuperman, Victor; Van Dyke, Julie A.

    2016-01-01

    Studies investigating individual differences in reading ability often involve data sets containing a large number of collinear predictors and a small number of observations. In this article, we discuss the method of Random Forests and demonstrate its suitability for addressing the statistical concerns raised by such data sets. The method is…

  7. Changes in Selected Biochemical Indices Resulting from Various Pre-sampling Handling Techniques in Broilers

    Directory of Open Access Journals (Sweden)

    Chloupek Petr

    2011-05-01

    Full Text Available Abstract Background Since it is not yet clear whether it is possible to satisfactorily avoid sampling-induced stress interference in poultry, more studies on the pattern of physiological response and detailed quantification of stress connected with the first few minutes of capture and pre-sampling handling in poultry are required. This study focused on detection of changes in the corticosterone level and concentrations of other selected biochemical parameters in broilers handled in two different manners during blood sampling (involving catching, carrying, restraint, and blood collection itself that lasted for various time periods within the interval 30-180 seconds. Methods Stress effects of pre-sampling handling were studied in a group (n = 144 of unsexed ROSS 308 broiler chickens aged 42 d. Handling (catching, carrying, restraint, and blood sampling itself was carried out in a gentle (caught, held and carried carefully in an upright position or rough (caught by the leg, held and carried with lack of care in inverted position manner and lasted for 30 s, 60 s, 90 s, 120 s, 150 s, and 180 s. Plasma corticosterone, albumin, glucose, cholesterol, lactate, triglycerides and total protein were measured in order to assess the stress-induced changes to these biochemical indices following handling in the first few minutes of capture. Results Pre-sampling handling in a rough manner resulted in considerably higher plasma concentrations of all biochemical indices monitored when compared with gentle handling. Concentrations of plasma corticosterone after 150 and 180 s of handling were considerably higher (P Conclusions These results indicate that the pre-sampling procedure may be a considerably stressful procedure for broilers, particularly when carried out with lack of care and exceeding 120 seconds.

  8. The use of random decrement technique for identification of structural modes of vibration. [tested on a generalized payload and the space shuttle model

    Science.gov (United States)

    Ibrahim, S. R.

    1977-01-01

    An algorithm is developed to obtain the free responses of a structure from its random responses due to some unknown or known random input or inputs, using the random-decrement technique without changing time correlation between signals. The algorithm is tested using random responses from a 'generalized payload' model and from the 'Space Shuttle' model. The resulting free responses are then used to identify the modal characteristics of the two systems.

  9. Determination of depleted uranium in environmental samples by gamma-spectroscopic techniques.

    Science.gov (United States)

    Karangelos, D J; Anagnostakis, M J; Hinis, E P; Simopoulos, S E; Zunic, Z S

    2004-01-01

    The military use of depleted uranium initiated the need for an efficient and reliable method to detect and quantify DU contamination in environmental samples. This paper presents such a method, based on the gamma spectroscopic determination of 238U and 235U. The main advantage of this method is that it allows for a direct determination of the U isotope ratio, while requiring little sample preparation and being significantly less labor intensive than methods requiring radiochemical treatment. Furthermore, the fact that the sample preparation is not destructive greatly simplifies control of the quality of measurements. Low energy photons are utilized, using Ge detectors efficient in the low energy region and applying appropriate corrections for self-absorption. Uranium-235 in particular is determined directly from its 185.72 keV photons, after analyzing the 235U-226Ra multiplet. The method presented is applied to soil samples originating from two different target sites, in Southern Yugoslavia and Montenegro. The analysis results are discussed in relation to the natural radioactivity content of the soil at the sampling sites. A mapping algorithm is applied to examine the spatial variability of the DU contamination.

  10. Seasonal comparison of moss bag technique against vertical snow samples for monitoring atmospheric pollution.

    Science.gov (United States)

    Salo, Hanna; Berisha, Anna-Kaisa; Mäkinen, Joni

    2016-03-01

    This is the first study seasonally applying Sphagnum papillosum moss bags and vertical snow samples for monitoring atmospheric pollution. Moss bags, exposed in January, were collected together with snow samples by early March 2012 near the Harjavalta Industrial Park in southwest Finland. Magnetic, chemical, scanning electron microscopy-energy dispersive X-ray spectroscopy (SEM-EDX), K-means clustering, and Tomlinson pollution load index (PLI) data showed parallel spatial trends of pollution dispersal for both materials. Results strengthen previous findings that concentrate and slag handling activities were important (dust) emission sources while the impact from Cu-Ni smelter's pipe remained secondary at closer distances. Statistically significant correlations existed between the variables of snow and moss bags. As a summary, both methods work well for sampling and are efficient pollutant accumulators. Moss bags can be used also in winter conditions and they provide more homogeneous and better controlled sampling method than snow samples. Copyright © 2015. Published by Elsevier B.V.

  11. Extending the Collection Duration of Breath Samples for Enteric Methane Emission Estimation Using the SF6 Tracer Technique

    Science.gov (United States)

    Pinares-Patiño, César; Gere, José; Williams, Karen; Gratton, Roberto; Juliarena, Paula; Molano, German; MacLean, Sarah; Sandoval, Edgar; Taylor, Grant; Koolaard, John

    2012-01-01

    Simple Summary Extended sample collection for the SF6 tracer technique is desirable for extensive grazing systems. Breath samples from eight cows were collected while lucerne silage was fed to achieve fixed intakes among the cows. Samples were collected over a 10-day period, using either apparatuses used in New Zealand (NZL) or Argentina (ARG), and either daily, over two consecutive 5-day periods or over a 10-day period (in duplicate). The NZL system had a greater sampling success and more consistent CH4 emission estimates than the ARG system, with no differences in mean emissions among sample collection periods. This study showed that extended sample collection is feasible, but definitive evaluation under grazing situation is required before a decision on recommendation can be made. Abstract The daily sample collection protocol of the sulphur hexafluoride (SF6) tracer technique for the estimation of methane (CH4) emissions from ruminants may not be practical under extensive grazing systems. Here, under controlled conditions, we evaluated extended periods of sampling as an alternative to daily sample collections. Eight rumen-fistulated cows were housed and fed lucerne silage to achieve common daily feed intakes of 6.4 kg dry matter per cow. Following SF6 permeation tube dosing, eight sampling lines were fitted to the breath collection harness, so that a common gas mix was available to each line. Half of the lines collected samples into PVC yokes using a modified capillary system as commonly used in New Zealand (NZL), and half collected samples into stainless steel cylinders using a ball-bearing flow restrictor as used in Argentina (ARG), all within a 10-day time frame, either daily, across two consecutive 5-day periods or across one 10-day period (in duplicate). The NZL system had greater sampling success (97.3 vs. 79.5%) and yielded more consistent CH4 emission estimates than the ARG system. Emission estimates from NZL daily, NZL 5-day and NZL 10-day samplings

  12. Profound Transcriptomic Differences Found between Sperm Samples from Sperm Donors vs. Patients Undergoing Assisted Reproduction Techniques Tends to Disappear after Swim-up Sperm Preparation Technique

    Directory of Open Access Journals (Sweden)

    Marcos Meseguer

    2010-01-01

    Full Text Available Background: Although spermatozoa delivers its RNA to oocytes at fertilization, its biologicalrole is not well characterized. Our purpose was to identify the genes differentially and exclusivelyexpressed in sperm samples both before and after the swim-up process in control donors andinfertile males with the purpose to identify their functional significance in male fertility.Materials and Methods: This was a nested case-control study. Ten sperm samples were obtainedfrom infertile patients [n=5 (two aliquots each from five samples; one before the swim-upprocess and one after] and donors [n=5 (two aliquots from five samples, one before the swim-upprocess and one after]. Oligonucleotide microarrays were employed to study the genome-wideexpression of pooled samples from infertile patients vs. donors. A total of four microarrays wereperfomed: two with sperm sample aliquots before swim-up and two with sperm samples aliquotsafter swim-up, from both the case and control groups. The results were evaluated to detect whichgenes expressed differentially [fold change (FC>5 and p<0.05] and which genes were exclusiveto each of the groups, both before and after swim-up.Results: Profound differences were detected between the fresh sperm samples of donors vs.infertile patients with respect to both differentially and exclusively expressed genes. Neverthelessthese differences seemed to decrease after the swim-up selection process.Conclusion: There are important differences between the expression profiles of sperm samplesof fertile donors vs. infertile patients who require assisted reproduction techniques (ART. Thesedifferences are potential forecasters of fertility success, although their reliability needs to beexplored further.

  13. Top-down analysis of protein samples by de novo sequencing techniques

    Energy Technology Data Exchange (ETDEWEB)

    Vyatkina, Kira; Wu, Si; Dekker, Lennard J. M.; VanDuijn, Martijn M.; Liu, Xiaowen; Tolić, Nikola; Luider, Theo M.; Paša-Tolić, Ljiljana; Pevzner, Pavel A.

    2016-05-14

    MOTIVATION: Recent technological advances have made high-resolution mass spectrometers affordable to many laboratories, thus boosting rapid development of top-down mass spectrometry, and implying a need in efficient methods for analyzing this kind of data. RESULTS: We describe a method for analysis of protein samples from top-down tandem mass spectrometry data, which capitalizes on de novo sequencing of fragments of the proteins present in the sample. Our algorithm takes as input a set of de novo amino acid strings derived from the given mass spectra using the recently proposed Twister approach, and combines them into aggregated strings endowed with offsets. The former typically constitute accurate sequence fragments of sufficiently well-represented proteins from the sample being analyzed, while the latter indicate their location in the protein sequence, and also bear information on post-translational modifications and fragmentation patterns.

  14. A novel cytologic sampling technique to diagnose subclinical endometritis and comparison of staining methods for endometrial cytology samples in dairy cows.

    Science.gov (United States)

    Pascottini, O B; Dini, P; Hostens, M; Ducatelle, R; Opsomer, G

    2015-11-01

    The present article describes a study of the diagnosis of subclinical endometritis in dairy cows having two principal aims: first, to validate a novel technique for taking endometrial cytology samples to diagnose subclinical endometritis in dairy cows. Second, to compare the percentage of polymorphonuclear cells (PMNs) in cytology samples stained with Diff-Quik versus a specific staining method for PMNs, naphthol-AS-D-chloroacetate-esterase (CIAE). In the first experiment, Holstein-Friesian cows (n = 204) were used to take two cytology samples at the same time using the conventional cytobrush (CB) and the new cytotape (CT). Both devices were assembled within the same catheter allowing sampling at the same time, and approximately at the same location. Cytotape consisted of a 1.5-cm piece of paper tape rolled on the top of an insemination catheter covered with a double guard sheet. Parameters used to evaluate both methods were: PMNs percentage, total cellularity, quality of the smears, and red blood cell contamination. The concordance correlation coefficient analysis was used to assess agreement between continuous and Pearson chi-square tests for categorical variables. Agreement between the percentage of PMNs in both methods was good ρ = 0.84 (0.79, 0.87) with a minor standard error of 2%. Both methods yielded similar total cellularity (P = 0.62). Cytotape yielded better quality smears with more intact cells (P < 0.01) while samples that were taken by CB were more likely to be bloody (P < 0.01). Hence, CT and CB methods yielded smears with a similar PMNs percentage and a total number of cells, but CT provided smears with higher quality and significantly less blood contamination. For the second experiment, 114 duplicate cytology slides were stained using both Diff-Quik and CIAE. Agreement between PMNs percentage in both staining techniques was good ρc = 0.84 (0.78, 0.89) with a standard error of only 2%. Hence, Diff-Quik was confirmed as an easy, fast

  15. Analysis of selected phthalates in Canadian indoor dust collected using household vacuum and standardized sampling techniques.

    Science.gov (United States)

    Kubwabo, C; Rasmussen, P E; Fan, X; Kosarac, I; Wu, F; Zidek, A; Kuchta, S L

    2013-12-01

    Phthalates have been used extensively as plasticizers to improve the flexibility of polymers, and they also have found many industrial applications. They are ubiquitous in the environment and have been detected in a variety of environmental and biological matrices. The goal of this study was to develop a method for the determination of 17 phthalate esters in house dust. This method involved sonication extraction, sample cleanup using solid phase extraction, and isotope dilution GC/MS/MS analysis. Method detection limits (MDLs) and recoveries ranged from 0.04 to 2.93 μg/g and from 84 to 117%, respectively. The method was applied to the analysis of phthalates in 38 paired household vacuum samples (HD) and fresh dust (FD) samples. HD and FD samples compared well for the majority of phthalates detected in house dust. Data obtained from 126 household dust samples confirmed the historical widespread use of bis(2-ethylhexyl) phthalate (DEHP), with a concentration range of 36 μg/g to 3840 μg/g. Dibutyl phthalate (DBP), benzyl butyl phthalate (BzBP), diisononyl phthalate (DINP), and diisodecyl phthalate (DIDP) were also found in most samples at relatively high concentrations. Another important phthalate, diisobutyl phthalate (DIBP), was detected at a frequency of 98.4% with concentrations ranging from below its MDL of 0.51 μg/g to 69 μg/g. © 2013 Her Majesty the Queen in Right of Canada Indoor Air © 2013 John Wiley & Sons Ltd. Reproduced with the permission of the Minister of Health Canada.

  16. An improved DNA isolation technique for PCR detection of Strongyloides stercoralis in stool samples.

    Science.gov (United States)

    Repetto, S A; Alba Soto, C D; Cazorla, S I; Tayeldin, M L; Cuello, S; Lasala, M B; Tekiel, V S; González Cappa, S M

    2013-05-01

    Strongyloides stercoralis is a nematode that causes severe infections in immunocompromised patients. The low parasitic burden of chronically infected patients makes diagnosis difficult to achieve by conventional methods. Here, an in-house (IH) method for the isolation of parasite DNA from stools and a PCR assay for the molecular diagnosis of S. stercoralis were optimized. DNA yield and purity improved with the IH method which included a step of incubation of stool samples with a glycine-SDS buffer and mechanical disruption prior to DNA extraction. For the PCR assay, the addition of bovine serum albumin was required to neutralize inhibitors present in stool. The analytical sensitivity of the PCR using DNA as template, isolated with the IH method, was superior to the commercial one. This study demonstrates that a combined method that adds the step of glycine-SDS buffer incubation plus mechanical disruption prior to DNA isolation with the commercial kit increased PCR sensitivity to levels of the IH method. Finally, our assay was tested on 17 clinical samples. With the IH method for DNA isolation, a S. stercoralis specific band was detected by PCR in the first stool sample in all patients (17/17), while with the commercial kit, our S. stercoralis-specific band was only observed in 7 samples. The superior efficiency of the IH and combined methods over the commercial kit was demonstrated when applied to clinical samples with low parasitic burden. These results show that the DNA extraction procedure is a key to increase sensitivity of the S. stercoralis PCR assay in stool samples. The method developed here could help to improve the molecular diagnosis of S. stercoralis. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. Sport Sampling Is Associated With Improved Landing Technique in Youth Athletes.

    Science.gov (United States)

    DiStefano, Lindsay J; Beltz, Eleanor M; Root, Hayley J; Martinez, Jessica C; Houghton, Andrew; Taranto, Nicole; Pearce, Katherine; McConnell, Erin; Muscat, Courtney; Boyle, Steve; Trojian, Thomas H

    2017-11-01

    Sport sampling is recommended to promote fundamental movement skill acquisition and physical activity. In contrast, sport specialization is associated with musculoskeletal injury risk, burnout, and attrition from sport. There is limited evidence to support the influence of sport sampling on neuromuscular control, which is associated with injury risk, in youth athletes. Athletes who participated in only 1 sport during the previous year would demonstrate higher Landing Error Scoring System (LESS) scores than their counterparts. Cross-sectional study. Level 3. A total of 355 youth athletes (age range, 8-14 years) completed a test session with a jump-landing task, which was evaluated using the LESS. Participants were categorized as single sport (SS) or multisport (MS) based on their self-reported sport participation in the past year. Their duration of sport sampling (low, moderate, high) was determined based on their sport participation history. Participants were dichotomized into good (LESS sport-sampling duration (low, moderate, high). The MS group was 2.5 times (95% CI, 1.9-3.1) as likely to be categorized as having good control compared with the SS group (χ 2 (355) = 10.10, P sport-sampling duration group were 5.8 times (95% CI, 3.1-8.5) and 5.4 times (95% CI, 4.0-6.8) as likely to be categorized as having good control compared with the moderate and low groups (χ 2 (216) = 11.20, P Sport sampling at a young age is associated with improved neuromuscular control, which may reduce injury risk in youth athletes. Youth athletes should be encouraged to try participating in multiple sports to enhance their neuromuscular control and promote long-term physical activity.

  18. How Laboratory Sampling Techniques and Extraction Methods Affect Reproducibility of PAH Results

    Science.gov (United States)

    2011-03-30

    Pestle (breakup large fraction) – Skeet crushes to a fine powder – # 10 Sieve • Soxhlet Serially extracted 5 times • Sonication Serially extracted 5...Original Extraction Soxhlet US Army Corps of Engineers 4th Serial Extraction by Sonication US Army Corps of Engineers 654321 250000 200000 150000 100000... Soxhlet 3540C Analyte Original RX 1 Benzo (a) pyrene 116 0.136 Sample A Units: mg/Kg Analyte Original RX 1 Benzo (a) pyrene 22.0 0.007 Sample B Units

  19. Electromembrane extraction as a rapid and selective miniaturized sample preparation technique for biological fluids

    DEFF Research Database (Denmark)

    Gjelstad, Astrid; Pedersen-Bjergaard, Stig; Seip, Knut Fredrik

    2015-01-01

    of organic solvent, and into an aqueous receiver solution. The extraction is promoted by application of an electrical field, causing electrokinetic migration of the charged analytes. The method has shown to perform excellent clean-up and selectivity from complicated aqueous matrices like biological fluids......This special report discusses the sample preparation method electromembrane extraction, which was introduced in 2006 as a rapid and selective miniaturized extraction method. The extraction principle is based on isolation of charged analytes extracted from an aqueous sample, across a thin film...

  20. Improving oral hygiene skills by computer-based training: a randomized controlled comparison of the modified Bass and the Fones techniques.

    Science.gov (United States)

    Harnacke, Daniela; Mitter, Simona; Lehner, Marc; Munzert, Jörn; Deinzer, Renate

    2012-01-01

    Gingivitis and other plaque-associated diseases have a high prevalence in western communities even though the majority of adults report daily oral hygiene. This indicates a lack of oral hygiene skills. Currently, there is no clear evidence as to which brushing technique would bring about the best oral hygiene skills. While the modified Bass technique is often recommended by dentists and in textbooks, the Fones technique is often recommended in patient brochures. Still, standardized comparisons of the effectiveness of teaching these techniques are lacking. In a final sample of n = 56 students, this multidisciplinary, randomized, examiner-blinded, controlled study compared the effects of parallel and standardized interactive computer presentations teaching either the Fones or the modified Bass technique. A control group was taught the basics of tooth brushing alone. Oral hygiene skills (remaining plaque after thorough oral hygiene) and gingivitis were assessed at baseline and 6, 12, and 28 weeks after the intervention. We found a significant group×time interaction for gingivitis (F(4/102) = 3.267; p = 0.016; ε = 0.957; η(2) = 0.114) and a significant main effect of group for oral hygiene skills (F(2/51) = 7.088; p = 0.002; η(2) = 0.218). Fones was superior to Bass; Bass did not differ from the control group. Group differences were most prominent after 6 and 12 weeks. The present trial indicates an advantage of teaching the Fones as compared to the modified Bass technique with respect to oral hygiene skills and gingivitis. Future studies are needed to analyze whether the disadvantage of teaching the Bass technique observed here is restricted to the teaching method employed. German Clinical Trials Register DRKS00003488.

  1. Comparative Evaluation of Two Venous Sampling Techniques for the Assessment of Pancreatic Insulin and Zinc Release upon Glucose Challenge

    Directory of Open Access Journals (Sweden)

    Anil Kumar Pillai

    2015-01-01

    Full Text Available Advances in noninvasive imaging modalities have provided opportunities to study β cell function through imaging zinc release from insulin secreting β cells. Understanding the temporal secretory pattern of insulin and zinc corelease after a glucose challenge is essential for proper timing of administration of zinc sensing probes. Portal venous sampling is an essential part of pharmacological and nutritional studies in animal models. The purpose of this study was to compare two different percutaneous image-guided techniques: transhepatic ultrasound guided portal vein access and transsplenic fluoroscopy guided splenic vein access for ease of access, safety, and evaluation of temporal kinetics of insulin and zinc release into the venous effluent from the pancreas. Both techniques were safe, reproducible, and easy to perform. The mean time required to obtain desired catheter position for venous sampling was 15 minutes shorter using the transsplenic technique. A clear biphasic insulin release profile was observed in both techniques. Statistically higher insulin concentration but similar zinc release after a glucose challenge was observed from splenic vein samples, as compared to the ones from the portal vein. To our knowledge, this is the first report of percutaneous methods to assess zinc release kinetics from the porcine pancreas.

  2. Comparison of estimation techniques for a forest inventory in which double sampling for stratification is used

    Science.gov (United States)

    Michael S. Williams

    2001-01-01

    A number of different estimators can be used when forest inventory plots cover two or more distinctly different condition classes. In this article the properties of two approximate Horvitz- Thompson (HT) estimators, a ratio of means (RM), and a mean of ratios (MR) estimator are explored in the framework of double sampling for stratification. Relevant theoretical...

  3. Soil and Water – What is Detectable through Microbiological Sample Preparation Techniques

    Science.gov (United States)

    The concerns of a potential terrorist’s use of biological agents in soil and ground water are articulated by comparisons to major illnesses in this Country involving contaminated drinking water sources. Objectives are focused on the importance of sample preparation in the rapid, ...

  4. A novel fluorescent in situ hybridization technique for detection of Rickettsia spp. in archival samples

    DEFF Research Database (Denmark)

    Svendsen, Claus Bo; Boye, Mette; Struve, Carsten

    2009-01-01

    A novel, sensitive and specific method for detecting Rickettsia spp. in archival samples is described. The method involves the use of fluorescently marked oligonucleotide probes for in situ hybridization. Specific hybridization of Ricekttsia was found without problems of cross-reactions...

  5. Measuring the complex permittivity of thin grain samples by the free-space transmission technique

    Science.gov (United States)

    In this paper, a numerical method for solving a higherorder model that relates the measured transmission coefficient to the permittivity of a material is used to determine the permittivity of thin grain samples. A method for resolving the phase ambiguity of the transmission coefficient is presented....

  6. Blinding Techniques in Randomized Controlled Trials of Laser Therapy: An Overview and Possible Solution

    Directory of Open Access Journals (Sweden)

    Ian Relf

    2008-01-01

    Full Text Available Low-level laser therapy has evidence accumulating about its effectiveness in a variety of medical conditions. We reviewed 51 double blind randomized controlled trials (RCTs of laser treatment. Analysis revealed 58% of trials showed benefit of laser over placebo. However, less than 5% of the trials had addressed beam disguise or allocation concealment in the laser machines used. Many of the trials used blinding methods that rely on staff cooperation and are therefore open to interference or bias. This indicates significant deficiencies in laser trial methodology. We report the development and preliminary testing of a novel laser machine that can blind both patient and operator to treatment allocation without staff participation. The new laser machine combines sealed preset and non-bypassable randomization codes, decoy lights and sound, and a conical perspex tip to overcome laser diode glow detection.

  7. Hybrid random walk-linear discriminant analysis method for unwrapping quantitative phase microscopy images of biological samples

    Science.gov (United States)

    Kim, Diane N. H.; Teitell, Michael A.; Reed, Jason; Zangle, Thomas A.

    2015-11-01

    Standard algorithms for phase unwrapping often fail for interferometric quantitative phase imaging (QPI) of biological samples due to the variable morphology of these samples and the requirement to image at low light intensities to avoid phototoxicity. We describe a new algorithm combining random walk-based image segmentation with linear discriminant analysis (LDA)-based feature detection, using assumptions about the morphology of biological samples to account for phase ambiguities when standard methods have failed. We present three versions of our method: first, a method for LDA image segmentation based on a manually compiled training dataset; second, a method using a random walker (RW) algorithm informed by the assumed properties of a biological phase image; and third, an algorithm which combines LDA-based edge detection with an efficient RW algorithm. We show that the combination of LDA plus the RW algorithm gives the best overall performance with little speed penalty compared to LDA alone, and that this algorithm can be further optimized using a genetic algorithm to yield superior performance for phase unwrapping of QPI data from biological samples.

  8. Use of protein: creatinine ratio in a random spot urine sample for predicting significant proteinuria in diabetes mellitus.

    Science.gov (United States)

    Yadav, B K; Adhikari, S; Gyawali, P; Shrestha, R; Poudel, B; Khanal, M

    2010-06-01

    Present study was undertaken during a period of 6 months (September 2008-February 2009) to see an correlation of 24 hours urine protein estimation with random spot protein-creatinine (P:C) ratio among a diabetic patients. The study comprised of 144 patients aged 30-70 years, recruited from Kantipur hospital, Kathmandu. The 24-hr urine sample was collected, followed by spot random urine sample. Both samples were analyzed for protein and creatinine excretion. An informed consent was taken from all participants. Sixteen inadequately collected urine samples as defined by (predicted creatinine--measured creatinine)/predicted creatinine > 0.2 were excluded from analysis. The Spearman's rank correlation between the spot urine P:C ratio and 24-hr total protein were performed by the Statistical Package for Social Service. At the P:C ratio cutoff of 0.15 and reference method (24-hr urine protein) cutoff of 150 mg/day, the correlation coefficient was found to be 0.892 (p urine collection but the cutoff should be carefully selected for different patients group under different laboratory procedures and settings.

  9. Track-Before-Detect Algorithm for Faint Moving Objects based on Random Sampling and Consensus

    Science.gov (United States)

    2014-09-01

    Vehicles Richard Rast and Waid Schlaegel AFRL, Directed Energy Vincent Schmidt AFRL, Human Effectiveness Directorate Stephen Gregory The Boeing...the data set collected with the RH 17-inch telescope, the night of 2014/10/02 UT, we evaluate the performance of RANSAC-MT by testing it using...calibration techniques. Moving object signatures of various intensities and angular velocities are tested . Figure 6 shows the results from one of the

  10. The application of statistical and/or non-statistical sampling techniques by internal audit functions in the South African banking industry

    Directory of Open Access Journals (Sweden)

    D.P. van der Nest

    2015-03-01

    Full Text Available This article explores the use by internal audit functions of audit sampling techniques in order to test the effectiveness of controls in the banking sector. The article focuses specifically on the use of statistical and/or non-statistical sampling techniques by internal auditors. The focus of the research for this article was internal audit functions in the banking sector of South Africa. The results discussed in the article indicate that audit sampling is still used frequently as an audit evidence-gathering technique. Non-statistical sampling techniques are used more frequently than statistical sampling techniques for the evaluation of the sample. In addition, both techniques are regarded as important for the determination of the sample size and the selection of the sample items

  11. Determination of palladium in biological samples applying nuclear analytical techniques; Determinacao de paladio em amostras biologicas aplicando tecnicas analiticas nucleares

    Energy Technology Data Exchange (ETDEWEB)

    Cavalcante, Cassio Q.; Sato, Ivone M.; Salvador, Vera L. R.; Saiki, Mitiko [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil). Dept. de Analise por Ativacao Neutronica]. E-mail: cavalcante.cassio@gmail.com

    2008-07-01

    This study presents Pd determinations in bovine tissue samples containing palladium prepared in the laboratory, and CCQM-P63 automotive catalyst materials of the Proficiency Test, using instrumental thermal and epithermal neutron activation analysis and energy dispersive X-ray fluorescence techniques. Solvent extraction and solid phase extraction procedures were also applied to separate Pd from interfering elements before the irradiation in the nuclear reactor. The results obtained by different techniques were compared against each other to examine sensitivity, precision and accuracy. (author)

  12. THE RHETORICAL USE OF RANDOM SAMPLING: CRAFTING AND COMMUNICATING THE PUBLIC IMAGE OF POLLS AS A SCIENCE (1935-1948).

    Science.gov (United States)

    Lusinchi, Dominic

    2017-03-01

    The scientific pollsters (Archibald Crossley, George H. Gallup, and Elmo Roper) emerged onto the American news media scene in 1935. Much of what they did in the following years (1935-1948) was to promote both the political and scientific legitimacy of their enterprise. They sought to be recognized as the sole legitimate producers of public opinion. In this essay I examine the, mostly overlooked, rhetorical work deployed by the pollsters to publicize the scientific credentials of their polling activities, and the central role the concept of sampling has had in that pursuit. First, they distanced themselves from the failed straw poll by claiming that their sampling methodology based on quotas was informed by science. Second, although in practice they did not use random sampling, they relied on it rhetorically to derive the symbolic benefits of being associated with the "laws of probability." © 2017 Wiley Periodicals, Inc.

  13. Examination of Microbial Proteome Preservation Techniques Applicable to Autonomous Environmental Sample Collection

    Directory of Open Access Journals (Sweden)

    Mak A. Saito

    2011-11-01

    Full Text Available Improvements in temporal and spatial sampling frequency have the potential to open new windows into the understanding of marine microbial dynamics. In recent years, efforts have been made to allow automated samplers to collect microbial biomass for DNA/RNA analyses from moored observatories and autonomous underwater vehicles. Measurements of microbial proteins are also of significant interest given their biogeochemical importance as enzymes that catalyze reactions and transporters that interface with the environment. We examined the influence of five preservatives solutions (SDS-extraction buffer, ethanol, trichloroacetic acid, B-PER, and RNAlater on the proteome integrity of the marine cyanobacterium Synechococcus WH8102 after four weeks of storage at room temperature. Four approaches were used to assess degradation: total protein recovery, band integrity on an SDS-PAGE gel, and number of protein identifications and relative abundances by 1D LC-MS/MS proteomic analyses. Total protein recoveries from the preserved samples were lower than the frozen control due to processing losses, which could be corrected for with internal standardization. The trichloroacetic acid preserved sample showed significant loss of protein band integrity on the SDS-PAGE gel. The RNAlater preserved sample showed the highest number of protein identifications (103% relative to the control; 520 + 31 identifications in RNAlater versus 504 + 4 in the control, equivalent to the frozen control. Relative abundances of individual proteins in the RNAlater treatment were quite similar to that of the frozen control (average ratio of 1.01 + 0.27 for the 50 most abundant proteins, while the SDS-extraction buffer, ethanol, and B-PER all showed significant decreases in both number of identifications and relative abundances of individual proteins. Based on these findings, RNAlater was an effective proteome preservative, although further study is warranted on additional marine microbes.

  14. Unscented Sampling Techniques For Evolutionary Computation With Applications To Astrodynamic Optimization

    Science.gov (United States)

    2016-09-01

    constrained optimization problems. The second goal is to improve computation times and efficiencies associated with evolutionary algorithms. The last goal is...to both genetic algorithms and evolution strategies to achieve these goals. The results of this research offer a promising new set of modified...computation, parallel processing, un - scented sampling 15. NUMBER OF PAGES 417 16. PRICE CODE 17. SECURITY CLASSIFICATION OF REPORT Unclassified 18

  15. Comparison of soil solution sampling techniques to assess metal fluxes from contaminated soil to groundwater.

    Science.gov (United States)

    Coutelot, F; Sappin-Didier, V; Keller, C; Atteia, O

    2014-12-01

    The unsaturated zone plays a major role in elemental fluxes in terrestrial ecosystems. A representative chemical analysis of soil pore water is required for the interpretation of soil chemical phenomena and particularly to assess Trace Elements (TEs) mobility. This requires an optimal sampling system to avoid modification of the extracted soil water chemistry and allow for an accurate estimation of solute fluxes. In this paper, the chemical composition of soil solutions sampled by Rhizon® samplers connected to a standard syringe was compared to two other types of suction probes (Rhizon® + vacuum tube and Rhizon® + diverted flow system). We investigated the effects of different vacuum application procedures on concentrations of spiked elements (Cr, As, Zn) mixed as powder into the first 20 cm of 100-cm columns and non-spiked elements (Ca, Na, Mg) concentrations in two types of columns (SiO2 sand and a mixture of kaolinite + SiO2 sand substrates). Rhizon® was installed at different depths. The metals concentrations showed that (i) in sand, peak concentrations cannot be correctly sampled, thus the flux cannot be estimated, and the errors can easily reach a factor 2; (ii) in sand + clay columns, peak concentrations were larger, indicating that they could be sampled but, due to sorption on clay, it was not possible to compare fluxes at different depths. The different samplers tested were not able to reflect the elemental flux to groundwater and, although the Rhizon® + syringe device was more accurate, the best solution remains to be the use of a lysimeter, whose bottom is kept continuously at a suction close to the one existing in the soil.

  16. Efficacy of Manual Therapy Including Neurodynamic Techniques for the Treatment of Carpal Tunnel Syndrome: A Randomized Controlled Trial.

    Science.gov (United States)

    Wolny, Tomasz; Saulicz, Edward; Linek, Paweł; Shacklock, Michael; Myśliwiec, Andrzej

    2017-05-01

    The purpose of this randomized trial was to compare the efficacy of manual therapy, including the use of neurodynamic techniques, with electrophysical modalities on patients with mild and moderate carpal tunnel syndrome (CTS). The study included 140 CTS patients who were randomly assigned to the manual therapy (MT) group, which included the use of neurodynamic techniques, functional massage, and carpal bone mobilizations techniques, or to the electrophysical modalities (EM) group, which included laser and ultrasound therapy. Nerve conduction, pain severity, symptom severity, and functional status measured by the Boston Carpal Tunnel Questionnaire were assessed before and after treatment. Therapy was conducted twice weekly and both groups received 20 therapy sessions. A baseline assessment revealed group differences in sensory conduction of the median nerve (P therapy, analysis of variance revealed group differences in pain severity (P therapies had a positive effect on nerve conduction, pain reduction, functional status, and subjective symptoms in individuals with CTS. However, the results regarding pain reduction, subjective symptoms, and functional status were better in the MT group. Copyright © 2017. Published by Elsevier Inc.

  17. The effect of dead time on randomly sampled power spectral estimates

    DEFF Research Database (Denmark)

    Buchhave, Preben; Velte, Clara Marika; George, William K.

    2014-01-01

    consider both the effect on the measured spectrum of a finite sampling time, i.e., a finite time during which the signal is acquired, and a finite dead time, that is a time in which the signal processor is busy evaluating a data point and therefore unable to measure a subsequent data point arriving within...... the dead time delay....

  18. Spatial distribution of metals in soil samples from Zona da Mata, Pernambuco, Brazil using XRF technique

    Energy Technology Data Exchange (ETDEWEB)

    Fernandez, Zahily Herrero; Santos Junior, Jose Araujo dos; Amaral, Romilton dos Santos; Menezes, Romulo Simoes Cezar; Santos, Josineide Marques do Nascimento; Bezerra, Jairo Dias; Damascena, Kennedy Francys Rodrigues, E-mail: zahily1985@gmail.com, E-mail: jaraujo@ufpe.br, E-mail: romilton@ufpe.br, E-mail: rmenezes@ufpe.br, E-mail: neideden@hotmail.com, E-mail: jairo.dias@ufpe.br, E-mail: kennedy.eng.ambiental@gmail.com [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Centro de Tecnologia e Geociencias. Departamento de Energia Nuclear; Alvarez, Juan Reinaldo Estevez, E-mail: jestevez@ceaden.cu [Centro de Aplicaciones Tecnologicas y Desarrollo Nuclear (CEADEN), Havana (Cuba); Silva, Edvane Borges da, E-mail: edvane.borges@pq.cnpq.br [Universidade Federal de Pernambuco (UFPE), Vitoria de Santo Antao, PE (Brazil). Nucleo de Biologia; Franca, Elvis Joacir de; Farias, Emerson Emiliano Gualberto de, E-mail: ejfranca@cnen.gov.br, E-mail: emersonemiliano@yahoo.com.br [Centro Regional de Ciencias Nucleares do Nordeste (CRCN-NE/CNEN-PE), Recife, PE (Brazil); Silva, Alberto Antonio da, E-mail: alberto.silva@barreiros.ifpe.edu.br [Instituto Federal de Educacao, Ciencia e Tecnologia de Pernambuco (IFPE), Barreiros, PE (Brazil)

    2015-07-01

    Soil contamination is today one of the most important environmental issues for society. In the past, soil pollution was not considered as important as air and water contamination, because this was more difficult to be controlled, becoming an important topic in studies of environmental protection worldwide. Based on this, this paper provides information on the determination of metals in soil samples collected in Zona da Mata, Pernambuco, Brazil, where normally the application of pesticides, insecticides and other agricultural additives are used in a disorderly manner and without control. A total of 24 sampling points were monitored. The analysis of Mn, Fe, Ni, Zn, Br, Rb, Sr, Pb, Ti, La, Al, Si and P were performed using Energy Dispersive X-Ray Fluorescence. In order to assess the development of analytical method, inorganic Certified Reference Materials (IAEA-SOIL-7 and SRM 2709) were analyzed. In each sampling site, the geoaccumulation index were calculated to estimate the level of metal contamination in the soil, this was made taking into account the resolution 460 of the National Environmental Council (CONAMA in Portuguese). The elemental distribution patterns obtained for each metal were associated with different pollution sources. This assessment provides an initial description of pollution levels presented by metals in soils from several areas of Zona da Mata, providing quantitative evidence and demonstrating the need to improve the regulation of agricultural and industrial activities. (author)

  19. Bilayer technique and nano-filled coating increase success of approximal ART restorations: a randomized clinical trial.

    Science.gov (United States)

    Hesse, Daniela; Bonifácio, Clarissa Calil; Guglielmi, Camila de Almeida Brandão; Bönecker, Marcelo; van Amerongen, Willem Evert; Raggio, Daniela Prócida

    2016-05-01

    The high-viscosity consistency of glass ionomer cement (GIC) may lead to its incorrect adaptation into the cavity and therefore to restoration failure. To compare two different insertion techniques for GIC in approximal atraumatic restorative treatment (ART) restorations and two different surface protection materials. Approximal caries lesion in primary molars from 208 schoolchildren was randomly assigned into four groups: G1, conventional GIC insertion protected with petroleum jelly (PJ); G2, bilayer technique protected with PJ; G3 conventional GIC insertion protected with nano-filled particles coating for GIC (NPC); G4, bilayer technique protected with NPC. Restorations were evaluated after 1, 6, 12, 18, 24, and 36 months. Kaplan-Meier survival analysis and log-rank test were performed. Cox regression analysis (α = 5%) was used to verify the influence of clinical factors. Restoration survival was 52.8%. Log-rank test indicated a better survival of the bilayer technique restorations, compared to conventional restorations (P = 0.005), whereas the coated conventional restorations presented higher survival than the uncoated ones (P = 0.035). Cox regression analysis showed no influence of any clinical tested variables. The survival rate of the approximal ART restorations is positively influenced by the bilayer technique, and the application of nano-filled coating increases the longevity of the conventional approximal ART restorations. © 2015 BSPD, IAPD and John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  20. Hypoalgesic effects of three different manual therapy techniques on cervical spine and psychological interaction: A randomized clinical trial.

    Science.gov (United States)

    Alonso-Perez, Jose Luis; Lopez-Lopez, Almudena; La Touche, Roy; Lerma-Lara, Sergio; Suarez, Emilio; Rojas, Javier; Bishop, Mark D; Villafañe, Jorge Hugo; Fernández-Carnero, Josué

    2017-10-01

    The purpose of this study was to evaluate the extent to which psychological factors interact with a particular manual therapy (MT) technique to induce hypoalgesia in healthy subjects. Seventy-five healthy volunteers (36 female, 39 males), were recruited in this double-blind, controlled and parallel study. Subjects were randomly assigned to receive: High velocity low amplitude technique (HVLA), joint mobilization, or Cervical Lateral glide mobilization (CLGM). Pressure pain threshold (PPT) over C7 unilaterally, trapezius muscle and lateral epicondyle bilaterally, were measured prior to single technique MT was applied and immediately after to applied MT. Pain catastrophizing, depression, anxiety and kinesiophobia were evaluated before treatment. The results indicate that hypoalgesia was observed in all groups after treatment in the neck and elbow region (P techniques studied produced local and segmental hypoalgesic effects, supporting the results of previous studies studying the individual interventions. Interaction between catastrophizing and HVLA technique suggest that whether catastrophizing level is low or medium, the chance of success is high, but high levels of catastrophizing may result in poor outcome after HVLA intervention. ClinicalTrials.gov Registration Number: NCT02782585. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. AUGMENTATION VS NONAUGMENTATION TECHNIQUES FOR OPEN REPAIRS OF ACHILLES TENDON RUPTURES WITH EARLY FUNCTIONAL TREATMENT: A PROSPECTIVE RANDOMIZED STUDY

    Directory of Open Access Journals (Sweden)

    Gündüz Tezeren

    2006-12-01

    Full Text Available A prospective randomized study was conducted in order to compare augmentation technique versus nonaugmentation technique, followed by early functional postoperative treatment, for operative repair of Achilles tendon ruptures. Twenty-four consecutive patients were assigned to two groups. Group I included 12 patients treated with Lindholm augmentation technique, whereas group II included 12 patients treated with modified Kessler end-to-end repair. Thereafter, these patients had postoperative management with a below-knee-cast for three weeks. The physioteraphy was initiated immediately after the cast was removed. Full weight bearing was allowed after five weeks postoperatively in the both groups. Two patients had reruptures in group II, whereas group I had prolonged operative time significantly. The patients with reruptures underwent reoperations and at the most final follow-up, it was observed that they could not resume to sporting activities. The other objective and subjective results were similar between two groups. Because of quite high rerupture rate in the group of patients treated with nonaugmentation technique, we favor functional postoperative treatment with early ankle movement in the patients treated with augmentation technique for the management of acute rupture of the Achilles tendon

  2. Effects of pushing techniques during the second stage of labor: A randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Refika Genç Koyucu

    2017-10-01

    Conclusions: Although the duration of the second stage of labor was longer compared to valsalva pushing technique, women were able to give birth without requiring any verbal or visual instruction, without exceeding the limit value of two hours and without affecting fetal wellness and neonatal results.

  3. Nonopioid versus opioid based general anesthesia technique for bariatric surgery: A randomized double-blind study

    Directory of Open Access Journals (Sweden)

    Mohamed Ahmed Mansour

    2013-01-01

    Full Text Available Objective: The objective of this study was to evaluate the efficacy and safety of giving general anesthesia without the use of any opioids either systemic or intraperitoneal in bariatric surgery. Methods: Prospective randomized controlled trial. Obese patients (body mass index >50 Kg/m 2 undergoing laparoscopic sleeve gastrectomies were recruited and provided an informed signed consent. Patients were randomized using a computer generated randomization table to receive either opioid or non-opioid based anesthesia. The patient and the investigator scoring patient outcome after surgery were blinded to the anesthetic protocol. Primary outcomes were hemodynamics in the form of "heart rate, systolic, diastolic, and mean arterial blood pressure" on induction and ½ hourly thereafter. Pain monitoring through visual analog scale (VAS 30 min after recovery, hourly for 2 h and every 4 h for 24 h was also recorded. Pain monitoring through VAS and post-operative nausea and vomiting 30 min after recovery were also recorded and finally patient satisfaction and acute pain nurse satisfaction. Results: There was no difference in background characteristics in both groups. There were no statistically significant differences in different outcomes as heart rate, mean blood pressure, O 2 saturation in different timings between groups at any of the determined eight time points but pain score and nurse satisfaction showed a trend to better performance with non-opioid treatment. Conclusion: Nonopioid based general anesthesia for Bariatric surgery is as effective as opioid one. There is no need to use opioids for such surgery especially that there was a trend to less pain in non-opioid anesthesia.

  4. Quantification of the overall measurement uncertainty associated with the passive moss biomonitoring technique: Sample collection and processing.

    Science.gov (United States)

    Aboal, J R; Boquete, M T; Carballeira, A; Casanova, A; Debén, S; Fernández, J A

    2017-05-01

    In this study we examined 6080 data gathered by our research group during more than 20 years of research on the moss biomonitoring technique, in order to quantify the variability generated by different aspects of the protocol and to calculate the overall measurement uncertainty associated with the technique. The median variance of the concentrations of different pollutants measured in moss tissues attributed to the different methodological aspects was high, reaching values of 2851 (ng·g-1)2 for Cd (sample treatment), 35.1 (μg·g-1)2 for Cu (sample treatment), 861.7 (ng·g-1)2 and for Hg (material selection). These variances correspond to standard deviations that constitute 67, 126 and 59% the regional background levels of these elements in the study region. The overall measurement uncertainty associated with the worst experimental protocol (5 subsamples, refrigerated, washed, 5 × 5 m size of the sampling area and once a year sampling) was between 2 and 6 times higher than that associated with the optimal protocol (30 subsamples, dried, unwashed, 20 × 20 m size of the sampling area and once a week sampling), and between 1.5 and 7 times higher than that associated with the standardized protocol (30 subsamples and once a year sampling). The overall measurement uncertainty associated with the standardized protocol could generate variations of between 14 and 47% in the regional background levels of Cd, Cu, Hg, Pb and Zn in the study area and much higher levels of variation in polluted sampling sites. We demonstrated that although the overall measurement uncertainty of the technique is still high, it can be reduced by using already well defined aspects of the protocol. Further standardization of the protocol together with application of the information on the overall measurement uncertainty would improve the reliability and comparability of the results of different biomonitoring studies, thus extending use of the technique beyond the context of scientific

  5. Western pond turtle: Biology, sampling techniques, inventory and monitoring, conservation, and management: Northwest Fauna No. 7

    Science.gov (United States)

    Bury, R.B.; Welsh, Hartwell H.; Germano, David J.; Ashton, Donald T.

    2012-01-01

    One of only two native, freshwater turtle species in the western United States, western pond turtles are declining in portions of their original range. Declines are mostly due to habitat loss, introduction of non-native species, pollution, and lack of connectivity among populations. USGS zoologist R. Bruce Bury and colleagues from the U.S. Forest Service, California State University, and other agencies compiled and edited a new review and field manual of this charismatic species. Objectives were to determine its current distribution and abundance, summarize and evaluate population features, review techniques to detect population and habitat changes, and improve monitoring for long-term trends. Methods described in the manual should improve consistency, efficiency, and accuracy of survey data, resulting in improved management and conservation efforts.

  6. Sampling Technique for Robust Odorant Detection Based on MIT RealNose Data

    Science.gov (United States)

    Duong, Tuan A.

    2012-01-01

    This technique enhances the detection capability of the autonomous Real-Nose system from MIT to detect odorants and their concentrations in noisy and transient environments. The lowcost, portable system with low power consumption will operate at high speed and is suited for unmanned and remotely operated long-life applications. A deterministic mathematical model was developed to detect odorants and calculate their concentration in noisy environments. Real data from MIT's NanoNose was examined, from which a signal conditioning technique was proposed to enable robust odorant detection for the RealNose system. Its sensitivity can reach to sub-part-per-billion (sub-ppb). A Space Invariant Independent Component Analysis (SPICA) algorithm was developed to deal with non-linear mixing that is an over-complete case, and it is used as a preprocessing step to recover the original odorant sources for detection. This approach, combined with the Cascade Error Projection (CEP) Neural Network algorithm, was used to perform odorant identification. Signal conditioning is used to identify potential processing windows to enable robust detection for autonomous systems. So far, the software has been developed and evaluated with current data sets provided by the MIT team. However, continuous data streams are made available where even the occurrence of a new odorant is unannounced and needs to be noticed by the system autonomously before its unambiguous detection. The challenge for the software is to be able to separate the potential valid signal from the odorant and from the noisy transition region when the odorant is just introduced.

  7. The capillary adhesion technique: a versatile method for determining the liquid adhesion force and sample stiffness.

    Science.gov (United States)

    Gandyra, Daniel; Walheim, Stefan; Gorb, Stanislav; Barthlott, Wilhelm; Schimmel, Thomas

    2015-01-01

    We report a novel, practical technique for the concerted, simultaneous determination of both the adhesion force of a small structure or structural unit (e.g., an individual filament, hair, micromechanical component or microsensor) to a liquid and its elastic properties. The method involves the creation and development of a liquid meniscus upon touching a liquid surface with the structure, and the subsequent disruption of this liquid meniscus upon removal. The evaluation of the meniscus shape immediately before snap-off of the meniscus allows the quantitative determination of the liquid adhesion force. Concurrently, by measuring and evaluating the deformation of the structure under investigation, its elastic properties can be determined. The sensitivity of the method is remarkably high, practically limited by the resolution of the camera capturing the process. Adhesion forces down to 10 µN and spring constants up to 2 N/m were measured. Three exemplary applications of this method are demonstrated: (1) determination of the water adhesion force and the elasticity of individual hairs (trichomes) of the floating fern Salvinia molesta. (2) The investigation of human head hairs both with and without functional surface coatings (a topic of high relevance in the field of hair cosmetics) was performed. The method also resulted in the measurement of an elastic modulus (Young's modulus) for individual hairs of 3.0 × 10(5) N/cm(2), which is within the typical range known for human hair. (3) Finally, the accuracy and validity of the capillary adhesion technique was proven by examining calibrated atomic force microscopy cantilevers, reproducing the spring constants calibrated using other methods.

  8. The capillary adhesion technique: a versatile method for determining the liquid adhesion force and sample stiffness

    Directory of Open Access Journals (Sweden)

    Daniel Gandyra

    2015-01-01

    Full Text Available We report a novel, practical technique for the concerted, simultaneous determination of both the adhesion force of a small structure or structural unit (e.g., an individual filament, hair, micromechanical component or microsensor to a liquid and its elastic properties. The method involves the creation and development of a liquid meniscus upon touching a liquid surface with the structure, and the subsequent disruption of this liquid meniscus upon removal. The evaluation of the meniscus shape immediately before snap-off of the meniscus allows the quantitative determination of the liquid adhesion force. Concurrently, by measuring and evaluating the deformation of the structure under investigation, its elastic properties can be determined. The sensitivity of the method is remarkably high, practically limited by the resolution of the camera capturing the process. Adhesion forces down to 10 µN and spring constants up to 2 N/m were measured. Three exemplary applications of this method are demonstrated: (1 determination of the water adhesion force and the elasticity of individual hairs (trichomes of the floating fern Salvinia molesta. (2 The investigation of human head hairs both with and without functional surface coatings (a topic of high relevance in the field of hair cosmetics was performed. The method also resulted in the measurement of an elastic modulus (Young’s modulus for individual hairs of 3.0 × 105 N/cm2, which is within the typical range known for human hair. (3 Finally, the accuracy and validity of the capillary adhesion technique was proven by examining calibrated atomic force microscopy cantilevers, reproducing the spring constants calibrated using other methods.

  9. Beyond simple small-angle X-ray scattering: developments in online complementary techniques and sample environments

    Directory of Open Access Journals (Sweden)

    Wim Bras

    2014-11-01

    Full Text Available Small- and wide-angle X-ray scattering (SAXS, WAXS are standard tools in materials research. The simultaneous measurement of SAXS and WAXS data in time-resolved studies has gained popularity due to the complementary information obtained. Furthermore, the combination of these data with non X-ray based techniques, via either simultaneous or independent measurements, has advanced understanding of the driving forces that lead to the structures and morphologies of materials, which in turn give rise to their properties. The simultaneous measurement of different data regimes and types, using either X-rays or neutrons, and the desire to control parameters that initiate and control structural changes have led to greater demands on sample environments. Examples of developments in technique combinations and sample environment design are discussed, together with a brief speculation about promising future developments.

  10. Optimizing EUS-guided liver biopsy sampling: comprehensive assessment of needle types and tissue acquisition techniques.

    Science.gov (United States)

    Schulman, Allison R; Thompson, Christopher C; Odze, Robert; Chan, Walter W; Ryou, Marvin

    2017-02-01

    EUS-guided liver biopsy sampling using FNA and, more recently, fine-needle biopsy (FNB) needles has been reported with discrepant diagnostic accuracy, in part due to differences in methodology. We aimed to compare liver histologic yields of 4 EUS-based needles and 2 percutaneous needles to identify optimal number of needle passes and suction. Six needle types were tested on human cadaveric tissue: one 19G FNA needle, one existing 19G FNB needle, one novel 19G FNB needle, one 22G FNB needle, and two 18G percutaneous needles (18G1 and 18G2). Two needle excursion patterns (1 vs 3 fanning passes) were performed on all EUS needles. Primary outcome was number of portal tracts. Secondary outcomes were degree of fragmentation and specimen adequacy. Pairwise comparisons were performed using t tests, with a 2-sided P liver biopsy samplings (48 per needle type) were performed. The novel 19G FNB needle had significantly increased mean portal tracts compared with all needle types. The 22G FNB needle had significantly increased portal tracts compared with the 18G1 needle (3.8 vs 2.5, P liver biopsy needle provides superior histologic yield compared with 18G percutaneous needles and existing 19G FNA and core needles. Moreover, the 22G FNB needle may be adequate for liver biopsy sampling. Investigations are underway to determine whether these results can be replicated in a clinical setting. Copyright © 2017 American Society for Gastrointestinal Endoscopy. Published by Elsevier Inc. All rights reserved.

  11. Random Evolutionary Dynamics Driven by Fitness and House-of-Cards Mutations: Sampling Formulae

    Science.gov (United States)

    Huillet, Thierry E.

    2017-07-01

    We first revisit the multi-allelic mutation-fitness balance problem, especially when mutations obey a house of cards condition, where the discrete-time deterministic evolutionary dynamics of the allelic frequencies derives from a Shahshahani potential. We then consider multi-allelic Wright-Fisher stochastic models whose deviation to neutrality is from the Shahshahani mutation/selection potential. We next focus on the weak selection, weak mutation cases and, making use of a Gamma calculus, we compute the normalizing partition functions of the invariant probability densities appearing in their Wright-Fisher diffusive approximations. Using these results, generalized Ewens sampling formulae (ESF) from the equilibrium distributions are derived. We start treating the ESF in the mixed mutation/selection potential case and then we restrict ourselves to the ESF in the simpler house-of-cards mutations only situation. We also address some issues concerning sampling problems from infinitely-many alleles weak limits.

  12. Dual to Ratio-Cum-Product Estimator in Simple and Stratified Random Sampling

    OpenAIRE

    Yunusa Olufadi

    2013-01-01

    New estimators for estimating the finite population mean using two auxiliary variables under simple and stratified sampling design is proposed. Their properties (e.g., mean square error) are studied to the first order of approximation. More so, some estimators are shown to be a particular member of this estimator. Furthermore, comparison of the proposed estimator with the usual unbiased estimator and other estimators considered in this paper reveals interesting results. These results are fur...

  13. The psychometric properties of the AUDIT: a survey from a random sample of elderly Swedish adults.

    Science.gov (United States)

    Källmén, Håkan; Wennberg, Peter; Ramstedt, Mats; Hallgren, Mats

    2014-07-01

    Increasing alcohol consumption and related harms have been reported among the elderly population of Europe. Consequently, it is important to monitor patterns of alcohol use, and to use a valid and reliable tool when screening for risky consumption in this age group. The aim was to evaluate the internal consistency reliability and construct validity of the Alcohol Use Disorders Identification Test (AUDIT) in elderly Swedish adults, and to compare the results with the general Swedish population. Another aim was to calculate the level of alcohol consumption (AUDIT-C) to be used for comparison in future studies. The questionnaire was sent to 1459 Swedish adults aged 79-80 years with a response rate of 73.3%. Internal consistency reliability, were assessed using Cronbach alpha, and confirmatory factor analysis assessed construct validity of the Alcohol Use Disorders Identification Test (AUDIT) in elderly population as compared to a Swedish general population sample. The results showed that AUDIT was more reliable and valid among the Swedish general population sample than among the elderly and that Item 1 and 4 in AUDIT was less reliable and valid among the elderly. While the AUDIT showed acceptable psychometric properties in the general population sample, it's performance was of less quality among the elderly respondents. Further psychometric assessments of the AUDIT in elderly populations are required before it is implemented more widely.

  14. The quality of the reported sample size calculations in randomized controlled trials indexed in PubMed.

    Science.gov (United States)

    Lee, Paul H; Tse, Andy C Y

    2017-05-01

    There are limited data on the quality of reporting of information essential for replication of the calculation as well as the accuracy of the sample size calculation. We examine the current quality of reporting of the sample size calculation in randomized controlled trials (RCTs) published in PubMed and to examine the variation in reporting across study design, study characteristics, and journal impact factor. We also reviewed the targeted sample size reported in trial registries. We reviewed and analyzed all RCTs published in December 2014 with journals indexed in PubMed. The 2014 Impact Factors for the journals were used as proxies for their quality. Of the 451 analyzed papers, 58.1% reported an a priori sample size calculation. Nearly all papers provided the level of significance (97.7%) and desired power (96.6%), and most of the papers reported the minimum clinically important effect size (73.3%). The median (inter-quartile range) of the percentage difference of the reported and calculated sample size calculation was 0.0% (IQR -4.6%;3.0%). The accuracy of the reported sample size was better for studies published in journals that endorsed the CONSORT statement and journals with an impact factor. A total of 98 papers had provided targeted sample size on trial registries and about two-third of these papers (n=62) reported sample size calculation, but only 25 (40.3%) had no discrepancy with the reported number in the trial registries. The reporting of the sample size calculation in RCTs published in PubMed-indexed journals and trial registries were poor. The CONSORT statement should be more widely endorsed. Copyright © 2016 European Federation of Internal Medicine. Published by Elsevier B.V. All rights reserved.

  15. [Repair of primary inguinal hernia: Lichtenstein versus Shouldice techniques. Prospective randomized study of pain and hospital costs].

    Science.gov (United States)

    Porrero, José L; Bonachía, Oscar; López-Buenadicha, Adolfo; Sanjuanbenito, Alfonso; Sánchez-Cabezudo, Carlos

    2005-02-01

    Hernia is one of the most widely studied processes, and the search for excellence has become the final aim. However, many controversies remain to be resolved. The objective of the present study was to analyze postoperative pain and costs using two techniques of primary inguinal hernia repair. We performed a prospective, randomized study of 54 patients who underwent surgical repair of inguinal hernia through either the Lichtenstein or the Shouldice technique between June 2001 and May 2002. The following variables were analyzed: age, location and type of hernia, evaluation of tolerance to local anesthesia, surgical technique, operating time, pain at days 1, 3 and 5 after surgery, analgesic consumption, days until driving could be resumed, days off work, and occupation. The patient groups were similar, with no significant differences in age, location or type of hernia. For Lichtenstein hernioplasty, operating time was lower (p < 0.01); pain evaluation showed no significant differences on days 1 and 3 after surgery but was higher on day 5 (p = 0.064). No significant differences were found in analgesic consumption, time before driving could be resumed, or days off work. Freelance patients returned to work earlier, independently of the surgical technique performed. The cost of the Lichtenstein technique was 235 euros compared with 180 euros for the Shouldice technique and this difference was statistically significant (p < 0.05). In the hands of expert surgeons, the Shouldice technique is the procedure of choice in the repair of primary hernias. The results are just as satisfactory as those obtained with Lichtenstein hernioplasty and hospital costs are lower.

  16. Versatile combustion-amalgamation technique for the photometric determination of mercury in fish and environmental samples

    Science.gov (United States)

    Willford, Wayne A.; Hesselberg, Robert J.; Bergman, Harold L.

    1973-01-01

    Total mercury in a variety of substances is determined rapidly and precisely by direct sample combustion, collection of released mercury by amalgamation, and photometric measurement of mercury volatilized from the heated amalgam. Up to 0.2 g fish tissue is heated in a stream of O2 (1.2 L/min) for 3.5 min in 1 tube of a 2-tube induction furnace. The released mercury vapor and combustion products are carried by the stream of O2 through a series of traps (6% NaOH scrubber, water condenser, and Mg(CIO4)2 drying tube) and the mercury is collected in a 10 mm diameter column of 24 gauge gold wire (8 g) cut into 3 mm lengths. The resulting amalgam is heated in the second tube of the induction furnace and the volatilized mercury is measured with a mercury vapor meter equipped with a recorder-integrator. Total analysis time is approximately 8 min/sample. The detection limit is less than 0.002 μg and the system is easily converted for use with other biological materials, water, and sediments.

  17. Investigation of Pectenotoxin Profiles in the Yellow Sea (China Using a Passive Sampling Technique

    Directory of Open Access Journals (Sweden)

    Zhaoxin Li

    2010-04-01

    Full Text Available Pectenotoxins (PTXs are a group of lipophilic algal toxins. These toxins have been found in algae and shellfish from Japan, New Zealand, Ireland, Norway and Portugal. PTX profiles vary with geographic location of collection site. The aim of the present study was to investigate PTX profiles from the Yellow Sea, China. The sampling location was within an aquatic farm (N36°12.428´, E120°17.826´ near the coast of Qingdao, China, in the Yellow Sea from 28July to 29August 2006. PTXs in seawater were determined using a solid phase adsorption toxin tracking (SPATT method. PTXs were analyzed by HPLC-MSMS. PTX-2, PTX-2 sec acid (PTX-2 SA and 7-epi-PTX-2 SA were found in seawater samples. The highest levels of PTXs (107 ng/g of resin PTX-2, 50 ng/g of resin PTX-2 SA plus 7-epi-PTX-2 SA in seawater were found on 1 August, 2006. From 1 August to 29 August, the levels of PTX-2 and PTX-2 SA decreased. In the same area, the marine algae, Dinophysis acuminata was found in the seawater in the summer months of 2006. This indicated that Dinophysis acuumuta might be the original source of PTXs. PTX-11 and PTX-12a/b were not found in seawater.

  18. A comparison of quantitative reconstruction techniques for PIXE-tomography analysis applied to biological samples

    Energy Technology Data Exchange (ETDEWEB)

    Beasley, D.G., E-mail: dgbeasley@ctn.ist.utl.pt [IST/C2TN, Universidade de Lisboa, Campus Tecnológico e Nuclear, E.N.10, 2686-953 Sacavém (Portugal); Alves, L.C. [IST/C2TN, Universidade de Lisboa, Campus Tecnológico e Nuclear, E.N.10, 2686-953 Sacavém (Portugal); Barberet, Ph.; Bourret, S.; Devès, G.; Gordillo, N.; Michelet, C. [Univ. Bordeaux, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Le Trequesser, Q. [Univ. Bordeaux, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Institut de Chimie de la Matière Condensée de Bordeaux (ICMCB, UPR9048) CNRS, Université de Bordeaux, 87 avenue du Dr. A. Schweitzer, Pessac F-33608 (France); Marques, A.C. [IST/IPFN, Universidade de Lisboa, Campus Tecnológico e Nuclear, E.N.10, 2686-953 Sacavém (Portugal); Seznec, H. [Univ. Bordeaux, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Silva, R.C. da [IST/IPFN, Universidade de Lisboa, Campus Tecnológico e Nuclear, E.N.10, 2686-953 Sacavém (Portugal)

    2014-07-15

    The tomographic reconstruction of biological specimens requires robust algorithms, able to deal with low density contrast and low element concentrations. At the IST/ITN microprobe facility new GPU-accelerated reconstruction software, JPIXET, has been developed, which can significantly increase the speed of quantitative reconstruction of Proton Induced X-ray Emission Tomography (PIXE-T) data. It has a user-friendly graphical user interface for pre-processing, data analysis and reconstruction of PIXE-T and Scanning Transmission Ion Microscopy Tomography (STIM-T). The reconstruction of PIXE-T data is performed using either an algorithm based on a GPU-accelerated version of the Maximum Likelihood Expectation Maximisation (MLEM) method or a GPU-accelerated version of the Discrete Image Space Reconstruction Algorithm (DISRA) (Sakellariou (2001) [2]). The original DISRA, its accelerated version, and the MLEM algorithm, were compared for the reconstruction of a biological sample of Caenorhabditis elegans – a small worm. This sample was analysed at the microbeam line of the AIFIRA facility of CENBG, Bordeaux. A qualitative PIXE-T reconstruction was obtained using the CENBG software package TomoRebuild (Habchi et al. (2013) [6]). The effects of pre-processing and experimental conditions on the elemental concentrations are discussed.

  19. Experimental Study on the Sensitive Emission Lines Intensities of Metal Samples Using Laser Ablation Technique and Its Comparison to Arc Discharge Technique

    Directory of Open Access Journals (Sweden)

    Eko Susilowati

    2004-05-01

    Full Text Available An experimental study has been carried out to measure the sensitive emission lines intensities of several metal samples (copper, zinc, silver, gold, gallium, nickel, silicone and iron using laser ablation technique conducted in low pressure surrounding gas by means of Laser Induced Shock Wave Plasma Spectroscopy (LISPS and in atmospheric pressure region using Laser Induced Breakdown Spectroscopy (LIBS. In both cases the Nd-YAG laser was operated at its fundamental wavelength of 1,064 nm with pulse duration of 8 ns and its intensity tightly focused on the metal samples in helium or air as an ambient gas. The laser energy was fixed at approximately 100 mJ using a set of neutral density filters placed tilted in front of the laser output window. The result of the intensity measurements showed a good agreement which those obtained using arc discharge technique as shown in Massachusetts Institute of Technology Wavelength Table. Further evaluation of these results on the basis of standard deviation leads to the conclusion that LISPS is more favorable for quantitative analysis compared to LIBS. It was further shown that replacing air by helium gas at low pressure improve to some extent the LISPS reproducibility and sensitivity.

  20. Active Learning Not Associated with Student Learning in a Random Sample of College Biology Courses

    Science.gov (United States)

    Andrews, T. M.; Leonard, M. J.; Colgrove, C. A.; Kalinowski, S. T.

    2011-01-01

    Previous research has suggested that adding active learning to traditional college science lectures substantially improves student learning. However, this research predominantly studied courses taught by science education researchers, who are likely to have exceptional teaching expertise. The present study investigated introductory biology courses randomly selected from a list of prominent colleges and universities to include instructors representing a broader population. We examined the relationship between active learning and student learning in the subject area of natural selection. We found no association between student learning gains and the use of active-learning instruction. Although active learning has the potential to substantially improve student learning, this research suggests that active learning, as used by typical college biology instructors, is not associated with greater learning gains. We contend that most instructors lack the rich and nuanced understanding of teaching and learning that science education researchers have developed. Therefore, active learning as designed and implemented by typical college biology instructors may superficially resemble active learning used by education researchers, but lacks the constructivist elements necessary for improving learning. PMID:22135373

  1. Get the most out of blow hormones: validation of sampling materials, field storage and extraction techniques for whale respiratory vapour samples.

    Science.gov (United States)

    Burgess, Elizabeth A; Hunt, Kathleen E; Kraus, Scott D; Rolland, Rosalind M

    2016-01-01

    Studies are progressively showing that vital physiological data may be contained in the respiratory vapour (blow) of cetaceans. Nonetheless, fundamental methodological issues need to be addressed before hormone analysis of blow can become a reliable technique. In this study, we performed controlled experiments in a laboratory setting, using known doses of pure parent hormones, to validate several technical factors that may play a crucial role in hormone analyses. We evaluated the following factors: (i) practical field storage of samples on small boats during daylong trips; (ii) efficiency of hormone extraction methods; and (iii) assay interference of different sampler types (i.e. veil nylon, nitex nylon mesh and polystyrene dish). Sampling materials were dosed with mock blow samples of known mixed hormone concentrations (progesterone, 17β-estradiol, testosterone, cortisol, aldosterone and triiodothyronine), designed to mimic endocrine profiles characteristic of pregnant females, adult males, an adrenal glucocorticoid response or a zero-hormone control (distilled H 2 O). Results showed that storage of samples in a cooler on ice preserved hormone integrity for at least 6 h ( P  = 0.18). All sampling materials and extraction methods yielded the correct relative patterns for all six hormones. However, veil and nitex mesh produced detectable assay interference (mean 0.22 ± 0.04 and 0.18 ± 0.03 ng/ml, respectively), possibly caused by some nylon-based component affecting antibody binding. Polystyrene dishes were the most efficacious sampler for accuracy and precision ( P  blow.

  2. Comparative Evaluation of Gingival Depigmentation by Tetrafluroethane Cryosurgery and Surgical Scalpel Technique. A Randomized Clinical Study

    OpenAIRE

    Narayankar, Suraj D.; Neeraj C Deshpande; Dave, Deepak H.; Thakkar, Dhaval J.

    2017-01-01

    Introduction: Importance of good smile cannot be underestimated in enhancement of beauty, self-confidence and personality of a person. Health and appearance of gingiva is an essential part of attractive smile. Gingival pigmentation gives rise to unesthetic smile line. In present world, with increasing awareness to esthetic, people have become highly concerned about black gums. Various treatment modalities like abrasion, scrapping, scalpel technique, cryosurgery, electrosurgery and laser are a...

  3. Can the Alexander Technique improve balance and mobility in older adults with visual impairments? A randomized controlled trial.

    Science.gov (United States)

    Gleeson, Michael; Sherrington, Catherine; Lo, Serigne; Keay, Lisa

    2015-03-01

    To investigate the impact of Alexander Technique lessons on balance and mobility in older adults with visual impairments. Randomized assessor blinded controlled trial with intervention and usual care control groups. Participants' homes. A total of 120 community-dwellers aged 50+ with visual impairments. Twelve weeks of Alexander lessons and usual care. Short Physical Performance Battery items were primary outcomes at 3 months and secondary outcomes at 12 months. Additional secondary outcomes were postural sway, maximal balance range and falls over 12 months. Between-group differences in primary outcomes were not significant. The intervention group reduced postural sway on a firm surface with eyes open at 3 months after adjusting for baseline values (-29.59 mm, 95%CI -49.52 to -9.67, P Alexander Technique is warranted. © The Author(s) 2014.

  4. Simultaneous detection of randomly arranged multiple barcodes using time division multiplexing technique

    Science.gov (United States)

    Haider, Saad Md. Jaglul; Islam, Md. Kafiul

    2010-02-01

    A method of detecting multiple barcodes simultaneously using time division multiplexing technique has been proposed in this paper to minimize the effective time needed for handling multiple tags of barcodes and to lessen the overall workload. Available barcode detection systems can handle multiple types of barcode but a single barcode at a time. This is not so efficient and can create large queue and chaos in places like mega shopping malls or large warehouses where we need to scan huge number of barcodes daily. Our proposed system is expected to improve the real time identification of goods or products on production lines and in automated warehouses or in mega shopping malls in a much more convenient and efficient way. For identifying of multiple barcodes simultaneously, a particular arrangement and orientation of LASER scanner and reflector have been used with a special curve shaped basement where the barcodes are placed. An effective and novel algorithm is developed to extract information from multiple barcodes which introduces starting pattern and ending pattern in barcodes with bit stuffing technique for the convenience of multiple detections. CRC technique is also used for trustworthiness of detection. The overall system enhances the existing single barcode detection system by a great amount although it is easy to implement and use.

  5. Elemental analysis of different varieties of rice samples using XRF technique

    Energy Technology Data Exchange (ETDEWEB)

    Kaur, Jaspreet, E-mail: gillpreet05051812@gmail.com; Kumar, Anil, E-mail: gilljaspreet06@gmail.com [Department of Basic and Applied Physics, Punjabi University, Patiala 147002 (India)

    2016-05-06

    Rice is most consumed staple food in the world providing over 21% of the calorie intake of world’s population having high yielding capacity. Elements detected in rice are Al, As, Br, Cd, Cl, Co, Cs, Cu, Fe, Hg, K, Mg, Mn, Mo, Rb, Se and Zn by using Instrumental Neutron Activation with k0 standardization (R. Jayasekera etal,2004). Some of these trace elements are C, H, O, N, S, Ca, P, K, Na, Cl, Mn, Ti, Mg, Cu, Fe, Ni, Si and Zn are essential for growth of human physique The deficiency or excess of these elements in food is known to cause a variety of malnutrition or health disorders in the world. Every year, various varieties of rice are launched by Punjab Agriculture University, Ludhiana. The main purpose of which is to increases the yield to attain the maximum profit. But this leads to changing the elemental concentration in them, which may affect the human health according to variation in the nutrition values. The main objective is to study the presence of elemental concentration in various varieties of rice using EDXRF technique.

  6. Comparison of standard exponential and linear techniques to amplify small cDNA samples for microarrays

    Directory of Open Access Journals (Sweden)

    von Arnold Sara

    2005-05-01

    Full Text Available Abstract Background The need to perform microarray experiments with small amounts of tissue has led to the development of several protocols for amplifying the target transcripts. The use of different amplification protocols could affect the comparability of microarray experiments. Results Here we compare expression data from Pinus taeda cDNA microarrays using transcripts amplified either exponentially by PCR or linearly by T7 transcription. The amplified transcripts vary significantly in estimated length, GC content and expression depending on amplification technique. Amplification by T7 RNA polymerase gives transcripts with a greater range of lengths, greater estimated mean length, and greater variation of expression levels, but lower average GC content, than those from PCR amplification. For genes with significantly higher expression after T7 transcription than after PCR, the transcripts were 27% longer and had about 2 percentage units lower GC content. The correlation of expression intensities between technical repeats was high for both methods (R2 = 0.98 whereas the correlation of expression intensities using the different methods was considerably lower (R2 = 0.52. Correlation of expression intensities between amplified and unamplified transcripts were intermediate (R2 = 0.68–0.77. Conclusion Amplification with T7 transcription better reflects the variation of the unamplified transcriptome than PCR based methods owing to the better representation of long transcripts. If transcripts of particular interest are known to have high GC content and are of limited length, however, PCR-based methods may be preferable.

  7. Passive sampling of anionic pesticides using the Diffusive Gradients in Thin films technique (DGT).

    Science.gov (United States)

    Guibal, Robin; Buzier, Rémy; Charriau, Adeline; Lissalde, Sophie; Guibaud, Gilles

    2017-05-08

    DGT passive samplers using Oasis® HLB or Oasis® MAX sorbent were developed for anionic pesticides sampling. They were tested using four model compounds (i.e. bentazon, chlorsulfuron, ioxynil and mecoprop). Polyacrylamide diffusive gel was found to be more suitable than agarose gel for most anionic pesticides sampling. An elution procedure was optimized and diffusion coefficients were determined for quantitative use of the samplers. Depending on the DGT configuration used (HLB or MAX), accuracies better than 30% were demonstrated in laboratory for pH from 3 to 8 and ionic strengths from 10-2 to 1 M. Combined with the effective binding capacities of samplers (≥9 μg for each pesticide) and limits of quantification of the method (≤13 ng.L-1 using Q-TOF detector) monitoring of numerous aquatic systems can be expected. Except for ioxynil, accurate quantifications were demonstrated in laboratory using a spiked natural water for HLB-DGT whereas MAX-DGT did not give satisfactory results. A further in situ validation was performed in two rivers and showed identical detection frequency between HLB-DGT and POCIS of anionic pesticides (bentazon and mesotrione) whereas calculated concentrations, although within the same order of magnitude, could differ (HLB-DGT could therefore constitute an interesting alternative to other passive samplers for the monitoring of several anionic pesticides in aquatic systems but more work is required for quantification of molecules from hydroxybenzonitrile chemical group (ioxynil). Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Long-term outcomes for different vaginal cuff closure techniques in robotic-assisted laparoscopic hysterectomy: A randomized controlled trial.

    Science.gov (United States)

    Tsafrir, Ziv; Palmer, Matthew; Dahlman, Marisa; Nawfal, A Karim; Aoun, Joelle; Taylor, Andrew; Fisher, Jay; Theoharis, Evan; Eisenstein, David

    2017-03-01

    This randomized controlled trial aimed to evaluate the outcomes of different vaginal cuff closure techniques in robotic-assisted total laparoscopic hysterectomy. Ninety women undergoing robotic-assisted total laparoscopic hysterectomy for benign disease were randomized to three vaginal cuff closure techniques: running 2.0 V-Lock™ (Arm 1), 0 Vicryl™ figure-of-eight (Arm 2), and running 0 Vicryl™ with Lapra-Ty(®) (Arm 3). Patients' records were reviewed for age, body mass index, smoking status and relevant co-morbidities. Operative times for vaginal closure and total length of surgery, estimated blood loss, and peri-operative complications were collected. Patients were evaluated at 2 and 6 weeks post-operatively, and interviewed 1year following surgery by a telephone survey. Outcomes evaluated were vaginal cuff dehiscence, pain, dyspareunia and bleeding. The study arms did not differ with respect to estimated blood loss (50mL in each arm; p=0.34), median vaginal cuff closure time (14.5, 12 and 13min, respectively; p=0.09) or readmission (p=0.55). In the 1-year follow-up (54/90 respondents; 60%), there were no significant differences among study arms for vaginal bleeding, cuff infection or dyspareunia. Only women belonging to arm 3 reported vaginal pain (0%, 0% and 23%, respectively; p=0.01). No cases of vaginal cuff dehiscence were observed. The type of closure technique has no significant impact on patient outcomes. In the absence of a clear advantage of one technique over the others, the decision regarding the preferred method to close the vaginal cuff in robotic-assisted total laparoscopic hysterectomy should be based on surgeons' preference and cost effectiveness. Copyright © 2016. Published by Elsevier B.V.

  9. Acupuncture-Related Techniques for Psoriasis: A Systematic Review with Pairwise and Network Meta-Analyses of Randomized Controlled Trials.

    Science.gov (United States)

    Yeh, Mei-Ling; Ko, Shu-Hua; Wang, Mei-Hua; Chi, Ching-Chi; Chung, Yu-Chu

    2017-12-01

    There has be a large body of evidence on the pharmacological treatments for psoriasis, but whether nonpharmacological interventions are effective in managing psoriasis remains largely unclear. This systematic review conducted pairwise and network meta-analyses to determine the effects of acupuncture-related techniques on acupoint stimulation for the treatment of psoriasis and to determine the order of effectiveness of these remedies. This study searched the following databases from inception to March 15, 2016: Medline, PubMed, Cochrane Central Register of Controlled Trials, EBSCO (including Academic Search Premier, American Doctoral Dissertations, and CINAHL), Airiti Library, and China National Knowledge Infrastructure. Randomized controlled trials (RCTs) on the effects of acupuncture-related techniques on acupoint stimulation as intervention for psoriasis were independently reviewed by two researchers. A total of 13 RCTs with 1,060 participants were included. The methodological quality of included studies was not rigorous. Acupoint stimulation, compared with nonacupoint stimulation, had a significant treatment for psoriasis. However, the most common adverse events were thirst and dry mouth. Subgroup analysis was further done to confirm that the short-term treatment effect was superior to that of the long-term effect in treating psoriasis. Network meta-analysis identified acupressure or acupoint catgut embedding, compared with medication, and had a significant effect for improving psoriasis. It was noted that acupressure was the most effective treatment. Acupuncture-related techniques could be considered as an alternative or adjuvant therapy for psoriasis in short term, especially of acupressure and acupoint catgut embedding. This study recommends further well-designed, methodologically rigorous, and more head-to-head randomized trials to explore the effects of acupuncture-related techniques for treating psoriasis.

  10. Effect of the Mediterranean diet on heart failure biomarkers: a randomized sample from the PREDIMED trial.

    Science.gov (United States)

    Fitó, Montserrat; Estruch, Ramón; Salas-Salvadó, Jordi; Martínez-Gonzalez, Miguel Angel; Arós, Fernando; Vila, Joan; Corella, Dolores; Díaz, Oscar; Sáez, Guillermo; de la Torre, Rafael; Mitjavila, María-Teresa; Muñoz, Miguel Angel; Lamuela-Raventós, Rosa-María; Ruiz-Gutierrez, Valentina; Fiol, Miquel; Gómez-Gracia, Enrique; Lapetra, José; Ros, Emilio; Serra-Majem, Lluis; Covas, María-Isabel

    2014-05-01

    Scarce data are available on the effect of the traditional Mediterranean diet (TMD) on heart failure biomarkers. We assessed the effect of TMD on biomarkers related to heart failure in a high cardiovascular disease risk population. A total of 930 subjects at high cardiovascular risk (420 men and 510 women) were recruited in the framework of a multicentre, randomized, controlled, parallel-group clinical trial directed at testing the efficacy of the TMD on the primary prevention of cardiovascular disease (The PREDIMED Study). Participants were assigned to a low-fat diet (control, n = 310) or one of two TMDs [TMD + virgin olive oil (VOO) or TMD + nuts]. Depending on group assignment, participants received free provision of extra-virgin olive oil, mixed nuts, or small non-food gifts. After 1 year of intervention, both TMDs decreased plasma N-terminal pro-brain natriuretic peptide, with changes reaching significance vs. control group (P cardiovascular disease (CVD) who improved their diet toward a TMD pattern reduced their N-terminal pro-brain natriuretic peptide compared with those assigned to a low-fat diet. The same was found for in vivo oxidized low-density lipoprotein and lipoprotein(a) plasma concentrations after the TMD + VOO diet. From our results TMD could be a useful tool to mitigate against risk factors for heart failure. From our results TMD could modify markers of heart failure towards a more protective mode. © 2014 The Authors. European Journal of Heart Failure © 2014 European Society of Cardiology.

  11. Sequential sampling model for multiattribute choice alternatives with random attention time and processing order.

    Science.gov (United States)

    Diederich, Adele; Oswald, Peter

    2014-01-01

    A sequential sampling model for multiattribute binary choice options, called multiattribute attention switching (MAAS) model, assumes a separate sampling process for each attribute. During the deliberation process attention switches from one attribute consideration to the next. The order in which attributes are considered as well for how long each attribute is considered-the attention time-influences the predicted choice probabilities and choice response times. Several probability distributions for the attention time with different variances are investigated. Depending on the time and order schedule the model predicts a rich choice probability/choice response time pattern including preference reversals and fast errors. Furthermore, the difference between finite and infinite decision horizons for the attribute considered last is investigated. For the former case the model predicts a probability p 0 > 0 of not deciding within the available time. The underlying stochastic process for each attribute is an Ornstein-Uhlenbeck process approximated by a discrete birth-death process. All predictions are also true for the widely applied Wiener process.

  12. Sequential sampling model for multiattribute choice alternatives with random attention time and processing order

    Directory of Open Access Journals (Sweden)

    Adele eDiederich

    2014-09-01

    Full Text Available A sequential sampling model for multiattribute binary choice options, called Multiattribute attention switching (MAAS model, assumes a separate sampling process for each attribute. During the deliberation process attention switches from one attribute consideration to the next. The order in which attributes are considered as well for how long each attribute is considered - the attention time - influences the predicted choice probabilities and choice response times. Several probability distributions for the attention time including deterministic, Poisson, binomial, geometric, and uniform with different variances are investigated. Depending on the time and order schedule the model predicts a rich choice probability/choice response time pattern including preference reversals and fast errors. Furthermore, the difference between a finite and infinite decision horizons for the attribute considered last is investigated. For the former case the model predicts a probability $p_0> 0$ of not deciding within the available time. The underlying stochastic process for each attribute is an Ornstein-Uhlenbeck process approximated by a discrete birth-death process. All predictions are also true for the widely applied Wiener process.

  13. Incorporating covariance estimation uncertainty in spatial sampling design for prediction with trans-Gaussian random fields

    Directory of Open Access Journals (Sweden)

    Gunter eSpöck

    2015-05-01

    Full Text Available Recently, Spock and Pilz [38], demonstratedthat the spatial sampling design problem forthe Bayesian linear kriging predictor can betransformed to an equivalent experimentaldesign problem for a linear regression modelwith stochastic regression coefficients anduncorrelated errors. The stochastic regressioncoefficients derive from the polar spectralapproximation of the residual process. Thus,standard optimal convex experimental designtheory can be used to calculate optimal spatialsampling designs. The design functionals ̈considered in Spock and Pilz [38] did nottake into account the fact that kriging isactually a plug-in predictor which uses theestimated covariance function. The resultingoptimal designs were close to space-fillingconfigurations, because the design criteriondid not consider the uncertainty of thecovariance function.In this paper we also assume that thecovariance function is estimated, e.g., byrestricted maximum likelihood (REML. Wethen develop a design criterion that fully takesaccount of the covariance uncertainty. Theresulting designs are less regular and space-filling compared to those ignoring covarianceuncertainty. The new designs, however, alsorequire some closely spaced samples in orderto improve the estimate of the covariancefunction. We also relax the assumption ofGaussian observations and assume that thedata is transformed to Gaussianity by meansof the Box-Cox transformation. The resultingprediction method is known as trans-Gaussiankriging. We apply the Smith and Zhu [37]approach to this kriging method and show thatresulting optimal designs also depend on theavailable data. We illustrate our results witha data set of monthly rainfall measurementsfrom Upper Austria.

  14. Determination of trace elements in bovine semen samples by inductively coupled plasma mass spectrometry and data mining techniques for identification of bovine class.

    Science.gov (United States)

    Aguiar, G F M; Batista, B L; Rodrigues, J L; Silva, L R S; Campiglia, A D; Barbosa, R M; Barbosa, F

    2012-12-01

    The reproductive performance of cattle may be influenced by several factors, but mineral imbalances are crucial in terms of direct effects on reproduction. Several studies have shown that elements such as calcium, copper, iron, magnesium, selenium, and zinc are essential for reproduction and can prevent oxidative stress. However, toxic elements such as lead, nickel, and arsenic can have adverse effects on reproduction. In this paper, we applied a simple and fast method of multi-element analysis to bovine semen samples from Zebu and European classes used in reproduction programs and artificial insemination. Samples were analyzed by inductively coupled plasma spectrometry (ICP-MS) using aqueous medium calibration and the samples were diluted in a proportion of 1:50 in a solution containing 0.01% (vol/vol) Triton X-100 and 0.5% (vol/vol) nitric acid. Rhodium, iridium, and yttrium were used as the internal standards for ICP-MS analysis. To develop a reliable method of tracing the class of bovine semen, we used data mining techniques that make it possible to classify unknown samples after checking the differentiation of known-class samples. Based on the determination of 15 elements in 41 samples of bovine semen, 3 machine-learning tools for classification were applied to determine cattle class. Our results demonstrate the potential of support vector machine (SVM), multilayer perceptron (MLP), and random forest (RF) chemometric tools to identify cattle class. Moreover, the selection tools made it possible to reduce the number of chemical elements needed from 15 to just 8. Copyright © 2012 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  15. Detection improving techniques for Hough detector in presence of randomly arriving impulse interference

    Science.gov (United States)

    Doukovska, Lyubka A.; Angelova, Donka S.

    2010-09-01

    In this paper is researched the effectiveness of Hough detector with different types of Constant False Alarm Rate (CFAR) processors working in the presence of randomly arriving impulse interference. We have studied the detection probability and the average decision threshold of a Hough detector with these types of CFAR processors. The experimental results are obtained by numerical analysis. They reveal that the use of Hough detector allows reducing drastically detectability losses in comparison to the conventional CFAR detectors and that it is effective for small signalto- noise ratios. The research work is performed in MATLAB computational environment. The obtained analytical results for Hough detector can be used in both, radar and communication receiver networks.

  16. Evaluation of the Quilting Technique for Reduction of Postmastectomy Seroma: A Randomized Controlled Study.

    Science.gov (United States)

    Khater, Ashraf; Elnahas, Waleed; Roshdy, Sameh; Farouk, Omar; Senbel, Ahmed; Fathi, Adel; Hamed, EmadEldeen; Abdelkhalek, Mohamed; Ghazy, Hosam

    2015-01-01

    Background. Postmastectomy seroma causes patients' discomfort, delays starting the adjuvant therapy, and may increase the possibility of surgical site infection. Objective. To evaluate quilting of the mastectomy flaps with obliteration of the axillary space in reducing postmastectomy seroma. Methods. A randomized controlled study was carried out among 120 females who were candidates for mastectomy and axillary clearance. The intervention group (N = 60) with quilting and the control group without quilting. All patients were followed up routinely for immediate and late complications. Results. There were no significant differences between the two groups as regards the demographic characteristics, postoperative pathological finding, and the immediate postoperative complications. The incidence of seroma was significantly lower in the intervention group compared with the control group (20% versus 78.3%, P method to significantly reduce the postoperative seroma in addition to significantly reducing the duration and volume of wound drainage. Therefore we recommend quilting of flaps as a routine step at the end of any mastectomy.

  17. Accelerated Evaluation of Automated Vehicles Safety in Lane-Change Scenarios Based on Importance Sampling Techniques.

    Science.gov (United States)

    Zhao, Ding; Lam, Henry; Peng, Huei; Bao, Shan; LeBlanc, David J; Nobukawa, Kazutoshi; Pan, Christopher S

    2017-03-01

    Automated vehicles (AVs) must be thoroughly evaluated before their release and deployment. A widely used evaluation approach is the Naturalistic-Field Operational Test (N-FOT), which tests prototype vehicles directly on the public roads. Due to the low exposure to safety-critical scenarios, N-FOTs are time consuming and expensive to conduct. In this paper, we propose an accelerated evaluation approach for AVs. The results can be used to generate motions of the other primary vehicles to accelerate the verification of AVs in simulations and controlled experiments. Frontal collision due to unsafe cut-ins is the target crash type of this paper. Human-controlled vehicles making unsafe lane changes are modeled as the primary disturbance to AVs based on data collected by the University of Michigan Safety Pilot Model Deployment Program. The cut-in scenarios are generated based on skewed statistics of collected human driver behaviors, which generate risky testing scenarios while preserving the statistical information so that the safety benefits of AVs in nonaccelerated cases can be accurately estimated. The cross-entropy method is used to recursively search for the optimal skewing parameters. The frequencies of the occurrences of conflicts, crashes, and injuries are estimated for a modeled AV, and the achieved accelerated rate is around 2000 to 20 000. In other words, in the accelerated simulations, driving for 1000 miles will expose the AV with challenging scenarios that will take about 2 to 20 million miles of real-world driving to encounter. This technique thus has the potential to greatly reduce the development and validation time for AVs.

  18. Generalized SAMPLE SIZE Determination Formulas for Investigating Contextual Effects by a Three-Level Random Intercept Model.

    Science.gov (United States)

    Usami, Satoshi

    2017-03-01

    Behavioral and psychological researchers have shown strong interests in investigating contextual effects (i.e., the influences of combinations of individual- and group-level predictors on individual-level outcomes). The present research provides generalized formulas for determining the sample size needed in investigating contextual effects according to the desired level of statistical power as well as width of confidence interval. These formulas are derived within a three-level random intercept model that includes one predictor/contextual variable at each level to simultaneously cover various kinds of contextual effects that researchers can show interest. The relative influences of indices included in the formulas on the standard errors of contextual effects estimates are investigated with the aim of further simplifying sample size determination procedures. In addition, simulation studies are performed to investigate finite sample behavior of calculated statistical power, showing that estimated sample sizes based on derived formulas can be both positively and negatively biased due to complex effects of unreliability of contextual variables, multicollinearity, and violation of assumption regarding the known variances. Thus, it is advisable to compare estimated sample sizes under various specifications of indices and to evaluate its potential bias, as illustrated in the example.

  19. Model determination in a case of heterogeneity of variance using sampling techniques.

    Science.gov (United States)

    Varona, L; Moreno, C; Garcia-Cortes, L A; Altarriba, J

    1997-01-12

    A sampling determination procedure has been described in a case of heterogeneity of variance. The procedure makes use of the predictive distributions of each data given the rest of the data and the structure of the assumed model. The computation of these predictive distributions is carried out using a Gibbs Sampling procedure. The final criterion to compare between models is the Mean Square Error between the expectation of predictive distributions and real data. The procedure has been applied to a data set of weight at 210 days in the Spanish Pirenaica beef cattle breed. Three proposed models have been compared: (a) Single Trait Animal Model; (b) Heterogeneous Variance Animal Model; and (c) Multiple Trait Animal Model. After applying the procedure, the most adjusted model was the Heterogeneous Variance Animal Model. This result is probably due to a compromise between the complexity of the model and the amount of available information. The estimated heritabilities under the preferred model have been 0.489 ± 0.076 for males and 0.331 ± 0.082 for females. RESUMEN: Contraste de modelos en un caso de heterogeneidad de varianzas usando métodos de muestreo Se ha descrito un método de contraste de modelos mediante técnicas de muestreo en un caso de heterogeneidad de varianza entre sexos. El procedimiento utiliza las distribucviones predictivas de cada dato, dado el resto de datos y la estructura del modelo. El criterio para coparar modelos es el error cuadrático medio entre la esperanza de las distribuciones predictivas y los datos reales. El procedimiento se ha aplicado en datos de peso a los 210 días en la raza bovina Pirenaica. Se han propuesto tres posibles modelos: (a) Modelo Animal Unicaracter; (b) Modelo Animal con Varianzas Heterogéneas; (c) Modelo Animal Multicaracter. El modelo mejor ajustado fue el Modelo Animal con Varianzas Heterogéneas. Este resultado es probablemente debido a un compromiso entre la complejidad del modelo y la cantidad de datos

  20. Comparison of Detrusor Muscle Sampling Rate in Monopolar and Bipolar Transurethral Resection of Bladder Tumor: A Randomized Trial.

    Science.gov (United States)

    Teoh, Jeremy Yuen-Chun; Chan, Eddie Shu-Yin; Yip, Siu-Ying; Tam, Ho-Man; Chiu, Peter Ka-Fung; Yee, Chi-Hang; Wong, Hon-Ming; Chan, Chi-Kwok; Hou, Simon See-Ming; Ng, Chi-Fai

    2017-05-01

    Our aim was to investigate the detrusor muscle sampling rate after monopolar versus bipolar transurethral resection of bladder tumor (TURBT). This was a single-center, prospective, randomized, phase III trial on monopolar versus bipolar TURBT. Baseline patient characteristics, disease characteristics and perioperative outcomes were compared, with the primary outcome being the detrusor muscle sampling rate in the TURBT specimen. Multivariate logistic regression analyses on detrusor muscle sampling were performed. From May 2012 to December 2015, a total of 160 patients with similar baseline characteristics were randomized to receive monopolar or bipolar TURBT. Fewer patients in the bipolar TURBT group required postoperative irrigation than patients in the monopolar TURBT group (18.7 vs. 43%; p = 0.001). In the whole cohort, no significant difference in the detrusor muscle sampling rates was observed between the bipolar and monopolar TURBT groups (77.3 vs. 63.3%; p = 0.057). In patients with urothelial carcinoma, bipolar TURBT achieved a higher detrusor muscle sampling rate than monopolar TURBT (84.6 vs. 67.7%; p = 0.025). On multivariate analyses, bipolar TURBT (odds ratio [OR] 2.23, 95% confidence interval [CI] 1.03-4.81; p = 0.042) and larger tumor size (OR 1.04, 95% CI 1.01-1.08; p = 0.022) were significantly associated with detrusor muscle sampling in the whole cohort. In addition, bipolar TURBT (OR 2.88, 95% CI 1.10-7.53; p = 0.031), larger tumor size (OR 1.05, 95% CI 1.01-1.10; p = 0.035), and female sex (OR 3.25, 95% CI 1.10-9.59; p = 0.033) were significantly associated with detrusor muscle sampling in patients with urothelial carcinoma. There was a trend towards a superior detrusor muscle sampling rate after bipolar TURBT. Further studies are needed to determine its implications on disease recurrence and progression.

  1. Efficacy of the Alexander Technique in treating chronic non-specific neck pain: a randomized controlled trial.

    Science.gov (United States)

    Lauche, Romy; Schuth, Mareike; Schwickert, Myriam; Lüdtke, Rainer; Musial, Frauke; Michalsen, Andreas; Dobos, Gustav; Choi, Kyung-Eun

    2016-03-01

    To test the efficacy of the Alexander Technique, local heat and guided imagery on pain and quality of life in patients with chronic non-specific neck pain. A randomized controlled trial with 3 parallel groups was conducted. Outpatient clinic, Department of Internal and Integrative Medicine. A total of 72 patients (65 females, 40.7±7.9 years) with chronic non-specific neck pain. Patients received 5 sessions of the Alexander Technique--an educational method which aims to modify dysfunctional posture, movement and thinking patterns associated with musculoskeletal disorders. Control groups were treated with local heat application or guided imagery. All interventions were conducted once a week for 45 minutes each. The primary outcome measure at week 5 was neck pain intensity on a 100-mm visual analogue scale; secondary outcomes included neck disability, quality of life, satisfaction and safety. Analyses of covariance were applied; testing ordered hypotheses. No group difference was found for pain intensity for the Alexander Technique compared to local heat (difference 4.5mm; 95% CI:-8.1;17.1; p=0.48), but exploratory analysis revealed the superiority of the Alexander Technique over guided imagery (difference -12.9 mm; 95% CI:-22.6;-3.1, p=0.01). Significant group differences in favor of the Alexander Technique were also found for physical quality of life (P<0.05). Adverse events mainly included slightly increased pain and muscle soreness. The Alexander Technique was not superior to local heat application in treating chronic non-specific neck pain. It cannot be recommended as routine intervention at this time. Further trials are warranted for conclusive judgment. © The Author(s) 2015.

  2. Seasonal variation in tracer movement in a forested experimental plot using manual and automated sampling techniques

    Science.gov (United States)

    Singer, J. H.; Seaman, J. C.; Aburime, S.

    2004-12-01

    In recent years, implications associated with groundwater contamination have increased the efforts of researchers studying solute transport through the unsaturated soil zone near the ground surface (vadose zone). Success in tracking the movement of water, solutes, and the development of vadose zone hydrologic models requires high-quality field data. However, near continuous, spatially distributed soil moisture and matric potential data sets are rare because conventional soil parameter instrumentation is point-based and labor intensive. An automated vadose zone monitoring system (AVM) was developed to complement a set of manually monitored instrument arrays in an effort to address the quality and quantity of data collected in the vadose zone. Tracer (tritium) movement was evaluated for winter and summer irrigation applications on a forested field plot on the Atlantic Coastal Plain in South Carolina. Tritiated water was applied in two pulse events through an irrigation system and breakthrough data were measured from soil cores, suction lysimeters and soil vapor wells in the field. Measured breakthrough data for both winter and summer tracer applications were compared to solute transport solutions and modeled using numerical modeling software. The data from automated and manual sampling systems were used to evaluate the results of a one dimension hydrologic model that predicted the movement of water and a tracer (tritium) movement associated with winter and summer irrigation events.

  3. Ecology and sampling techniques of an understudied subterranean habitat: the Milieu Souterrain Superficiel (MSS).

    Science.gov (United States)

    Mammola, Stefano; Giachino, Pier Mauro; Piano, Elena; Jones, Alexandra; Barberis, Marcel; Badino, Giovanni; Isaia, Marco

    2016-12-01

    The term Milieu Souterrain Superficiel (MSS) has been used since the early 1980s in subterranean biology to categorize an array of different hypogean habitats. In general terms, a MSS habitat represents the underground network of empty air-filled voids and cracks developing within multiple layers of rock fragments. Its origins can be diverse and is generally covered by topsoil. The MSS habitat is often connected both with the deep hypogean domain-caves and deep rock cracks-and the superficial soil horizon. A MSS is usually characterized by peculiar microclimatic conditions, and it can harbor specialized hypogean, endogean, and surface-dwelling species. In light of the many interpretations given by different authors, we reviewed 235 papers regarding the MSS in order to provide a state-of-the-art description of these habitats and facilitate their study. We have briefly described the different types of MSS mentioned in the scientific literature (alluvial, bedrock, colluvial, volcanic, and other types) and synthesized the advances in the study of the physical and ecological factors affecting this habitat-i.e., microclimate, energy flows, animal communities, and trophic interactions. We finally described and reviewed the available sampling methods used to investigate MSS fauna.

  4. Application of petroleum geophysical well logging and sampling techniques for evaluating aquifer characteristics

    Energy Technology Data Exchange (ETDEWEB)

    Temples, T.J.; Waddell, M.G. [Univ. of South Carolina, Columbia, SC (United States). Earth Science and Resource Inst.

    1996-05-01

    The Hilton Head Island Test Well {number_sign}1 was drilled to a depth of 3,833 feet to evaluate the upper Cretaceous section as a potential ground-water source for Hilton Head Island, South Carolina. The initial plan was to analyze continuous conventional cores. The interval to be analyzed extended from the top of the Eocene to the base of the Cretaceous (approximately 3,500 feet). However, due to the high cost ($400,000), the decision was made to evaluate aquifer potential using advanced geophysical logs with sidewall cores for calibration. The logging suite consisted of a dual induction resistivity, spontaneous potential, compensated neutron, density log, gamma ray, spectral gamma, multipole array acoustic log, caliper, high resolution dipmeter, and a circumferential borehole imaging log. In addition to the wireline logs, 239 sidewall cores and 12 Formation Multi-Test samples were obtained. The log, sidewall core, and FMT information were integrated into an interpretive package using computer generated logs and simple spreadsheets to calculate aquifer properties. Porosity, hydraulic conductivity, transmissivity, and lithologic data derived from this integrated analysis were then used to select screen zones. Water quality in relation to drinking water standards exceeded expectations. The information obtained from the integrated program allowed estimates to be made about the well`s productivity without the expense of conventional coring, flow testing, and completion of the well.

  5. Extraction method enhancement techniques for the analysis of PCDDS and PCDFS in meat sample

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Tae-Uk [Gyeonsang National Univ., Jinju (Korea). Division of Applied Life Science]|[National Veterinary Research and Quarantine Service (Korea). Busan Regional Office; Kwon, Jin-Wook [National Veterinary Research and Quarantine Service (Korea). Anyang city

    2004-09-15

    Dioxins are the most concerned persistent organic pollutants(POPs) to human beings and being fat soluble tend to accumulate in higher animals including humans. Dioxins has 75 congeners of PCDD and 135 congeners of PCDF, but generally only 7 congeners of PCDD and 10 congeners of PCDF are analyzed because of their toxicity, stability and so on. Since dioxins contamination is regarded as a global issue, a large amount of samples have been analyzed, and various methods for measuring dioxins have been developed and improved. Dioxins analysis needs very complicated analytical procedure including extraction, cleanup and instrument analysis. Because analytical procedure is very complicated and needs many steps, conventional analytical procedure are very time consuming and use large volumes of solvents. So current methods are time consuming and very expensive. In this study, speedy and cost reducing methods without reducing recovery and stability should be focused. So to present the new extraction method to simplify and stabilize of extraction method, the recovery and stability of 7 congeners of PCDD and 10 congeners of PCDF using soxhlet, forced convection dry oven and microwave oven were compared.

  6. Ecology and sampling techniques of an understudied subterranean habitat: the Milieu Souterrain Superficiel (MSS)

    Science.gov (United States)

    Mammola, Stefano; Giachino, Pier Mauro; Piano, Elena; Jones, Alexandra; Barberis, Marcel; Badino, Giovanni; Isaia, Marco

    2016-12-01

    The term Milieu Souterrain Superficiel (MSS) has been used since the early 1980s in subterranean biology to categorize an array of different hypogean habitats. In general terms, a MSS habitat represents the underground network of empty air-filled voids and cracks developing within multiple layers of rock fragments. Its origins can be diverse and is generally covered by topsoil. The MSS habitat is often connected both with the deep hypogean domain—caves and deep rock cracks—and the superficial soil horizon. A MSS is usually characterized by peculiar microclimatic conditions, and it can harbor specialized hypogean, endogean, and surface-dwelling species. In light of the many interpretations given by different authors, we reviewed 235 papers regarding the MSS in order to provide a state-of-the-art description of these habitats and facilitate their study. We have briefly described the different types of MSS mentioned in the scientific literature (alluvial, bedrock, colluvial, volcanic, and other types) and synthesized the advances in the study of the physical and ecological factors affecting this habitat—i.e., microclimate, energy flows, animal communities, and trophic interactions. We finally described and reviewed the available sampling methods used to investigate MSS fauna.

  7. Peyton’s four-step approach for teaching complex spinal manipulation techniques – a prospective randomized trial

    Directory of Open Access Journals (Sweden)

    Gertraud Gradl-Dietsch

    2016-11-01

    Full Text Available Abstract Background The objectives of this prospective randomized trial were to assess the impact of Peyton’s four-step approach on the acquisition of complex psychomotor skills and to examine the influence of gender on learning outcomes. Methods We randomly assigned 95 third to fifth year medical students to an intervention group which received instructions according to Peyton (PG or a control group, which received conventional teaching (CG. Both groups attended four sessions on the principles of manual therapy and specific manipulative and diagnostic techniques for the spine. We assessed differences in theoretical knowledge (multiple choice (MC exam and practical skills (Objective Structured Practical Examination (OSPE with respect to type of intervention and gender. Participants took a second OSPE 6 months after completion of the course. Results There were no differences between groups with respect to the MC exam. Students in the PG group scored significantly higher in the OSPE. Gender had no additional impact. Results of the second OSPE showed a significant decline in competency regardless of gender and type of intervention. Conclusions Peyton’s approach is superior to standard instruction for teaching complex spinal manipulation skills regardless of gender. Skills retention was equally low for both techniques.

  8. Training secondary school teachers in instructional language modification techniques to support adolescents with language impairment: a randomized controlled trial.

    Science.gov (United States)

    Starling, Julia; Munro, Natalie; Togher, Leanne; Arciuli, Joanne

    2012-10-01

    This study evaluated the efficacy of a collaborative intervention where a speech-language pathologist (SLP) trained mainstream secondary school teachers to make modifications to their oral and written instructional language. The trained teachers' uptake of techniques in their whole-class teaching practices and the impact this had on the language abilities of students with language impairment (LI) were evaluated. Two secondary schools were randomly assigned to either a trained or a control condition. A cohort of 13 teachers (7 trained and 6 control) and 43 Year 8 students with LI (21 trained and 22 control) were tested at pre, post, and follow-up times-teachers by structured interview and students by standardized spoken and written language assessments. Significantly increased use of the language modification techniques by the trained teachers was observed when compared to the control group of untrained teachers, with this increased use maintained over time. Results from the trained group of students showed a significant improvement in written expression and listening comprehension relative to the control group of students. This randomized controlled trial is one of the first investigations to evaluate a collaborative intervention that links changes in mainstream secondary teachers' instructional language practices with improvements in the language abilities of adolescents with LI.

  9. Comparison of atomic absorption, mass and X-ray spectrometry techniques using dissolution-based and solid sampling methods for the determination of silver in polymeric samples

    Energy Technology Data Exchange (ETDEWEB)

    Schrijver, Isabel de [Ghent University, Department of Analytical Chemistry, Krijgslaan 281-S12, B-9000 Ghent (Belgium); University College West-Flanders, Department of Industrial Engineering and Technology, Research group EnBiChem, Graaf Karel de Goedelaan 5, B-8500 Kortrijk (Belgium); Aramendia, Maite; Vincze, Laszlo [Ghent University, Department of Analytical Chemistry, Krijgslaan 281-S12, B-9000 Ghent (Belgium); Resano, Martin [University of Zaragoza, Department of Analytical Chemistry, Pedro Cerbuna 12, E-50009 Zaragoza (Spain); Dumoulin, Ann [University College West-Flanders, Department of Industrial Engineering and Technology, Research group EnBiChem, Graaf Karel de Goedelaan 5, B-8500 Kortrijk (Belgium); Vanhaecke, Frank [Ghent University, Department of Analytical Chemistry, Krijgslaan 281-S12, B-9000 Ghent (Belgium)], E-mail: Frank.Vanhaecke@UGent.be

    2007-11-15

    In this work, the capabilities and limitations of solid sampling techniques - laser ablation inductively coupled plasma mass spectrometry (LA-ICPMS), wavelength dispersive X-ray fluorescence spectrometry (WD-XRFS) and solid sampling electrothermal atomic absorption spectrometry (SS-ETAAS) - for the determination of silver in polymers have been evaluated and compared to those of acid digestion and subsequent Ag determination using pneumatic nebulization ICPMS (PN-ICPMS) or flame AAS (FAAS). In a first stage, two dissolution procedures were examined: conventional acid digestion in a Kjeldahl flask and the combination of dry ashing and microwave-assisted digestion. Accurate results for Ag could be obtained, although occasionally, problems of analyte losses and/or incomplete dissolution were observed. LA-ICPMS shows potential for direct analysis of solid materials, but calibration was found to be difficult. A polypropylene sample was used as standard. This approach provided satisfactory results for other polypropylene samples and even for other types of plastics, provided that the {sup 13}C{sup +} signal was used as internal reference, correcting for variations in ablation efficiency. However, the results for polyoxymethylene were overestimated. Similar calibration problems appeared with WD-XRFS, due to differences in absorption efficiency of X-rays. In this case, the accuracy could be improved by using a matrix correction procedure, which however required the matrix composition to be known into sufficient detail. SS-ETAAS, proved to be a fast approach that allowed accurate determination of Ag in polymers using aqueous standard solutions for calibration. Due to the high Ag content and the excellent sensitivity, the use of a 3-field mode Zeeman-effect background correction system was essential for the extension of the working range.

  10. Randomized controlled trial comparing the effectiveness of the ultrasound-guided galvanic electrolysis technique (USGET) versus conventional electro-physiotherapeutic treatment on patellar tendinopathy

    National Research Council Canada - National Science Library

    Abat, F; Sánchez-Sánchez, J L; Martín-Nogueras, A M; Calvo-Arenillas, J I; Yajeya, J; Méndez-Sánchez, R; Monllau, J C; Gelber, P E

    2016-01-01

    .... The purpose of this study is to compare, in a randomized controlled trial, the clinical efficacy of eccentric exercise combined with either an ultrasound-guided galvanic electrolysis technique (USGET...

  11. Determining optimal sample sizes for multistage adaptive randomized clinical trials from an industry perspective using value of information methods.

    Science.gov (United States)

    Chen, Maggie H; Willan, Andrew R

    2013-02-01

    Most often, sample size determinations for randomized clinical trials are based on frequentist approaches that depend on somewhat arbitrarily chosen factors, such as type I and II error probabilities and the smallest clinically important difference. As an alternative, many authors have proposed decision-theoretic (full Bayesian) approaches, often referred to as value of information methods that attempt to determine the sample size that maximizes the difference between the trial's expected utility and its expected cost, referred to as the expected net gain. Taking an industry perspective, Willan proposes a solution in which the trial's utility is the increase in expected profit. Furthermore, Willan and Kowgier, taking a societal perspective, show that multistage designs can increase expected net gain. The purpose of this article is to determine the optimal sample size using value of information methods for industry-based, multistage adaptive randomized clinical trials, and to demonstrate the increase in expected net gain realized. At the end of each stage, the trial's sponsor must decide between three actions: continue to the next stage, stop the trial and seek regulatory approval, or stop the trial and abandon the drug. A model for expected total profit is proposed that includes consideration of per-patient profit, disease incidence, time horizon, trial duration, market share, and the relationship between trial results and probability of regulatory approval. The proposed method is extended to include multistage designs with a solution provided for a two-stage design. An example is given. Significant increases in the expected net gain are realized by using multistage designs. The complexity of the solutions increases with the number of stages, although far simpler near-optimal solutions exist. The method relies on the central limit theorem, assuming that the sample size is sufficiently large so that the relevant statistics are normally distributed. From a value of

  12. Short-term effectiveness of spinal manipulative therapy versus functional technique in patients with chronic nonspecific low back pain: a pragmatic randomized controlled trial.

    Science.gov (United States)

    Castro-Sánchez, Adelaida María; Lara-Palomo, Inmaculada C; Matarán-Peñarrocha, Guillermo A; Fernández-de-Las-Peñas, César; Saavedra-Hernández, Manuel; Cleland, Joshua; Aguilar-Ferrándiz, María Encarnación

    2016-03-01

    Chronic low back pain (LBP) is a prevalent condition associated with pain, disability, decreased quality of life, and fear of movement. To date, no studies have compared the effectiveness of spinal manipulation and functional technique for the management of this population. This study aimed to compare the effectiveness of spinal manipulation and functional technique on pain, disability, kinesiophobia, and quality of life in patients with chronic LBP. A single-blind pragmatic randomized controlled trial conducted in a university research clinic was carried out. Sixty-two patients (62% female, age: 45±7) with chronic LBP comprised the patient sample. Data on disability (Roland-Morris Disability Questionnaire [RMQ], Oswestry Low Back Pain Disability Index [ODI]), pain intensity (Numerical Pain Rate Scale [NPRS]), fear of movement (Tampa Scale of Kinesiophobia [TSK]), quality of life (Short Form-36 [SF-36] quality of life questionnaire), isometric resistance of abdominal muscles (McQuade test), and spinal mobility in flexion (finger-to-floor distance) were collected at baseline immediately after the intervention phase and at 1 month postintervention by an assessor blinded to group allocation of the patients. Patients were randomly assigned to the spinal manipulative therapy group or the functional technique group and received three once-weekly sessions. In comparison to patients receiving functional technique, those receiving spinal manipulation experienced statistically, although not clinically, significant greater reductions in terms of RMQ (standardized mean difference in score changes between groups at post-treatment: 0.1; at 1 month: 0.1) and ODI (post-treatment: 2.9; at 1 month: 1.4). Linear longitudinal analysis showed a significant improvement in both groups over time for RMQ (manipulative: F=68.51, ptreatment-by-time interactions were not detected for pain intensity (p=.488), TSK (p=.552), any domains of the SF-36 quality of life questionnaire (p≤.164), Mc

  13. Uncertainty Of Stream Nutrient Transport Estimates Using Random Sampling Of Storm Events From High Resolution Water Quality And Discharge Data

    Science.gov (United States)

    Scholefield, P. A.; Arnscheidt, J.; Jordan, P.; Beven, K.; Heathwaite, L.

    2007-12-01

    The uncertainties associated with stream nutrient transport estimates are frequently overlooked and the sampling strategy is rarely if ever investigated. Indeed, the impact of sampling strategy and estimation method on the bias and precision of stream phosphorus (P) transport calculations is little understood despite the use of such values in the calibration and testing of models of phosphorus transport. The objectives of this research were to investigate the variability and uncertainty in the estimates of total phosphorus transfers at an intensively monitored agricultural catchment. The Oona Water which is located in the Irish border region, is part of a long term monitoring program focusing on water quality. The Oona Water is a rural river catchment with grassland agriculture and scattered dwelling houses and has been monitored for total phosphorus (TP) at 10 min resolution for several years (Jordan et al, 2007). Concurrent sensitive measurements of discharge are also collected. The water quality and discharge data were provided at 1 hour resolution (averaged) and this meant that a robust estimate of the annual flow weighted concentration could be obtained by simple interpolation between points. A two-strata approach (Kronvang and Bruhn, 1996) was used to estimate flow weighted concentrations using randomly sampled storm events from the 400 identified within the time series and also base flow concentrations. Using a random stratified sampling approach for the selection of events, a series ranging from 10 through to the full 400 were used, each time generating a flow weighted mean using a load-discharge relationship identified through log-log regression and monte-carlo simulation. These values were then compared to the observed total phosphorus concentration for the catchment. Analysis of these results show the impact of sampling strategy, the inherent bias in any estimate of phosphorus concentrations and the uncertainty associated with such estimates. The

  14. Global Stratigraphy of Venus: Analysis of a Random Sample of Thirty-Six Test Areas

    Science.gov (United States)

    Basilevsky, Alexander T.; Head, James W., III

    1995-01-01

    The age relations between 36 impact craters with dark paraboloids and other geologic units and structures at these localities have been studied through photogeologic analysis of Magellan SAR images of the surface of Venus. Geologic settings in all 36 sites, about 1000 x 1000 km each, could be characterized using only 10 different terrain units and six types of structures. These units and structures form a major stratigraphic and geologic sequence (from oldest to youngest): (1) tessera terrain; (2) densely fractured terrains associated with coronae and in the form of remnants among plains; (3) fractured and ridged plains and ridge belts; (4) plains with wrinkle ridges; (5) ridges associated with coronae annulae and ridges of arachnoid annulae which are contemporary with wrinkle ridges of the ridged plains; (6) smooth and lobate plains; (7) fractures of coronae annulae, and fractures not related to coronae annulae, which disrupt ridged and smooth plains; (8) rift-associated fractures; and (9) craters with associated dark paraboloids, which represent the youngest 1O% of the Venus impact crater population (Campbell et al.), and are on top of all volcanic and tectonic units except the youngest episodes of rift-associated fracturing and volcanism; surficial streaks and patches are approximately contemporary with dark-paraboloid craters. Mapping of such units and structures in 36 randomly distributed large regions (each approximately 10(exp 6) sq km) shows evidence for a distinctive regional and global stratigraphic and geologic sequence. On the basis of this sequence we have developed a model that illustrates several major themes in the history of Venus. Most of the history of Venus (that of its first 80% or so) is not preserved in the surface geomorphological record. The major deformation associated with tessera formation in the period sometime between 0.5-1.0 b.y. ago (Ivanov and Basilevsky) is the earliest event detected. In the terminal stages of tessera fon

  15. Can inhibitory and facilitatory kinesiotaping techniques affect motor neuron excitability? A randomized cross-over trial.

    Science.gov (United States)

    Yoosefinejad, Amin Kordi; Motealleh, Alireza; Abbasalipur, Shekoofeh; Shahroei, Mahan; Sobhani, Sobhan

    2017-04-01

    The aim of this study was to investigate the immediate effects of facilitatory and inhibitory kinesiotaping on motor neuron excitability. Randomized cross-over trial. Twenty healthy people received inhibitory and facilitatory kinesiotaping on two testing days. The H- and M-waves of the lateral gasterocnemius were recorded before and immediately after applying the two modes of taping. The Hmax/Mmax ratio (a measure of motor neuron excitability) was determined and analyzed. The mean Hmax/Mmax ratios were -0.013 (95% CI: -0.033 to 0.007) for inhibitory taping and 0.007 (95% CI: -0.013 to 0.027) for facilitatory taping. The mean difference between groups was -0.020 (95% CI: -0.048 to 0.008). The statistical model revealed no significant differences between the two interventions (P = 0.160). Furthermore, there were no within-group differences in Hmax/Mmax ratio for either group. Our findings did not disclose signs of immediate change in motor neuron excitability in the lateral gasterocnemius. Copyright © 2016. Published by Elsevier Ltd.

  16. Evaluation of the Quilting Technique for Reduction of Postmastectomy Seroma: A Randomized Controlled Study

    Directory of Open Access Journals (Sweden)

    Ashraf Khater

    2015-01-01

    Full Text Available Background. Postmastectomy seroma causes patients’ discomfort, delays starting the adjuvant therapy, and may increase the possibility of surgical site infection. Objective. To evaluate quilting of the mastectomy flaps with obliteration of the axillary space in reducing postmastectomy seroma. Methods. A randomized controlled study was carried out among 120 females who were candidates for mastectomy and axillary clearance. The intervention group (N=60 with quilting and the control group without quilting. All patients were followed up routinely for immediate and late complications. Results. There were no significant differences between the two groups as regards the demographic characteristics, postoperative pathological finding, and the immediate postoperative complications. The incidence of seroma was significantly lower in the intervention group compared with the control group (20% versus 78.3%, P<0.001. Additionally, the intervention group had a shorter duration till seroma resolution (9 days versus 11 days, P<0.001 and a smaller volume of drainage (710 mL versus 1160 mL, P<0.001 compared with the control group. Conclusion. The use of mastectomy with quilting of flaps and obliteration of the axillary space is an efficient method to significantly reduce the postoperative seroma in addition to significantly reducing the duration and volume of wound drainage. Therefore we recommend quilting of flaps as a routine step at the end of any mastectomy.

  17. Improved Reference Sampling and Subtraction: A Technique for Reducing the Read Noise of Near-infrared Detector Systems

    Science.gov (United States)

    Rauscher, Bernard J.; Arendt, Richard G.; Fixsen, D. J.; Greenhouse, Matthew A.; Lander, Matthew; Lindler, Don; Loose, Markus; Moseley, S. H.; Mott, D. Brent; Wen, Yiting; Wilson, Donna V.; Xenophontos, Christos

    2017-10-01

    Near-infrared array detectors, like the James Webb Space Telescope (JWST) NIRSpec’s Teledyne’s H2RGs, often provide reference pixels and a reference output. These are used to remove correlated noise. Improved reference sampling and subtraction (IRS2) is a statistical technique for using this reference information optimally in a least-squares sense. Compared with the traditional H2RG readout, IRS2 uses a different clocking pattern to interleave many more reference pixels into the data than is otherwise possible. Compared with standard reference correction techniques, IRS2 subtracts the reference pixels and reference output using a statistically optimized set of frequency-dependent weights. The benefits include somewhat lower noise variance and much less obvious correlated noise. NIRSpec’s IRS2 images are cosmetically clean, with less 1/f banding than in traditional data from the same system. This article describes the IRS2 clocking pattern and presents the equations needed to use IRS2 in systems other than NIRSpec. For NIRSpec, applying these equations is already an option in the calibration pipeline. As an aid to instrument builders, we provide our prototype IRS2 calibration software and sample JWST NIRSpec data. The same techniques are applicable to other detector systems, including those based on Teledyne’s H4RG arrays. The H4RG’s interleaved reference pixel readout mode is effectively one IRS2 pattern.

  18. Use of pornography in a random sample of Norwegian heterosexual couples.

    Science.gov (United States)

    Daneback, Kristian; Traeen, Bente; Månsson, Sven-Axel

    2009-10-01

    This study examined the use of pornography in couple relationships to enhance the sex-life. The study contained a representative sample of 398 heterosexual couples aged 22-67 years. Data collection was carried out by self-administered postal questionnaires. The majority (77%) of the couples did not report any kind of pornography use to enhance the sex-life. In 15% of the couples, both had used pornography; in 3% of the couples, only the female partner had used pornography; and, in 5% of the couples, only the male partner had used pornography for this purpose. Based on the results of a discriminant function analysis, it is suggested that couples where one or both used pornography had a more permissive erotic climate compared to the couples who did not use pornography. In couples where only one partner used pornography, we found more problems related to arousal (male) and negative (female) self-perception. These findings could be of importance for clinicians who work with couples.

  19. Root coverage with connective tissue graft associated with coronally advanced flap or tunnel technique: a randomized, double-blind, mono-centre clinical trial

    NARCIS (Netherlands)

    Azaripour, Adriano; Kissinger, Maren; Farina, Vittorio Siro Leone; van Noorden, Cornelis J. F.; Gerhold-Ay, Aslihan; Willershausen, Brita; Cortellini, Pierpaolo

    2016-01-01

    Aim: The aim of this randomized clinical trial was to compare the coronally advanced flap (CAF) with the modified microsurgical tunnel technique (MMTT) for treatment of Miller class I and II recessions. Material and Methods: Forty patients with 71 gingival recessions were recruited and randomly

  20. Stochastic sensitivity technique in a persistence analysis of randomly forced population systems with multiple trophic levels.

    Science.gov (United States)

    Bashkirtseva, Irina; Ryashko, Lev; Ryazanova, Tatyana

    2017-11-01

    Motivated by important ecological applications we study how noise can reduce a number of trophic levels in hierarchically related multidimensional population systems. A nonlinear model with three trophic levels under the influence of external stochastic forcing is considered as a basic conceptual example. We analyze a probabilistic mechanism of noise-induced extinction of separate populations in this "prey-predator-top predator" system. We propose a new general mathematical approach for the estimation of the proximity of equilibrium regimes of this stochastic model to hazardous borders where abrupt changes in dynamics of ecological systems can occur. Our method is based on the stochastic sensitivity function technique and visualization method of confidence domains. Constructive abilities of this mathematical approach are demonstrated in the analysis of different scenaria of noise-induced reducing of the number of trophic levels. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Conservative management equally effective to new suture anchor technique for acute mallet finger deformity: A prospective randomized clinical trial.

    Science.gov (United States)

    Batıbay, Sefa Giray; Akgül, Turgut; Bayram, Serkan; Ayık, Ömer; Durmaz, Hayati

    2017-09-28

    Prospective randomized controlled trial. This study was designed to compare our new suture anchor technique with conservative management in acute Wehbe-Schneider type I A-B and II A-B mallet fingers. Twenty nine patients who presented to our clinic between 2013 and 2015 were randomized for surgical or conservative treatment. Wehbe-Schneider subtype C fractures were excluded. Fourteen were treated with surgery, and 15 were treated with conservative treatment. Primary outcomes were visual analog scale score, active distal interphalangeal (DIP) joint flexion, return to work, extension deficit and DIP joint degeneration. Follow-up time was 12 months. The mean visual analog scale was 2.0, and return to work was on average in 63.2 days in the surgical group and 1.47 and 53.7 days in the conservative group. Extension deficit was 8.1° in the surgical group and 6.1° in the conservative group. The mean DIP flexion at final follow-up was 54.5° (40-65) in the surgery group and 58.3° (45-70) in the conservative group. DIP joint degeneration was observed with X-rays in 4 patients in surgical group, and none of the patients in the conservative group had DIP degeneration at 1 year after treatment. The therapeutic effectiveness of suture anchor technique was not statistically different from conservative treatment. Subluxation seen after fixation treatment with suture anchors may be due to inadequate anchor fixation. DIP joint degeneration was seen significantly more in the surgical group. Our study suggests that the new suture anchor technique is not superior to conservative treatment. Ib. Copyright © 2017 Hanley & Belfus. Published by Elsevier Inc. All rights reserved.

  2. A validated HPLC-UV method and optimization of sample preparation technique for norepinephrine and serotonin in mouse brain.

    Science.gov (United States)

    Thomas, Jaya; Khanam, Razia; Vohora, Divya

    2015-01-01

    Norepinephrine and serotonin are two important neurotransmitters whose variations in brain are reported to be associated with many common neuropsychiatric disorders. Yet, relevant literature on estimation of monoamines in biological samples using HPLC-UV is limited. The present study involves the development of a simultaneous HPLC-UV method for estimation of norepinephrine and serotonin along with optimization of the sample preparation technique. Chromatographic separation was achieved by injecting 20 µL of the sample after extraction into Quaternary pump HPLC equipped with C18 column using 0.05% formic acid and acetonitrile (90:10, v/v) as the mobile phase with 1 mL min(-1) flow rate. The developed method was validated as per the ICH guidelines in terms of linearity, accuracy, repeatability, precision, and robustness. The method showed a wide range of linearity (50-4000 and 31.25-4000 ng mL(-1) for norepinephrine and serotonin, respectively). The recovery was found to be in the range of 86.04-89.01% and 86.43-89.61% for norepinephrine and serotonin, respectively. The results showed low value of %RSD for repeatability, intra and inter-day precision, and robustness studies. Four different methods were used for the extraction of these neurotransmitters and the best one with maximum recovery was ascertained. Here, we developed and validated a simple, accurate, and reliable method for the estimation of norepinephrine and serotonin in mice brain samples using HPLC-UV. The method was successfully applied to quantify these neurotransmitters in mice brain extracted by optimized sample preparation technique.

  3. A Novel Technique for Raman Analysis of Highly Radioactive Samples Using Any Standard Micro-Raman Spectrometer.

    Science.gov (United States)

    Colle, Jean-Yves; Naji, Mohamed; Sierig, Mark; Manara, Dario

    2017-04-12

    A novel approach for the Raman measurement of nuclear materials is reported in this paper. It consists of the enclosure of the radioactive sample in a tight capsule that isolates the material from the atmosphere. The capsule can optionally be filled with a chosen gas pressurized up to 20 bars. The micro-Raman measurement is performed through an optical-grade quartz window. This technique permits accurate Raman measurements with no need for the spectrometer to be enclosed in an alpha-tight containment. It therefore allows the use of all options of the Raman spectrometer, like multi-wavelength laser excitation, different polarizations, and single or triple spectrometer modes. Some examples of measurements are shown and discussed. First, some spectral features of a highly radioactive americium oxide sample (AmO2) are presented. Then, we report the Raman spectra of neptunium oxide (NpO2) samples, the interpretation of which is greatly improved by employing three different excitation wavelengths, 17O doping, and a triple mode configuration to measure the anti-stokes Raman lines. This last feature also allows the estimation of the sample surface temperature. Finally, data that were measured on a sample from Chernobyl lava, where phases are identified by Raman mapping, are shown.

  4. The use of recently described ionisation techniques for the rapid analysis of some common drugs and samples of biological origin.

    Science.gov (United States)

    Williams, Jonathan P; Patel, Vibhuti J; Holland, Richard; Scrivens, James H

    2006-01-01

    Three ionisation techniques that require no sample preparation or extraction prior to mass analysis have been used for the rapid analysis of pharmaceutical tablets and ointments. These methods were (i) the novel direct analysis in real time (DART), (ii) desorption electrospray ionisation (DESI), and (iii) desorption atmospheric pressure chemical ionisation (DAPCI). The performance of the three techniques was investigated for a number of common drugs. Significant differences between these approaches were observed. For compounds of moderate to low polarity DAPCI produced more effective ionisation. Accurate DESI and DAPCI tandem mass spectra were obtained and these greatly enhance the selectivity and information content of the experiment. The detection from human skin of the active ingredients from ointments is reported together with the detection of ibuprofen metabolites in human urine. Copyright 2006 John Wiley & Sons, Ltd.

  5. Comparative study of manual liquid-based cytology (MLBC technique and direct smear technique (conventional on fine-needle cytology/fine-needle aspiration cytology samples

    Directory of Open Access Journals (Sweden)

    Prajkta Suresh Pawar

    2014-01-01

    Conclusion: This MLBC technique gives results comparable to the conventional technique with better morphology. In a set up where aspirators are learners, this technique will ensure adequacy due to remnant in needle hub getting processed

  6. Estimating screening-mammography receiver operating characteristic (ROC) curves from stratified random samples of screening mammograms: a simulation study.

    Science.gov (United States)

    Zur, Richard M; Pesce, Lorenzo L; Jiang, Yulei

    2015-05-01

    To evaluate stratified random sampling (SRS) of screening mammograms by (1) Breast Imaging Reporting and Data System (BI-RADS) assessment categories, and (2) the presence of breast cancer in mammograms, for estimation of screening-mammography receiver operating characteristic (ROC) curves in retrospective observer studies. We compared observer study case sets constructed by (1) random sampling (RS); (2) SRS with proportional allocation (SRS-P) with BI-RADS 1 and 2 noncancer cases accounting for 90.6% of all noncancer cases; (3) SRS with disproportional allocation (SRS-D) with BI-RADS 1 and 2 noncancer cases accounting for 10%-80%; and (4) SRS-D and multiple imputation (SRS-D + MI) with missing BI-RADS 1 and 2 noncancer cases imputed to recover the 90.6% proportion. Monte Carlo simulated case sets were drawn from a large case population modeled after published Digital Mammography Imaging Screening Trial data. We compared the bias, root-mean-square error, and coverage of 95% confidence intervals of area under the ROC curve (AUC) estimates from the sampling methods (200-2000 cases, of which 25% were cancer cases) versus from the large case population. AUC estimates were unbiased from RS, SRS-P, and SRS-D + MI, but biased from SRS-D. AUC estimates from SRS-P and SRS-D + MI had 10% smaller root-mean-square error than RS. Both SRS-P and SRS-D + MI can be used to obtain unbiased and 10% more efficient estimate of screening-mammography ROC curves. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  7. Randomized clinical trial of BiClamp forceps versus clamp-crushing technique in open liver resection.

    Science.gov (United States)

    Chen, Jiang Ming; Geng, Wei; Zhang, Song; Liu, Fu Bao; Zhao, Hong Chuan; Zhao, Yi Jun; Wang, Guo Bin; Xie, Sheng Xue; Geng, Xiao Ping

    2017-03-01

    The aim of this trial was to compare the efficacy and safety of BiClamp forceps with the "gold-standard" clamp-crushing technique for open liver resection. From October 2014 to May 2016, 86 consecutive patients scheduled to undergo hepatic resection were randomized to a BiClamp forceps group (n = 43) or to a clamp-crushing technique group (n = 43). Background characteristics of the two groups were closely matched. There were no significant differences between the BiClamp forceps group and clamp-crushing group in total intraoperative blood loss (339.81 ± 257.20 ml vs. 376.73 ± 303.67 ml, respectively; P = 0.545) or blood loss per transection area (5.35 ± 3.27 ml/cm2 vs. 5.44 ± 3.02 ml/cm2 , respectively; P = 0.609). Liver transection speed, the need of blood transfusion, morbidity, length of postoperative hospital stay, total hospitalization cost and liver function recovery were similar in the two groups. Multivariate logistic regression analysis identified major hepatectomy, multiple resections and liver transection time ≥30 min as significantly unfavorable factors for decreased intraoperative blood loss. Liver parenchymal transection with BiClamp forceps is as safe and feasible as the gold-standard clamp-crushing technique. © 2017 Japanese Society of Hepato-Biliary-Pancreatic Surgery.

  8. Characterizing stand-level forest canopy cover and height using Landsat time series, samples of airborne LiDAR, and the Random Forest algorithm

    Science.gov (United States)

    Ahmed, Oumer S.; Franklin, Steven E.; Wulder, Michael A.; White, Joanne C.

    2015-03-01

    Many forest management activities, including the development of forest inventories, require spatially detailed forest canopy cover and height data. Among the various remote sensing technologies, LiDAR (Light Detection and Ranging) offers the most accurate and consistent means for obtaining reliable canopy structure measurements. A potential solution to reduce the cost of LiDAR data, is to integrate transects (samples) of LiDAR data with frequently acquired and spatially comprehensive optical remotely sensed data. Although multiple regression is commonly used for such modeling, often it does not fully capture the complex relationships between forest structure variables. This study investigates the potential of Random Forest (RF), a machine learning technique, to estimate LiDAR measured canopy structure using a time series of Landsat imagery. The study is implemented over a 2600 ha area of industrially managed coastal temperate forests on Vancouver Island, British Columbia, Canada. We implemented a trajectory-based approach to time series analysis that generates time since disturbance (TSD) and disturbance intensity information for each pixel and we used this information to stratify the forest land base into two strata: mature forests and young forests. Canopy cover and height for three forest classes (i.e. mature, young and mature and young (combined)) were modeled separately using multiple regression and Random Forest (RF) techniques. For all forest classes, the RF models provided improved estimates relative to the multiple regression models. The lowest validation error was obtained for the mature forest strata in a RF model (R2 = 0.88, RMSE = 2.39 m and bias = -0.16 for canopy height; R2 = 0.72, RMSE = 0.068% and bias = -0.0049 for canopy cover). This study demonstrates the value of using disturbance and successional history to inform estimates of canopy structure and obtain improved estimates of forest canopy cover and height using the RF algorithm.

  9. Sampling and sample preparation development for analytical and on-line measurement techniques of process liquids; Naeytteenoton ja kaesittelyn kehittaeminen prosessinesteiden analytiikan ja on-line mittaustekniikan tarpeisiin - MPKT 11

    Energy Technology Data Exchange (ETDEWEB)

    Karttunen, K. [Oulu Univ. (Finland)

    1998-12-31

    Main goal of the research project is to develop sampling and sample handling methods and techniques for pulp and paper industry to be used for analysis and on-line purposes. The research focus specially on the research and development of the classification and separation methods and techniques needed for liquid and colloidal substances as well as in ion analysis. (orig.)

  10. Randomized Comparison of Two Vaginal Self-Sampling Methods for Human Papillomavirus Detection: Dry Swab versus FTA Cartridge.

    Directory of Open Access Journals (Sweden)

    Rosa Catarino

    Full Text Available Human papillomavirus (HPV self-sampling (self-HPV is valuable in cervical cancer screening. HPV testing is usually performed on physician-collected cervical smears stored in liquid-based medium. Dry filters and swabs are an alternative. We evaluated the adequacy of self-HPV using two dry storage and transport devices, the FTA cartridge and swab.A total of 130 women performed two consecutive self-HPV samples. Randomization determined which of the two tests was performed first: self-HPV using dry swabs (s-DRY or vaginal specimen collection using a cytobrush applied to an FTA cartridge (s-FTA. After self-HPV, a physician collected a cervical sample using liquid-based medium (Dr-WET. HPV types were identified by real-time PCR. Agreement between collection methods was measured using the kappa statistic.HPV prevalence for high-risk types was 62.3% (95%CI: 53.7-70.2 detected by s-DRY, 56.2% (95%CI: 47.6-64.4 by Dr-WET, and 54.6% (95%CI: 46.1-62.9 by s-FTA. There was overall agreement of 70.8% between s-FTA and s-DRY samples (kappa = 0.34, and of 82.3% between self-HPV and Dr-WET samples (kappa = 0.56. Detection sensitivities for low-grade squamous intraepithelial lesion or worse (LSIL+ were: 64.0% (95%CI: 44.5-79.8 for s-FTA, 84.6% (95%CI: 66.5-93.9 for s-DRY, and 76.9% (95%CI: 58.0-89.0 for Dr-WET. The preferred self-collection method among patients was s-DRY (40.8% vs. 15.4%. Regarding costs, FTA card was five times more expensive than the swab (~5 US dollars (USD/per card vs. ~1 USD/per swab.Self-HPV using dry swabs is sensitive for detecting LSIL+ and less expensive than s-FTA.International Standard Randomized Controlled Trial Number (ISRCTN: 43310942.

  11. Two different techniques in the rehabilitation treatment of low back pain: a randomized controlled trial.

    Science.gov (United States)

    Donzelli, S; Di Domenica, E; Cova, A M; Galletti, R; Giunta, N

    2006-09-01

    The Back School is a widely accepted and effective method for treating low back pain, whereas no scientific evidence exists about the effects of the Pilates CovaTech method. With this study we wanted to evaluate the efficacy of this new method in patients with low back pain. Fifty-three patients with at least 3 months of nonspecific low back pain were entered into a Pilates therapy or a Back School treatment group, 43 of which completed the study. Small exercise groups of 7 patients each followed a daily kinesitherapy protocol for 10 days. Evaluations were performed at the start of the study and then at 1, 3 and 6 months after the beginning of treatment. We used the Oswestry Low Back Pain Disability Scale (OLBPDQ) to assess disability and the visual analog scale (VAS) to evaluate pain. Demographic and baseline clinical characteristics were similar for both groups. A significant reduction in pain intensity and disability was observed across the entire sample. The Pilates method group showed better compliance and subjective response to treatment. The results obtained with the Pilates method were comparable to those achieved with the Back School method, suggesting its use as an alternative approach to the treatment of non specific low back pain.

  12. Optimal Subset Selection of Time-Series MODIS Images and Sample Data Transfer with Random Forests for Supervised Classification Modelling

    Directory of Open Access Journals (Sweden)

    Fuqun Zhou

    2016-10-01

    Full Text Available Nowadays, various time-series Earth Observation data with multiple bands are freely available, such as Moderate Resolution Imaging Spectroradiometer (MODIS datasets including 8-day composites from NASA, and 10-day composites from the Canada Centre for Remote Sensing (CCRS. It is challenging to efficiently use these time-series MODIS datasets for long-term environmental monitoring due to their vast volume and information redundancy. This challenge will be greater when Sentinel 2–3 data become available. Another challenge that researchers face is the lack of in-situ data for supervised modelling, especially for time-series data analysis. In this study, we attempt to tackle the two important issues with a case study of land cover mapping using CCRS 10-day MODIS composites with the help of Random Forests’ features: variable importance, outlier identification. The variable importance feature is used to analyze and select optimal subsets of time-series MODIS imagery for efficient land cover mapping, and the outlier identification feature is utilized for transferring sample data available from one year to an adjacent year for supervised classification modelling. The results of the case study of agricultural land cover classification at a regional scale show that using only about a half of the variables we can achieve land cover classification accuracy close to that generated using the full dataset. The proposed simple but effective solution of sample transferring could make supervised modelling possible for applications lacking sample data.

  13. Optimal Subset Selection of Time-Series MODIS Images and Sample Data Transfer with Random Forests for Supervised Classification Modelling.

    Science.gov (United States)

    Zhou, Fuqun; Zhang, Aining

    2016-10-25

    Nowadays, various time-series Earth Observation data with multiple bands are freely available, such as Moderate Resolution Imaging Spectroradiometer (MODIS) datasets including 8-day composites from NASA, and 10-day composites from the Canada Centre for Remote Sensing (CCRS). It is challenging to efficiently use these time-series MODIS datasets for long-term environmental monitoring due to their vast volume and information redundancy. This challenge will be greater when Sentinel 2-3 data become available. Another challenge that researchers face is the lack of in-situ data for supervised modelling, especially for time-series data analysis. In this study, we attempt to tackle the two important issues with a case study of land cover mapping using CCRS 10-day MODIS composites with the help of Random Forests' features: variable importance, outlier identification. The variable importance feature is used to analyze and select optimal subsets of time-series MODIS imagery for efficient land cover mapping, and the outlier identification feature is utilized for transferring sample data available from one year to an adjacent year for supervised classification modelling. The results of the case study of agricultural land cover classification at a regional scale show that using only about a half of the variables we can achieve land cover classification accuracy close to that generated using the full dataset. The proposed simple but effective solution of sample transferring could make supervised modelling possible for applications lacking sample data.

  14. Surface Sampling Techniques

    Science.gov (United States)

    1982-09-01

    their suitability for use for qualitative analysis of explosiveu/ explosivo residues oil the surface types of interest. Tables 11-5 and 11-6 list spot teot...below: Analytes Tested NG Nitroglycerin PETN Pentaerythritetetranitrate RDX Cyclotrimethylenetrinitramine TNT 2,4,6-trinitrotoluene TNB 1,3,5...acetonitrile had evaporated, the paper was examined under 254 nm UV illumination. All of the analytes except NG and PETN were detecr.ed at the 10OX

  15. Glove perforation rate with orthopedic gloving versus double gloving technique in tibial plateau leveling osteotomy: A randomized trial.

    Science.gov (United States)

    Egeler, Kimberly; Stephenson, Nicole; Stanke, Natasha

    2016-11-01

    In this randomized, prospective study, perforation rates, glove change rates, and cost between orthopedic gloves (n = 227) and double gloving with standard latex surgical gloves (n = 178) worn in tibial plateau leveling osteotomy procedures were compared. Gloves were collected from the surgeon and surgical resident after procedures and were tested for perforations with a standardized water leak test, as described by the American Society for Testing and Materials International. No statistically significant difference was found between the perforation rate using orthopedic gloving and double gloving techniques (P = 0.629) or the rate at which gloves were changed (P = 0.146). Orthopedic gloving was 2.1 times more costly than double gloving but they may be preferred by surgeons for dexterity and comfort.

  16. Specific music therapy techniques in the treatment of primary headache disorders in adolescents: a randomized attention-placebo-controlled trial.

    Science.gov (United States)

    Koenig, Julian; Oelkers-Ax, Rieke; Kaess, Michael; Parzer, Peter; Lenzen, Christoph; Hillecke, Thomas Karl; Resch, Franz

    2013-10-01

    Migraine and tension-type headache have a high prevalence in children and adolescents. In addition to common pharmacologic and nonpharmacologic interventions, music therapy has been shown to be efficient in the prophylaxis of pediatric migraine. This study aimed to assess the efficacy of specific music therapy techniques in the treatment of adolescents with primary headache (tension-type headache and migraine). A prospective, randomized, attention-placebo-controlled parallel group trial was conducted. Following an 8-week baseline, patients were randomized to either music therapy (n = 40) or a rhythm pedagogic program (n = 38) designed as an "attention placebo" over 6 sessions within 8 weeks. Reduction of both headache frequency and intensity after treatment (8-week postline) as well as 6 months after treatment were taken as the efficacy variables. Treatments were delivered in equal dose and frequency by the same group of therapists. Data analysis of subjects completing the protocol showed that neither treatment was superior to the other at any point of measurement (posttreatment and follow-up). Intention-to-treat analysis revealed no impact of drop-out on these results. Both groups showed a moderate mean reduction of headache frequency posttreatment of about 20%, but only small numbers of responders (50% frequency reduction). Follow-up data showed no significant deteriorations or improvements. This article presents a randomized placebo-controlled trial on music therapy in the treatment of adolescents with frequent primary headache. Music therapy is not superior to an attention placebo within this study. These results draw attention to the need of providing adequate controls within therapeutic trials in the treatment of pain. Copyright © 2013 American Pain Society. Published by Elsevier Inc. All rights reserved.

  17. Media Use and Source Trust among Muslims in Seven Countries: Results of a Large Random Sample Survey

    Directory of Open Access Journals (Sweden)

    Steven R. Corman

    2013-12-01

    Full Text Available Despite the perceived importance of media in the spread of and resistance against Islamist extremism, little is known about how Muslims use different kinds of media to get information about religious issues, and what sources they trust when doing so. This paper reports the results of a large, random sample survey among Muslims in seven countries Southeast Asia, West Africa and Western Europe, which helps fill this gap. Results show a diverse set of profiles of media use and source trust that differ by country, with overall low trust in mediated sources of information. Based on these findings, we conclude that mass media is still the most common source of religious information for Muslims, but that trust in mediated information is low overall. This suggests that media are probably best used to persuade opinion leaders, who will then carry anti-extremist messages through more personal means.

  18. Enhancing positive parent-child interactions and family functioning in a poverty sample: a randomized control trial.

    Science.gov (United States)

    Negrão, Mariana; Pereira, Mariana; Soares, Isabel; Mesman, Judi

    2014-01-01

    This study tested the attachment-based intervention program Video-feedback Intervention to promote Positive Parenting and Sensitive Discipline (VIPP-SD) in a randomized controlled trial with poor families of toddlers screened for professional's concerns about the child's caregiving environment. The VIPP-SD is an evidence-based intervention, but has not yet been tested in the context of poverty. The sample included 43 families with 1- to 4-year-old children: mean age at the pretest was 29 months and 51% were boys. At the pretest and posttest, mother-child interactions were observed at home, and mothers reported on family functioning. The VIPP-SD proved to be effective in enhancing positive parent-child interactions and positive family relations in a severely deprived context. Results are discussed in terms of implications for support services provided to such poor families in order to reduce intergenerational risk transmission.

  19. Effects of horizontal vs vertical vaginal cuff closure techniques on vagina length after vaginal hysterectomy: a prospective randomized study.

    Science.gov (United States)

    Cavkaytar, Sabri; Kokanali, Mahmut Kuntay; Topcu, Hasan Onur; Aksakal, Orhan Seyfi; Doganay, Melike

    2014-01-01

    To compare the effects of horizontal and vertical vaginal cuff closure techniques on vagina length after vaginal hysterectomy. Prospective randomized study (Canadian Task Force classification I). Teaching and research hospital, a tertiary center. Fifty-two women with POP-Q stage 0 or 1 uterine prolapse were randomized into 2 groups using vertical (n = 26) or horizontal (n = 26) vaginal cuff closure. All patients underwent vaginal hysterectomy. Vagina length in the 2 groups was compared preoperatively, immediately after surgery, and at 6 weeks postoperatively. Mean (SD) preoperative vagina length in the horizontal and vertical groups was similar (7.87 [0.92] cm vs 7.99 [0.78] cm; p = .41). Immediately postoperatively, the vagina was significantly shorter in the horizontal group than in the vertical group (6.61 [0.89] cm vs 7.51 [0.74] cm; p vagina was still significantly shorter in the horizontal group (6.55 [0.89] cm vs 7.42 (0.73) cm; p vagina length before and after surgery was also significantly higher in the horizontal group than in the vertical group (-1.26 [0.12] cm vs 0.49 [0.11] cm; p vagina length better than does horizontal cuff closure. Copyright © 2014 AAGL. Published by Elsevier Inc. All rights reserved.

  20. Nonpharmacological techniques to reduce pain in preterm infants who receive heel-lance procedure: a randomized controlled trial.

    Science.gov (United States)

    Bergomi, Piera; Chieppi, Michele; Maini, Antonella; Mugnos, Tiziana; Spotti, Debora; Tzialla, Chrisoulle; Scudeller, Luigia

    2014-01-01

    The heel-lance (HL) method for blood collection from the newborn is controversial for the pain it causes. This is the first randomized controlled trial on the management and reduction of pain using the music of Wolfgang Amadeus Mozart ("Sonata K. 448") in premature infants hospitalized in a neonatal intensive care unit (NICU). This study has compared nonpharmacological techniques with standard procedure for reducing pain during HL procedure. Thirty-five premature infants were enrolled, each for 3 HL procedures, of which each was randomized to 1 of the 3 study arms. Arms were then compared in terms of the Premature Infant Pain Profile (PIPP) changes by analysis of variance (ANOVA). One hundred five HL procedures were available for analysis (35 standard procedure, 35 music, 35 glucose). Median baseline PIPP was 3, and median PIPP after the HL procedure was 5. PIPP scale change was +3 in the control arm, +1 in the glucose arm, +2 in the music arm (p = .008). Both glucose and music were safe and effective in limiting pain increase when compared to standard procedure in HL procedures in preterm infants.

  1. Rationale, Design, Samples, and Baseline Sun Protection in a Randomized Trial on a Skin Cancer Prevention Intervention in Resort Environments

    Science.gov (United States)

    Buller, David B.; Andersen, Peter A.; Walkosz, Barbara J.; Scott, Michael D.; Beck, Larry; Cutter, Gary R.

    2016-01-01

    Introduction Exposure to solar ultraviolet radiation during recreation is a risk factor for skin cancer. A trial evaluating an intervention to promote advanced sun protection (sunscreen pre-application/reapplication; protective hats and clothing; use of shade) during vacations. Materials and Methods Adult visitors to hotels/resorts with outdoor recreation (i.e., vacationers) participated in a group-randomized pretest-posttest controlled quasi-experimental design in 2012–14. Hotels/resorts were pair-matched and randomly assigned to the intervention or untreated control group. Sun protection (e.g., clothing, hats, shade and sunscreen) was measured in cross-sectional samples by observation and a face-to-face intercept survey during two-day visits. Results Initially, 41 hotel/resorts (11%) participated but 4 dropped out before posttest. Hotel/resorts were diverse (employees=30 to 900; latitude=24o 78′ N to 50o 52′ N; elevation=2 ft. to 9,726 ft. above sea level), and had a variety of outdoor venues (beaches/pools, court/lawn games, golf courses, common areas, and chairlifts). At pretest, 4,347 vacationers were observed and 3,531 surveyed. More females were surveyed (61%) than observed (50%). Vacationers were mostly 35–60 years old, highly educated (college education = 68%) and non-Hispanic white (93%), with high-risk skin types (22%). Vacationers reported covering 60% of their skin with clothing. Also, 40% of vacationers used shade; 60% applied sunscreen; and 42% had been sunburned. Conclusions The trial faced challenges recruiting resorts but result show that the large, multi-state sample of vacationers were at high risk for solar UV exposure. PMID:26593781

  2. Investigating causal associations between use of nicotine, alcohol, caffeine, and cannabis: A two-sample bidirectional Mendelian randomization study.

    Science.gov (United States)

    Verweij, Karin J H; Treur, Jorien L; Vink, Jacqueline M

    2018-01-15

    Epidemiological studies consistently show co-occurrence of use of different addictive substances. Whether these associations are causal or due to overlapping underlying influences remains an important question in addiction research. Methodological advances have made it possible to use published genetic associations to infer causal relationships between phenotypes. In this exploratory study, we used Mendelian randomization (MR) to examine the causality of well-established associations between nicotine, alcohol, caffeine, and cannabis use. Two-sample MR was employed to estimate bi-directional causal effects between four addictive substances: nicotine (smoking initiation and cigarettes smoked per day), caffeine (cups of coffee per day), alcohol (units per week), and cannabis (initiation). Based on existing genome-wide association results we selected genetic variants associated with the exposure measure as an instrument to estimate causal effects. Where possible we applied sensitivity analyses (MR-Egger and weighted median) more robust to horizontal pleiotropy. Most MR tests did not reveal causal associations. There was some weak evidence for a causal positive effect of genetically instrumented alcohol use on smoking initiation and of cigarettes per day on caffeine use, but these did not hold up with the sensitivity analyses. There was also some suggestive evidence for a positive effect of alcohol use on caffeine use (only with MR-Egger) and smoking initiation on cannabis initiation (only with weighted median). None of the suggestive causal associations survived corrections for multiple testing. Two-sample Mendelian randomization analyses found little evidence for causal relationships between nicotine, alcohol, caffeine, and cannabis use. This article is protected by copyright. All rights reserved.

  3. Multivariate stratified sampling by stochastic multiobjective optimisation

    OpenAIRE

    Diaz-Garcia, Jose A.; Ramos-Quiroga, Rogelio

    2011-01-01

    This work considers the allocation problem for multivariate stratified random sampling as a problem of integer non-linear stochastic multiobjective mathematical programming. With this goal in mind the asymptotic distribution of the vector of sample variances is studied. Two alternative approaches are suggested for solving the allocation problem for multivariate stratified random sampling. An example is presented by applying the different proposed techniques.

  4. Investigation of CPD and HMDS sample preparation techniques for cervical cells in developing computer-aided screening system based on FE-SEM/EDX.

    Science.gov (United States)

    Jusman, Yessi; Ng, Siew Cheok; Abu Osman, Noor Azuan

    2014-01-01

    This paper investigated the effects of critical-point drying (CPD) and hexamethyldisilazane (HMDS) sample preparation techniques for cervical cells on field emission scanning electron microscopy and energy dispersive X-ray (FE-SEM/EDX). We investigated the visualization of cervical cell image and elemental distribution on the cervical cell for two techniques of sample preparation. Using FE-SEM/EDX, the cervical cell images are captured and the cell element compositions are extracted for both sample preparation techniques. Cervical cell image quality, elemental composition, and processing time are considered for comparison of performances. Qualitatively, FE-SEM image based on HMDS preparation technique has better image quality than CPD technique in terms of degree of spread cell on the specimen and morphologic signs of cell deteriorations (i.e., existence of plate and pellet drying artifacts and membrane blebs). Quantitatively, with mapping and line scanning EDX analysis, carbon and oxygen element compositions in HMDS technique were higher than the CPD technique in terms of weight percentages. The HMDS technique has shorter processing time than the CPD technique. The results indicate that FE-SEM imaging, elemental composition, and processing time for sample preparation with the HMDS technique were better than CPD technique for cervical cell preparation technique for developing computer-aided screening system.

  5. Assessment of a novel flow cytometry technique of one-step intracellular staining: example of FOXP3 in clinical samples.

    Science.gov (United States)

    Demaret, Julie; Saison, Julien; Venet, Fabienne; Malcus, Christophe; Poitevin-Later, Francoise; Lepape, Alain; Ferry, Tristan; Monneret, Guillaume

    2013-05-01

    By measuring multiple parameters on a single-cell basis, flow cytometry is a potent tool to dissect the phenotypes and functions of cell subsets. However, because this technique may be time-consuming, particularly for intracellular staining, it could be problematic for its use in daily routine or in large cohorts. Recently, a novel reagent has been developed to perform intracellular staining in one step. The objective of our study was thus to assess this new method in comparison with the reference technique by focusing on FOXP3 staining in clinical samples. Peripheral blood was collected from 15 HIV-1-infected patients, 5 critically ill patients, and 5 healthy volunteers and stained using the two different methods. Different subsets of FOXP3 positive cells were investigated by flow cytometry. When comparing results obtained with the two techniques, no statistical differences between the percentages of CD4+FOXP3+, CD4+CD25+FOXP3+, and CD4+CD25+CD127-FOXP3+ cells were observed. Besides, a strong correlation between percentages of CD4+FOXP3+CD25+CD127- lymphocytes measured with both techniques was found in patients (r: 0.843, P flow cytometry stainings obtained with the one-step method were very robust with an excellent intra-assay precision, a better discriminative power and correct stability and reproducibility of the staining even after blood storage. With a strong correlation between the percentages of FOXP3+ Tregs when compared with the reference method, a better staining quality, a shorter realization time and no need of isotype control, this one step procedure may represent an important improvement for a daily routine use of intracellular staining. Copyright © 2013 International Clinical Cytometry Society.

  6. Reliability of a new technique for the determination of vitamin B12 absorption in children: single stool sample test--a double isotope technique

    Energy Technology Data Exchange (ETDEWEB)

    Hjelt, K.

    1986-03-01

    The fractional vitamin B12 absorption (FAB12) was determined in 39 patients with various gastrointestinal diseases by a double-isotope technique, employing a single stool sample test (SSST), as well as a complete stool collection. The age of the patients ranged from 2.5 months to 16.2 years (mean 5.0 years). The test dose was administered orally and consisted of 0.5-4.5 micrograms of /sup 57/CoB12 (approximately 0.05 microCi), carmine powder, and 2 mg /sup 51/CrCl/sub 3/ (approximately 1.25 microCi) as the inabsorbable tracer. The wholebody radiation to a 1-year-old child averaged only 20 mrad. The stool and napkin was collected and homogenized by addition of 300 ml chromium sulfuric acid. A 300-ml sample of the homogenized stool and napkin, as well as 300 ml chromium sulfuric acid (75% v/v) containing the standards, were counted in a broad-based well counter. The FAB12 determined by SSST employing the stool with the highest content of /sup 51/Cr (which corresponded to the most carmine-colored stool) correlated closely to the FAB12 based on complete stool collection (r = 0.98, n = 39, p less than 0.001). The reproducibility of FAB12 determined by SSST was assessed from double assays in 19 patients. For a mean value of 12%, the SD was 3%, which corresponded to a coefficient of variation (CV) of 25%. The excretion of /sup 57/Co and /sup 51/Cr in the urine was examined in six patients with moderate to severe mucosal damage and was found to be low.

  7. Soil Moisture Mapping in an Arid Area Using a Land Unit Area (LUA Sampling Approach and Geostatistical Interpolation Techniques

    Directory of Open Access Journals (Sweden)

    Saeid Gharechelou

    2016-03-01

    Full Text Available Soil moisture (SM plays a key role in many environmental processes and has a high spatial and temporal variability. Collecting sample SM data through field surveys (e.g., for validation of remote sensing-derived products can be very expensive and time consuming if a study area is large, and producing accurate SM maps from the sample point data is a difficult task as well. In this study, geospatial processing techniques are used to combine several geo-environmental layers relevant to SM (soil, geology, rainfall, land cover, etc. into a land unit area (LUA map, which delineates regions with relatively homogeneous geological/geomorphological, land use/land cover, and climate characteristics. This LUA map is used to guide the collection of sample SM data in the field, and the field data is finally spatially interpolated to create a wall-to-wall map of SM in the study area (Garmsar, Iran. The main goal of this research is to create a SM map in an arid area, using a land unit area (LUA approach to obtain the most appropriate sample locations for collecting SM field data. Several environmental GIS layers, which have an impact on SM, were combined to generate a LUA map, and then field surveying was done in each class of the LUA map. A SM map was produced based on LUA, remote sensing data indexes, and spatial interpolation of the field survey sample data. The several interpolation methods (inverse distance weighting, kriging, and co-kriging were evaluated for generating SM maps from the sample data. The produced maps were compared to each other and validated using ground truth data. The results show that the LUA approach is a reasonable method to create the homogenous field to introduce a representative sample for field soil surveying. The geostatistical SM map achieved adequate accuracy; however, trend analysis and distribution of the soil sample point locations within the LUA types should be further investigated to achieve even better results. Co

  8. The modified ultra-mini percutaneous nephrolithotomy technique and comparison with standard nephrolithotomy: a randomized prospective study.

    Science.gov (United States)

    Karakan, Tolga; Kilinc, Muhammet Fatih; Doluoglu, Omer Gokhan; Yildiz, Yildiray; Yuceturk, Cem Nedim; Bagcioglu, Murat; Karagöz, Mehmet Ali; Bas, Okan; Resorlu, Berkan

    2017-04-01

    To compare the success and complications of ultra-mini percutaneous nephrolithotomy (UPNL) and standard percutaneous nephrolithotomy (SPNL) techniques. We prospectively analyzed 50 patients who underwent SPNL, and 47 patients who underwent UPNL. The patients with a stone size equal to or smaller than 25 mm and we used flipping a coin as the randomization technique. The mean stone size was 20.9 ± 3.6 mm in SPNL, and 20.3 ± 3.0 mm in ultra-mini PNL groups. Stone free rates were 88 % (44/50 patients) and 89.3 % (42/47 patients) in SPNL and UPNL groups, respectively, without any significant difference in between (p = 0.33). No major complications were seen in the UPNL group. PNL has been modified into micro PNL and UPNL parallel to the technological advances to decrease the complications of PNL. When performed as we do UPNL may be an alternative method to SPNL without any additional smaller-calibred nephroscope and with a similar high success rate.

  9. Treatment of anterior cruciate ligament injuries with special reference to surgical technique and rehabilitation: an assessment of randomized controlled trials.

    Science.gov (United States)

    Andersson, Daniel; Samuelsson, Kristian; Karlsson, Jón

    2009-06-01

    The primary aim was to investigate and assess the current evidence of randomized controlled trials (RCTs) on anterior cruciate ligament (ACL) injuries, with special reference to the choice of surgical techniques and aspects of rehabilitation. A secondary aim was to clarify relative strengths and weaknesses of the selected studies, resolve literature conflicts, and finally, evaluate the need for further studies. A PubMed database search using the key words "anterior cruciate ligament" was performed. The search was limited to only RCTs published in English during the period of January 1995 to March 2009. Articles concerning surgical technique and rehabilitation were obtained. After initial screening and subsequent quality appraisal based on the CONSORT (Consolidated Standards of Reporting Trials) Statement, a total of 70 articles were included in this review. Initial graft tension and the use of a ligament augmentation device do not affect clinical outcome. Bioabsorbable screws and titanium screws produced equal clinical outcome, regardless of graft type. Radiographic signs of osteoarthritis develop in 50% of ACL-injured patients, regardless of treatment. Meniscectomy further increases the risk. Furthermore, the use of a postoperative knee brace does not affect the clinical outcome after ACL reconstruction. Closed kinetic chain exercises produced less pain and laxity while promoting better subjective outcome than open kinetic chain exercises after patellar tendon reconstruction. In terms of quality assessment, several weaknesses pertaining to study design were discovered among the included RCTs, which intelligibly stress the need for further high-quality studies. Level II, systematic review of RCTs.

  10. Effects of a modified technique for TVT-O positioning on postoperative pain: single-blind randomized study.

    Science.gov (United States)

    Tommaselli, Giovanni A; Formisano, Carmen; Di Carlo, Costantino; Fabozzi, Annamaria; Nappi, Carmine

    2012-09-01

    One of the most frequent and distressing complications of the tension-free vaginal tape obturator (TVT-O) procedure for stress urinary incontinence (SUI) is groin pain, which may be related to the surgical technique or to the tape. The aim of this study was to evaluate the impact of a more limited dissection and a more medial trocar trajectory in TVT-O positioning on postoperative pain. Seventy-two SUI patients were randomized to undergo TVT-O either with the traditional technique (group A) or a modified procedure (reduced paraurethral dissection and a more medial trocar trajectory) (group B). Visual analog scale pain scores 12 h, 24 h, and 1 month after the procedure, number of analgesic vials, objective cure rate, and patient functional and quality of life scores 6 months after the procedure were evaluated. Data were analyzed by the Student's t test for parametric variables, the Mann-Whitney U and Wilcoxon tests for nonparametric variables, and Fisher's exact test for categorical variables. Pain scores were significantly lower in group B compared with group A 24 h after surgery (P = 0.01). Pain scores significantly decreased from 12-24 h postoperatively to 1 month follow-up in both groups (P TVT-O seem to reduce postoperative groin pain at 24 h after the procedure, but not the analgesic requirement.

  11. Calibrated delivery drape versus indirect gravimetric technique for the measurement of blood loss after delivery: a randomized trial.

    Science.gov (United States)

    Ambardekar, Shubha; Shochet, Tara; Bracken, Hillary; Coyaji, Kurus; Winikoff, Beverly

    2014-08-15

    Trials of interventions for PPH prevention and treatment rely on different measurement methods for the quantification of blood loss and identification of PPH. This study's objective was to compare measures of blood loss obtained from two different measurement protocols frequently used in studies. Nine hundred women presenting for vaginal delivery were randomized to a direct method (a calibrated delivery drape) or an indirect method (a shallow bedpan placed below the buttocks and weighing the collected blood and blood-soaked gauze/pads). Blood loss was measured from immediately after delivery for at least one hour or until active bleeding stopped. Significantly greater mean blood loss was recorded by the direct than by the indirect measurement technique (253.9 mL and 195.3 mL, respectively; difference = 58.6 mL (95% CI: 31-86); p 500 mL (8.7% vs. 4.7%, p = 0.02). The study suggests a real and significant difference in blood loss measurement between these methods. Research using blood loss measurement as an endpoint needs to be interpreted taking measurement technique into consideration. This study has been registered at clinicaltrials.gov as NCT01885845.

  12. [Scraping technique of stuck needle at Anmian point in the treatment of insomnia: a randomized controlled trial].

    Science.gov (United States)

    Zhang, Quan-Ai; Sun, Xiao-Hui; Lin, Jia-Ju; Li, Xing-Ling

    2013-06-01

    To compare the efficacy difference in the treatment of insomnia between scraping technique of stuck needle and conventional acupuncture at Anmian (Extra). One hundred and thirty one cases were randomized into an Anmian group (68 cases) and a conventional acupuncture group (63 cases). In the Anmian group, Anmian (Extra) was selected. After arrival of qi, the stuck needling was adopted by rotating the needle gently in single direction, 2-3 rounds till the needle body was stuck tightly. Afterwards, the needle tail was touched gently with the index finger to fix the needle body and the needle handle was scraped gently with the thumbnail from bottom to up. The needle was retained for 30 min. In the conventional acupuncture group, Sanyinjiao (SP 6), Shenmen (HT 7) and Baihui (GV 20) were selected and stimulated with reducing technique by rotating the needles. The needles were retained for 30 min. The treatment was given once every day, continuously for 2 weeks in both groups. The score of each factor and the total score in Pittsburgh sleep quality index (PSQI) were assessed before and after treatment in the two groups. Additionally, the efficacies of two groups were evaluated. For the patients in the conventional acupuncture group, the sleep quality and time of falling into spleen after treatment were improved as compared with those before treatment (all P efficiency, sleep disturbance, hypnotic drug and daytime dysfunction, and PSQI total score did not present statistically significant difference as compared with those before treatment (all P > 0.05). After treatment, for the patients in the Anmian group, the factor score and total score in PSQI were apparently improved as compared with those before treatment (P acupuncture group (P acupuncture group. The clinical efficacy in the Anmian group was apparently superior to the conventional acupuncture group. The scraping technique of stuck needle at Anmian (Extra) achieves the superior effect on insomnia as compared with

  13. Randomized clinical trial comparing inguinal hernia repair with Lichtenstein technique using non-absorbable or partially absorbable mesh. Preliminary report

    Directory of Open Access Journals (Sweden)

    Konrad Pielaciński

    2011-12-01

    Full Text Available Introduction: The Lichtenstein technique is currently considered the “gold standard” of open, anterior inguinal herniarepair. It is not free, however, of adverse effects, which may be caused by the implemented synthetic material. Aim: Determination the influence of the mesh employed on treatment results including immediate complications,return to everyday activities, chronic pain occurrence and hernia recurrence. Material and methods: Tension-free hernia repair using the Lichtenstein technique was performed in all the 59patients randomized to trial groups. Group P with heavyweight polypropylene mesh contained 34 patients; group Vwith lightweight, partially absorbable mesh (polypropylene/polyglactin 910 consisted of 25 people. Controlled, scheduledfollow-up appointments took place after the 7th day and the 3rd and 6th month. Patients were clinically assessedand pain intensity was determined on an analogue-visual scale.Results: No statistically significant influence of the type of mesh on the risk of early complications, severe pain intensity,the length of hospital stay, time of recovery, or patients’ satisfaction with treatment was observed. After 6 monthsalso no statistically significant differences were observed between groups with regard to recurrence rate (P 3.4% vs.V 4.0%, chronic pain (P 5.9% vs. V 4.0% and ailments such as “foreign body presence” (V vs. P, OR = 0.30, 95% CI0.077-1.219, p = 0.093 incidence, although their probability was 70% lower for V mesh. Conclusions: The preliminary results confirm the effectiveness of the Lichtenstein technique for hernia repair withboth types of meshes. It appears that use of a partially absorbable mesh is connected with

  14. Comparison of different sampling techniques and of different culture methods for detection of group B streptococcus carriage in pregnant women

    Directory of Open Access Journals (Sweden)

    Verhelst Rita

    2010-09-01

    Full Text Available Abstract Background Streptococcus agalactiae (group B streptococcus; GBS is a significant cause of perinatal and neonatal infections worldwide. To detect GBS colonization in pregnant women, the CDC recommends isolation of the bacterium from vaginal and anorectal swab samples by growth in a selective enrichment medium, such as Lim broth (Todd-Hewitt broth supplemented with selective antibiotics, followed by subculture on sheep blood agar. However, this procedure may require 48 h to complete. We compared different sampling and culture techniques for the detection of GBS. Methods A total of 300 swabs was taken from 100 pregnant women at 35-37 weeks of gestation. For each subject, one rectovaginal, one vaginal and one rectal ESwab were collected. Plating onto Columbia CNA agar (CNA, group B streptococcus differential agar (GBSDA (Granada Medium and chromID Strepto B agar (CA, with and without Lim broth enrichment, were compared. The isolates were confirmed as S. agalactiae using the CAMP test on blood agar and by molecular identification with tDNA-PCR or by 16S rRNA gene sequence determination. Results The overall GBS colonization rate was 22%. GBS positivity for rectovaginal sampling (100% was significantly higher than detection on the basis of vaginal sampling (50%, but not significantly higher than for rectal sampling (82%. Direct plating of the rectovaginal swab on CNA, GBSDA and CA resulted in detection of 59, 91 and 95% of the carriers, respectively, whereas subculturing of Lim broth yielded 77, 95 and 100% positivity, respectively. Lim broth enrichment enabled the detection of only one additional GBS positive subject. There was no significant difference between GBSDA and CA, whereas both were more sensitive than CNA. Direct culture onto GBSDA or CA (91 and 95% detected more carriers than Lim broth enrichment and subculture onto CNA (77%. One false negative isolate was observed on GBSDA, and three false positives on CA. Conclusions In

  15. Comparative study of manual liquid-based cytology (MLBC) technique and direct smear technique (conventional) on fine-needle cytology/fine-needle aspiration cytology samples

    OpenAIRE

    Prajkta Suresh Pawar; Rasika Uday Gadkari; Sunil Y Swami; Anil R Joshi

    2014-01-01

    Background: Liquid-based cytology technique enables cells to be suspended in a liquid medium and spread in a monolayer, making better morphological assessment. Automated techniques have been widely used, but limited due to cost and availability. Aim: The aim was to establish manual liquid-based cytology (MLBC) technique on fine-needle aspiration cytology (FNAC) material and compare its results with conventional technique. Materials and Methods: In this study, we examined cells trappe...

  16. Integrating silicon nanowire field effect transistor, microfluidics and air sampling techniques for real-time monitoring biological aerosols.

    Science.gov (United State