WorldWideScience

Sample records for random sampling technique

  1. Sample size estimation and sampling techniques for selecting a representative sample

    Directory of Open Access Journals (Sweden)

    Aamir Omair

    2014-01-01

    Full Text Available Introduction: The purpose of this article is to provide a general understanding of the concepts of sampling as applied to health-related research. Sample Size Estimation: It is important to select a representative sample in quantitative research in order to be able to generalize the results to the target population. The sample should be of the required sample size and must be selected using an appropriate probability sampling technique. There are many hidden biases which can adversely affect the outcome of the study. Important factors to consider for estimating the sample size include the size of the study population, confidence level, expected proportion of the outcome variable (for categorical variables/standard deviation of the outcome variable (for numerical variables, and the required precision (margin of accuracy from the study. The more the precision required, the greater is the required sample size. Sampling Techniques: The probability sampling techniques applied for health related research include simple random sampling, systematic random sampling, stratified random sampling, cluster sampling, and multistage sampling. These are more recommended than the nonprobability sampling techniques, because the results of the study can be generalized to the target population.

  2. An unbiased estimator of the variance of simple random sampling using mixed random-systematic sampling

    OpenAIRE

    Padilla, Alberto

    2009-01-01

    Systematic sampling is a commonly used technique due to its simplicity and ease of implementation. The drawback of this simplicity is that it is not possible to estimate the design variance without bias. There are several ways to circumvent this problem. One method is to suppose that the variable of interest has a random order in the population, so the sample variance of simple random sampling without replacement is used. By means of a mixed random - systematic sample, an unbiased estimator o...

  3. Independent random sampling methods

    CERN Document Server

    Martino, Luca; Míguez, Joaquín

    2018-01-01

    This book systematically addresses the design and analysis of efficient techniques for independent random sampling. Both general-purpose approaches, which can be used to generate samples from arbitrary probability distributions, and tailored techniques, designed to efficiently address common real-world practical problems, are introduced and discussed in detail. In turn, the monograph presents fundamental results and methodologies in the field, elaborating and developing them into the latest techniques. The theory and methods are illustrated with a varied collection of examples, which are discussed in detail in the text and supplemented with ready-to-run computer code. The main problem addressed in the book is how to generate independent random samples from an arbitrary probability distribution with the weakest possible constraints or assumptions in a form suitable for practical implementation. The authors review the fundamental results and methods in the field, address the latest methods, and emphasize the li...

  4. Generation and Analysis of Constrained Random Sampling Patterns

    DEFF Research Database (Denmark)

    Pierzchlewski, Jacek; Arildsen, Thomas

    2016-01-01

    Random sampling is a technique for signal acquisition which is gaining popularity in practical signal processing systems. Nowadays, event-driven analog-to-digital converters make random sampling feasible in practical applications. A process of random sampling is defined by a sampling pattern, which...... indicates signal sampling points in time. Practical random sampling patterns are constrained by ADC characteristics and application requirements. In this paper, we introduce statistical methods which evaluate random sampling pattern generators with emphasis on practical applications. Furthermore, we propose...... algorithm generates random sampling patterns dedicated for event-driven-ADCs better than existed sampling pattern generators. Finally, implementation issues of random sampling patterns are discussed....

  5. Urine sampling techniques in symptomatic primary-care patients

    DEFF Research Database (Denmark)

    Holm, Anne; Aabenhus, Rune

    2016-01-01

    in infection rate between mid-stream-clean-catch, mid-stream-urine and random samples. Conclusions: At present, no evidence suggests that sampling technique affects the accuracy of the microbiological diagnosis in non-pregnant women with symptoms of urinary tract infection in primary care. However......Background: Choice of urine sampling technique in urinary tract infection may impact diagnostic accuracy and thus lead to possible over- or undertreatment. Currently no evidencebased consensus exists regarding correct sampling technique of urine from women with symptoms of urinary tract infection...... a randomized or paired design to compare the result of urine culture obtained with two or more collection techniques in adult, female, non-pregnant patients with symptoms of urinary tract infection. We evaluated quality of the studies and compared accuracy based on dichotomized outcomes. Results: We included...

  6. MUP, CEC-DES, STRADE. Codes for uncertainty propagation, experimental design and stratified random sampling techniques

    International Nuclear Information System (INIS)

    Amendola, A.; Astolfi, M.; Lisanti, B.

    1983-01-01

    The report describes the how-to-use of the codes: MUP (Monte Carlo Uncertainty Propagation) for uncertainty analysis by Monte Carlo simulation, including correlation analysis, extreme value identification and study of selected ranges of the variable space; CEC-DES (Central Composite Design) for building experimental matrices according to the requirements of Central Composite and Factorial Experimental Designs; and, STRADE (Stratified Random Design) for experimental designs based on the Latin Hypercube Sampling Techniques. Application fields, of the codes are probabilistic risk assessment, experimental design, sensitivity analysis and system identification problems

  7. Comparison of sampling techniques for use in SYVAC

    International Nuclear Information System (INIS)

    Dalrymple, G.J.

    1984-01-01

    The Stephen Howe review (reference TR-STH-1) recommended the use of a deterministic generator (DG) sampling technique for sampling the input values to the SYVAC (SYstems Variability Analysis Code) program. This technique was compared with Monte Carlo simple random sampling (MC) by taking a 1000 run case of SYVAC using MC as the reference case. The results show that DG appears relatively inaccurate for most values of consequence when used with 11 sample intervals. If 22 sample intervals are used then DG generates cumulative distribution functions that are statistically similar to the reference distribution. 400 runs of DG or MC are adequate to generate a representative cumulative distribution function. The MC technique appears to perform better than DG for the same number of runs. However, the DG predicts higher doses and in view of the importance of generating data in the high dose region this sampling technique with 22 sample intervals is recommended for use in SYVAC. (author)

  8. BWIP-RANDOM-SAMPLING, Random Sample Generation for Nuclear Waste Disposal

    International Nuclear Information System (INIS)

    Sagar, B.

    1989-01-01

    1 - Description of program or function: Random samples for different distribution types are generated. Distribution types as required for performance assessment modeling of geologic nuclear waste disposal are provided. These are: - Uniform, - Log-uniform (base 10 or natural), - Normal, - Lognormal (base 10 or natural), - Exponential, - Bernoulli, - User defined continuous distribution. 2 - Method of solution: A linear congruential generator is used for uniform random numbers. A set of functions is used to transform the uniform distribution to the other distributions. Stratified, rather than random, sampling can be chosen. Truncated limits can be specified on many distributions, whose usual definition has an infinite support. 3 - Restrictions on the complexity of the problem: Generation of correlated random variables is not included

  9. Randomized branch sampling to estimatefruit production in Pecan trees cv. ‘Barton’

    Directory of Open Access Journals (Sweden)

    Filemom Manoel Mokochinski

    Full Text Available ABSTRACT: Sampling techniques to quantify the production of fruits are still very scarce and create a gap in crop development research. This study was conducted in a rural property in the county of Cachoeira do Sul - RS to estimate the efficiency of randomized branch sampling (RBS in quantifying the production of pecan fruit at three different ages (5,7 and 10 years. Two selection techniques were tested: the probability proportional to the diameter (PPD and the uniform probability (UP techniques, which were performed on nine trees, three from each age and randomly chosen. The RBS underestimated fruit production for all ages, and its main drawback was the high sampling error (125.17% - PPD and 111.04% - UP. The UP was regarded as more efficient than the PPD, though both techniques estimated similar production and similar experimental errors. In conclusion, we reported that branch sampling was inaccurate for this case study, requiring new studies to produce estimates with smaller sampling error.

  10. A Bayesian Justification for Random Sampling in Sample Survey

    Directory of Open Access Journals (Sweden)

    Glen Meeden

    2012-07-01

    Full Text Available In the usual Bayesian approach to survey sampling the sampling design, plays a minimal role, at best. Although a close relationship between exchangeable prior distributions and simple random sampling has been noted; how to formally integrate simple random sampling into the Bayesian paradigm is not clear. Recently it has been argued that the sampling design can be thought of as part of a Bayesian's prior distribution. We will show here that under this scenario simple random sample can be given a Bayesian justification in survey sampling.

  11. Landslide Susceptibility Assessment Using Frequency Ratio Technique with Iterative Random Sampling

    Directory of Open Access Journals (Sweden)

    Hyun-Joo Oh

    2017-01-01

    Full Text Available This paper assesses the performance of the landslide susceptibility analysis using frequency ratio (FR with an iterative random sampling. A pair of before-and-after digital aerial photographs with 50 cm spatial resolution was used to detect landslide occurrences in Yongin area, Korea. Iterative random sampling was run ten times in total and each time it was applied to the training and validation datasets. Thirteen landslide causative factors were derived from the topographic, soil, forest, and geological maps. The FR scores were calculated from the causative factors and training occurrences repeatedly ten times. The ten landslide susceptibility maps were obtained from the integration of causative factors that assigned FR scores. The landslide susceptibility maps were validated by using each validation dataset. The FR method achieved susceptibility accuracies from 89.48% to 93.21%. And the landslide susceptibility accuracy of the FR method is higher than 89%. Moreover, the ten times iterative FR modeling may contribute to a better understanding of a regularized relationship between the causative factors and landslide susceptibility. This makes it possible to incorporate knowledge-driven considerations of the causative factors into the landslide susceptibility analysis and also be extensively used to other areas.

  12. Systematic random sampling of the comet assay.

    Science.gov (United States)

    McArt, Darragh G; Wasson, Gillian R; McKerr, George; Saetzler, Kurt; Reed, Matt; Howard, C Vyvyan

    2009-07-01

    The comet assay is a technique used to quantify DNA damage and repair at a cellular level. In the assay, cells are embedded in agarose and the cellular content is stripped away leaving only the DNA trapped in an agarose cavity which can then be electrophoresed. The damaged DNA can enter the agarose and migrate while the undamaged DNA cannot and is retained. DNA damage is measured as the proportion of the migratory 'tail' DNA compared to the total DNA in the cell. The fundamental basis of these arbitrary values is obtained in the comet acquisition phase using fluorescence microscopy with a stoichiometric stain in tandem with image analysis software. Current methods deployed in such an acquisition are expected to be both objectively and randomly obtained. In this paper we examine the 'randomness' of the acquisition phase and suggest an alternative method that offers both objective and unbiased comet selection. In order to achieve this, we have adopted a survey sampling approach widely used in stereology, which offers a method of systematic random sampling (SRS). This is desirable as it offers an impartial and reproducible method of comet analysis that can be used both manually or automated. By making use of an unbiased sampling frame and using microscope verniers, we are able to increase the precision of estimates of DNA damage. Results obtained from a multiple-user pooled variation experiment showed that the SRS technique attained a lower variability than that of the traditional approach. The analysis of a single user with repetition experiment showed greater individual variances while not being detrimental to overall averages. This would suggest that the SRS method offers a better reflection of DNA damage for a given slide and also offers better user reproducibility.

  13. The Dirichet-Multinomial model for multivariate randomized response data and small samples

    NARCIS (Netherlands)

    Avetisyan, Marianna; Fox, Gerardus J.A.

    2012-01-01

    In survey sampling the randomized response (RR) technique can be used to obtain truthful answers to sensitive questions. Although the individual answers are masked due to the RR technique, individual (sensitive) response rates can be estimated when observing multivariate response data. The

  14. Misrepresenting random sampling? A systematic review of research papers in the Journal of Advanced Nursing.

    Science.gov (United States)

    Williamson, Graham R

    2003-11-01

    This paper discusses the theoretical limitations of the use of random sampling and probability theory in the production of a significance level (or P-value) in nursing research. Potential alternatives, in the form of randomization tests, are proposed. Research papers in nursing, medicine and psychology frequently misrepresent their statistical findings, as the P-values reported assume random sampling. In this systematic review of studies published between January 1995 and June 2002 in the Journal of Advanced Nursing, 89 (68%) studies broke this assumption because they used convenience samples or entire populations. As a result, some of the findings may be questionable. The key ideas of random sampling and probability theory for statistical testing (for generating a P-value) are outlined. The result of a systematic review of research papers published in the Journal of Advanced Nursing is then presented, showing how frequently random sampling appears to have been misrepresented. Useful alternative techniques that might overcome these limitations are then discussed. REVIEW LIMITATIONS: This review is limited in scope because it is applied to one journal, and so the findings cannot be generalized to other nursing journals or to nursing research in general. However, it is possible that other nursing journals are also publishing research articles based on the misrepresentation of random sampling. The review is also limited because in several of the articles the sampling method was not completely clearly stated, and in this circumstance a judgment has been made as to the sampling method employed, based on the indications given by author(s). Quantitative researchers in nursing should be very careful that the statistical techniques they use are appropriate for the design and sampling methods of their studies. If the techniques they employ are not appropriate, they run the risk of misinterpreting findings by using inappropriate, unrepresentative and biased samples.

  15. Systematic versus random sampling in stereological studies.

    Science.gov (United States)

    West, Mark J

    2012-12-01

    The sampling that takes place at all levels of an experimental design must be random if the estimate is to be unbiased in a statistical sense. There are two fundamental ways by which one can make a random sample of the sections and positions to be probed on the sections. Using a card-sampling analogy, one can pick any card at all out of a deck of cards. This is referred to as independent random sampling because the sampling of any one card is made without reference to the position of the other cards. The other approach to obtaining a random sample would be to pick a card within a set number of cards and others at equal intervals within the deck. Systematic sampling along one axis of many biological structures is more efficient than random sampling, because most biological structures are not randomly organized. This article discusses the merits of systematic versus random sampling in stereological studies.

  16. k-Means: Random Sampling Procedure

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. k-Means: Random Sampling Procedure. Optimal 1-Mean is. Approximation of Centroid (Inaba et al). S = random sample of size O(1/ ); Centroid of S is a (1+ )-approx centroid of P with constant probability.

  17. Random sampling technique for ultra-fast computations of molecular opacities for exoplanet atmospheres

    Science.gov (United States)

    Min, M.

    2017-10-01

    Context. Opacities of molecules in exoplanet atmospheres rely on increasingly detailed line-lists for these molecules. The line lists available today contain for many species up to several billions of lines. Computation of the spectral line profile created by pressure and temperature broadening, the Voigt profile, of all of these lines is becoming a computational challenge. Aims: We aim to create a method to compute the Voigt profile in a way that automatically focusses the computation time into the strongest lines, while still maintaining the continuum contribution of the high number of weaker lines. Methods: Here, we outline a statistical line sampling technique that samples the Voigt profile quickly and with high accuracy. The number of samples is adjusted to the strength of the line and the local spectral line density. This automatically provides high accuracy line shapes for strong lines or lines that are spectrally isolated. The line sampling technique automatically preserves the integrated line opacity for all lines, thereby also providing the continuum opacity created by the large number of weak lines at very low computational cost. Results: The line sampling technique is tested for accuracy when computing line spectra and correlated-k tables. Extremely fast computations ( 3.5 × 105 lines per second per core on a standard current day desktop computer) with high accuracy (≤1% almost everywhere) are obtained. A detailed recipe on how to perform the computations is given.

  18. The Dirichlet-Multinomial Model for Multivariate Randomized Response Data and Small Samples

    Science.gov (United States)

    Avetisyan, Marianna; Fox, Jean-Paul

    2012-01-01

    In survey sampling the randomized response (RR) technique can be used to obtain truthful answers to sensitive questions. Although the individual answers are masked due to the RR technique, individual (sensitive) response rates can be estimated when observing multivariate response data. The beta-binomial model for binary RR data will be generalized…

  19. Statistical Theory of the Vector Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Brincker, Rune; Ibrahim, S. R.

    1999-01-01

    decays. Due to the speed and/or accuracy of the Vector Random Decrement technique, it was introduced as an attractive alternative to the Random Decrement technique. In this paper, the theory of the Vector Random Decrement technique is extended by applying a statistical description of the stochastic...

  20. Sampling problems for randomly broken sticks

    Energy Technology Data Exchange (ETDEWEB)

    Huillet, Thierry [Laboratoire de Physique Theorique et Modelisation, CNRS-UMR 8089 et Universite de Cergy-Pontoise, 5 mail Gay-Lussac, 95031, Neuville sur Oise (France)

    2003-04-11

    Consider the random partitioning model of a population (represented by a stick of length 1) into n species (fragments) with identically distributed random weights (sizes). Upon ranking the fragments' weights according to ascending sizes, let S{sub m:n} be the size of the mth smallest fragment. Assume that some observer is sampling such populations as follows: drop at random k points (the sample size) onto this stick and record the corresponding numbers of visited fragments. We shall investigate the following sampling problems: (1) what is the sample size if the sampling is carried out until the first visit of the smallest fragment (size S{sub 1:n})? (2) For a given sample size, have all the fragments of the stick been visited at least once or not? This question is related to Feller's random coupon collector problem. (3) In what order are new fragments being discovered and what is the random number of samples separating the discovery of consecutive new fragments until exhaustion of the list? For this problem, the distribution of the size-biased permutation of the species' weights, as the sequence of their weights in their order of appearance is needed and studied.

  1. Application of bias factor method using random sampling technique for prediction accuracy improvement of critical eigenvalue of BWR

    International Nuclear Information System (INIS)

    Ito, Motohiro; Endo, Tomohiro; Yamamoto, Akio; Kuroda, Yusuke; Yoshii, Takashi

    2017-01-01

    The bias factor method based on the random sampling technique is applied to the benchmark problem of Peach Bottom Unit 2. Validity and availability of the present method, i.e. correction of calculation results and reduction of uncertainty, are confirmed in addition to features and performance of the present method. In the present study, core characteristics in cycle 3 are corrected with the proposed method using predicted and 'measured' critical eigenvalues in cycles 1 and 2. As the source of uncertainty, variance-covariance of cross sections is considered. The calculation results indicate that bias between predicted and measured results, and uncertainty owing to cross section can be reduced. Extension to other uncertainties such as thermal hydraulics properties will be a future task. (author)

  2. STATISTICAL LANDMARKS AND PRACTICAL ISSUES REGARDING THE USE OF SIMPLE RANDOM SAMPLING IN MARKET RESEARCHES

    Directory of Open Access Journals (Sweden)

    CODRUŢA DURA

    2010-01-01

    Full Text Available The sample represents a particular segment of the statistical populationchosen to represent it as a whole. The representativeness of the sample determines the accuracyfor estimations made on the basis of calculating the research indicators and the inferentialstatistics. The method of random sampling is part of probabilistic methods which can be usedwithin marketing research and it is characterized by the fact that it imposes the requirementthat each unit belonging to the statistical population should have an equal chance of beingselected for the sampling process. When the simple random sampling is meant to be rigorouslyput into practice, it is recommended to use the technique of random number tables in order toconfigure the sample which will provide information that the marketer needs. The paper alsodetails the practical procedure implemented in order to create a sample for a marketingresearch by generating random numbers using the facilities offered by Microsoft Excel.

  3. Improved importance sampling technique for efficient simulation of digital communication systems

    Science.gov (United States)

    Lu, Dingqing; Yao, Kung

    1988-01-01

    A new, improved importance sampling (IIS) approach to simulation is considered. Some basic concepts of IS are introduced, and detailed evolutions of simulation estimation variances for Monte Carlo (MC) and IS simulations are given. The general results obtained from these evolutions are applied to the specific previously known conventional importance sampling (CIS) technique and the new IIS technique. The derivation for a linear system with no signal random memory is considered in some detail. For the CIS technique, the optimum input scaling parameter is found, while for the IIS technique, the optimum translation parameter is found. The results are generalized to a linear system with memory and signals. Specific numerical and simulation results are given which show the advantages of CIS over MC and IIS over CIS for simulations of digital communications systems.

  4. Estimation of Sensitive Proportion by Randomized Response Data in Successive Sampling

    Directory of Open Access Journals (Sweden)

    Bo Yu

    2015-01-01

    Full Text Available This paper considers the problem of estimation for binomial proportions of sensitive or stigmatizing attributes in the population of interest. Randomized response techniques are suggested for protecting the privacy of respondents and reducing the response bias while eliciting information on sensitive attributes. In many sensitive question surveys, the same population is often sampled repeatedly on each occasion. In this paper, we apply successive sampling scheme to improve the estimation of the sensitive proportion on current occasion.

  5. Application of a stratified random sampling technique to the estimation and minimization of respirable quartz exposure to underground miners

    International Nuclear Information System (INIS)

    Makepeace, C.E.; Horvath, F.J.; Stocker, H.

    1981-11-01

    The aim of a stratified random sampling plan is to provide the best estimate (in the absence of full-shift personal gravimetric sampling) of personal exposure to respirable quartz among underground miners. One also gains information of the exposure distribution of all the miners at the same time. Three variables (or strata) are considered in the present scheme: locations, occupations and times of sampling. Random sampling within each stratum ensures that each location, occupation and time of sampling has equal opportunity of being selected without bias. Following implementation of the plan and analysis of collected data, one can determine the individual exposures and the mean. This information can then be used to identify those groups whose exposure contributes significantly to the collective exposure. In turn, this identification, along with other considerations, allows the mine operator to carry out a cost-benefit optimization and eventual implementation of engineering controls for these groups. This optimization and engineering control procedure, together with the random sampling plan, can then be used in an iterative manner to minimize the mean value of the distribution and collective exposures

  6. The electron transport problem sampling by Monte Carlo individual collision technique

    International Nuclear Information System (INIS)

    Androsenko, P.A.; Belousov, V.I.

    2005-01-01

    The problem of electron transport is of most interest in all fields of the modern science. To solve this problem the Monte Carlo sampling has to be used. The electron transport is characterized by a large number of individual interactions. To simulate electron transport the 'condensed history' technique may be used where a large number of collisions are grouped into a single step to be sampled randomly. Another kind of Monte Carlo sampling is the individual collision technique. In comparison with condensed history technique researcher has the incontestable advantages. For example one does not need to give parameters altered by condensed history technique like upper limit for electron energy, resolution, number of sub-steps etc. Also the condensed history technique may lose some very important tracks of electrons because of its limited nature by step parameters of particle movement and due to weakness of algorithms for example energy indexing algorithm. There are no these disadvantages in the individual collision technique. This report presents some sampling algorithms of new version BRAND code where above mentioned technique is used. All information on electrons was taken from Endf-6 files. They are the important part of BRAND. These files have not been processed but directly taken from electron information source. Four kinds of interaction like the elastic interaction, the Bremsstrahlung, the atomic excitation and the atomic electro-ionization were considered. In this report some results of sampling are presented after comparison with analogs. For example the endovascular radiotherapy problem (P2) of QUADOS2002 was presented in comparison with another techniques that are usually used. (authors)

  7. Random On-Board Pixel Sampling (ROPS) X-Ray Camera

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Zhehui [Los Alamos; Iaroshenko, O. [Los Alamos; Li, S. [Los Alamos; Liu, T. [Fermilab; Parab, N. [Argonne (main); Chen, W. W. [Purdue U.; Chu, P. [Los Alamos; Kenyon, G. [Los Alamos; Lipton, R. [Fermilab; Sun, K.-X. [Nevada U., Las Vegas

    2017-09-25

    Recent advances in compressed sensing theory and algorithms offer new possibilities for high-speed X-ray camera design. In many CMOS cameras, each pixel has an independent on-board circuit that includes an amplifier, noise rejection, signal shaper, an analog-to-digital converter (ADC), and optional in-pixel storage. When X-ray images are sparse, i.e., when one of the following cases is true: (a.) The number of pixels with true X-ray hits is much smaller than the total number of pixels; (b.) The X-ray information is redundant; or (c.) Some prior knowledge about the X-ray images exists, sparse sampling may be allowed. Here we first illustrate the feasibility of random on-board pixel sampling (ROPS) using an existing set of X-ray images, followed by a discussion about signal to noise as a function of pixel size. Next, we describe a possible circuit architecture to achieve random pixel access and in-pixel storage. The combination of a multilayer architecture, sparse on-chip sampling, and computational image techniques, is expected to facilitate the development and applications of high-speed X-ray camera technology.

  8. An efficient method of randomly sampling the coherent angular scatter distribution

    International Nuclear Information System (INIS)

    Williamson, J.F.; Morin, R.L.

    1983-01-01

    Monte Carlo simulations of photon transport phenomena require random selection of an interaction process at each collision site along the photon track. Possible choices are usually limited to photoelectric absorption and incoherent scatter as approximated by the Klein-Nishina distribution. A technique is described for sampling the coherent angular scatter distribution, for the benefit of workers in medical physics. (U.K.)

  9. The electron transport problem sampling by Monte Carlo individual collision technique

    Energy Technology Data Exchange (ETDEWEB)

    Androsenko, P.A.; Belousov, V.I. [Obninsk State Technical Univ. of Nuclear Power Engineering, Kaluga region (Russian Federation)

    2005-07-01

    The problem of electron transport is of most interest in all fields of the modern science. To solve this problem the Monte Carlo sampling has to be used. The electron transport is characterized by a large number of individual interactions. To simulate electron transport the 'condensed history' technique may be used where a large number of collisions are grouped into a single step to be sampled randomly. Another kind of Monte Carlo sampling is the individual collision technique. In comparison with condensed history technique researcher has the incontestable advantages. For example one does not need to give parameters altered by condensed history technique like upper limit for electron energy, resolution, number of sub-steps etc. Also the condensed history technique may lose some very important tracks of electrons because of its limited nature by step parameters of particle movement and due to weakness of algorithms for example energy indexing algorithm. There are no these disadvantages in the individual collision technique. This report presents some sampling algorithms of new version BRAND code where above mentioned technique is used. All information on electrons was taken from Endf-6 files. They are the important part of BRAND. These files have not been processed but directly taken from electron information source. Four kinds of interaction like the elastic interaction, the Bremsstrahlung, the atomic excitation and the atomic electro-ionization were considered. In this report some results of sampling are presented after comparison with analogs. For example the endovascular radiotherapy problem (P2) of QUADOS2002 was presented in comparison with another techniques that are usually used. (authors)

  10. Two sampling techniques for game meat

    OpenAIRE

    van der Merwe, Maretha; Jooste, Piet J.; Hoffman, Louw C.; Calitz, Frikkie J.

    2013-01-01

    A study was conducted to compare the excision sampling technique used by the export market and the sampling technique preferred by European countries, namely the biotrace cattle and swine test. The measuring unit for the excision sampling was grams (g) and square centimetres (cm2) for the swabbing technique. The two techniques were compared after a pilot test was conducted on spiked approved beef carcasses (n = 12) that statistically proved the two measuring units correlated. The two sampling...

  11. ANALYSIS OF MONTE CARLO SIMULATION SAMPLING TECHNIQUES ON SMALL SIGNAL STABILITY OF WIND GENERATOR- CONNECTED POWER SYSTEM

    Directory of Open Access Journals (Sweden)

    TEMITOPE RAPHAEL AYODELE

    2016-04-01

    Full Text Available Monte Carlo simulation using Simple Random Sampling (SRS technique is popularly known for its ability to handle complex uncertainty problems. However, to produce a reasonable result, it requires huge sample size. This makes it to be computationally expensive, time consuming and unfit for online power system applications. In this article, the performance of Latin Hypercube Sampling (LHS technique is explored and compared with SRS in term of accuracy, robustness and speed for small signal stability application in a wind generator-connected power system. The analysis is performed using probabilistic techniques via eigenvalue analysis on two standard networks (Single Machine Infinite Bus and IEEE 16–machine 68 bus test system. The accuracy of the two sampling techniques is determined by comparing their different sample sizes with the IDEAL (conventional. The robustness is determined based on a significant variance reduction when the experiment is repeated 100 times with different sample sizes using the two sampling techniques in turn. Some of the results show that sample sizes generated from LHS for small signal stability application produces the same result as that of the IDEAL values starting from 100 sample size. This shows that about 100 sample size of random variable generated using LHS method is good enough to produce reasonable results for practical purpose in small signal stability application. It is also revealed that LHS has the least variance when the experiment is repeated 100 times compared to SRS techniques. This signifies the robustness of LHS over that of SRS techniques. 100 sample size of LHS produces the same result as that of the conventional method consisting of 50000 sample size. The reduced sample size required by LHS gives it computational speed advantage (about six times over the conventional method.

  12. Estimation of Correlation Functions by the Random DEC Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Krenk, Steen; Jensen, Jakob Laigaard

    The Random Dec Technique is a versatile technique for characterization of random signals in the time domain. In this paper a short review of the most important properties of the technique is given. The review is mainly based on recently achieved results that are still unpublished, or that has just...

  13. Estimation of Correlation Functions by the Random Decrement Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Krenk, Steen; Jensen, Jakob Laigaard

    responses simulated by two SDOF ARMA models loaded by the same bandlimited white noise. The speed and the accuracy of the RDD technique is compared to the Fast Fourier Transform (FFT) technique. The RDD technique does not involve multiplications, but only additions. Therefore, the technique is very fast......The Random Decrement (RDD) Technique is a versatile technique for characterization of random signals in the time domain. In this paper a short review of the theoretical basis is given, and the technique is illustrated by estimating auto-correlation functions and cross-correlation functions on modal...

  14. Estimation of Correlation Functions by the Random Decrement Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Krenk, Steen; Jensen, Jacob Laigaard

    1991-01-01

    responses simulated by two SDOF ARMA models loaded by the same band-limited white noise. The speed and the accuracy of the RDD technique is compared to the Fast Fourier Transform (FFT) technique. The RDD technique does not involve multiplications, but only additions. Therefore, the technique is very fast......The Random Decrement (RDD) Technique is a versatile technique for characterization of random signals in the time domain. In this paper a short review of the theoretical basis is given, and the technique is illustrated by estimating auto-correlation functions and cross-correlation functions on modal...

  15. Estimation of Correlation Functions by the Random Decrement Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Krenk, Steen; Jensen, Jakob Laigaard

    1992-01-01

    responses simulated by two SDOF ARMA models loaded by the same bandlimited white noise. The speed and the accuracy of the RDD technique is compared to the Fast Fourier Transform (FFT) technique. The RDD technique does not involve multiplications, but only additions. Therefore, the technique is very fast......The Random Decrement (RDD) Technique is a versatile technique for characterization of random signals in the time domain. In this paper a short review of the theoretical basis is given, and the technique is illustrated by estimating auto-correlation functions and cross-correlation functions on modal...

  16. Two sampling techniques for game meat

    Directory of Open Access Journals (Sweden)

    Maretha van der Merwe

    2013-03-01

    Full Text Available A study was conducted to compare the excision sampling technique used by the export market and the sampling technique preferred by European countries, namely the biotrace cattle and swine test. The measuring unit for the excision sampling was grams (g and square centimetres (cm2 for the swabbing technique. The two techniques were compared after a pilot test was conducted on spiked approved beef carcasses (n = 12 that statistically proved the two measuring units correlated. The two sampling techniques were conducted on the same game carcasses (n = 13 and analyses performed for aerobic plate count (APC, Escherichia coli and Staphylococcus aureus, for both techniques. A more representative result was obtained by swabbing and no damage was caused to the carcass. Conversely, the excision technique yielded fewer organisms and caused minor damage to the carcass. The recovery ratio from the sampling technique improved 5.4 times for APC, 108.0 times for E. coli and 3.4 times for S. aureus over the results obtained from the excision technique. It was concluded that the sampling methods of excision and swabbing can be used to obtain bacterial profiles from both export and local carcasses and could be used to indicate whether game carcasses intended for the local market are possibly on par with game carcasses intended for the export market and therefore safe for human consumption.

  17. Two sampling techniques for game meat.

    Science.gov (United States)

    van der Merwe, Maretha; Jooste, Piet J; Hoffman, Louw C; Calitz, Frikkie J

    2013-03-20

    A study was conducted to compare the excision sampling technique used by the export market and the sampling technique preferred by European countries, namely the biotrace cattle and swine test. The measuring unit for the excision sampling was grams (g) and square centimetres (cm2) for the swabbing technique. The two techniques were compared after a pilot test was conducted on spiked approved beef carcasses (n = 12) that statistically proved the two measuring units correlated. The two sampling techniques were conducted on the same game carcasses (n = 13) and analyses performed for aerobic plate count (APC), Escherichia coli and Staphylococcus aureus, for both techniques. A more representative result was obtained by swabbing and no damage was caused to the carcass. Conversely, the excision technique yielded fewer organisms and caused minor damage to the carcass. The recovery ratio from the sampling technique improved 5.4 times for APC, 108.0 times for E. coli and 3.4 times for S. aureus over the results obtained from the excision technique. It was concluded that the sampling methods of excision and swabbing can be used to obtain bacterial profiles from both export and local carcasses and could be used to indicate whether game carcasses intended for the local market are possibly on par with game carcasses intended for the export market and therefore safe for human consumption.

  18. Sampling large random knots in a confined space

    International Nuclear Information System (INIS)

    Arsuaga, J; Blackstone, T; Diao, Y; Hinson, K; Karadayi, E; Saito, M

    2007-01-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e n 2 )). We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n 2 ). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications

  19. Sampling large random knots in a confined space

    Science.gov (United States)

    Arsuaga, J.; Blackstone, T.; Diao, Y.; Hinson, K.; Karadayi, E.; Saito, M.

    2007-09-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e^{n^2}) . We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n2). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  20. Sampling large random knots in a confined space

    Energy Technology Data Exchange (ETDEWEB)

    Arsuaga, J [Department of Mathematics, San Francisco State University, 1600 Holloway Ave, San Francisco, CA 94132 (United States); Blackstone, T [Department of Computer Science, San Francisco State University, 1600 Holloway Ave., San Francisco, CA 94132 (United States); Diao, Y [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Hinson, K [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Karadayi, E [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States); Saito, M [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States)

    2007-09-28

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e{sup n{sup 2}}). We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n{sup 2}). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  1. Path integral methods for primordial density perturbations - sampling of constrained Gaussian random fields

    International Nuclear Information System (INIS)

    Bertschinger, E.

    1987-01-01

    Path integrals may be used to describe the statistical properties of a random field such as the primordial density perturbation field. In this framework the probability distribution is given for a Gaussian random field subjected to constraints such as the presence of a protovoid or supercluster at a specific location in the initial conditions. An algorithm has been constructed for generating samples of a constrained Gaussian random field on a lattice using Monte Carlo techniques. The method makes possible a systematic study of the density field around peaks or other constrained regions in the biased galaxy formation scenario, and it is effective for generating initial conditions for N-body simulations with rare objects in the computational volume. 21 references

  2. Spectral Estimation by the Random Dec Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Jensen, Jacob L.; Krenk, Steen

    1990-01-01

    This paper contains an empirical study of the accuracy of the Random Dec (RDD) technique. Realizations of the response from a single-degree-of-freedom system loaded by white noise are simulated using an ARMA model. The Autocorrelation function is estimated using the RDD technique and the estimated...

  3. Spectral Estimation by the Random DEC Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Jensen, J. Laigaard; Krenk, S.

    This paper contains an empirical study of the accuracy of the Random Dec (RDD) technique. Realizations of the response from a single-degree-of-freedom system loaded by white noise are simulated using an ARMA model. The Autocorrelation function is estimated using the RDD technique and the estimated...

  4. Radioisotope Sample Measurement Techniques in Medicine and Biology. Proceedings of the Symposium on Radioisotope Sample Measurement Techniques

    International Nuclear Information System (INIS)

    1965-01-01

    The medical and biological applications of radioisotopes depend on two basically different types of measurements, those on living subjects in vivo and those on samples in vitro. The International Atomic Energy Agency has in the past held several meetings on in vivo measurement techniques, notably whole-body counting and radioisotope scanning. The present volume contains the Proceedings of the first Symposium the Agency has organized to discuss the various aspects of techniques for sample measurement in vitro. The range of these sample measurement techniques is very wide. The sample may weigh a few milligrams or several hundred grams, and may be in the gaseous, liquid or solid state. Its radioactive content may consist of a single, known radioisotope or several unknown ones. The concentration of radioactivity may be low, medium or high. The measurements may be made manually or automatically and any one of the many radiation detectors now available may be used. The 53 papers presented at the Symposium illustrate the great variety of methods now in use for radioactive- sample measurements. The first topic discussed is gamma-ray spectrometry, which finds an increasing number of applications in sample measurements. Other sections of the Proceedings deal with: the use of computers in gamma-ray spectrometry and multiple tracer techniques; recent developments in activation analysis where both gamma-ray spectrometry and computing techniques are applied; thin-layer and paper radio chromatographic techniques for use with low energy beta-ray emitters; various aspects of liquid scintillation counting techniques in the measurement of alpha- and beta-ray emitters, including chemical and colour quenching; autoradiographic techniques; calibration of equipment; and standardization of radioisotopes. Finally, some applications of solid-state detectors are presented; this section may be regarded as a preview of important future developments. The meeting was attended by 203 participants

  5. Sampling Polya-Gamma random variates: alternate and approximate techniques

    OpenAIRE

    Windle, Jesse; Polson, Nicholas G.; Scott, James G.

    2014-01-01

    Efficiently sampling from the P\\'olya-Gamma distribution, ${PG}(b,z)$, is an essential element of P\\'olya-Gamma data augmentation. Polson et. al (2013) show how to efficiently sample from the ${PG}(1,z)$ distribution. We build two new samplers that offer improved performance when sampling from the ${PG}(b,z)$ distribution and $b$ is not unity.

  6. Efficient sampling of complex network with modified random walk strategies

    Science.gov (United States)

    Xie, Yunya; Chang, Shuhua; Zhang, Zhipeng; Zhang, Mi; Yang, Lei

    2018-02-01

    We present two novel random walk strategies, choosing seed node (CSN) random walk and no-retracing (NR) random walk. Different from the classical random walk sampling, the CSN and NR strategies focus on the influences of the seed node choice and path overlap, respectively. Three random walk samplings are applied in the Erdös-Rényi (ER), Barabási-Albert (BA), Watts-Strogatz (WS), and the weighted USAir networks, respectively. Then, the major properties of sampled subnets, such as sampling efficiency, degree distributions, average degree and average clustering coefficient, are studied. The similar conclusions can be reached with these three random walk strategies. Firstly, the networks with small scales and simple structures are conducive to the sampling. Secondly, the average degree and the average clustering coefficient of the sampled subnet tend to the corresponding values of original networks with limited steps. And thirdly, all the degree distributions of the subnets are slightly biased to the high degree side. However, the NR strategy performs better for the average clustering coefficient of the subnet. In the real weighted USAir networks, some obvious characters like the larger clustering coefficient and the fluctuation of degree distribution are reproduced well by these random walk strategies.

  7. RandomSpot: A web-based tool for systematic random sampling of virtual slides.

    Science.gov (United States)

    Wright, Alexander I; Grabsch, Heike I; Treanor, Darren E

    2015-01-01

    This paper describes work presented at the Nordic Symposium on Digital Pathology 2014, Linköping, Sweden. Systematic random sampling (SRS) is a stereological tool, which provides a framework to quickly build an accurate estimation of the distribution of objects or classes within an image, whilst minimizing the number of observations required. RandomSpot is a web-based tool for SRS in stereology, which systematically places equidistant points within a given region of interest on a virtual slide. Each point can then be visually inspected by a pathologist in order to generate an unbiased sample of the distribution of classes within the tissue. Further measurements can then be derived from the distribution, such as the ratio of tumor to stroma. RandomSpot replicates the fundamental principle of traditional light microscope grid-shaped graticules, with the added benefits associated with virtual slides, such as facilitated collaboration and automated navigation between points. Once the sample points have been added to the region(s) of interest, users can download the annotations and view them locally using their virtual slide viewing software. Since its introduction, RandomSpot has been used extensively for international collaborative projects, clinical trials and independent research projects. So far, the system has been used to generate over 21,000 sample sets, and has been used to generate data for use in multiple publications, identifying significant new prognostic markers in colorectal, upper gastro-intestinal and breast cancer. Data generated using RandomSpot also has significant value for training image analysis algorithms using sample point coordinates and pathologist classifications.

  8. Acceptance sampling using judgmental and randomly selected samples

    Energy Technology Data Exchange (ETDEWEB)

    Sego, Landon H.; Shulman, Stanley A.; Anderson, Kevin K.; Wilson, John E.; Pulsipher, Brent A.; Sieber, W. Karl

    2010-09-01

    We present a Bayesian model for acceptance sampling where the population consists of two groups, each with different levels of risk of containing unacceptable items. Expert opinion, or judgment, may be required to distinguish between the high and low-risk groups. Hence, high-risk items are likely to be identifed (and sampled) using expert judgment, while the remaining low-risk items are sampled randomly. We focus on the situation where all observed samples must be acceptable. Consequently, the objective of the statistical inference is to quantify the probability that a large percentage of the unsampled items in the population are also acceptable. We demonstrate that traditional (frequentist) acceptance sampling and simpler Bayesian formulations of the problem are essentially special cases of the proposed model. We explore the properties of the model in detail, and discuss the conditions necessary to ensure that required samples sizes are non-decreasing function of the population size. The method is applicable to a variety of acceptance sampling problems, and, in particular, to environmental sampling where the objective is to demonstrate the safety of reoccupying a remediated facility that has been contaminated with a lethal agent.

  9. Critical evaluation of sample pretreatment techniques.

    Science.gov (United States)

    Hyötyläinen, Tuulia

    2009-06-01

    Sample preparation before chromatographic separation is the most time-consuming and error-prone part of the analytical procedure. Therefore, selecting and optimizing an appropriate sample preparation scheme is a key factor in the final success of the analysis, and the judicious choice of an appropriate procedure greatly influences the reliability and accuracy of a given analysis. The main objective of this review is to critically evaluate the applicability, disadvantages, and advantages of various sample preparation techniques. Particular emphasis is placed on extraction techniques suitable for both liquid and solid samples.

  10. Event-triggered synchronization for reaction-diffusion complex networks via random sampling

    Science.gov (United States)

    Dong, Tao; Wang, Aijuan; Zhu, Huiyun; Liao, Xiaofeng

    2018-04-01

    In this paper, the synchronization problem of the reaction-diffusion complex networks (RDCNs) with Dirichlet boundary conditions is considered, where the data is sampled randomly. An event-triggered controller based on the sampled data is proposed, which can reduce the number of controller and the communication load. Under this strategy, the synchronization problem of the diffusion complex network is equivalently converted to the stability of a of reaction-diffusion complex dynamical systems with time delay. By using the matrix inequality technique and Lyapunov method, the synchronization conditions of the RDCNs are derived, which are dependent on the diffusion term. Moreover, it is found the proposed control strategy can get rid of the Zeno behavior naturally. Finally, a numerical example is given to verify the obtained results.

  11. A random spatial sampling method in a rural developing nation

    Science.gov (United States)

    Michelle C. Kondo; Kent D.W. Bream; Frances K. Barg; Charles C. Branas

    2014-01-01

    Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. We describe a stratified random sampling method...

  12. Newly introduced sample preparation techniques: towards miniaturization.

    Science.gov (United States)

    Costa, Rosaria

    2014-01-01

    Sampling and sample preparation are of crucial importance in an analytical procedure, representing quite often a source of errors. The technique chosen for the isolation of analytes greatly affects the success of a chemical determination. On the other hand, growing concerns about environmental and human safety, along with the introduction of international regulations for quality control, have moved the interest of scientists towards specific needs. Newly introduced sample preparation techniques are challenged to meet new criteria: (i) miniaturization, (ii) higher sensitivity and selectivity, and (iii) automation. In this survey, the most recent techniques introduced in the field of sample preparation will be described and discussed, along with many examples of applications.

  13. A new simple technique for improving the random properties of chaos-based cryptosystems

    Science.gov (United States)

    Garcia-Bosque, M.; Pérez-Resa, A.; Sánchez-Azqueta, C.; Celma, S.

    2018-03-01

    A new technique for improving the security of chaos-based stream ciphers has been proposed and tested experimentally. This technique manages to improve the randomness properties of the generated keystream by preventing the system to fall into short period cycles due to digitation. In order to test this technique, a stream cipher based on a Skew Tent Map algorithm has been implemented on a Virtex 7 FPGA. The randomness of the keystream generated by this system has been compared to the randomness of the keystream generated by the same system with the proposed randomness-enhancement technique. By subjecting both keystreams to the National Institute of Standards and Technology (NIST) tests, we have proved that our method can considerably improve the randomness of the generated keystreams. In order to incorporate our randomness-enhancement technique, only 41 extra slices have been needed, proving that, apart from effective, this method is also efficient in terms of area and hardware resources.

  14. Proteomic Challenges: Sample Preparation Techniques for Microgram-Quantity Protein Analysis from Biological Samples

    Directory of Open Access Journals (Sweden)

    Peter Feist

    2015-02-01

    Full Text Available Proteins regulate many cellular functions and analyzing the presence and abundance of proteins in biological samples are central focuses in proteomics. The discovery and validation of biomarkers, pathways, and drug targets for various diseases can be accomplished using mass spectrometry-based proteomics. However, with mass-limited samples like tumor biopsies, it can be challenging to obtain sufficient amounts of proteins to generate high-quality mass spectrometric data. Techniques developed for macroscale quantities recover sufficient amounts of protein from milligram quantities of starting material, but sample losses become crippling with these techniques when only microgram amounts of material are available. To combat this challenge, proteomicists have developed micro-scale techniques that are compatible with decreased sample size (100 μg or lower and still enable excellent proteome coverage. Extraction, contaminant removal, protein quantitation, and sample handling techniques for the microgram protein range are reviewed here, with an emphasis on liquid chromatography and bottom-up mass spectrometry-compatible techniques. Also, a range of biological specimens, including mammalian tissues and model cell culture systems, are discussed.

  15. Proteomic Challenges: Sample Preparation Techniques for Microgram-Quantity Protein Analysis from Biological Samples

    Science.gov (United States)

    Feist, Peter; Hummon, Amanda B.

    2015-01-01

    Proteins regulate many cellular functions and analyzing the presence and abundance of proteins in biological samples are central focuses in proteomics. The discovery and validation of biomarkers, pathways, and drug targets for various diseases can be accomplished using mass spectrometry-based proteomics. However, with mass-limited samples like tumor biopsies, it can be challenging to obtain sufficient amounts of proteins to generate high-quality mass spectrometric data. Techniques developed for macroscale quantities recover sufficient amounts of protein from milligram quantities of starting material, but sample losses become crippling with these techniques when only microgram amounts of material are available. To combat this challenge, proteomicists have developed micro-scale techniques that are compatible with decreased sample size (100 μg or lower) and still enable excellent proteome coverage. Extraction, contaminant removal, protein quantitation, and sample handling techniques for the microgram protein range are reviewed here, with an emphasis on liquid chromatography and bottom-up mass spectrometry-compatible techniques. Also, a range of biological specimens, including mammalian tissues and model cell culture systems, are discussed. PMID:25664860

  16. Proteomic challenges: sample preparation techniques for microgram-quantity protein analysis from biological samples.

    Science.gov (United States)

    Feist, Peter; Hummon, Amanda B

    2015-02-05

    Proteins regulate many cellular functions and analyzing the presence and abundance of proteins in biological samples are central focuses in proteomics. The discovery and validation of biomarkers, pathways, and drug targets for various diseases can be accomplished using mass spectrometry-based proteomics. However, with mass-limited samples like tumor biopsies, it can be challenging to obtain sufficient amounts of proteins to generate high-quality mass spectrometric data. Techniques developed for macroscale quantities recover sufficient amounts of protein from milligram quantities of starting material, but sample losses become crippling with these techniques when only microgram amounts of material are available. To combat this challenge, proteomicists have developed micro-scale techniques that are compatible with decreased sample size (100 μg or lower) and still enable excellent proteome coverage. Extraction, contaminant removal, protein quantitation, and sample handling techniques for the microgram protein range are reviewed here, with an emphasis on liquid chromatography and bottom-up mass spectrometry-compatible techniques. Also, a range of biological specimens, including mammalian tissues and model cell culture systems, are discussed.

  17. The contribution of simple random sampling to observed variations in faecal egg counts.

    Science.gov (United States)

    Torgerson, Paul R; Paul, Michaela; Lewis, Fraser I

    2012-09-10

    It has been over 100 years since the classical paper published by Gosset in 1907, under the pseudonym "Student", demonstrated that yeast cells suspended in a fluid and measured by a haemocytometer conformed to a Poisson process. Similarly parasite eggs in a faecal suspension also conform to a Poisson process. Despite this there are common misconceptions how to analyse or interpret observations from the McMaster or similar quantitative parasitic diagnostic techniques, widely used for evaluating parasite eggs in faeces. The McMaster technique can easily be shown from a theoretical perspective to give variable results that inevitably arise from the random distribution of parasite eggs in a well mixed faecal sample. The Poisson processes that lead to this variability are described and illustrative examples of the potentially large confidence intervals that can arise from observed faecal eggs counts that are calculated from the observations on a McMaster slide. Attempts to modify the McMaster technique, or indeed other quantitative techniques, to ensure uniform egg counts are doomed to failure and belie ignorance of Poisson processes. A simple method to immediately identify excess variation/poor sampling from replicate counts is provided. Copyright © 2012 Elsevier B.V. All rights reserved.

  18. NAIL SAMPLING TECHNIQUE AND ITS INTERPRETATION

    Directory of Open Access Journals (Sweden)

    TZAR MN

    2011-01-01

    Full Text Available The clinical suspicion of onychomyosis based on appearance of the nails, requires culture for confirmation. This is because treatment requires prolonged use of systemic agents which may cause side effects. One of the common problems encountered is improper nail sampling technique which results in loss of essential information. The unfamiliar terminologies used in reporting culture results may intimidate physicians resulting in misinterpretation and hamper treatment decision. This article provides a simple guide on nail sampling technique and the interpretation of culture results.

  19. Identification of System Parameters by the Random Decrement Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Kirkegaard, Poul Henning; Rytter, Anders

    1991-01-01

    -Walker equations and finally, least-square fitting of the theoretical correlation function. The results are compared to the results of fitting an Auto Regressive Moving Average (ARMA) model directly to the system output from a single-degree-of-freedom system loaded by white noise.......The aim of this paper is to investigate and illustrate the possibilities of using correlation functions estimated by the Random Decrement Technique as a basis for parameter identification. A two-stage system identification system is used: first, the correlation functions are estimated by the Random...... Decrement Technique, and then the system parameters are identified from the correlation function estimates. Three different techniques are used in the parameter identification process: a simple non-parametric method, estimation of an Auto Regressive (AR) model by solving an overdetermined set of Yule...

  20. Random vs. systematic sampling from administrative databases involving human subjects.

    Science.gov (United States)

    Hagino, C; Lo, R J

    1998-09-01

    Two sampling techniques, simple random sampling (SRS) and systematic sampling (SS), were compared to determine whether they yield similar and accurate distributions for the following four factors: age, gender, geographic location and years in practice. Any point estimate within 7 yr or 7 percentage points of its reference standard (SRS or the entire data set, i.e., the target population) was considered "acceptably similar" to the reference standard. The sampling frame was from the entire membership database of the Canadian Chiropractic Association. The two sampling methods were tested using eight different sample sizes of n (50, 100, 150, 200, 250, 300, 500, 800). From the profile/characteristics, summaries of four known factors [gender, average age, number (%) of chiropractors in each province and years in practice], between- and within-methods chi 2 tests and unpaired t tests were performed to determine whether any of the differences [descriptively greater than 7% or 7 yr] were also statistically significant. The strengths of the agreements between the provincial distributions were quantified by calculating the percent agreements for each (provincial pairwise-comparison methods). Any percent agreement less than 70% was judged to be unacceptable. Our assessments of the two sampling methods (SRS and SS) for the different sample sizes tested suggest that SRS and SS yielded acceptably similar results. Both methods started to yield "correct" sample profiles at approximately the same sample size (n > 200). SS is not only convenient, it can be recommended for sampling from large databases in which the data are listed without any inherent order biases other than alphabetical listing by surname.

  1. Development of sampling techniques for ITER Type B radwaste

    International Nuclear Information System (INIS)

    Hong, Kwon Pyo; Kim, Sung Geun; Jung, Sang Hee; Oh, Wan Ho; Park, Myung Chul; Kim, Hee Moon; Ahn, Sang Bok

    2016-01-01

    There are several difficulties and limitation in sampling activities. As the Type B radwaste components are mostly metallic(mostly stainless steel) and bulk(∼ 1 m in size and ∼ 100 mm in thickness), it is difficult in taking samples from the surface of Type B radwaste by remote operation. But also, sampling should be performed without use of any liquid coolant to avoid the spread of contamination. And all sampling procedures are carried in the hot cell red zone with remote operation. Three kinds of sampling techniques are being developed. They are core sampling, chip sampling, and wedge sampling, which are the candidates of sampling techniques to be applied to ITER hot cell. Object materials for sampling are stainless steel or Cu alloy block in order to simulate ITER Type B radwaste. The best sampling technique for ITER Type B radwaste among the three sampling techniques will be suggested in several months after finishing the related experiment

  2. Development of sampling techniques for ITER Type B radwaste

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Kwon Pyo; Kim, Sung Geun; Jung, Sang Hee; Oh, Wan Ho; Park, Myung Chul; Kim, Hee Moon; Ahn, Sang Bok [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    There are several difficulties and limitation in sampling activities. As the Type B radwaste components are mostly metallic(mostly stainless steel) and bulk(∼ 1 m in size and ∼ 100 mm in thickness), it is difficult in taking samples from the surface of Type B radwaste by remote operation. But also, sampling should be performed without use of any liquid coolant to avoid the spread of contamination. And all sampling procedures are carried in the hot cell red zone with remote operation. Three kinds of sampling techniques are being developed. They are core sampling, chip sampling, and wedge sampling, which are the candidates of sampling techniques to be applied to ITER hot cell. Object materials for sampling are stainless steel or Cu alloy block in order to simulate ITER Type B radwaste. The best sampling technique for ITER Type B radwaste among the three sampling techniques will be suggested in several months after finishing the related experiment.

  3. Boat sampling technique for assessment of ageing of components

    International Nuclear Information System (INIS)

    Kumar, Kundan; Shyam, T.V.; Kayal, J.N.; Rupani, B.B.

    2006-01-01

    Boat sampling technique (BST) is a surface sampling technique, which has been developed for obtaining, in-situ, metal samples from the surface of an operating component without affecting its operating service life. The BST is non-destructive in nature and the sample is obtained without plastic deformation or without thermal degradation of the parent material. The shape and size of the sample depends upon the shape of the cutter and the surface geometry of the parent material. Miniature test specimens are generated from the sample and the specimens are subjected to various tests, viz. Metallurgical Evaluation, Metallographic Evaluation, Micro-hardness Evaluation, sensitisation test, small punch test etc. to confirm the integrity and assessment of safe operating life of the component. This paper highlights design objective of boat sampling technique, description of sampling module, sampling cutter and its performance evaluation, cutting process, boat samples, operational sequence of sampling module, qualification of sampling module, qualification of sampling technique, qualification of scooped region of the parent material, sample retrieval system, inspection, testing and examination to be carried out on the boat samples and scooped region. (author)

  4. SOME SYSTEMATIC SAMPLING STRATEGIES USING MULTIPLE RANDOM STARTS

    OpenAIRE

    Sampath Sundaram; Ammani Sivaraman

    2010-01-01

    In this paper an attempt is made to extend linear systematic sampling using multiple random starts due to Gautschi(1957)for various types of systematic sampling schemes available in literature, namely(i)  Balanced Systematic Sampling (BSS) of  Sethi (1965) and (ii) Modified Systematic Sampling (MSS) of Singh, Jindal, and Garg  (1968). Further, the proposed methods were compared with Yates corrected estimator developed with reference to Gautschi’s Linear systematic samplin...

  5. Identification of System Parameters by the Random Decrement Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Kirkegaard, Poul Henning; Rytter, Anders

    -Walker equations and finally least square fitting of the theoretical correlation function. The results are compared to the results of fitting an Auto Regressive Moving Average(ARMA) model directly to the system output. All investigations are performed on the simulated output from a single degree-off-freedom system......The aim of this paper is to investigate and illustrate the possibilities of using correlation functions estimated by the Random Decrement Technique as a basis for parameter identification. A two-stage system identification method is used: first the correlation functions are estimated by the Random...... Decrement technique and then the system parameters are identified from the correlation function estimates. Three different techniques are used in the parameters identification process: a simple non-paramatic method, estimation of an Auto Regressive(AR) model by solving an overdetermined set of Yule...

  6. Non-terminal blood sampling techniques in guinea pigs.

    Science.gov (United States)

    Birck, Malene M; Tveden-Nyborg, Pernille; Lindblad, Maiken M; Lykkesfeldt, Jens

    2014-10-11

    Guinea pigs possess several biological similarities to humans and are validated experimental animal models(1-3). However, the use of guinea pigs currently represents a relatively narrow area of research and descriptive data on specific methodology is correspondingly scarce. The anatomical features of guinea pigs are slightly different from other rodent models, hence modulation of sampling techniques to accommodate for species-specific differences, e.g., compared to mice and rats, are necessary to obtain sufficient and high quality samples. As both long and short term in vivo studies often require repeated blood sampling the choice of technique should be well considered in order to reduce stress and discomfort in the animals but also to ensure survival as well as compliance with requirements of sample size and accessibility. Venous blood samples can be obtained at a number of sites in guinea pigs e.g., the saphenous and jugular veins, each technique containing both advantages and disadvantages(4,5). Here, we present four different blood sampling techniques for either conscious or anaesthetized guinea pigs. The procedures are all non-terminal procedures provided that sample volumes and number of samples do not exceed guidelines for blood collection in laboratory animals(6). All the described methods have been thoroughly tested and applied for repeated in vivo blood sampling in studies within our research facility.

  7. Health plan auditing: 100-percent-of-claims vs. random-sample audits.

    Science.gov (United States)

    Sillup, George P; Klimberg, Ronald K

    2011-01-01

    The objective of this study was to examine the relative efficacy of two different methodologies for auditing self-funded medical claim expenses: 100-percent-of-claims auditing versus random-sampling auditing. Multiple data sets of claim errors or 'exceptions' from two Fortune-100 corporations were analysed and compared to 100 simulated audits of 300- and 400-claim random samples. Random-sample simulations failed to identify a significant number and amount of the errors that ranged from $200,000 to $750,000. These results suggest that health plan expenses of corporations could be significantly reduced if they audited 100% of claims and embraced a zero-defect approach.

  8. Effectiveness of three different oral hygiene techniques on Viridans streptococci: A randomized controlled trial

    Directory of Open Access Journals (Sweden)

    N Naveen

    2016-01-01

    Full Text Available Introduction: Tongue cleaning is an important aspect of oral hygiene maintenance along with other mechanical and chemical aids. These methods have an influence on microorganism count in saliva. Aim: To assess the effectiveness of three different oral hygiene techniques on Viridans streptococci. Materials and Methods: This was a randomized controlled trial with 45 study subjects aged between 14 and 16 years and were randomly allocated into three groups: Group A - plastic tongue scraper, Group B - chlorhexidine mouthwash along with plastic tongue scraper, and Group C - chlorhexidine mouthwash. Unstimulated salivary samples were collected on the 1st, 7th, and 15th day before routine oral hygiene practices. Saliva samples were collected and incubated for 48 h on itis Salivarius(MS agar. Streptococcus mitis, Streptococcus mutans, and Streptococcus salivarius were counted. Data were analyzed using descriptive and inferential statistics. Results: The mean count of S. mitis, S. mutans, and S. salivarius for Group A, B, and C was found to be significant (P < 0.001 when compared between 1st, 7th, and 15th day. Between-groups comparisons revealed a significant difference between Groups A and C, B and C (P < 0.001. Conclusion: There was a significant reduction in bacterial count in all the participants indicating that all the three methods are useful in improving oral hygiene. Combination technique was found to be most effective.

  9. Revisiting random walk based sampling in networks: evasion of burn-in period and frequent regenerations.

    Science.gov (United States)

    Avrachenkov, Konstantin; Borkar, Vivek S; Kadavankandy, Arun; Sreedharan, Jithin K

    2018-01-01

    In the framework of network sampling, random walk (RW) based estimation techniques provide many pragmatic solutions while uncovering the unknown network as little as possible. Despite several theoretical advances in this area, RW based sampling techniques usually make a strong assumption that the samples are in stationary regime, and hence are impelled to leave out the samples collected during the burn-in period. This work proposes two sampling schemes without burn-in time constraint to estimate the average of an arbitrary function defined on the network nodes, for example, the average age of users in a social network. The central idea of the algorithms lies in exploiting regeneration of RWs at revisits to an aggregated super-node or to a set of nodes, and in strategies to enhance the frequency of such regenerations either by contracting the graph or by making the hitting set larger. Our first algorithm, which is based on reinforcement learning (RL), uses stochastic approximation to derive an estimator. This method can be seen as intermediate between purely stochastic Markov chain Monte Carlo iterations and deterministic relative value iterations. The second algorithm, which we call the Ratio with Tours (RT)-estimator, is a modified form of respondent-driven sampling (RDS) that accommodates the idea of regeneration. We study the methods via simulations on real networks. We observe that the trajectories of RL-estimator are much more stable than those of standard random walk based estimation procedures, and its error performance is comparable to that of respondent-driven sampling (RDS) which has a smaller asymptotic variance than many other estimators. Simulation studies also show that the mean squared error of RT-estimator decays much faster than that of RDS with time. The newly developed RW based estimators (RL- and RT-estimators) allow to avoid burn-in period, provide better control of stability along the sample path, and overall reduce the estimation time. Our

  10. Differences in sampling techniques on total post-mortem tryptase.

    Science.gov (United States)

    Tse, R; Garland, J; Kesha, K; Elstub, H; Cala, A D; Ahn, Y; Stables, S; Palmiere, C

    2017-11-20

    The measurement of mast cell tryptase is commonly used to support the diagnosis of anaphylaxis. In the post-mortem setting, the literature recommends sampling from peripheral blood sources (femoral blood) but does not specify the exact sampling technique. Sampling techniques vary between pathologists, and it is unclear whether different sampling techniques have any impact on post-mortem tryptase levels. The aim of this study is to compare the difference in femoral total post-mortem tryptase levels between two sampling techniques. A 6-month retrospective study comparing femoral total post-mortem tryptase levels between (1) aspirating femoral vessels with a needle and syringe prior to evisceration and (2) femoral vein cut down during evisceration. Twenty cases were identified, with three cases excluded from analysis. There was a statistically significant difference (paired t test, p sampling methods. The clinical significance of this finding and what factors may contribute to it are unclear. When requesting post-mortem tryptase, the pathologist should consider documenting the exact blood collection site and method used for collection. In addition, blood samples acquired by different techniques should not be mixed together and should be analyzed separately if possible.

  11. Microextraction sample preparation techniques in biomedical analysis.

    Science.gov (United States)

    Szultka, Malgorzata; Pomastowski, Pawel; Railean-Plugaru, Viorica; Buszewski, Boguslaw

    2014-11-01

    Biologically active compounds are found in biological samples at relatively low concentration levels. The sample preparation of target compounds from biological, pharmaceutical, environmental, and food matrices is one of the most time-consuming steps in the analytical procedure. The microextraction techniques are dominant. Metabolomic studies also require application of proper analytical technique for the determination of endogenic metabolites present in biological matrix on trace concentration levels. Due to the reproducibility of data, precision, relatively low cost of the appropriate analysis, simplicity of the determination, and the possibility of direct combination of those techniques with other methods (combination types on-line and off-line), they have become the most widespread in routine determinations. Additionally, sample pretreatment procedures have to be more selective, cheap, quick, and environmentally friendly. This review summarizes the current achievements and applications of microextraction techniques. The main aim is to deal with the utilization of different types of sorbents for microextraction and emphasize the use of new synthesized sorbents as well as to bring together studies concerning the systematic approach to method development. This review is dedicated to the description of microextraction techniques and their application in biomedical analysis. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. A cost-saving statistically based screening technique for focused sampling of a lead-contaminated site

    International Nuclear Information System (INIS)

    Moscati, A.F. Jr.; Hediger, E.M.; Rupp, M.J.

    1986-01-01

    High concentrations of lead in soils along an abandoned railroad line prompted a remedial investigation to characterize the extent of contamination across a 7-acre site. Contamination was thought to be spotty across the site reflecting its past use in battery recycling operations at discrete locations. A screening technique was employed to delineate the more highly contaminated areas by testing a statistically determined minimum number of random samples from each of seven discrete site areas. The approach not only quickly identified those site areas which would require more extensive grid sampling, but also provided a statistically defensible basis for excluding other site areas from further consideration, thus saving the cost of additional sample collection and analysis. The reduction in the number of samples collected in ''clean'' areas of the site ranged from 45 to 60%

  13. New materials for sample preparation techniques in bioanalysis.

    Science.gov (United States)

    Nazario, Carlos Eduardo Domingues; Fumes, Bruno Henrique; da Silva, Meire Ribeiro; Lanças, Fernando Mauro

    2017-02-01

    The analysis of biological samples is a complex and difficult task owing to two basic and complementary issues: the high complexity of most biological matrices and the need to determine minute quantities of active substances and contaminants in such complex sample. To succeed in this endeavor samples are usually subject to three steps of a comprehensive analytical methodological approach: sample preparation, analytes isolation (usually utilizing a chromatographic technique) and qualitative/quantitative analysis (usually with the aid of mass spectrometric tools). Owing to the complex nature of bio-samples, and the very low concentration of the target analytes to be determined, selective sample preparation techniques is mandatory in order to overcome the difficulties imposed by these two constraints. During the last decade new chemical synthesis approaches has been developed and optimized, such as sol-gel and molecularly imprinting technologies, allowing the preparation of novel materials for sample preparation including graphene and derivatives, magnetic materials, ionic liquids, molecularly imprinted polymers, and much more. In this contribution we will review these novel techniques and materials, as well as their application to the bioanalysis niche. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Estimates of Inequality Indices Based on Simple Random, Ranked Set, and Systematic Sampling

    OpenAIRE

    Bansal, Pooja; Arora, Sangeeta; Mahajan, Kalpana K.

    2013-01-01

    Gini index, Bonferroni index, and Absolute Lorenz index are some popular indices of inequality showing different features of inequality measurement. In general simple random sampling procedure is commonly used to estimate the inequality indices and their related inference. The key condition that the samples must be drawn via simple random sampling procedure though makes calculations much simpler but this assumption is often violated in practice as the data does not always yield simple random ...

  15. An alternative procedure for estimating the population mean in simple random sampling

    Directory of Open Access Journals (Sweden)

    Housila P. Singh

    2012-03-01

    Full Text Available This paper deals with the problem of estimating the finite population mean using auxiliary information in simple random sampling. Firstly we have suggested a correction to the mean squared error of the estimator proposed by Gupta and Shabbir [On improvement in estimating the population mean in simple random sampling. Jour. Appl. Statist. 35(5 (2008, pp. 559-566]. Later we have proposed a ratio type estimator and its properties are studied in simple random sampling. Numerically we have shown that the proposed class of estimators is more efficient than different known estimators including Gupta and Shabbir (2008 estimator.

  16. Correlated random sampling for multivariate normal and log-normal distributions

    International Nuclear Information System (INIS)

    Žerovnik, Gašper; Trkov, Andrej; Kodeli, Ivan A.

    2012-01-01

    A method for correlated random sampling is presented. Representative samples for multivariate normal or log-normal distribution can be produced. Furthermore, any combination of normally and log-normally distributed correlated variables may be sampled to any requested accuracy. Possible applications of the method include sampling of resonance parameters which are used for reactor calculations.

  17. Determination of metals in air samples using X-Ray fluorescence associated the APDC preconcentration technique

    Energy Technology Data Exchange (ETDEWEB)

    Nardes, Raysa C.; Santos, Ramon S.; Sanches, Francis A.C.R.A.; Gama Filho, Hamilton S.; Oliveira, Davi F.; Anjos, Marcelino J., E-mail: rc.nardes@gmail.com, E-mail: ramonziosp@yahoo.com.br, E-mail: francissanches@gmail.com, E-mail: hamiltongamafilho@hotmail.com, E-mail: davi.oliveira@uerj.br, E-mail: marcelin@uerj.br [Universidade do Estado do Rio de Janeiro (UERJ), Rio de Janeiro, RJ (Brazil). Instituto de Fisica. Departamento de Fisica Aplicada e Termodinamica

    2015-07-01

    Air pollution has become one of the leading quality degradation factors of life for people in large urban centers. Studies indicate that the suspended particulate matter in the atmosphere is directly associated with risks to public health, in addition, it can cause damage to fauna, flora and public / cultural patrimonies. The inhalable particulate materials can cause the emergence and / or worsening of chronic diseases related to respiratory system and other diseases, such as reduced physical strength. In this study, we propose a new method to measure the concentration of total suspended particulate matter (TSP) in the air using an impinger as an air cleaning apparatus, preconcentration with APDC and Total Reflection X-ray Fluorescence technique (TXRF) to analyze the heavy metals present in the air. The samples were collected from five random points in the city of Rio de Janeiro/Brazil. Analyses of TXRF were performed at the Brazilian Synchrotron Light Laboratory (LNLS). The technique proved viable because it was able to detect five important metallic elements to environmental studies: Cr, Fe, Ni, Cu and Zn. This technique presented substantial efficiency in determining the elementary concentration of air pollutants, in addition to low cost. It can be concluded that the metals analysis technique in air samples using an impinger as sample collection instrument associated with a complexing agent (APDC) was viable because it is a low-cost technique, moreover, it was possible the detection of five important metal elements in environmental studies associated with industrial emissions and urban traffic. (author)

  18. Determination of metals in air samples using X-Ray fluorescence associated the APDC preconcentration technique

    International Nuclear Information System (INIS)

    Nardes, Raysa C.; Santos, Ramon S.; Sanches, Francis A.C.R.A.; Gama Filho, Hamilton S.; Oliveira, Davi F.; Anjos, Marcelino J.

    2015-01-01

    Air pollution has become one of the leading quality degradation factors of life for people in large urban centers. Studies indicate that the suspended particulate matter in the atmosphere is directly associated with risks to public health, in addition, it can cause damage to fauna, flora and public / cultural patrimonies. The inhalable particulate materials can cause the emergence and / or worsening of chronic diseases related to respiratory system and other diseases, such as reduced physical strength. In this study, we propose a new method to measure the concentration of total suspended particulate matter (TSP) in the air using an impinger as an air cleaning apparatus, preconcentration with APDC and Total Reflection X-ray Fluorescence technique (TXRF) to analyze the heavy metals present in the air. The samples were collected from five random points in the city of Rio de Janeiro/Brazil. Analyses of TXRF were performed at the Brazilian Synchrotron Light Laboratory (LNLS). The technique proved viable because it was able to detect five important metallic elements to environmental studies: Cr, Fe, Ni, Cu and Zn. This technique presented substantial efficiency in determining the elementary concentration of air pollutants, in addition to low cost. It can be concluded that the metals analysis technique in air samples using an impinger as sample collection instrument associated with a complexing agent (APDC) was viable because it is a low-cost technique, moreover, it was possible the detection of five important metal elements in environmental studies associated with industrial emissions and urban traffic. (author)

  19. A comparative study of sampling techniques for monitoring carcass contamination

    NARCIS (Netherlands)

    Snijders, J.M.A.; Janssen, M.H.W.; Gerats, G.E.; Corstiaensen, G.P.

    1984-01-01

    Four bacteriological sampling techniques i.e. the excision, double swab, agar contract and modified agar contact techniques were compared by sampling pig carcasses before and after chilling. As well as assessing the advantages and disadvantages of the techniques particular attention was paid to

  20. Multiuser Random Coding Techniques for Mismatched Decoding

    OpenAIRE

    Scarlett, Jonathan; Martinez, Alfonso; Guillén i Fàbregas, Albert

    2016-01-01

    This paper studies multiuser random coding techniques for channel coding with a given (possibly suboptimal) decoding rule. For the mismatched discrete memoryless multiple-access channel, an error exponent is obtained that is tight with respect to the ensemble average, and positive within the interior of Lapidoth's achievable rate region. This exponent proves the ensemble tightness of the exponent of Liu and Hughes in the case of maximum-likelihood decoding. An equivalent dual form of Lapidoth...

  1. Sample preparation for special PIE-techniques at ITU

    International Nuclear Information System (INIS)

    Toscano, E.H.; Manzel, R.

    2002-01-01

    Several sample preparation techniques were developed and installed in hot cells. The techniques were conceived to evaluate the performance of highly burnt fuel rods and include: (a) a device for the removal of the fuel, (b) a method for the preparation of the specimen ends for the welding of new end caps and for the careful cleaning of samples for Transmission Electron Microscopy and Glow Discharge Mass Spectroscopy, (c) a sample pressurisation device for long term creep tests, and (d) a diameter measuring device for creep or burst samples. Examples of the determination of the mechanical properties, the behaviour under transient conditions and for the assessment of the corrosion behaviour of high burnup cladding materials are presented. (author)

  2. New trends in sample preparation techniques for environmental analysis.

    Science.gov (United States)

    Ribeiro, Cláudia; Ribeiro, Ana Rita; Maia, Alexandra S; Gonçalves, Virgínia M F; Tiritan, Maria Elizabeth

    2014-01-01

    Environmental samples include a wide variety of complex matrices, with low concentrations of analytes and presence of several interferences. Sample preparation is a critical step and the main source of uncertainties in the analysis of environmental samples, and it is usually laborious, high cost, time consuming, and polluting. In this context, there is increasing interest in developing faster, cost-effective, and environmentally friendly sample preparation techniques. Recently, new methods have been developed and optimized in order to miniaturize extraction steps, to reduce solvent consumption or become solventless, and to automate systems. This review attempts to present an overview of the fundamentals, procedure, and application of the most recently developed sample preparation techniques for the extraction, cleanup, and concentration of organic pollutants from environmental samples. These techniques include: solid phase microextraction, on-line solid phase extraction, microextraction by packed sorbent, dispersive liquid-liquid microextraction, and QuEChERS (Quick, Easy, Cheap, Effective, Rugged and Safe).

  3. Multiple sensitive estimation and optimal sample size allocation in the item sum technique.

    Science.gov (United States)

    Perri, Pier Francesco; Rueda García, María Del Mar; Cobo Rodríguez, Beatriz

    2018-01-01

    For surveys of sensitive issues in life sciences, statistical procedures can be used to reduce nonresponse and social desirability response bias. Both of these phenomena provoke nonsampling errors that are difficult to deal with and can seriously flaw the validity of the analyses. The item sum technique (IST) is a very recent indirect questioning method derived from the item count technique that seeks to procure more reliable responses on quantitative items than direct questioning while preserving respondents' anonymity. This article addresses two important questions concerning the IST: (i) its implementation when two or more sensitive variables are investigated and efficient estimates of their unknown population means are required; (ii) the determination of the optimal sample size to achieve minimum variance estimates. These aspects are of great relevance for survey practitioners engaged in sensitive research and, to the best of our knowledge, were not studied so far. In this article, theoretical results for multiple estimation and optimal allocation are obtained under a generic sampling design and then particularized to simple random sampling and stratified sampling designs. Theoretical considerations are integrated with a number of simulation studies based on data from two real surveys and conducted to ascertain the efficiency gain derived from optimal allocation in different situations. One of the surveys concerns cannabis consumption among university students. Our findings highlight some methodological advances that can be obtained in life sciences IST surveys when optimal allocation is achieved. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Random sampling of quantum states: a survey of methods and some issues regarding the Overparametrized Method

    International Nuclear Information System (INIS)

    Maziero, Jonas

    2015-01-01

    The numerical generation of random quantum states (RQS) is an important procedure for investigations in quantum information science. Here, we review some methods that may be used for performing that task. We start by presenting a simple procedure for generating random state vectors, for which the main tool is the random sampling of unbiased discrete probability distributions (DPD). Afterwards, the creation of random density matrices is addressed. In this context, we first present the standard method, which consists in using the spectral decomposition of a quantum state for getting RQS from random DPDs and random unitary matrices. In the sequence, the Bloch vector parametrization method is described. This approach, despite being useful in several instances, is not in general convenient for RQS generation. In the last part of the article, we regard the overparametrized method (OPM) and the related Ginibre and Bures techniques. The OPM can be used to create random positive semidefinite matrices with unit trace from randomly produced general complex matrices in a simple way that is friendly for numerical implementations. We consider a physically relevant issue related to the possible domains that may be used for the real and imaginary parts of the elements of such general complex matrices. Subsequently, a too fast concentration of measure in the quantum state space that appears in this parametrization is noticed. (author)

  5. Random sampling of evolution time space and Fourier transform processing

    International Nuclear Information System (INIS)

    Kazimierczuk, Krzysztof; Zawadzka, Anna; Kozminski, Wiktor; Zhukov, Igor

    2006-01-01

    Application of Fourier Transform for processing 3D NMR spectra with random sampling of evolution time space is presented. The 2D FT is calculated for pairs of frequencies, instead of conventional sequence of one-dimensional transforms. Signal to noise ratios and linewidths for different random distributions were investigated by simulations and experiments. The experimental examples include 3D HNCA, HNCACB and 15 N-edited NOESY-HSQC spectra of 13 C 15 N labeled ubiquitin sample. Obtained results revealed general applicability of proposed method and the significant improvement of resolution in comparison with conventional spectra recorded in the same time

  6. Non-terminal blood sampling techniques in Guinea pigs

    DEFF Research Database (Denmark)

    Birck, Malene Muusfeldt; Tveden-Nyborg, Pernille; Lindblad, Maiken Marie

    2014-01-01

    Guinea pigs possess several biological similarities to humans and are validated experimental animal models(1-3). However, the use of guinea pigs currently represents a relatively narrow area of research and descriptive data on specific methodology is correspondingly scarce. The anatomical features...... of guinea pigs are slightly different from other rodent models, hence modulation of sampling techniques to accommodate for species-specific differences, e.g., compared to mice and rats, are necessary to obtain sufficient and high quality samples. As both long and short term in vivo studies often require...... repeated blood sampling the choice of technique should be well considered in order to reduce stress and discomfort in the animals but also to ensure survival as well as compliance with requirements of sample size and accessibility. Venous blood samples can be obtained at a number of sites in guinea pigs e...

  7. A random sampling procedure for anisotropic distributions

    International Nuclear Information System (INIS)

    Nagrajan, P.S.; Sethulakshmi, P.; Raghavendran, C.P.; Bhatia, D.P.

    1975-01-01

    A procedure is described for sampling the scattering angle of neutrons as per specified angular distribution data. The cosine of the scattering angle is written as a double Legendre expansion in the incident neutron energy and a random number. The coefficients of the expansion are given for C, N, O, Si, Ca, Fe and Pb and these elements are of interest in dosimetry and shielding. (author)

  8. Gray bootstrap method for estimating frequency-varying random vibration signals with small samples

    Directory of Open Access Journals (Sweden)

    Wang Yanqing

    2014-04-01

    Full Text Available During environment testing, the estimation of random vibration signals (RVS is an important technique for the airborne platform safety and reliability. However, the available methods including extreme value envelope method (EVEM, statistical tolerances method (STM and improved statistical tolerance method (ISTM require large samples and typical probability distribution. Moreover, the frequency-varying characteristic of RVS is usually not taken into account. Gray bootstrap method (GBM is proposed to solve the problem of estimating frequency-varying RVS with small samples. Firstly, the estimated indexes are obtained including the estimated interval, the estimated uncertainty, the estimated value, the estimated error and estimated reliability. In addition, GBM is applied to estimating the single flight testing of certain aircraft. At last, in order to evaluate the estimated performance, GBM is compared with bootstrap method (BM and gray method (GM in testing analysis. The result shows that GBM has superiority for estimating dynamic signals with small samples and estimated reliability is proved to be 100% at the given confidence level.

  9. Statistical sampling techniques as applied to OSE inspections

    International Nuclear Information System (INIS)

    Davis, J.J.; Cote, R.W.

    1987-01-01

    The need has been recognized for statistically valid methods for gathering information during OSE inspections; and for interpretation of results, both from performance testing and from records reviews, interviews, etc. Battelle Columbus Division, under contract to DOE OSE has performed and is continuing to perform work in the area of statistical methodology for OSE inspections. This paper represents some of the sampling methodology currently being developed for use during OSE inspections. Topics include population definition, sample size requirements, level of confidence and practical logistical constraints associated with the conduct of an inspection based on random sampling. Sequential sampling schemes and sampling from finite populations are also discussed. The methods described are applicable to various data gathering activities, ranging from the sampling and examination of classified documents to the sampling of Protective Force security inspectors for skill testing

  10. Random sampling of elementary flux modes in large-scale metabolic networks.

    Science.gov (United States)

    Machado, Daniel; Soons, Zita; Patil, Kiran Raosaheb; Ferreira, Eugénio C; Rocha, Isabel

    2012-09-15

    The description of a metabolic network in terms of elementary (flux) modes (EMs) provides an important framework for metabolic pathway analysis. However, their application to large networks has been hampered by the combinatorial explosion in the number of modes. In this work, we develop a method for generating random samples of EMs without computing the whole set. Our algorithm is an adaptation of the canonical basis approach, where we add an additional filtering step which, at each iteration, selects a random subset of the new combinations of modes. In order to obtain an unbiased sample, all candidates are assigned the same probability of getting selected. This approach avoids the exponential growth of the number of modes during computation, thus generating a random sample of the complete set of EMs within reasonable time. We generated samples of different sizes for a metabolic network of Escherichia coli, and observed that they preserve several properties of the full EM set. It is also shown that EM sampling can be used for rational strain design. A well distributed sample, that is representative of the complete set of EMs, should be suitable to most EM-based methods for analysis and optimization of metabolic networks. Source code for a cross-platform implementation in Python is freely available at http://code.google.com/p/emsampler. dmachado@deb.uminho.pt Supplementary data are available at Bioinformatics online.

  11. Application of digital sampling techniques to particle identification

    International Nuclear Information System (INIS)

    Bardelli, L.; Poggi, G.; Bini, M.; Carraresi, L.; Pasquali, G.; Taccetti, N.

    2003-01-01

    An application of digital sampling techniques is presented which can greatly simplify experiments involving sub-nanosecond time-mark determinations and energy measurements with nuclear detectors, used for Pulse Shape Analysis and Time of Flight measurements in heavy ion experiments. In this work a 100 M Sample/s, 12 bit analog to digital converter has been used: examples of this technique applied to Silicon and CsI(Tl) detectors in heavy-ions experiments involving particle identification via Pulse Shape analysis and Time of Flight measurements are presented. The system is suited for applications to large detector arrays and to different kinds of detectors. Some preliminary results regarding the simulation of current signals in Silicon detectors are also discussed. (authors)

  12. A modified random decrement technique for modal identification from nonstationary ambient response data only

    International Nuclear Information System (INIS)

    Lin, Chang Sheng; Chiang, Dar Yun

    2012-01-01

    Modal identification is considered from response data of structural system under nonstationary ambient vibration. In a previous paper, we showed that by assuming the ambient excitation to be nonstationary white noise in the form of a product model, the nonstationary response signals can be converted into free-vibration data via the correlation technique. In the present paper, if the ambient excitation can be modeled as a nonstationary white noise in the form of a product model, then the nonstationary cross random decrement signatures of structural response evaluated at any fixed time instant are shown theoretically to be proportional to the nonstationary cross-correlation functions. The practical problem of insufficient data samples available for evaluating nonstationary random decrement signatures can be approximately resolved by first extracting the amplitude-modulating function from the response and then transforming the nonstationary responses into stationary ones. Modal-parameter identification can then be performed using the Ibrahim time-domain technique, which is effective at identifying closely spaced modes. The theory proposed can be further extended by using the filtering concept to cover the case of nonstationary color excitations. Numerical simulations confirm the validity of the proposed method for identification of modal parameters from nonstationary ambient response data

  13. Determination of Initial Conditions for the Safety Analysis by Random Sampling of Operating Parameters

    International Nuclear Information System (INIS)

    Jeong, Hae-Yong; Park, Moon-Ghu

    2015-01-01

    In most existing evaluation methodologies, which follow a conservative approach, the most conservative initial conditions are searched for each transient scenario through tremendous assessment for wide operating windows or limiting conditions for operation (LCO) allowed by the operating guidelines. In this procedure, a user effect could be involved and a remarkable time and human resources are consumed. In the present study, we investigated a more effective statistical method for the selection of the most conservative initial condition by the use of random sampling of operating parameters affecting the initial conditions. A method for the determination of initial conditions based on random sampling of plant design parameters is proposed. This method is expected to be applied for the selection of the most conservative initial plant conditions in the safety analysis using a conservative evaluation methodology. In the method, it is suggested that the initial conditions of reactor coolant flow rate, pressurizer level, pressurizer pressure, and SG level are adjusted by controlling the pump rated flow, setpoints of PLCS, PPCS, and FWCS, respectively. The proposed technique is expected to contribute to eliminate the human factors introduced in the conventional safety analysis procedure and also to reduce the human resources invested in the safety evaluation of nuclear power plants

  14. Key-space analysis of double random phase encryption technique

    Science.gov (United States)

    Monaghan, David S.; Gopinathan, Unnikrishnan; Naughton, Thomas J.; Sheridan, John T.

    2007-09-01

    We perform a numerical analysis on the double random phase encryption/decryption technique. The key-space of an encryption technique is the set of possible keys that can be used to encode data using that technique. In the case of a strong encryption scheme, many keys must be tried in any brute-force attack on that technique. Traditionally, designers of optical image encryption systems demonstrate only how a small number of arbitrary keys cannot decrypt a chosen encrypted image in their system. However, this type of demonstration does not discuss the properties of the key-space nor refute the feasibility of an efficient brute-force attack. To clarify these issues we present a key-space analysis of the technique. For a range of problem instances we plot the distribution of decryption errors in the key-space indicating the lack of feasibility of a simple brute-force attack.

  15. Fast egg collection method greatly improves randomness of egg sampling in Drosophila melanogaster

    DEFF Research Database (Denmark)

    Schou, Mads Fristrup

    2013-01-01

    When obtaining samples for population genetic studies, it is essential that the sampling is random. For Drosophila, one of the crucial steps in sampling experimental flies is the collection of eggs. Here an egg collection method is presented, which randomizes the eggs in a water column...... and diminishes environmental variance. This method was compared with a traditional egg collection method where eggs are collected directly from the medium. Within each method the observed and expected standard deviations of egg-to-adult viability were compared, whereby the difference in the randomness...... and to obtain a representative collection of genotypes, the method presented here is strongly recommended when collecting eggs from Drosophila....

  16. The concentration of heavy metals: zinc, cadmium, lead, copper, mercury, iron and calcium in head hair of a randomly selected sample of Kenyan people

    International Nuclear Information System (INIS)

    Wandiga, S.O.; Jumba, I.O.

    1982-01-01

    An intercomparative analysis of the concentration of heavy metals:zinc, cadmium, lead, copper, mercury, iron and calcium in head hair of a randomly selected sample of Kenyan people using the techniques of atomic absorption spectrophotometry (AAS) and differential pulse anodic stripping voltammetry (DPAS) has been undertaken. The percent relative standard deviation for each sample analysed using either of the techniques show good sensitivity and correlation between the techniques. The DPAS was found to be slightly sensitive than the AAs instrument used. The recalculated body burden rations of Cd to Zn, Pb to Fe reveal no unusual health impairement symptoms and suggest a relatively clean environment in Kenya.(author)

  17. CONSISTENCY UNDER SAMPLING OF EXPONENTIAL RANDOM GRAPH MODELS.

    Science.gov (United States)

    Shalizi, Cosma Rohilla; Rinaldo, Alessandro

    2013-04-01

    The growing availability of network data and of scientific interest in distributed systems has led to the rapid development of statistical models of network structure. Typically, however, these are models for the entire network, while the data consists only of a sampled sub-network. Parameters for the whole network, which is what is of interest, are estimated by applying the model to the sub-network. This assumes that the model is consistent under sampling , or, in terms of the theory of stochastic processes, that it defines a projective family. Focusing on the popular class of exponential random graph models (ERGMs), we show that this apparently trivial condition is in fact violated by many popular and scientifically appealing models, and that satisfying it drastically limits ERGM's expressive power. These results are actually special cases of more general results about exponential families of dependent random variables, which we also prove. Using such results, we offer easily checked conditions for the consistency of maximum likelihood estimation in ERGMs, and discuss some possible constructive responses.

  18. Sampling techniques for thrips (Thysanoptera: Thripidae) in preflowering tomato.

    Science.gov (United States)

    Joost, P Houston; Riley, David G

    2004-08-01

    Sampling techniques for thrips (Thysanoptera: Thripidae) were compared in preflowering tomato plants at the Coastal Plain Experiment Station in Tifton, GA, in 2000 and 2003, to determine the most effective method of determining abundance of thrips on tomato foliage early in the growing season. Three relative sampling techniques, including a standard insect aspirator, a 946-ml beat cup, and an insect vacuum device, were compared for accuracy to an absolute method and to themselves for precision and efficiency of sampling thrips. Thrips counts of all relative sampling methods were highly correlated (R > 0.92) to the absolute method. The aspirator method was the most accurate compared with the absolute sample according to regression analysis in 2000. In 2003, all sampling methods were considered accurate according to Dunnett's test, but thrips numbers were lower and sample variation was greater than in 2000. In 2000, the beat cup method had the lowest relative variation (RV) or best precision, at 1 and 8 d after transplant (DAT). Only the beat cup method had RV values <25 for all sampling dates. In 2003, the beat cup method had the lowest RV value at 15 and 21 DAT. The beat cup method also was the most efficient method for all sample dates in both years. Frankliniella fusca (Pergande) was the most abundant thrips species on the foliage of preflowering tomato in both years of study at this location. Overall, the best thrips sampling technique tested was the beat cup method in terms of precision and sampling efficiency.

  19. Characterization of electron microscopes with binary pseudo-random multilayer test samples

    Science.gov (United States)

    Yashchuk, Valeriy V.; Conley, Raymond; Anderson, Erik H.; Barber, Samuel K.; Bouet, Nathalie; McKinney, Wayne R.; Takacs, Peter Z.; Voronov, Dmitriy L.

    2011-09-01

    Verification of the reliability of metrology data from high quality X-ray optics requires that adequate methods for test and calibration of the instruments be developed. For such verification for optical surface profilometers in the spatial frequency domain, a modulation transfer function (MTF) calibration method based on binary pseudo-random (BPR) gratings and arrays has been suggested [1,2] and proven to be an effective calibration method for a number of interferometric microscopes, a phase shifting Fizeau interferometer, and a scatterometer [5]. Here we describe the details of development of binary pseudo-random multilayer (BPRML) test samples suitable for characterization of scanning (SEM) and transmission (TEM) electron microscopes. We discuss the results of TEM measurements with the BPRML test samples fabricated from a WiSi 2/Si multilayer coating with pseudo-randomly distributed layers. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML test samples can be used to characterize X-ray microscopes. Corresponding work with X-ray microscopes is in progress.

  20. Characterization of electron microscopes with binary pseudo-random multilayer test samples

    International Nuclear Information System (INIS)

    Yashchuk, Valeriy V.; Conley, Raymond; Anderson, Erik H.; Barber, Samuel K.; Bouet, Nathalie; McKinney, Wayne R.; Takacs, Peter Z.; Voronov, Dmitriy L.

    2011-01-01

    Verification of the reliability of metrology data from high quality X-ray optics requires that adequate methods for test and calibration of the instruments be developed. For such verification for optical surface profilometers in the spatial frequency domain, a modulation transfer function (MTF) calibration method based on binary pseudo-random (BPR) gratings and arrays has been suggested and proven to be an effective calibration method for a number of interferometric microscopes, a phase shifting Fizeau interferometer, and a scatterometer [5]. Here we describe the details of development of binary pseudo-random multilayer (BPRML) test samples suitable for characterization of scanning (SEM) and transmission (TEM) electron microscopes. We discuss the results of TEM measurements with the BPRML test samples fabricated from a WiSi 2 /Si multilayer coating with pseudo-randomly distributed layers. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML test samples can be used to characterize X-ray microscopes. Corresponding work with X-ray microscopes is in progress.

  1. LOD score exclusion analyses for candidate genes using random population samples.

    Science.gov (United States)

    Deng, H W; Li, J; Recker, R R

    2001-05-01

    While extensive analyses have been conducted to test for, no formal analyses have been conducted to test against, the importance of candidate genes with random population samples. We develop a LOD score approach for exclusion analyses of candidate genes with random population samples. Under this approach, specific genetic effects and inheritance models at candidate genes can be analysed and if a LOD score is < or = - 2.0, the locus can be excluded from having an effect larger than that specified. Computer simulations show that, with sample sizes often employed in association studies, this approach has high power to exclude a gene from having moderate genetic effects. In contrast to regular association analyses, population admixture will not affect the robustness of our analyses; in fact, it renders our analyses more conservative and thus any significant exclusion result is robust. Our exclusion analysis complements association analysis for candidate genes in random population samples and is parallel to the exclusion mapping analyses that may be conducted in linkage analyses with pedigrees or relative pairs. The usefulness of the approach is demonstrated by an application to test the importance of vitamin D receptor and estrogen receptor genes underlying the differential risk to osteoporotic fractures.

  2. Generating Random Samples of a Given Size Using Social Security Numbers.

    Science.gov (United States)

    Erickson, Richard C.; Brauchle, Paul E.

    1984-01-01

    The purposes of this article are (1) to present a method by which social security numbers may be used to draw cluster samples of a predetermined size and (2) to describe procedures used to validate this method of drawing random samples. (JOW)

  3. Effect of joint mobilization techniques for primary total knee arthroplasty: Study protocol for a randomized controlled trial.

    Science.gov (United States)

    Xu, Jiao; Zhang, Juan; Wang, Xue-Qiang; Wang, Xuan-Lin; Wu, Ya; Chen, Chan-Cheng; Zhang, Han-Yu; Zhang, Zhi-Wan; Fan, Kai-Yi; Zhu, Qiang; Deng, Zhi-Wei

    2017-12-01

    Total knee arthroplasty (TKA) has become the most preferred procedure by patients for the relief of pain caused by knee osteoarthritis. TKA patients aim a speedy recovery after the surgery. Joint mobilization techniques for rehabilitation have been widely used to relieve pain and improve joint mobility. However, relevant randomized controlled trials showing the curative effect of these techniques remain lacking to date. Accordingly, this study aims to investigate whether joint mobilization techniques are valid for primary TKA. We will manage a single-blind, prospective, randomized, controlled trial of 120 patients with unilateral TKA. Patients will be randomized into an intervention group, a physical modality therapy group, and a usual care group. The intervention group will undergo joint mobilization manipulation treatment once a day and regular training twice a day for a month. The physical modality therapy group will undergo physical therapy once a day and regular training twice a day for a month. The usual care group will perform regular training twice a day for a month. Primary outcome measures will be based on the visual analog scale, the knee joint Hospital for Special Surgery score, range of motion, surrounded degree, and adverse effect. Secondary indicators will include manual muscle testing, 36-Item Short Form Health Survey, Berg Balance Scale function evaluation, Pittsburgh Sleep Quality Index, proprioception, and muscle morphology. We will direct intention-to-treat analysis if a subject withdraws from the trial. The important features of this trial for joint mobilization techniques in primary TKA are randomization procedures, single-blind, large sample size, and standardized protocol. This study aims to investigate whether joint mobilization techniques are effective for early TKA patients. The result of this study may serve as a guide for TKA patients, medical personnel, and healthcare decision makers. It has been registered at http

  4. Use of nuclear technique in samples for agricultural purposes

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Kerley A. P. de; Sperling, Eduardo Von, E-mail: kerley@ufmg.br, E-mail: kerleyfisica@yahoo.com.br [Department of Sanitary and Environmental Engineering Federal University of Minas Gerais, Belo Horizonte (Brazil); Menezes, Maria Angela B. C.; Jacomino, Vanusa M.F. [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2013-01-15

    The concern related to environment is growing. Due to this, it is needed to determine chemical elements in a large range of concentration. The neutron activation technique (NAA) determines the elemental composition by the measurement of artificial radioactivity in a sample that was submitted to a neutron flux. NAA is a sensitive and accurate technique with low detection limits. An example of application of NAA was the measurement of concentrations of rare earth elements (REE) in waste samples of phosphogypsum (PG) and cerrado soil samples (clayey and sandy soils). Additionally, a soil reference material of the International Atomic Energy Agency (IAEA) was also analyzed. The REE concentration in PG samples was two times higher than those found in national fertilizers, (total of 4,000 mg kg{sup -1}), 154 times greater than the values found in the sandy soil (26 mg kg{sup -1}) and 14 times greater than the in clayey soil (280 mg kg{sup -1}). The experimental results for the reference material were inside the uncertainty of the certified values pointing out the accuracy of the method (95%). The determination of La, Ce, Pr, Nd, Pm, Sm, Eu, Tb, Dy, Ho, Er, Tm, Yb and Lu in the samples and reference material confirmed the versatility of the technique on REE determination in soil and phosphogypsum samples that are matrices for agricultural interest. (author)

  5. Use of nuclear technique in samples for agricultural purposes

    International Nuclear Information System (INIS)

    Oliveira, Kerley A. P. de; Sperling, Eduardo Von; Menezes, Maria Angela B. C.; Jacomino, Vanusa M.F.

    2013-01-01

    The concern related to environment is growing. Due to this, it is needed to determine chemical elements in a large range of concentration. The neutron activation technique (NAA) determines the elemental composition by the measurement of artificial radioactivity in a sample that was submitted to a neutron flux. NAA is a sensitive and accurate technique with low detection limits. An example of application of NAA was the measurement of concentrations of rare earth elements (REE) in waste samples of phosphogypsum (PG) and cerrado soil samples (clayey and sandy soils). Additionally, a soil reference material of the International Atomic Energy Agency (IAEA) was also analyzed. The REE concentration in PG samples was two times higher than those found in national fertilizers, (total of 4,000 mg kg -1 ), 154 times greater than the values found in the sandy soil (26 mg kg -1 ) and 14 times greater than the in clayey soil (280 mg kg -1 ). The experimental results for the reference material were inside the uncertainty of the certified values pointing out the accuracy of the method (95%). The determination of La, Ce, Pr, Nd, Pm, Sm, Eu, Tb, Dy, Ho, Er, Tm, Yb and Lu in the samples and reference material confirmed the versatility of the technique on REE determination in soil and phosphogypsum samples that are matrices for agricultural interest. (author)

  6. Development of analytical techniques for safeguards environmental samples at JAEA

    International Nuclear Information System (INIS)

    Sakurai, Satoshi; Magara, Masaaki; Usuda, Shigekazu; Watanabe, Kazuo; Esaka, Fumitaka; Hirayama, Fumio; Lee, Chi-Gyu; Yasuda, Kenichiro; Inagawa, Jun; Suzuki, Daisuke; Iguchi, Kazunari; Kokubu, Yoko S.; Miyamoto, Yutaka; Ohzu, Akira

    2007-01-01

    JAEA has been developing, under the auspices of the Ministry of Education, Culture, Sports, Science and Technology of Japan, analytical techniques for ultra-trace amounts of nuclear materials in environmental samples in order to contribute to the strengthened safeguards system. Development of essential techniques for bulk and particle analysis, as well as screening, of the environmental swipe samples has been established as ultra-trace analytical methods of uranium and plutonium. In January 2003, JAEA was qualified, including its quality control system, as a member of the JAEA network analytical laboratories for environmental samples. Since 2004, JAEA has conducted the analysis of domestic and the IAEA samples, through which JAEA's analytical capability has been verified and improved. In parallel, advanced techniques have been developed in order to expand the applicability to the samples of various elemental composition and impurities and to improve analytical accuracy and efficiency. This paper summarizes the trace of the technical development in environmental sample analysis at JAEA, and refers to recent trends of research and development in this field. (author)

  7. The optimal injection technique for the osteoarthritic ankle: A randomized, cross-over trial

    NARCIS (Netherlands)

    Witteveen, Angelique G. H.; Kok, Aimee; Sierevelt, Inger N.; Kerkhoffs, Gino M. M. J.; van Dijk, C. Niek

    2013-01-01

    Background: To optimize the injection technique for the osteoarthritic ankle in order to enhance the effect of intra-articular injections and minimize adverse events. Methods: Randomized cross-over trial. Comparing two injection techniques in patients with symptomatic ankle osteoarthritis. Patients

  8. Water sampling techniques for continuous monitoring of pesticides in water

    Directory of Open Access Journals (Sweden)

    Šunjka Dragana

    2017-01-01

    Full Text Available Good ecological and chemical status of water represents the most important aim of the Water Framework Directive 2000/60/EC, which implies respect of water quality standards at the level of entire river basin (2008/105/EC and 2013/39/EC. This especially refers to the control of pesticide residues in surface waters. In order to achieve the set goals, a continuous monitoring program that should provide a comprehensive and interrelated overview of water status should be implemented. However, it demands the use of appropriate analysis techniques. Until now, the procedure for sampling and quantification of residual pesticide quantities in aquatic environment was based on the use of traditional sampling techniques that imply periodical collecting of individual samples. However, this type of sampling provides only a snapshot of the situation in regard to the presence of pollutants in water. As an alternative, the technique of passive sampling of pollutants in water, including pesticides has been introduced. Different samplers are available for pesticide sampling in surface water, depending on compounds. The technique itself is based on keeping a device in water over a longer period of time which varies from several days to several weeks, depending on the kind of compound. In this manner, the average concentrations of pollutants dissolved in water during a time period (time-weighted average concentrations, TWA are obtained, which enables monitoring of trends in areal and seasonal variations. The use of these techniques also leads to an increase in sensitivity of analytical methods, considering that pre-concentration of analytes takes place within the sorption medium. However, the use of these techniques for determination of pesticide concentrations in real water environments requires calibration studies for the estimation of sampling rates (Rs. Rs is a volume of water per time, calculated as the product of overall mass transfer coefficient and area of

  9. Magnetic separation techniques in sample preparation for biological analysis: a review.

    Science.gov (United States)

    He, Jincan; Huang, Meiying; Wang, Dongmei; Zhang, Zhuomin; Li, Gongke

    2014-12-01

    Sample preparation is a fundamental and essential step in almost all the analytical procedures, especially for the analysis of complex samples like biological and environmental samples. In past decades, with advantages of superparamagnetic property, good biocompatibility and high binding capacity, functionalized magnetic materials have been widely applied in various processes of sample preparation for biological analysis. In this paper, the recent advancements of magnetic separation techniques based on magnetic materials in the field of sample preparation for biological analysis were reviewed. The strategy of magnetic separation techniques was summarized. The synthesis, stabilization and bio-functionalization of magnetic nanoparticles were reviewed in detail. Characterization of magnetic materials was also summarized. Moreover, the applications of magnetic separation techniques for the enrichment of protein, nucleic acid, cell, bioactive compound and immobilization of enzyme were described. Finally, the existed problems and possible trends of magnetic separation techniques for biological analysis in the future were proposed. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Smoothing the redshift distributions of random samples for the baryon acoustic oscillations: applications to the SDSS-III BOSS DR12 and QPM mock samples

    Science.gov (United States)

    Wang, Shao-Jiang; Guo, Qi; Cai, Rong-Gen

    2017-12-01

    We investigate the impact of different redshift distributions of random samples on the baryon acoustic oscillations (BAO) measurements of D_V(z)r_d^fid/r_d from the two-point correlation functions of galaxies in the Data Release 12 of the Baryon Oscillation Spectroscopic Survey (BOSS). Big surveys, such as BOSS, usually assign redshifts to the random samples by randomly drawing values from the measured redshift distributions of the data, which would necessarily introduce fiducial signals of fluctuations into the random samples, weakening the signals of BAO, if the cosmic variance cannot be ignored. We propose a smooth function of redshift distribution that fits the data well to populate the random galaxy samples. The resulting cosmological parameters match the input parameters of the mock catalogue very well. The significance of BAO signals has been improved by 0.33σ for a low-redshift sample and by 0.03σ for a constant-stellar-mass sample, though the absolute values do not change significantly. Given the precision of the measurements of current cosmological parameters, it would be appreciated for the future improvements on the measurements of galaxy clustering.

  11. Solid Phase Microextraction and Related Techniques for Drugs in Biological Samples

    OpenAIRE

    Moein, Mohammad Mahdi; Said, Rana; Bassyouni, Fatma; Abdel-Rehim, Mohamed

    2014-01-01

    In drug discovery and development, the quantification of drugs in biological samples is an important task for the determination of the physiological performance of the investigated drugs. After sampling, the next step in the analytical process is sample preparation. Because of the low concentration levels of drug in plasma and the variety of the metabolites, the selected extraction technique should be virtually exhaustive. Recent developments of sample handling techniques are directed, from o...

  12. Piezoelectric Versus Conventional Rotary Techniques for Impacted Third Molar Extraction: A Meta-analysis of Randomized Controlled Trials.

    Science.gov (United States)

    Jiang, Qian; Qiu, Yating; Yang, Chi; Yang, Jingyun; Chen, Minjie; Zhang, Zhiyuan

    2015-10-01

    Impacted third molars are frequently encountered in clinical work. Surgical removal of impacted third molars is often required to prevent clinical symptoms. Traditional rotary cutting instruments are potentially injurious, and piezosurgery, as a new osteotomy technique, has been introduced in oral and maxillofacial surgery. No consistent conclusion has been reached regarding whether this new technique is associated with fewer or less severe postoperative sequelae after third molar extraction.The aim of this study was to compare piezosurgery with rotary osteotomy techniques, with regard to surgery time and the severity of postoperative sequelae, including pain, swelling, and trismus.We conducted a systematic literature search in the Cochrane Library, PubMed, Embase, and Google Scholar.The eligibility criteria of this study included the following: the patients were clearly diagnosed as having impacted mandibular third molars; the patients underwent piezosurgery osteotomy, and in the control group rotary osteotomy techniques, for removing impacted third molars; the outcomes of interest include surgery time, trismus, swelling or pain; the studies are randomized controlled trials.We used random-effects models to calculate the difference in the outcomes, and the corresponding 95% confidence interval. We calculated the weighted mean difference if the trials used the same measurement, and a standardized mean difference if otherwise.A total of seven studies met the eligibility criteria and were included in our analysis. Compared with rotary osteotomy, patients undergoing piezosurgery experienced longer surgery time (mean difference 4.13 minutes, 95% confidence interval 2.75-5.52, P piezosurgery groups.The number of included randomized controlled trials and the sample size of each trial were relatively small, double blinding was not possible, and cost analysis was unavailable due to a lack of data.Our meta-analysis indicates that although patients undergoing piezosurgery

  13. Manipulation of biological samples using micro and nano techniques.

    Science.gov (United States)

    Castillo, Jaime; Dimaki, Maria; Svendsen, Winnie Edith

    2009-01-01

    The constant interest in handling, integrating and understanding biological systems of interest for the biomedical field, the pharmaceutical industry and the biomaterial researchers demand the use of techniques that allow the manipulation of biological samples causing minimal or no damage to their natural structure. Thanks to the advances in micro- and nanofabrication during the last decades several manipulation techniques offer us the possibility to image, characterize and manipulate biological material in a controlled way. Using these techniques the integration of biomaterials with remarkable properties with physical transducers has been possible, giving rise to new and highly sensitive biosensing devices. This article reviews the different techniques available to manipulate and integrate biological materials in a controlled manner either by sliding them along a surface (2-D manipulation), by grapping them and moving them to a new position (3-D manipulation), or by manipulating and relocating them applying external forces. The advantages and drawbacks are mentioned together with examples that reflect the state of the art of manipulation techniques for biological samples (171 references).

  14. NAIL SAMPLING TECHNIQUE AND ITS INTERPRETATION

    OpenAIRE

    TZAR MN; LEELAVATHI M

    2011-01-01

    The clinical suspicion of onychomyosis based on appearance of the nails, requires culture for confirmation. This is because treatment requires prolonged use of systemic agents which may cause side effects. One of the common problems encountered is improper nail sampling technique which results in loss of essential information. The unfamiliar terminologies used in reporting culture results may intimidate physicians resulting in misinterpretation and hamper treatment decision. This article prov...

  15. Assessment of fracture risk: value of random population-based samples--the Geelong Osteoporosis Study.

    Science.gov (United States)

    Henry, M J; Pasco, J A; Seeman, E; Nicholson, G C; Sanders, K M; Kotowicz, M A

    2001-01-01

    Fracture risk is determined by bone mineral density (BMD). The T-score, a measure of fracture risk, is the position of an individual's BMD in relation to a reference range. The aim of this study was to determine the magnitude of change in the T-score when different sampling techniques were used to produce the reference range. Reference ranges were derived from three samples, drawn from the same region: (1) an age-stratified population-based random sample, (2) unselected volunteers, and (3) a selected healthy subset of the population-based sample with no diseases or drugs known to affect bone. T-scores were calculated using the three reference ranges for a cohort of women who had sustained a fracture and as a group had a low mean BMD (ages 35-72 yr; n = 484). For most comparisons, the T-scores for the fracture cohort were more negative using the population reference range. The difference in T-scores reached 1.0 SD. The proportion of the fracture cohort classified as having osteoporosis at the spine was 26, 14, and 23% when the population, volunteer, and healthy reference ranges were applied, respectively. The use of inappropriate reference ranges results in substantial changes to T-scores and may lead to inappropriate management.

  16. 40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sample selection by random number... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square... area created in accordance with paragraph (a) of this section, select two random numbers: one each for...

  17. Sample preparation techniques for (p, X) spectrometry

    International Nuclear Information System (INIS)

    Whitehead, N.E.

    1985-01-01

    Samples are ashed at low temperature, using oxygen plasma; a rotary evaporator, and freeze drying speeded up the ashing. The new design of apparatus manufactured was only 10 watt but was as efficient as a 200 watt commercial machine; a circuit diagram is included. Samples of hair and biopsy samples of skin were analysed by the technique. A wool standard was prepared for interlaboratory comparison exercises. It was based on New Zealand merino sheep wool and was 2.9 kg in weight. A washing protocol was developed, which preserves most of the trace element content. The wool was ground in liquid nitrogen using a plastic pestle and beaker, driven by a rotary drill press. (author)

  18. Validation of the k-filtering technique for a signal composed of random-phase plane waves and non-random coherent structures

    Directory of Open Access Journals (Sweden)

    O. W. Roberts

    2014-12-01

    Full Text Available Recent observations of astrophysical magnetic fields have shown the presence of fluctuations being wave-like (propagating in the plasma frame and those described as being structure-like (advected by the plasma bulk velocity. Typically with single-spacecraft missions it is impossible to differentiate between these two fluctuations, due to the inherent spatio-temporal ambiguity associated with a single point measurement. However missions such as Cluster which contain multiple spacecraft have allowed for temporal and spatial changes to be resolved, using techniques such as k filtering. While this technique does not assume Taylor's hypothesis it requires both weak stationarity of the time series and that the fluctuations can be described by a superposition of plane waves with random phases. In this paper we test whether the method can cope with a synthetic signal which is composed of a combination of non-random-phase coherent structures with a mean radius d and a mean separation λ, as well as plane waves with random phase.

  19. Monitoring oil persistence on beaches : SCAT versus stratified random sampling designs

    International Nuclear Information System (INIS)

    Short, J.W.; Lindeberg, M.R.; Harris, P.M.; Maselko, J.M.; Pella, J.J.; Rice, S.D.

    2003-01-01

    In the event of a coastal oil spill, shoreline clean-up assessment teams (SCAT) commonly rely on visual inspection of the entire affected area to monitor the persistence of the oil on beaches. Occasionally, pits are excavated to evaluate the persistence of subsurface oil. This approach is practical for directing clean-up efforts directly following a spill. However, sampling of the 1989 Exxon Valdez oil spill in Prince William Sound 12 years later has shown that visual inspection combined with pit excavation does not offer estimates of contaminated beach area of stranded oil volumes. This information is needed to statistically evaluate the significance of change with time. Assumptions regarding the correlation of visually-evident surface oil and cryptic subsurface oil are usually not evaluated as part of the SCAT mandate. Stratified random sampling can avoid such problems and could produce precise estimates of oiled area and volume that allow for statistical assessment of major temporal trends and the extent of the impact. The 2001 sampling of the shoreline of Prince William Sound showed that 15 per cent of surface oil occurrences were associated with subsurface oil. This study demonstrates the usefulness of the stratified random sampling method and shows how sampling design parameters impact statistical outcome. Power analysis based on the study results, indicate that optimum power is derived when unnecessary stratification is avoided. It was emphasized that sampling effort should be balanced between choosing sufficient beaches for sampling and the intensity of sampling

  20. [A comparison of convenience sampling and purposive sampling].

    Science.gov (United States)

    Suen, Lee-Jen Wu; Huang, Hui-Man; Lee, Hao-Hsien

    2014-06-01

    Convenience sampling and purposive sampling are two different sampling methods. This article first explains sampling terms such as target population, accessible population, simple random sampling, intended sample, actual sample, and statistical power analysis. These terms are then used to explain the difference between "convenience sampling" and purposive sampling." Convenience sampling is a non-probabilistic sampling technique applicable to qualitative or quantitative studies, although it is most frequently used in quantitative studies. In convenience samples, subjects more readily accessible to the researcher are more likely to be included. Thus, in quantitative studies, opportunity to participate is not equal for all qualified individuals in the target population and study results are not necessarily generalizable to this population. As in all quantitative studies, increasing the sample size increases the statistical power of the convenience sample. In contrast, purposive sampling is typically used in qualitative studies. Researchers who use this technique carefully select subjects based on study purpose with the expectation that each participant will provide unique and rich information of value to the study. As a result, members of the accessible population are not interchangeable and sample size is determined by data saturation not by statistical power analysis.

  1. A user`s guide to LHS: Sandia`s Latin Hypercube Sampling Software

    Energy Technology Data Exchange (ETDEWEB)

    Wyss, G.D.; Jorgensen, K.H. [Sandia National Labs., Albuquerque, NM (United States). Risk Assessment and Systems Modeling Dept.

    1998-02-01

    This document is a reference guide for LHS, Sandia`s Latin Hypercube Sampling Software. This software has been developed to generate either Latin hypercube or random multivariate samples. The Latin hypercube technique employs a constrained sampling scheme, whereas random sampling corresponds to a simple Monte Carlo technique. The present program replaces the previous Latin hypercube sampling program developed at Sandia National Laboratories (SAND83-2365). This manual covers the theory behind stratified sampling as well as use of the LHS code both with the Windows graphical user interface and in the stand-alone mode.

  2. Sampling Methods in Cardiovascular Nursing Research: An Overview.

    Science.gov (United States)

    Kandola, Damanpreet; Banner, Davina; O'Keefe-McCarthy, Sheila; Jassal, Debbie

    2014-01-01

    Cardiovascular nursing research covers a wide array of topics from health services to psychosocial patient experiences. The selection of specific participant samples is an important part of the research design and process. The sampling strategy employed is of utmost importance to ensure that a representative sample of participants is chosen. There are two main categories of sampling methods: probability and non-probability. Probability sampling is the random selection of elements from the population, where each element of the population has an equal and independent chance of being included in the sample. There are five main types of probability sampling including simple random sampling, systematic sampling, stratified sampling, cluster sampling, and multi-stage sampling. Non-probability sampling methods are those in which elements are chosen through non-random methods for inclusion into the research study and include convenience sampling, purposive sampling, and snowball sampling. Each approach offers distinct advantages and disadvantages and must be considered critically. In this research column, we provide an introduction to these key sampling techniques and draw on examples from the cardiovascular research. Understanding the differences in sampling techniques may aid nurses in effective appraisal of research literature and provide a reference pointfor nurses who engage in cardiovascular research.

  3. Are most samples of animals systematically biased? Consistent individual trait differences bias samples despite random sampling.

    Science.gov (United States)

    Biro, Peter A

    2013-02-01

    Sampling animals from the wild for study is something nearly every biologist has done, but despite our best efforts to obtain random samples of animals, 'hidden' trait biases may still exist. For example, consistent behavioral traits can affect trappability/catchability, independent of obvious factors such as size and gender, and these traits are often correlated with other repeatable physiological and/or life history traits. If so, systematic sampling bias may exist for any of these traits. The extent to which this is a problem, of course, depends on the magnitude of bias, which is presently unknown because the underlying trait distributions in populations are usually unknown, or unknowable. Indeed, our present knowledge about sampling bias comes from samples (not complete population censuses), which can possess bias to begin with. I had the unique opportunity to create naturalized populations of fish by seeding each of four small fishless lakes with equal densities of slow-, intermediate-, and fast-growing fish. Using sampling methods that are not size-selective, I observed that fast-growing fish were up to two-times more likely to be sampled than slower-growing fish. This indicates substantial and systematic bias with respect to an important life history trait (growth rate). If correlations between behavioral, physiological and life-history traits are as widespread as the literature suggests, then many animal samples may be systematically biased with respect to these traits (e.g., when collecting animals for laboratory use), and affect our inferences about population structure and abundance. I conclude with a discussion on ways to minimize sampling bias for particular physiological/behavioral/life-history types within animal populations.

  4. Application of random amplified polymorphic DNA (RAPD) markers ...

    African Journals Online (AJOL)

    SAM

    2014-06-11

    Jun 11, 2014 ... variety share an identical genome. In this field one of the most successful techniques is random ... To each minced sample, 350 µL of the same extraction buffer was added and the samples were ..... using fingerprints produced by random primers. J. Hort. Sci. 69:123-. 130. Levi A, Rowland LJ, Hartung JS ...

  5. A simple and efficient alternative to implementing systematic random sampling in stereological designs without a motorized microscope stage.

    Science.gov (United States)

    Melvin, Neal R; Poda, Daniel; Sutherland, Robert J

    2007-10-01

    When properly applied, stereology is a very robust and efficient method to quantify a variety of parameters from biological material. A common sampling strategy in stereology is systematic random sampling, which involves choosing a random sampling [corrected] start point outside the structure of interest, and sampling relevant objects at [corrected] sites that are placed at pre-determined, equidistant intervals. This has proven to be a very efficient sampling strategy, and is used widely in stereological designs. At the microscopic level, this is most often achieved through the use of a motorized stage that facilitates the systematic random stepping across the structure of interest. Here, we report a simple, precise and cost-effective software-based alternative to accomplishing systematic random sampling under the microscope. We believe that this approach will facilitate the use of stereological designs that employ systematic random sampling in laboratories that lack the resources to acquire costly, fully automated systems.

  6. Accounting for Sampling Error in Genetic Eigenvalues Using Random Matrix Theory.

    Science.gov (United States)

    Sztepanacz, Jacqueline L; Blows, Mark W

    2017-07-01

    The distribution of genetic variance in multivariate phenotypes is characterized by the empirical spectral distribution of the eigenvalues of the genetic covariance matrix. Empirical estimates of genetic eigenvalues from random effects linear models are known to be overdispersed by sampling error, where large eigenvalues are biased upward, and small eigenvalues are biased downward. The overdispersion of the leading eigenvalues of sample covariance matrices have been demonstrated to conform to the Tracy-Widom (TW) distribution. Here we show that genetic eigenvalues estimated using restricted maximum likelihood (REML) in a multivariate random effects model with an unconstrained genetic covariance structure will also conform to the TW distribution after empirical scaling and centering. However, where estimation procedures using either REML or MCMC impose boundary constraints, the resulting genetic eigenvalues tend not be TW distributed. We show how using confidence intervals from sampling distributions of genetic eigenvalues without reference to the TW distribution is insufficient protection against mistaking sampling error as genetic variance, particularly when eigenvalues are small. By scaling such sampling distributions to the appropriate TW distribution, the critical value of the TW statistic can be used to determine if the magnitude of a genetic eigenvalue exceeds the sampling error for each eigenvalue in the spectral distribution of a given genetic covariance matrix. Copyright © 2017 by the Genetics Society of America.

  7. Application of the Sampling Selection Technique in Approaching Financial Audit

    Directory of Open Access Journals (Sweden)

    Victor Munteanu

    2018-03-01

    Full Text Available In his professional approach, the financial auditor has a wide range of working techniques, including selection techniques. They are applied depending on the nature of the information available to the financial auditor, the manner in which they are presented - paper or electronic format, and, last but not least, the time available. Several techniques are applied, successively or in parallel, to increase the safety of the expressed opinion and to provide the audit report with a solid basis of information. Sampling is used in the phase of control or clarification of the identified error. The main purpose is to corroborate or measure the degree of risk detected following a pertinent analysis. Since the auditor does not have time or means to thoroughly rebuild the information, the sampling technique can provide an effective response to the need for valorization.

  8. A random sampling approach for robust estimation of tissue-to-plasma ratio from extremely sparse data.

    Science.gov (United States)

    Chu, Hui-May; Ette, Ene I

    2005-09-02

    his study was performed to develop a new nonparametric approach for the estimation of robust tissue-to-plasma ratio from extremely sparsely sampled paired data (ie, one sample each from plasma and tissue per subject). Tissue-to-plasma ratio was estimated from paired/unpaired experimental data using independent time points approach, area under the curve (AUC) values calculated with the naïve data averaging approach, and AUC values calculated using sampling based approaches (eg, the pseudoprofile-based bootstrap [PpbB] approach and the random sampling approach [our proposed approach]). The random sampling approach involves the use of a 2-phase algorithm. The convergence of the sampling/resampling approaches was investigated, as well as the robustness of the estimates produced by different approaches. To evaluate the latter, new data sets were generated by introducing outlier(s) into the real data set. One to 2 concentration values were inflated by 10% to 40% from their original values to produce the outliers. Tissue-to-plasma ratios computed using the independent time points approach varied between 0 and 50 across time points. The ratio obtained from AUC values acquired using the naive data averaging approach was not associated with any measure of uncertainty or variability. Calculating the ratio without regard to pairing yielded poorer estimates. The random sampling and pseudoprofile-based bootstrap approaches yielded tissue-to-plasma ratios with uncertainty and variability. However, the random sampling approach, because of the 2-phase nature of its algorithm, yielded more robust estimates and required fewer replications. Therefore, a 2-phase random sampling approach is proposed for the robust estimation of tissue-to-plasma ratio from extremely sparsely sampled data.

  9. Nuclear analytical techniques and their application to environmental samples

    International Nuclear Information System (INIS)

    Lieser, K.H.

    1986-01-01

    A survey is given on nuclear analytical techniques and their application to environmental samples. Measurement of the inherent radioactivity of elements or radionuclides allows determination of natural radioelements (e.g. Ra), man-made radioelements (e.g. Pu) and radionuclides in the environment. Activation analysis, in particular instrumental neutron activation analysis, is a very reliable and sensitive method for determination of a great number of trace elements in environmental samples, because the most abundant main constituents are not activated. Tracer techniques are very useful for studies of the behaviour and of chemical reactions of trace elements and compounds in the environment. Radioactive sources are mainly applied for excitation of characteristic X-rays (X-ray fluorescence analysis). (author)

  10. Estimation of the Coefficient of Restitution of Rocking Systems by the Random Decrement Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Demosthenous, Milton; Manos, George C.

    1994-01-01

    The aim of this paper is to investigate the possibility of estimating an average damping parameter for a rocking system due to impact, the so-called coefficient of restitution, from the random response, i.e. when the loads are random and unknown, and the response is measured. The objective...... is to obtain an estimate of the free rocking response from the measured random response using the Random Decrement (RDD) Technique, and then estimate the coefficient of restitution from this free response estimate. In the paper this approach is investigated by simulating the response of a single degree...

  11. Random Sampling of Correlated Parameters – a Consistent Solution for Unfavourable Conditions

    Energy Technology Data Exchange (ETDEWEB)

    Žerovnik, G., E-mail: gasper.zerovnik@ijs.si [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Trkov, A. [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); International Atomic Energy Agency, PO Box 100, A-1400 Vienna (Austria); Kodeli, I.A. [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Capote, R. [International Atomic Energy Agency, PO Box 100, A-1400 Vienna (Austria); Smith, D.L. [Argonne National Laboratory, 1710 Avenida del Mundo, Coronado, CA 92118-3073 (United States)

    2015-01-15

    Two methods for random sampling according to a multivariate lognormal distribution – the correlated sampling method and the method of transformation of correlation coefficients – are briefly presented. The methods are mathematically exact and enable consistent sampling of correlated inherently positive parameters with given information on the first two distribution moments. Furthermore, a weighted sampling method to accelerate the convergence of parameters with extremely large relative uncertainties is described. However, the method is efficient only for a limited number of correlated parameters.

  12. The Bootstrap, the Jackknife, and the Randomization Test: A Sampling Taxonomy.

    Science.gov (United States)

    Rodgers, J L

    1999-10-01

    A simple sampling taxonomy is defined that shows the differences between and relationships among the bootstrap, the jackknife, and the randomization test. Each method has as its goal the creation of an empirical sampling distribution that can be used to test statistical hypotheses, estimate standard errors, and/or create confidence intervals. Distinctions between the methods can be made based on the sampling approach (with replacement versus without replacement) and the sample size (replacing the whole original sample versus replacing a subset of the original sample). The taxonomy is useful for teaching the goals and purposes of resampling schemes. An extension of the taxonomy implies other possible resampling approaches that have not previously been considered. Univariate and multivariate examples are presented.

  13. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality

    Science.gov (United States)

    Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Introduction Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. Materials and methods The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. Results The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. Conclusions The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in

  14. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality.

    Science.gov (United States)

    L'Engle, Kelly; Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit dialing of mobile

  15. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality.

    Directory of Open Access Journals (Sweden)

    Kelly L'Engle

    Full Text Available Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample.The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census.The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample.The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit

  16. An experimental comparison of triggered and random pulse train uncertainties

    International Nuclear Information System (INIS)

    Henzlova, Daniela; Menlove, Howard O.; Swinhoe, Martyn T.

    2010-01-01

    In this paper we present an experimental comparison of signal-triggered and randomly triggered based analysis algorithms of neutron multiplicity data. Traditional shift register type signal-triggered multiplicity analysis of singles, doubles and triples rates is compared with analysis using randomly triggered gates. Two methods of random gate generation are explored - non-overlapping gates (Feyrunan approach) and periodic overlapping gates (fast accidentals). Using californium sources with low, medium and high rate in combination with AmLi sources (as a surrogate for plutonium) we investigate relative standard deviation (RSD) of data in order to determine if there are parameter spaces in which one of the measurement methods should be preferred. Neutron correlation analysis is a commonly used NDA technique to assay plutonium mass. The data can be collected in two distinct ways: using signal-triggered or randomly triggered counting gates. Analysis algorithms were developed for both approaches to determine singles (S), doubles (D) and triples (7) rates from the measured sample. Currently the most commonly implemented technique to collect neutron coincidence data utilizes shift register based electronics. Shift register uses signal-triggered counting gates to generate foreground multiplicity distribution of correlated+accidental events and a random gate (opened after a predefined long delay following the signal trigger) to generate background multiplicity distribution of accidental events. Modern shift registers include fast accidental option to sample data with a fixed clock frequency. This way a set of overlapping gates is used to generate background multiplicity distributions in order to improve the measurement precision. In parallel to shift register approach the Feynman variance technique is frequently used, which utilizes set of consecutive non-overlapping gates. In general, different user communities (e.g. safeguards, nuclear material accountancy, emergency

  17. Micro and Nano Techniques for the Handling of Biological Samples

    DEFF Research Database (Denmark)

    Micro and Nano Techniques for the Handling of Biological Samples reviews the different techniques available to manipulate and integrate biological materials in a controlled manner, either by sliding them along a surface (2-D manipulation), or by gripping and moving them to a new position (3-D...

  18. Methodological integrative review of the work sampling technique used in nursing workload research.

    Science.gov (United States)

    Blay, Nicole; Duffield, Christine M; Gallagher, Robyn; Roche, Michael

    2014-11-01

    To critically review the work sampling technique used in nursing workload research. Work sampling is a technique frequently used by researchers and managers to explore and measure nursing activities. However, work sampling methods used are diverse making comparisons of results between studies difficult. Methodological integrative review. Four electronic databases were systematically searched for peer-reviewed articles published between 2002-2012. Manual scanning of reference lists and Rich Site Summary feeds from contemporary nursing journals were other sources of data. Articles published in the English language between 2002-2012 reporting on research which used work sampling to examine nursing workload. Eighteen articles were reviewed. The review identified that the work sampling technique lacks a standardized approach, which may have an impact on the sharing or comparison of results. Specific areas needing a shared understanding included the training of observers and subjects who self-report, standardization of the techniques used to assess observer inter-rater reliability, sampling methods and reporting of outcomes. Work sampling is a technique that can be used to explore the many facets of nursing work. Standardized reporting measures would enable greater comparison between studies and contribute to knowledge more effectively. Author suggestions for the reporting of results may act as guidelines for researchers considering work sampling as a research method. © 2014 John Wiley & Sons Ltd.

  19. Systematic comparison of static and dynamic headspace sampling techniques for gas chromatography.

    Science.gov (United States)

    Kremser, Andreas; Jochmann, Maik A; Schmidt, Torsten C

    2016-09-01

    Six automated, headspace-based sample preparation techniques were used to extract volatile analytes from water with the goal of establishing a systematic comparison between commonly available instrumental alternatives. To that end, these six techniques were used in conjunction with the same gas chromatography instrument for analysis of a common set of volatile organic carbon (VOC) analytes. The methods were thereby divided into three classes: static sampling (by syringe or loop), static enrichment (SPME and PAL SPME Arrow), and dynamic enrichment (ITEX and trap sampling). For PAL SPME Arrow, different sorption phase materials were also included in the evaluation. To enable an effective comparison, method detection limits (MDLs), relative standard deviations (RSDs), and extraction yields were determined and are discussed for all techniques. While static sampling techniques exhibited sufficient extraction yields (approx. 10-20 %) to be reliably used down to approx. 100 ng L(-1), enrichment techniques displayed extraction yields of up to 80 %, resulting in MDLs down to the picogram per liter range. RSDs for all techniques were below 27 %. The choice on one of the different instrumental modes of operation (aforementioned classes) was thereby the most influential parameter in terms of extraction yields and MDLs. Individual methods inside each class showed smaller deviations, and the least influences were observed when evaluating different sorption phase materials for the individual enrichment techniques. The option of selecting specialized sorption phase materials may, however, be more important when analyzing analytes with different properties such as high polarity or the capability of specific molecular interactions. Graphical Abstract PAL SPME Arrow during the extraction of volatile analytes from the headspace of an aqueous sample.

  20. Occupational position and its relation to mental distress in a random sample of Danish residents

    DEFF Research Database (Denmark)

    Rugulies, Reiner Ernst; Madsen, Ida E H; Nielsen, Maj Britt D

    2010-01-01

    PURPOSE: To analyze the distribution of depressive, anxiety, and somatization symptoms across different occupational positions in a random sample of Danish residents. METHODS: The study sample consisted of 591 Danish residents (50% women), aged 20-65, drawn from an age- and gender-stratified random...... sample of the Danish population. Participants filled out a survey that included the 92 item version of the Hopkins Symptom Checklist (SCL-92). We categorized occupational position into seven groups: high- and low-grade non-manual workers, skilled and unskilled manual workers, high- and low-grade self...

  1. RF Sub-sampling Receiver Architecture based on Milieu Adapting Techniques

    DEFF Research Database (Denmark)

    Behjou, Nastaran; Larsen, Torben; Jensen, Ole Kiel

    2012-01-01

    A novel sub-sampling based architecture is proposed which has the ability of reducing the problem of image distortion and improving the signal to noise ratio significantly. The technique is based on sensing the environment and adapting the sampling rate of the receiver to the best possible...

  2. Adaptive importance sampling for probabilistic validation of advanced driver assistance systems

    NARCIS (Netherlands)

    Gietelink, O.J.; Schutter, B. de; Verhaegen, M.

    2006-01-01

    We present an approach for validation of advanced driver assistance systems, based on randomized algorithms. The new method consists of an iterative randomized simulation using adaptive importance sampling. The randomized algorithm is more efficient than conventional simulation techniques. The

  3. Cone penetrometer tests and HydroPunch sampling: A screening technique for plume definition

    International Nuclear Information System (INIS)

    Smolley, M.; Kappmeyer, J.C.

    1991-01-01

    Cone penetrometer tests and HydroPunch sampling were used to define the extent of volatile organic compounds in ground water. The investigation indicated that the combination of the these techniques is effective for obtaining ground water samples for preliminary plume definition. HydroPunch samples can be collected in unconsolidated sediments and the analytical results obtained from these samples are comparable to those obtained from adjacent monitoring wells. This sampling method is a rapid and cost-effective screening technique for characterizing the extent of contaminant plumes in soft sediment environments. Use of this screening technique allowed monitoring wells to be located at the plume boundary, thereby reducing the number of wells installed and the overall cost of the plume definition program

  4. Chance constrained problems: penalty reformulation and performance of sample approximation technique

    Czech Academy of Sciences Publication Activity Database

    Branda, Martin

    2012-01-01

    Roč. 48, č. 1 (2012), s. 105-122 ISSN 0023-5954 R&D Projects: GA ČR(CZ) GBP402/12/G097 Institutional research plan: CEZ:AV0Z10750506 Keywords : chance constrained problems * penalty functions * asymptotic equivalence * sample approximation technique * investment problem Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.619, year: 2012 http://library.utia.cas.cz/separaty/2012/E/branda-chance constrained problems penalty reformulation and performance of sample approximation technique.pdf

  5. Adaptive importance sampling of random walks on continuous state spaces

    International Nuclear Information System (INIS)

    Baggerly, K.; Cox, D.; Picard, R.

    1998-01-01

    The authors consider adaptive importance sampling for a random walk with scoring in a general state space. Conditions under which exponential convergence occurs to the zero-variance solution are reviewed. These results generalize previous work for finite, discrete state spaces in Kollman (1993) and in Kollman, Baggerly, Cox, and Picard (1996). This paper is intended for nonstatisticians and includes considerable explanatory material

  6. Toward a Principled Sampling Theory for Quasi-Orders.

    Science.gov (United States)

    Ünlü, Ali; Schrepp, Martin

    2016-01-01

    Quasi-orders, that is, reflexive and transitive binary relations, have numerous applications. In educational theories, the dependencies of mastery among the problems of a test can be modeled by quasi-orders. Methods such as item tree or Boolean analysis that mine for quasi-orders in empirical data are sensitive to the underlying quasi-order structure. These data mining techniques have to be compared based on extensive simulation studies, with unbiased samples of randomly generated quasi-orders at their basis. In this paper, we develop techniques that can provide the required quasi-order samples. We introduce a discrete doubly inductive procedure for incrementally constructing the set of all quasi-orders on a finite item set. A randomization of this deterministic procedure allows us to generate representative samples of random quasi-orders. With an outer level inductive algorithm, we consider the uniform random extensions of the trace quasi-orders to higher dimension. This is combined with an inner level inductive algorithm to correct the extensions that violate the transitivity property. The inner level correction step entails sampling biases. We propose three algorithms for bias correction and investigate them in simulation. It is evident that, on even up to 50 items, the new algorithms create close to representative quasi-order samples within acceptable computing time. Hence, the principled approach is a significant improvement to existing methods that are used to draw quasi-orders uniformly at random but cannot cope with reasonably large item sets.

  7. Toward a Principled Sampling Theory for Quasi-Orders

    Science.gov (United States)

    Ünlü, Ali; Schrepp, Martin

    2016-01-01

    Quasi-orders, that is, reflexive and transitive binary relations, have numerous applications. In educational theories, the dependencies of mastery among the problems of a test can be modeled by quasi-orders. Methods such as item tree or Boolean analysis that mine for quasi-orders in empirical data are sensitive to the underlying quasi-order structure. These data mining techniques have to be compared based on extensive simulation studies, with unbiased samples of randomly generated quasi-orders at their basis. In this paper, we develop techniques that can provide the required quasi-order samples. We introduce a discrete doubly inductive procedure for incrementally constructing the set of all quasi-orders on a finite item set. A randomization of this deterministic procedure allows us to generate representative samples of random quasi-orders. With an outer level inductive algorithm, we consider the uniform random extensions of the trace quasi-orders to higher dimension. This is combined with an inner level inductive algorithm to correct the extensions that violate the transitivity property. The inner level correction step entails sampling biases. We propose three algorithms for bias correction and investigate them in simulation. It is evident that, on even up to 50 items, the new algorithms create close to representative quasi-order samples within acceptable computing time. Hence, the principled approach is a significant improvement to existing methods that are used to draw quasi-orders uniformly at random but cannot cope with reasonably large item sets. PMID:27965601

  8. Sampling methods and non-destructive examination techniques for large radioactive waste packages

    International Nuclear Information System (INIS)

    Green, T.H.; Smith, D.L.; Burgoyne, K.E.; Maxwell, D.J.; Norris, G.H.; Billington, D.M.; Pipe, R.G.; Smith, J.E.; Inman, C.M.

    1992-01-01

    Progress is reported on work undertaken to evaluate quality checking methods for radioactive wastes. A sampling rig was designed, fabricated and used to develop techniques for the destructive sampling of cemented simulant waste using remotely operated equipment. An engineered system for the containment of cooling water was designed and manufactured and successfully demonstrated with the drum and coring equipment mounted in both vertical and horizontal orientations. The preferred in-cell orientation was found to be with the drum and coring machinery mounted in a horizontal position. Small powdered samples can be taken from cemented homogeneous waste cores using a hollow drill/vacuum section technique with the preferred subsampling technique being to discard the outer 10 mm layer to obtain a representative sample of the cement core. Cement blends can be dissolved using fusion techniques and the resulting solutions are stable to gelling for periods in excess of one year. Although hydrochloric acid and nitric acid are promising solvents for dissolution of cement blends, the resultant solutions tend to form silicic acid gels. An estimate of the beta-emitter content of cemented waste packages can be obtained by a combination of non-destructive and destructive techniques. The errors will probably be in excess of +/-60 % at the 95 % confidence level. Real-time X-ray video-imaging techniques have been used to analyse drums of uncompressed, hand-compressed, in-drum compacted and high-force compacted (i.e. supercompacted) simulant waste. The results have confirmed the applicability of this technique for NDT of low-level waste. 8 refs., 12 figs., 3 tabs

  9. Precision of systematic and random sampling in clustered populations: habitat patches and aggregating organisms.

    Science.gov (United States)

    McGarvey, Richard; Burch, Paul; Matthews, Janet M

    2016-01-01

    Natural populations of plants and animals spatially cluster because (1) suitable habitat is patchy, and (2) within suitable habitat, individuals aggregate further into clusters of higher density. We compare the precision of random and systematic field sampling survey designs under these two processes of species clustering. Second, we evaluate the performance of 13 estimators for the variance of the sample mean from a systematic survey. Replicated simulated surveys, as counts from 100 transects, allocated either randomly or systematically within the study region, were used to estimate population density in six spatial point populations including habitat patches and Matérn circular clustered aggregations of organisms, together and in combination. The standard one-start aligned systematic survey design, a uniform 10 x 10 grid of transects, was much more precise. Variances of the 10 000 replicated systematic survey mean densities were one-third to one-fifth of those from randomly allocated transects, implying transect sample sizes giving equivalent precision by random survey would need to be three to five times larger. Organisms being restricted to patches of habitat was alone sufficient to yield this precision advantage for the systematic design. But this improved precision for systematic sampling in clustered populations is underestimated by standard variance estimators used to compute confidence intervals. True variance for the survey sample mean was computed from the variance of 10 000 simulated survey mean estimates. Testing 10 published and three newly proposed variance estimators, the two variance estimators (v) that corrected for inter-transect correlation (ν₈ and ν(W)) were the most accurate and also the most precise in clustered populations. These greatly outperformed the two "post-stratification" variance estimators (ν₂ and ν₃) that are now more commonly applied in systematic surveys. Similar variance estimator performance rankings were found with

  10. A Comparison of Soil-Water Sampling Techniques

    Science.gov (United States)

    Tindall, J. A.; Figueroa-Johnson, M.; Friedel, M. J.

    2007-12-01

    The representativeness of soil pore water extracted by suction lysimeters in ground-water monitoring studies is a problem that often confounds interpretation of measured data. Current soil water sampling techniques cannot identify the soil volume from which a pore water sample is extracted, neither macroscopic, microscopic, or preferential flowpath. This research was undertaken to compare values of extracted suction lysimeters samples from intact soil cores with samples obtained by the direct extraction methods to determine what portion of soil pore water is sampled by each method. Intact soil cores (30 centimeter (cm) diameter by 40 cm height) were extracted from two different sites - a sandy soil near Altamonte Springs, Florida and a clayey soil near Centralia in Boone County, Missouri. Isotopically labeled water (O18? - analyzed by mass spectrometry) and bromide concentrations (KBr- - measured using ion chromatography) from water samples taken by suction lysimeters was compared with samples obtained by direct extraction methods of centrifugation and azeotropic distillation. Water samples collected by direct extraction were about 0.25 ? more negative (depleted) than that collected by suction lysimeter values from a sandy soil and about 2-7 ? more negative from a well structured clayey soil. Results indicate that the majority of soil water in well-structured soil is strongly bound to soil grain surfaces and is not easily sampled by suction lysimeters. In cases where a sufficient volume of water has passed through the soil profile and displaced previous pore water, suction lysimeters will collect a representative sample of soil pore water from the sampled depth interval. It is suggested that for stable isotope studies monitoring precipitation and soil water, suction lysimeter should be installed at shallow depths (10 cm). Samples should also be coordinated with precipitation events. The data also indicate that each extraction method be use to sample a different

  11. Can groundwater sampling techniques used in monitoring wells influence methane concentrations and isotopes?

    Science.gov (United States)

    Rivard, Christine; Bordeleau, Geneviève; Lavoie, Denis; Lefebvre, René; Malet, Xavier

    2018-03-06

    Methane concentrations and isotopic composition in groundwater are the focus of a growing number of studies. However, concerns are often expressed regarding the integrity of samples, as methane is very volatile and may partially exsolve during sample lifting in the well and transfer to sampling containers. While issues concerning bottle-filling techniques have already been documented, this paper documents a comparison of methane concentration and isotopic composition obtained with three devices commonly used to retrieve water samples from dedicated observation wells. This work lies within the framework of a larger project carried out in the Saint-Édouard area (southern Québec, Canada), whose objective was to assess the risk to shallow groundwater quality related to potential shale gas exploitation. The selected sampling devices, which were tested on ten wells during three sampling campaigns, consist of an impeller pump, a bladder pump, and disposable sampling bags (HydraSleeve). The sampling bags were used both before and after pumping, to verify the appropriateness of a no-purge approach, compared to the low-flow approach involving pumping until stabilization of field physicochemical parameters. Results show that methane concentrations obtained with the selected sampling techniques are usually similar and that there is no systematic bias related to a specific technique. Nonetheless, concentrations can sometimes vary quite significantly (up to 3.5 times) for a given well and sampling event. Methane isotopic composition obtained with all sampling techniques is very similar, except in some cases where sampling bags were used before pumping (no-purge approach), in wells where multiple groundwater sources enter the borehole.

  12. Development of environmental sample analysis techniques for safeguards

    International Nuclear Information System (INIS)

    Magara, Masaaki; Hanzawa, Yukiko; Esaka, Fumitaka

    1999-01-01

    JAERI has been developing environmental sample analysis techniques for safeguards and preparing a clean chemistry laboratory with clean rooms. Methods to be developed are a bulk analysis and a particle analysis. In the bulk analysis, Inductively-Coupled Plasma Mass Spectrometer or Thermal Ionization Mass Spectrometer are used to measure nuclear materials after chemical treatment of sample. In the particle analysis, Electron Probe Micro Analyzer and Secondary Ion Mass Spectrometer are used for elemental analysis and isotopic analysis, respectively. The design of the clean chemistry laboratory has been carried out and construction will be completed by the end of March, 2001. (author)

  13. A Table-Based Random Sampling Simulation for Bioluminescence Tomography

    Directory of Open Access Journals (Sweden)

    Xiaomeng Zhang

    2006-01-01

    Full Text Available As a popular simulation of photon propagation in turbid media, the main problem of Monte Carlo (MC method is its cumbersome computation. In this work a table-based random sampling simulation (TBRS is proposed. The key idea of TBRS is to simplify multisteps of scattering to a single-step process, through randomly table querying, thus greatly reducing the computing complexity of the conventional MC algorithm and expediting the computation. The TBRS simulation is a fast algorithm of the conventional MC simulation of photon propagation. It retained the merits of flexibility and accuracy of conventional MC method and adapted well to complex geometric media and various source shapes. Both MC simulations were conducted in a homogeneous medium in our work. Also, we present a reconstructing approach to estimate the position of the fluorescent source based on the trial-and-error theory as a validation of the TBRS algorithm. Good agreement is found between the conventional MC simulation and the TBRS simulation.

  14. Standardization of proton-induced x-ray emission technique for analysis of thick samples

    Science.gov (United States)

    Ali, Shad; Zeb, Johar; Ahad, Abdul; Ahmad, Ishfaq; Haneef, M.; Akbar, Jehan

    2015-09-01

    This paper describes the standardization of the proton-induced x-ray emission (PIXE) technique for finding the elemental composition of thick samples. For the standardization, three different samples of standard reference materials (SRMs) were analyzed using this technique and the data were compared with the already known data of these certified SRMs. These samples were selected in order to cover the maximum range of elements in the periodic table. Each sample was irradiated for three different values of collected beam charges at three different times. A proton beam of 2.57 MeV obtained using 5UDH-II Pelletron accelerator was used for excitation of x-rays from the sample. The acquired experimental data were analyzed using the GUPIXWIN software. The results show that the SRM data and the data obtained using the PIXE technique are in good agreement.

  15. Sampling from the normal and exponential distributions

    International Nuclear Information System (INIS)

    Chaplin, K.R.; Wills, C.A.

    1982-01-01

    Methods for generating random numbers from the normal and exponential distributions are described. These involve dividing each function into subregions, and for each of these developing a method of sampling usually based on an acceptance rejection technique. When sampling from the normal or exponential distribution, each subregion provides the required random value with probability equal to the ratio of its area to the total area. Procedures written in FORTRAN for the CYBER 175/CDC 6600 system are provided to implement the two algorithms

  16. 40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or as...

  17. Review of sample preparation techniques for the analysis of pesticide residues in soil.

    Science.gov (United States)

    Tadeo, José L; Pérez, Rosa Ana; Albero, Beatriz; García-Valcárcel, Ana I; Sánchez-Brunete, Consuelo

    2012-01-01

    This paper reviews the sample preparation techniques used for the analysis of pesticides in soil. The present status and recent advances made during the last 5 years in these methods are discussed. The analysis of pesticide residues in soil requires the extraction of analytes from this matrix, followed by a cleanup procedure, when necessary, prior to their instrumental determination. The optimization of sample preparation is a very important part of the method development that can reduce the analysis time, the amount of solvent, and the size of samples. This review considers all aspects of sample preparation, including extraction and cleanup. Classical extraction techniques, such as shaking, Soxhlet, and ultrasonic-assisted extraction, and modern techniques like pressurized liquid extraction, microwave-assisted extraction, solid-phase microextraction and QuEChERS (Quick, Easy, Cheap, Effective, Rugged, and Safe) are reviewed. The different cleanup strategies applied for the purification of soil extracts are also discussed. In addition, the application of these techniques to environmental studies is considered.

  18. A Nationwide Random Sampling Survey of Potential Complicated Grief in Japan

    Science.gov (United States)

    Mizuno, Yasunao; Kishimoto, Junji; Asukai, Nozomu

    2012-01-01

    To investigate the prevalence of significant loss, potential complicated grief (CG), and its contributing factors, we conducted a nationwide random sampling survey of Japanese adults aged 18 or older (N = 1,343) using a self-rating Japanese-language version of the Complicated Grief Brief Screen. Among them, 37.0% experienced their most significant…

  19. Reinforcing Sampling Distributions through a Randomization-Based Activity for Introducing ANOVA

    Science.gov (United States)

    Taylor, Laura; Doehler, Kirsten

    2015-01-01

    This paper examines the use of a randomization-based activity to introduce the ANOVA F-test to students. The two main goals of this activity are to successfully teach students to comprehend ANOVA F-tests and to increase student comprehension of sampling distributions. Four sections of students in an advanced introductory statistics course…

  20. Laboratory techniques for safe encapsulation of α-emitting powder samples

    International Nuclear Information System (INIS)

    Chamberlain, H.E.; Pottinger, J.S.

    1984-01-01

    Plutonium oxide powder samples can be encapsulated in thin plastic film to prevent spread of contamination in counting and X-ray diffraction equipment. The film has to be thin enough to transmit X-rays and α-particles. Techniques are described for the wrapping process and the precautions necessary to keep the sample processing line free of significant contamination. (author)

  1. Randomization tests

    CERN Document Server

    Edgington, Eugene

    2007-01-01

    Statistical Tests That Do Not Require Random Sampling Randomization Tests Numerical Examples Randomization Tests and Nonrandom Samples The Prevalence of Nonrandom Samples in Experiments The Irrelevance of Random Samples for the Typical Experiment Generalizing from Nonrandom Samples Intelligibility Respect for the Validity of Randomization Tests Versatility Practicality Precursors of Randomization Tests Other Applications of Permutation Tests Questions and Exercises Notes References Randomized Experiments Unique Benefits of Experiments Experimentation without Mani

  2. Creating ensembles of decision trees through sampling

    Science.gov (United States)

    Kamath, Chandrika; Cantu-Paz, Erick

    2005-08-30

    A system for decision tree ensembles that includes a module to read the data, a module to sort the data, a module to evaluate a potential split of the data according to some criterion using a random sample of the data, a module to split the data, and a module to combine multiple decision trees in ensembles. The decision tree method is based on statistical sampling techniques and includes the steps of reading the data; sorting the data; evaluating a potential split according to some criterion using a random sample of the data, splitting the data, and combining multiple decision trees in ensembles.

  3. Sampling phased array a new technique for signal processing and ultrasonic imaging

    OpenAIRE

    Bulavinov, A.; Joneit, D.; Kröning, M.; Bernus, L.; Dalichow, M.H.; Reddy, K.M.

    2006-01-01

    Different signal processing and image reconstruction techniques are applied in ultrasonic non-destructive material evaluation. In recent years, rapid development in the fields of microelectronics and computer engineering lead to wide application of phased array systems. A new phased array technique, called "Sampling Phased Array" has been developed in Fraunhofer Institute for non-destructive testing. It realizes unique approach of measurement and processing of ultrasonic signals. The sampling...

  4. The role of graphene-based sorbents in modern sample preparation techniques.

    Science.gov (United States)

    de Toffoli, Ana Lúcia; Maciel, Edvaldo Vasconcelos Soares; Fumes, Bruno Henrique; Lanças, Fernando Mauro

    2018-01-01

    The application of graphene-based sorbents in sample preparation techniques has increased significantly since 2011. These materials have good physicochemical properties to be used as sorbent and have shown excellent results in different sample preparation techniques. Graphene and its precursor graphene oxide have been considered to be good candidates to improve the extraction and concentration of different classes of target compounds (e.g., parabens, polycyclic aromatic hydrocarbon, pyrethroids, triazines, and so on) present in complex matrices. Its applications have been employed during the analysis of different matrices (e.g., environmental, biological and food). In this review, we highlight the most important characteristics of graphene-based material, their properties, synthesis routes, and the most important applications in both off-line and on-line sample preparation techniques. The discussion of the off-line approaches includes methods derived from conventional solid-phase extraction focusing on the miniaturized magnetic and dispersive modes. The modes of microextraction techniques called stir bar sorptive extraction, solid phase microextraction, and microextraction by packed sorbent are discussed. The on-line approaches focus on the use of graphene-based material mainly in on-line solid phase extraction, its variation called in-tube solid-phase microextraction, and on-line microdialysis systems. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. LOD score exclusion analyses for candidate QTLs using random population samples.

    Science.gov (United States)

    Deng, Hong-Wen

    2003-11-01

    While extensive analyses have been conducted to test for, no formal analyses have been conducted to test against, the importance of candidate genes as putative QTLs using random population samples. Previously, we developed an LOD score exclusion mapping approach for candidate genes for complex diseases. Here, we extend this LOD score approach for exclusion analyses of candidate genes for quantitative traits. Under this approach, specific genetic effects (as reflected by heritability) and inheritance models at candidate QTLs can be analyzed and if an LOD score is < or = -2.0, the locus can be excluded from having a heritability larger than that specified. Simulations show that this approach has high power to exclude a candidate gene from having moderate genetic effects if it is not a QTL and is robust to population admixture. Our exclusion analysis complements association analysis for candidate genes as putative QTLs in random population samples. The approach is applied to test the importance of Vitamin D receptor (VDR) gene as a potential QTL underlying the variation of bone mass, an important determinant of osteoporosis.

  6. The study of combining Latin Hypercube Sampling method and LU decomposition method (LULHS method) for constructing spatial random field

    Science.gov (United States)

    WANG, P. T.

    2015-12-01

    Groundwater modeling requires to assign hydrogeological properties to every numerical grid. Due to the lack of detailed information and the inherent spatial heterogeneity, geological properties can be treated as random variables. Hydrogeological property is assumed to be a multivariate distribution with spatial correlations. By sampling random numbers from a given statistical distribution and assigning a value to each grid, a random field for modeling can be completed. Therefore, statistics sampling plays an important role in the efficiency of modeling procedure. Latin Hypercube Sampling (LHS) is a stratified random sampling procedure that provides an efficient way to sample variables from their multivariate distributions. This study combines the the stratified random procedure from LHS and the simulation by using LU decomposition to form LULHS. Both conditional and unconditional simulations of LULHS were develpoed. The simulation efficiency and spatial correlation of LULHS are compared to the other three different simulation methods. The results show that for the conditional simulation and unconditional simulation, LULHS method is more efficient in terms of computational effort. Less realizations are required to achieve the required statistical accuracy and spatial correlation.

  7. Classification of Phishing Email Using Random Forest Machine Learning Technique

    OpenAIRE

    Akinyelu, Andronicus A.; Adewumi, Aderemi O.

    2013-01-01

    Phishing is one of the major challenges faced by the world of e-commerce today. Thanks to phishing attacks, billions of dollars have been lost by many companies and individuals. In 2012, an online report put the loss due to phishing attack at about $1.5 billion. This global impact of phishing attacks will continue to be on the increase and thus requires more efficient phishing detection techniques to curb the menace. This paper investigates and reports the use of random forest machine learnin...

  8. Sampling Strategies and Processing of Biobank Tissue Samples from Porcine Biomedical Models.

    Science.gov (United States)

    Blutke, Andreas; Wanke, Rüdiger

    2018-03-06

    In translational medical research, porcine models have steadily become more popular. Considering the high value of individual animals, particularly of genetically modified pig models, and the often-limited number of available animals of these models, establishment of (biobank) collections of adequately processed tissue samples suited for a broad spectrum of subsequent analyses methods, including analyses not specified at the time point of sampling, represent meaningful approaches to take full advantage of the translational value of the model. With respect to the peculiarities of porcine anatomy, comprehensive guidelines have recently been established for standardized generation of representative, high-quality samples from different porcine organs and tissues. These guidelines are essential prerequisites for the reproducibility of results and their comparability between different studies and investigators. The recording of basic data, such as organ weights and volumes, the determination of the sampling locations and of the numbers of tissue samples to be generated, as well as their orientation, size, processing and trimming directions, are relevant factors determining the generalizability and usability of the specimen for molecular, qualitative, and quantitative morphological analyses. Here, an illustrative, practical, step-by-step demonstration of the most important techniques for generation of representative, multi-purpose biobank specimen from porcine tissues is presented. The methods described here include determination of organ/tissue volumes and densities, the application of a volume-weighted systematic random sampling procedure for parenchymal organs by point-counting, determination of the extent of tissue shrinkage related to histological embedding of samples, and generation of randomly oriented samples for quantitative stereological analyses, such as isotropic uniform random (IUR) sections generated by the "Orientator" and "Isector" methods, and vertical

  9. Techniques of sample attack used in soil and mineral analysis. Phase I

    International Nuclear Information System (INIS)

    Chiu, N.W.; Dean, J.R.; Sill, C.W.

    1984-07-01

    Several techniques of sample attack for the determination of radioisotopes are reviewed. These techniques include: 1) digestion with nitric or hydrochloric acid in Parr digestion bomb, 2) digestion with a mixture of nitric and hydrochloric acids, 3) digestion with a mixture of hydrofluoric, nitric and perchloric acids, and 4) fusion with sodium carbonate, potassium fluoride or alkali pyrosulfates. The effectiveness of these techniques to decompose various soils and minerals containing radioisotopes such as lead-210 uranium, thorium and radium-226 are discussed. The combined procedure of potassium fluoride fusion followed by alkali pyrosulfate fusion is recommended for radium-226, uranium and thorium analysis. This technique guarantees the complete dissolution of samples containing refractory materials such as silica, silicates, carbides, oxides and sulfates. For the lead-210 analysis, the procedure of digestion with a mixture of hydrofluoric, nitric and perchloric acids followed by fusion with alkali pyrosulfate is recommended. These two procedures are detailed. Schemes for the sequential separation of the radioisotopes from a dissolved sample solution are outlined. Procedures for radiochemical analysis are suggested

  10. Assessment of Natural Radioactivity in TENORM Samples Using Different Techniques

    International Nuclear Information System (INIS)

    Salman, Kh.A.; Shahein, A.Y.

    2009-01-01

    In petroleum oil industries, technologically-enhanced, naturally occurring radioactive materials are produced. The presence of TENORM constitutes a significant radiological human health hazard. In the present work, liquid scintillation counting technique was used to determine both 222 Rn and 226 Ra concentrations in TENORM samples, by measuring 222 Rn concentrations in the sample at different intervals of time after preparation. The radiation doses from the TENORM samples were estimated using thermoluminenscent detector (TLD-4000). The estimated radiation doses were found to be proportional to both the measured radiation doses in site and natural activity concentration in the samples that measured with LSC

  11. Work Sampling Study of an Engineering Professor during a Regular Contract Period

    Science.gov (United States)

    Brink, Jan; McDonald, Dale B.

    2015-01-01

    Work sampling is a technique that has been employed in industry and fields such as healthcare for some time. It is a powerful technique, and an alternative to conventional stop watch time studies, used by industrial engineers to focus upon random work sampling observations. This study applies work sampling to the duties performed by an individual…

  12. Attenuation of species abundance distributions by sampling

    Science.gov (United States)

    Shimadzu, Hideyasu; Darnell, Ross

    2015-01-01

    Quantifying biodiversity aspects such as species presence/ absence, richness and abundance is an important challenge to answer scientific and resource management questions. In practice, biodiversity can only be assessed from biological material taken by surveys, a difficult task given limited time and resources. A type of random sampling, or often called sub-sampling, is a commonly used technique to reduce the amount of time and effort for investigating large quantities of biological samples. However, it is not immediately clear how (sub-)sampling affects the estimate of biodiversity aspects from a quantitative perspective. This paper specifies the effect of (sub-)sampling as attenuation of the species abundance distribution (SAD), and articulates how the sampling bias is induced to the SAD by random sampling. The framework presented also reveals some confusion in previous theoretical studies. PMID:26064626

  13. Effect of manual therapy techniques on headache disability in patients with tension-type headache. Randomized controlled trial.

    Science.gov (United States)

    Espí-López, G V; Rodríguez-Blanco, C; Oliva-Pascual-Vaca, A; Benítez-Martínez, J C; Lluch, E; Falla, D

    2014-12-01

    Tension-type headache (TTH) is the most common type of primary headache however there is no clear evidence as to which specific treatment is most effective or whether combined treatment is more effective than individual treatments. To assess the effectiveness of manual therapy techniques, applied to the suboccipital region, on aspects of disability in a sample of patients with tension-type headache. Randomized Controlled Trial. Specialized centre for headache treatment. Seventy-six (62 women) patients (age: 39.9 ± 10.9 years) with episodic chronic TTH. Patients were randomly divided into four treatment groups: 1) suboccipital soft tissue inhibition; 2) occiput-atlas-axis manipulation; 3) combined treatment of both techniques; 4) control. Four sessions were applied over 4 weeks and disability was assessed before and after treatment using the Headache Disability Inventory (HDI). Headache frequency, severity and the functional and emotional subscales of the questionnaire were assessed. Photophobia, phonophobia and pericranial tenderness were also monitored. Headache frequency was significantly reduced with the manipulative and combined treatment (Ptreatment groups (Ptreatment also reduced the score on the emotional subscale of the HDI (Ptreatments were combined, effectiveness was noted for all aspects of disability and other symptoms including photophobia, phonophobia and pericranial tenderness. Although individual manual therapy treatments showed a positive change in headache features, measures of photophobia, photophobia and pericranial tenderness only improved in the group that received the combined treatment suggesting that combined treatment is the most appropriate for symptomatic relief of TTH.

  14. Blocked Randomization with Randomly Selected Block Sizes

    Directory of Open Access Journals (Sweden)

    Jimmy Efird

    2010-12-01

    Full Text Available When planning a randomized clinical trial, careful consideration must be given to how participants are selected for various arms of a study. Selection and accidental bias may occur when participants are not assigned to study groups with equal probability. A simple random allocation scheme is a process by which each participant has equal likelihood of being assigned to treatment versus referent groups. However, by chance an unequal number of individuals may be assigned to each arm of the study and thus decrease the power to detect statistically significant differences between groups. Block randomization is a commonly used technique in clinical trial design to reduce bias and achieve balance in the allocation of participants to treatment arms, especially when the sample size is small. This method increases the probability that each arm will contain an equal number of individuals by sequencing participant assignments by block. Yet still, the allocation process may be predictable, for example, when the investigator is not blind and the block size is fixed. This paper provides an overview of blocked randomization and illustrates how to avoid selection bias by using random block sizes.

  15. Review of online coupling of sample preparation techniques with liquid chromatography.

    Science.gov (United States)

    Pan, Jialiang; Zhang, Chengjiang; Zhang, Zhuomin; Li, Gongke

    2014-03-07

    Sample preparation is still considered as the bottleneck of the whole analytical procedure, and efforts has been conducted towards the automation, improvement of sensitivity and accuracy, and low comsuption of organic solvents. Development of online sample preparation techniques (SP) coupled with liquid chromatography (LC) is a promising way to achieve these goals, which has attracted great attention. This article reviews the recent advances on the online SP-LC techniques. Various online SP techniques have been described and summarized, including solid-phase-based extraction, liquid-phase-based extraction assisted with membrane, microwave assisted extraction, ultrasonic assisted extraction, accelerated solvent extraction and supercritical fluids extraction. Specially, the coupling approaches of online SP-LC systems and the corresponding interfaces have been discussed and reviewed in detail, such as online injector, autosampler combined with transport unit, desorption chamber and column switching. Typical applications of the online SP-LC techniques have been summarized. Then the problems and expected trends in this field are attempted to be discussed and proposed in order to encourage the further development of online SP-LC techniques. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Cleaning and Cleanliness Verification Techniques for Mars Returned Sample Handling

    Science.gov (United States)

    Mickelson, E. T.; Lindstrom, D. J.; Allton, J. H.; Hittle, J. D.

    2002-01-01

    Precision cleaning and cleanliness verification techniques are examined as a subset of a comprehensive contamination control strategy for a Mars sample return mission. Additional information is contained in the original extended abstract.

  17. Additive non-uniform random sampling in superimposed fiber Bragg grating strain gauge

    Science.gov (United States)

    Ma, Y. C.; Liu, H. Y.; Yan, S. B.; Yang, Y. H.; Yang, M. W.; Li, J. M.; Tang, J.

    2013-05-01

    This paper demonstrates an additive non-uniform random sampling and interrogation method for dynamic and/or static strain gauge using a reflection spectrum from two superimposed fiber Bragg gratings (FBGs). The superimposed FBGs are designed to generate non-equidistant space of a sensing pulse train in the time domain during dynamic strain gauge. By combining centroid finding with smooth filtering methods, both the interrogation speed and accuracy are improved. A 1.9 kHz dynamic strain is measured by generating an additive non-uniform randomly distributed 2 kHz optical sensing pulse train from a mean 500 Hz triangular periodically changing scanning frequency.

  18. Elemental analyses of goundwater: demonstrated advantage of low-flow sampling and trace-metal clean techniques over standard techniques

    Science.gov (United States)

    Creasey, C. L.; Flegal, A. R.

    The combined use of both (1) low-flow purging and sampling and (2) trace-metal clean techniques provides more representative measurements of trace-element concentrations in groundwater than results derived with standard techniques. The use of low-flow purging and sampling provides relatively undisturbed groundwater samples that are more representative of in situ conditions, and the use of trace-element clean techniques limits the inadvertent introduction of contaminants during sampling, storage, and analysis. When these techniques are applied, resultant trace-element concentrations are likely to be markedly lower than results based on standard sampling techniques. In a comparison of data derived from contaminated and control groundwater wells at a site in California, USA, trace-element concentrations from this study were 2-1000 times lower than those determined by the conventional techniques used in sampling of the same wells prior to (5months) and subsequent to (1month) the collections for this study. Specifically, the cadmium and chromium concentrations derived using standard sampling techniques exceed the California Maximum Contaminant Levels (MCL), whereas in this investigation concentrations of both of those elements are substantially below their MCLs. Consequently, the combined use of low-flow and trace-metal clean techniques may preclude erroneous reports of trace-element contamination in groundwater. Résumé L'utilisation simultanée de la purge et de l'échantillonnage à faible débit et des techniques sans traces de métaux permet d'obtenir des mesures de concentrations en éléments en traces dans les eaux souterraines plus représentatives que les résultats fournis par les techniques classiques. L'utilisation de la purge et de l'échantillonnage à faible débit donne des échantillons d'eau souterraine relativement peu perturbés qui sont plus représentatifs des conditions in situ, et le recours aux techniques sans éléments en traces limite l

  19. Dental Students' Perceptions of Digital and Conventional Impression Techniques: A Randomized Controlled Trial.

    Science.gov (United States)

    Zitzmann, Nicola U; Kovaltschuk, Irina; Lenherr, Patrik; Dedem, Philipp; Joda, Tim

    2017-10-01

    The aim of this randomized controlled trial was to analyze inexperienced dental students' perceptions of the difficulty and applicability of digital and conventional implant impressions and their preferences including performance. Fifty undergraduate dental students at a dental school in Switzerland were randomly divided into two groups (2×25). Group A first took digital impressions in a standardized phantom model and then conventional impressions, while the procedures were reversed for Group B. Participants were asked to complete a VAS questionnaire (0-100) on the level of difficulty and applicability (user/patient-friendliness) of both techniques. They were asked which technique they preferred and perceived to be more efficient. A quotient of "effective scan time per software-recorded time" (TRIOS) was calculated as an objective quality indicator for intraoral optical scanning (IOS). The majority of students perceived IOS as easier than the conventional technique. Most (72%) preferred the digital approach using IOS to take the implant impression to the conventional method (12%) or had no preference (12%). Although total work was similar for males and females, the TRIOS quotient indicated that male students tended to use their time more efficiently. In this study, dental students with no clinical experience were very capable of acquiring digital tools, indicating that digital impression techniques can be included early in the dental curriculum to help them catch up with ongoing development in computer-assisted technologies used in oral rehabilitation.

  20. A secure cyclic steganographic technique for color images using randomization

    International Nuclear Information System (INIS)

    Muhammad, K.; Ahmad, J.; Rehman, N.U.

    2014-01-01

    Information Security is a major concern in today's modern era. Almost all the communicating bodies want the security, confidentiality and integrity of their personal data. But this security goal cannot be achieved easily when we are using an open network like internet. Steganography provides one of the best solutions to this problem. This paper represents a new Cyclic Steganographic Technique (CST) based on Least Significant Bit (LSB) for true color (RGB) images. The proposed method hides the secret data in the LSBs of cover image pixels in a randomized cyclic manner. The proposed technique is evaluated using both subjective and objective analysis using histograms changeability, Peak Signal-to-Noise Ratio (PSNR) and Mean Square Error (MSE). Experimentally it is found that the proposed method gives promising results in terms of security, imperceptibility and robustness as compared to some existent methods and vindicates this new algorithm. (author)

  1. THE STUDY OF HEAVY METAL FROM ENVIRONMENTAL SAMPLES BY ATOMIC TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Ion V. POPESCU

    2011-05-01

    Full Text Available Using the Atomic Absorption Spectrometry (AAS and Energy Dispersive X-ray spectrometry (EDXRF techniques we analyzed the contents of heavy metals ( Cd, Cr, Ni, Pb, Ti, Sr, Co, Bi from eight wild mushrooms and soil substrate samples (48 samples of eight fungal species and 32 underlying soil samples, collected from ten forest sites of Dambovița County Romania. It was determined that the elements, especially heavy metals, in soil were characteristic of the acidic soils of the Romanian forest lands and are influenced by industrial pollution. Analytical possibilities of AAS and EDXRF analytical techniques have been compared and the heavy metal transfer from substrate to mushrooms has been studied. The coefficient of accumulation of essential and heavy metals has been calculated as well. Heavy metal contents of all analyzed mushrooms were generally higher than previously reported in literature.

  2. Comparison between correlated sampling and the perturbation technique of MCNP5 for fixed-source problems

    International Nuclear Information System (INIS)

    He Tao; Su Bingjing

    2011-01-01

    Highlights: → The performance of the MCNP differential operator perturbation technique is compared with that of the MCNP correlated sampling method for three types of fixed-source problems. → In terms of precision, the MCNP perturbation technique outperforms correlated sampling for one type of problem but performs comparably with or even under-performs correlated sampling for the other two types of problems. → In terms of accuracy, the MCNP perturbation calculations may predict inaccurate results for some of the test problems. However, the accuracy can be improved if the midpoint correction technique is used. - Abstract: Correlated sampling and the differential operator perturbation technique are two methods that enable MCNP (Monte Carlo N-Particle) to simulate small response change between an original system and a perturbed system. In this work the performance of the MCNP differential operator perturbation technique is compared with that of the MCNP correlated sampling method for three types of fixed-source problems. In terms of precision of predicted response changes, the MCNP perturbation technique outperforms correlated sampling for the problem involving variation of nuclide concentrations in the same direction but performs comparably with or even underperforms correlated sampling for the other two types of problems that involve void or variation of nuclide concentrations in opposite directions. In terms of accuracy, the MCNP differential operator perturbation calculations may predict inaccurate results that deviate from the benchmarks well beyond their uncertainty ranges for some of the test problems. However, the accuracy of the MCNP differential operator perturbation can be improved if the midpoint correction technique is used.

  3. A technique for extracting blood samples from mice in fire toxicity tests

    Science.gov (United States)

    Bucci, T. J.; Hilado, C. J.; Lopez, M. T.

    1976-01-01

    The extraction of adequate blood samples from moribund and dead mice has been a problem because of the small quantity of blood in each animal and the short time available between the animals' death and coagulation of the blood. These difficulties are particularly critical in fire toxicity tests because removal of the test animals while observing proper safety precautions for personnel is time-consuming. Techniques for extracting blood samples from mice were evaluated, and a technique was developed to obtain up to 0.8 ml of blood from a single mouse after death. The technique involves rapid exposure and cutting of the posterior vena cava and accumulation of blood in the peritoneal space. Blood samples of 0.5 ml or more from individual mice have been consistently obtained as much as 16 minutes after apparent death. Results of carboxyhemoglobin analyses of blood appeared reproducible and consistent with carbon monoxide concentrations in the exposure chamber.

  4. A scatter-corrected list-mode reconstruction and a practical scatter/random approximation technique for dynamic PET imaging

    International Nuclear Information System (INIS)

    Cheng, J-C; Rahmim, Arman; Blinder, Stephan; Camborde, Marie-Laure; Raywood, Kelvin; Sossi, Vesna

    2007-01-01

    We describe an ordinary Poisson list-mode expectation maximization (OP-LMEM) algorithm with a sinogram-based scatter correction method based on the single scatter simulation (SSS) technique and a random correction method based on the variance-reduced delayed-coincidence technique. We also describe a practical approximate scatter and random-estimation approach for dynamic PET studies based on a time-averaged scatter and random estimate followed by scaling according to the global numbers of true coincidences and randoms for each temporal frame. The quantitative accuracy achieved using OP-LMEM was compared to that obtained using the histogram-mode 3D ordinary Poisson ordered subset expectation maximization (3D-OP) algorithm with similar scatter and random correction methods, and they showed excellent agreement. The accuracy of the approximated scatter and random estimates was tested by comparing time activity curves (TACs) as well as the spatial scatter distribution from dynamic non-human primate studies obtained from the conventional (frame-based) approach and those obtained from the approximate approach. An excellent agreement was found, and the time required for the calculation of scatter and random estimates in the dynamic studies became much less dependent on the number of frames (we achieved a nearly four times faster performance on the scatter and random estimates by applying the proposed method). The precision of the scatter fraction was also demonstrated for the conventional and the approximate approach using phantom studies

  5. Comparison of sampling techniques for Rift Valley Fever virus ...

    African Journals Online (AJOL)

    We investigated mosquito sampling techniques with two types of traps and attractants at different time for trapping potential vectors for Rift Valley Fever virus. The study was conducted in six villages in Ngorongoro district in Tanzania from September to October 2012. A total of 1814 mosquitoes were collected, of which 738 ...

  6. A line-based vegetation sampling technique and its application in ...

    African Journals Online (AJOL)

    percentage cover, density and intercept frequency) and also provides plant size distributions, yet requires no more sampling effort than the line-intercept method.. A field test of the three techniques in succulent karoo, showed that the discriminating ...

  7. Comparison of correlation analysis techniques for irregularly sampled time series

    Directory of Open Access Journals (Sweden)

    K. Rehfeld

    2011-06-01

    Full Text Available Geoscientific measurements often provide time series with irregular time sampling, requiring either data reconstruction (interpolation or sophisticated methods to handle irregular sampling. We compare the linear interpolation technique and different approaches for analyzing the correlation functions and persistence of irregularly sampled time series, as Lomb-Scargle Fourier transformation and kernel-based methods. In a thorough benchmark test we investigate the performance of these techniques.

    All methods have comparable root mean square errors (RMSEs for low skewness of the inter-observation time distribution. For high skewness, very irregular data, interpolation bias and RMSE increase strongly. We find a 40 % lower RMSE for the lag-1 autocorrelation function (ACF for the Gaussian kernel method vs. the linear interpolation scheme,in the analysis of highly irregular time series. For the cross correlation function (CCF the RMSE is then lower by 60 %. The application of the Lomb-Scargle technique gave results comparable to the kernel methods for the univariate, but poorer results in the bivariate case. Especially the high-frequency components of the signal, where classical methods show a strong bias in ACF and CCF magnitude, are preserved when using the kernel methods.

    We illustrate the performances of interpolation vs. Gaussian kernel method by applying both to paleo-data from four locations, reflecting late Holocene Asian monsoon variability as derived from speleothem δ18O measurements. Cross correlation results are similar for both methods, which we attribute to the long time scales of the common variability. The persistence time (memory is strongly overestimated when using the standard, interpolation-based, approach. Hence, the Gaussian kernel is a reliable and more robust estimator with significant advantages compared to other techniques and suitable for large scale application to paleo-data.

  8. Application of nuclear and allied techniques for the characterisation of forensic samples

    International Nuclear Information System (INIS)

    Sudersanan, M.; Kayasth, S.R.; Pant, D.R.; Chattopadhyay, N.; Bhattacharyya, C.N.

    2002-01-01

    Full text: Forensic science deals with the application of various techniques for physics, chemistry and biology for crime investigation. The legal implication of such analysis put considerable restriction on the choice of analytical techniques. Moreover, the unknown nature of the materials, the limited availability of samples and the large number of elements to be analysed put considerable strain on the analytical chemist on the selection of the appropriate technique. The availability of nuclear techniques has considerably enhanced the scope of forensic analysis. This paper deals with the recent results on the use of nuclear and allied analytical techniques for forensic applications. One of the important types of samples of forensic importance pertain to the identification of gunshot residues. The use of nuclear techniques has considerably simplified the interpretation of results through the use of appropriate elements like Ba, Cu, Sb, Zn, As and Sn etc. The combination of non-nuclear techniques for elements like Pb and Ni which are not easily amenable to be analysed by NAA and the use of appropriate separation procedure has led to the use of this method as a valid and versatile analytical procedure. In view of the presence of a large amounts of extraneous materials like cloth, body tissues etc in these samples and the limited availability of materials, the procedures for sample collection, dissolution and analysis have been standardized. Analysis of unknown materials like powders, metallic pieces etc. for the possible presence of nuclear materials or as materials in illicit trafficking is becoming important in recent years. The use of multi-technique approach is important in this case. Use of non-destructive techniques like XRF and radioactive counting enables the preliminary identification of materials and for the detection of radioactivity. Subsequent analysis by NAA or other appropriate analytical methods allows the characterization of the materials. Such

  9. Additive non-uniform random sampling in superimposed fiber Bragg grating strain gauge

    International Nuclear Information System (INIS)

    Ma, Y C; Liu, H Y; Yan, S B; Li, J M; Tang, J; Yang, Y H; Yang, M W

    2013-01-01

    This paper demonstrates an additive non-uniform random sampling and interrogation method for dynamic and/or static strain gauge using a reflection spectrum from two superimposed fiber Bragg gratings (FBGs). The superimposed FBGs are designed to generate non-equidistant space of a sensing pulse train in the time domain during dynamic strain gauge. By combining centroid finding with smooth filtering methods, both the interrogation speed and accuracy are improved. A 1.9 kHz dynamic strain is measured by generating an additive non-uniform randomly distributed 2 kHz optical sensing pulse train from a mean 500 Hz triangular periodically changing scanning frequency. (paper)

  10. Rare event techniques applied in the Rasmussen study

    International Nuclear Information System (INIS)

    Vesely, W.E.

    1977-01-01

    The Rasmussen Study estimated public risks from commercial nuclear power plant accidents, and therefore the statistics of rare events had to be treated. Two types of rare events were specifically handled, those rare events which were probabilistically rare events and those which were statistically rare events. Four techniques were used to estimate probabilities of rare events. These techniques were aggregating data samples, discretizing ''continuous'' events, extrapolating from minor to catastrophic severities, and decomposing events using event trees and fault trees. In aggregating or combining data the goal was to enlarge the data sample so that the rare event was no longer rare, i.e., so that the enlarged data sample contained one or more occurrences of the event of interest. This aggregation gave rise to random variable treatments of failure rates, occurrence frequencies, and other characteristics estimated from data. This random variable treatment can be interpreted as being comparable to an empirical Bayes technique or a Bayesian technique. In the discretizing event technique, events of a detailed nature were grouped together into a grosser event for purposes of analysis as well as for data collection. The treatment of data characteristics as random variables helped to account for the uncertainties arising from this discretizing. In the severity extrapolation technique a severity variable was associated with each event occurrence for the purpose of predicting probabilities of catastrophic occurrences. Tail behaviors of distributions therefore needed to be considered. Finally, event trees and fault trees were used to express accident occurrences and system failures in terms of more basic events for which data existed. Common mode failures and general dependencies therefore needed to be treated. 2 figures

  11. Determination of palladium in biological samples applying nuclear analytical techniques

    International Nuclear Information System (INIS)

    Cavalcante, Cassio Q.; Sato, Ivone M.; Salvador, Vera L. R.; Saiki, Mitiko

    2008-01-01

    This study presents Pd determinations in bovine tissue samples containing palladium prepared in the laboratory, and CCQM-P63 automotive catalyst materials of the Proficiency Test, using instrumental thermal and epithermal neutron activation analysis and energy dispersive X-ray fluorescence techniques. Solvent extraction and solid phase extraction procedures were also applied to separate Pd from interfering elements before the irradiation in the nuclear reactor. The results obtained by different techniques were compared against each other to examine sensitivity, precision and accuracy. (author)

  12. A new sampling technique for surface exposure dating using a portable electric rock cutter

    Directory of Open Access Journals (Sweden)

    Yusuke Suganuma

    2012-07-01

    Full Text Available Surface exposure dating using in situ cosmogenic nuclides has contributed to our understanding of Earth-surface processes. The precision of the ages estimated by this method is affected by the sample geometry; therefore, high accuracy measurements of the thickness and shape of the rock sample (thickness and shape is crucial. However, it is sometimes diffi cult to meet these requirements by conventional sampling methods with a hammer and chisel. Here, we propose a new sampling technique using a portable electric rock cutter. This sampling technique is faster, produces more precisely shaped samples, and allows for a more precise age interpretation. A simple theoretical modeldemonstrates that the age error due to defective sample geometry increases as the total sample thickness increases, indicating the importance of precise sampling for surface exposure dating.

  13. Mantle biopsy: a technique for nondestructive tissue-sampling of freshwater mussels

    Science.gov (United States)

    David J. Berg; Wendell R. Haag; Sheldon I. Guttman; James B. Sickel

    1995-01-01

    Mantle biopsy is a means of obtaining tissue samples for genetic, physiological, and contaminant studies of bivalves; but the effects of this biopsy on survival have not been determined. We describe a simple technique for obtaining such samples from unionacean bivalves and how we compared survival among biopsied and control organisms in field experiments. Survival was...

  14. A Monte Carlo Sampling Technique for Multi-phonon Processes

    Energy Technology Data Exchange (ETDEWEB)

    Hoegberg, Thure

    1961-12-15

    A sampling technique for selecting scattering angle and energy gain in Monte Carlo calculations of neutron thermalization is described. It is supposed that the scattering is separated into processes involving different numbers of phonons. The number of phonons involved is first determined. Scattering angle and energy gain are then chosen by using special properties of the multi-phonon term.

  15. Evaluation of primary immunization coverage of infants under universal immunization programme in an urban area of bangalore city using cluster sampling and lot quality assurance sampling techniques.

    Science.gov (United States)

    K, Punith; K, Lalitha; G, Suman; Bs, Pradeep; Kumar K, Jayanth

    2008-07-01

    Is LQAS technique better than cluster sampling technique in terms of resources to evaluate the immunization coverage in an urban area? To assess and compare the lot quality assurance sampling against cluster sampling in the evaluation of primary immunization coverage. Population-based cross-sectional study. Areas under Mathikere Urban Health Center. Children aged 12 months to 23 months. 220 in cluster sampling, 76 in lot quality assurance sampling. Percentages and Proportions, Chi square Test. (1) Using cluster sampling, the percentage of completely immunized, partially immunized and unimmunized children were 84.09%, 14.09% and 1.82%, respectively. With lot quality assurance sampling, it was 92.11%, 6.58% and 1.31%, respectively. (2) Immunization coverage levels as evaluated by cluster sampling technique were not statistically different from the coverage value as obtained by lot quality assurance sampling techniques. Considering the time and resources required, it was found that lot quality assurance sampling is a better technique in evaluating the primary immunization coverage in urban area.

  16. TRAN-STAT: statistics for environmental studies, Number 22. Comparison of soil-sampling techniques for plutonium at Rocky Flats

    International Nuclear Information System (INIS)

    Gilbert, R.O.; Bernhardt, D.E.; Hahn, P.B.

    1983-01-01

    A summary of a field soil sampling study conducted around the Rocky Flats Colorado plant in May 1977 is preseted. Several different soil sampling techniques that had been used in the area were applied at four different sites. One objective was to comparethe average 239 - 240 Pu concentration values obtained by the various soil sampling techniques used. There was also interest in determining whether there are differences in the reproducibility of the various techniques and how the techniques compared with the proposed EPA technique of sampling to 1 cm depth. Statistically significant differences in average concentrations between the techniques were found. The differences could be largely related to the differences in sampling depth-the primary physical variable between the techniques. The reproducibility of the techniques was evaluated by comparing coefficients of variation. Differences between coefficients of variation were not statistically significant. Average (median) coefficients ranged from 21 to 42 percent for the five sampling techniques. A laboratory study indicated that various sample treatment and particle sizing techniques could increase the concentration of plutonium in the less than 10 micrometer size fraction by up to a factor of about 4 compared to the 2 mm size fraction

  17. Stratified source-sampling techniques for Monte Carlo eigenvalue analysis

    International Nuclear Information System (INIS)

    Mohamed, A.

    1998-01-01

    In 1995, at a conference on criticality safety, a special session was devoted to the Monte Carlo ''Eigenvalue of the World'' problem. Argonne presented a paper, at that session, in which the anomalies originally observed in that problem were reproduced in a much simplified model-problem configuration, and removed by a version of stratified source-sampling. In this paper, stratified source-sampling techniques are generalized and applied to three different Eigenvalue of the World configurations which take into account real-world statistical noise sources not included in the model problem, but which differ in the amount of neutronic coupling among the constituents of each configuration. It is concluded that, in Monte Carlo eigenvalue analysis of loosely-coupled arrays, the use of stratified source-sampling reduces the probability of encountering an anomalous result over that if conventional source-sampling methods are used. However, this gain in reliability is substantially less than that observed in the model-problem results

  18. Evaluation of primary immunization coverage of infants under universal immunization programme in an urban area of Bangalore city using cluster sampling and lot quality assurance sampling techniques

    Directory of Open Access Journals (Sweden)

    Punith K

    2008-01-01

    Full Text Available Research Question: Is LQAS technique better than cluster sampling technique in terms of resources to evaluate the immunization coverage in an urban area? Objective: To assess and compare the lot quality assurance sampling against cluster sampling in the evaluation of primary immunization coverage. Study Design: Population-based cross-sectional study. Study Setting: Areas under Mathikere Urban Health Center. Study Subjects: Children aged 12 months to 23 months. Sample Size: 220 in cluster sampling, 76 in lot quality assurance sampling. Statistical Analysis: Percentages and Proportions, Chi square Test. Results: (1 Using cluster sampling, the percentage of completely immunized, partially immunized and unimmunized children were 84.09%, 14.09% and 1.82%, respectively. With lot quality assurance sampling, it was 92.11%, 6.58% and 1.31%, respectively. (2 Immunization coverage levels as evaluated by cluster sampling technique were not statistically different from the coverage value as obtained by lot quality assurance sampling techniques. Considering the time and resources required, it was found that lot quality assurance sampling is a better technique in evaluating the primary immunization coverage in urban area.

  19. Monte Carlo and Quasi-Monte Carlo Sampling

    CERN Document Server

    Lemieux, Christiane

    2009-01-01

    Presents essential tools for using quasi-Monte Carlo sampling in practice. This book focuses on issues related to Monte Carlo methods - uniform and non-uniform random number generation, variance reduction techniques. It covers several aspects of quasi-Monte Carlo methods.

  20. Analytical techniques for measurement of 99Tc in environmental samples

    International Nuclear Information System (INIS)

    Anon.

    1979-01-01

    Three new methods have been developed for measuring 99 Tc in environmental samples. The most sensitive method is isotope dilution mass spectrometry, which allows measurement of about 1 x 10 -12 grams of 99 Tc. Results on analysis of five samples by this method compare very well with values obtained by a second independent method, which involves counting of beta particles from 99 Tc and internal conversion electrons from /sup 97m/Tc. A third method involving electrothermal atomic absorption has also been developed. Although this method is not as sensitive as the first two techniques, the cost per analysis is expected to be considerably less for certain types of samples

  1. Determination of some trace elements in biological samples using XRF and TXRF techniques

    International Nuclear Information System (INIS)

    Khuder, A.; Karjou, J.; Sawan, M. K.

    2006-07-01

    XRF and TXRF techniques were successfully used for the multi-element determination of trace elements in whole blood and human head hair samples. This was achieved by the direct analysis using XRF technique with different collimation units and by the optimized chemical procedures for TXRF analysis. Light element of S and P were preferably determined by XRF with primary x-ray excitation, while, elements of K, Ca, Fe, and Br were determined with a very good accuracy and precision using XRF with Cu- and Mo-secondary targets. The chemical procedure dependent on the preconcentration of trace elements by APDC was superiorly used for the determination of traces of Ni and Pb in the range of 1.0-1.7 μg/dl and 11-23 μg/dl, respectively, in whole blood samples by TXRF technique; determination of other elements as Cu and Zn was also achievable using this approach. Rb in whole blood samples was determined directly after the digestion of samples using PTFE-bomb for TXRF analysis. (author)

  2. Recent advances in sample preparation techniques and methods of sulfonamides detection - A review.

    Science.gov (United States)

    Dmitrienko, Stanislava G; Kochuk, Elena V; Apyari, Vladimir V; Tolmacheva, Veronika V; Zolotov, Yury A

    2014-11-19

    Sulfonamides (SAs) have been the most widely used antimicrobial drugs for more than 70 years, and their residues in foodstuffs and environmental samples pose serious health hazards. For this reason, sensitive and specific methods for the quantification of these compounds in numerous matrices have been developed. This review intends to provide an updated overview of the recent trends over the past five years in sample preparation techniques and methods for detecting SAs. Examples of the sample preparation techniques, including liquid-liquid and solid-phase extraction, dispersive liquid-liquid microextraction and QuEChERS, are given. Different methods of detecting the SAs present in food and feed and in environmental, pharmaceutical and biological samples are discussed. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. IMAGE SEGMENTATION BASED ON MARKOV RANDOM FIELD AND WATERSHED TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    This paper presented a method that incorporates Markov Random Field(MRF), watershed segmentation and merging techniques for performing image segmentation and edge detection tasks. MRF is used to obtain an initial estimate of x regions in the image under process where in MRF model, gray level x, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The process needs an initial segmented result. An initial segmentation is got based on K-means clustering technique and the minimum distance, then the region process in modeled by MRF to obtain an image contains different intensity regions. Starting from this we calculate the gradient values of that image and then employ a watershed technique. When using MRF method it obtains an image that has different intensity regions and has all the edge and region information, then it improves the segmentation result by superimpose closed and an accurate boundary of each region using watershed algorithm. After all pixels of the segmented regions have been processed, a map of primitive region with edges is generated. Finally, a merge process based on averaged mean values is employed. The final segmentation and edge detection result is one closed boundary per actual region in the image.

  4. Efficacy and complications associated with a modified inferior alveolar nerve block technique. A randomized, triple-blind clinical trial.

    Science.gov (United States)

    Montserrat-Bosch, Marta; Figueiredo, Rui; Nogueira-Magalhães, Pedro; Arnabat-Dominguez, Josep; Valmaseda-Castellón, Eduard; Gay-Escoda, Cosme

    2014-07-01

    To compare the efficacy and complication rates of two different techniques for inferior alveolar nerve blocks (IANB). A randomized, triple-blind clinical trial comprising 109 patients who required lower third molar removal was performed. In the control group, all patients received an IANB using the conventional Halsted technique, whereas in the experimental group, a modified technique using a more inferior injection point was performed. A total of 100 patients were randomized. The modified technique group showed a significantly higher onset time in the lower lip and chin area, and was frequently associated to a lingual electric discharge sensation. Three failures were recorded, 2 of them in the experimental group. No relevant local or systemic complications were registered. Both IANB techniques used in this trial are suitable for lower third molar removal. However, performing an inferior alveolar nerve block in a more inferior position (modified technique) extends the onset time, does not seem to reduce the risk of intravascular injections and might increase the risk of lingual nerve injuries.

  5. Symbol synchronization and sampling frequency synchronization techniques in real-time DDO-OFDM systems

    Science.gov (United States)

    Chen, Ming; He, Jing; Cao, Zizheng; Tang, Jin; Chen, Lin; Wu, Xian

    2014-09-01

    In this paper, we propose and experimentally demonstrate a symbol synchronization and sampling frequency synchronization techniques in real-time direct-detection optical orthogonal frequency division multiplexing (DDO-OFDM) system, over 100-km standard single mode fiber (SSMF) using a cost-effective directly modulated distributed feedback (DFB) laser. The experiment results show that the proposed symbol synchronization based on training sequence (TS) has a low complexity and high accuracy even at a sampling frequency offset (SFO) of 5000-ppm. Meanwhile, the proposed pilot-assisted sampling frequency synchronization between digital-to-analog converter (DAC) and analog-to-digital converter (ADC) is capable of estimating SFOs with an accuracy of technique can also compensate SFO effects within a small residual SFO caused by deviation of SFO estimation and low-precision or unstable clock source. The two synchronization techniques are suitable for high-speed DDO-OFDM transmission systems.

  6. Modified emission-transmission method for determining trace elements in solid samples using the XRF techniques

    International Nuclear Information System (INIS)

    Poblete, V.; Alvarez, M.; Hermosilla, M.

    2000-01-01

    This is a study of an analysis of trace elements in medium thick solid samples, by the modified transmission emission method, using the energy dispersion X-ray fluorescence technique (EDXRF). The effects of absorption and reinforcement are the main disadvantages of the EDXRF technique for the quantitative analysis of bigger elements and trace elements in solid samples. The implementation of this method and its application to a variety of samples was carried out using an infinitely thick multi-element white sample that calculates the correction factors by absorbing all the analytes in the sample. The discontinuities in the masic absorption coefficients versus energies association for each element, with medium thick and homogenous samples, are analyzed and corrected. A thorough analysis of the different theoretical and test variables are proven by using real samples, including certified material with known concentration. The simplicity of the calculation method and the results obtained show the method's major precision, with possibilities for the non-destructive routine analysis of different solid samples, using the EDXRF technique (author)

  7. Random assay in radioimmunoassay: Feasibility and application compared with batch assay

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Min; Lee, Hwan Hee; Park, Sohyun; Kim, Tae Sung; Kim, Seok Ki [Dept. of Nuclear MedicineNational Cancer Center, Goyang (Korea, Republic of)

    2016-12-15

    The batch assay has been conventionally used for radioimmunoassay (RIA) because of its technical robustness and practical convenience. However, it has limitations in terms of the relative lag of report time due to the necessity of multiple assays in a small number of samples compared with the random assay technique. In this study, we aimed to verify whether the random assay technique can be applied in RIA and is feasible in daily practice. The coefficients of variation (CVs) of eight standard curves within a single kit were calculated in a CA-125 immunoradiometric assay (IRMA) for the reference of the practically ideal CV of the CA-125 kit. Ten standard curves of 10 kits from 2 prospectively collected lots (pLot) and 85 standard curves of 85 kits from 3 retrospectively collected lots (Lot) were obtained. Additionally, the raw measurement data of both 170 control references and 1123 patients' sera were collected retrospectively between December 2015 and January 2016. A standard curve of the first kit of each lot was used as a master standard curve for a random assay. The CVs of inter-kits were analyzed in each lot, respectively. All raw measurements were normalized by decay and radioactivity. The CA-125 values from control samples and patients' sera were compared using the original batch assay and random assay. In standard curve analysis, the CVs of inter-kits in pLots and Lots were comparable to those within a single kit. The CVs from the random assay with normalization were similar to those from the batch assay in the control samples (CVs % of low/high concentration; Lot1 2.71/1.91, Lot2 2.35/1.83, Lot3 2.83/2.08 vs. Lot1 2.05/1.21, Lot2 1.66/1.48, Lot3 2.41/2.14). The ICCs between the batch assay and random assay using patients' sera were satisfactory (Lot1 1.00, Lot2 0.999, Lot3 1.00). The random assay technique could be successfully applied to the conventional CA-125 IRMA kits. The random assay showed strong agreement with the batch assay. The

  8. Recent Trends in Microextraction Techniques Employed in Analytical and Bioanalytical Sample Preparation

    Directory of Open Access Journals (Sweden)

    Abuzar Kabir

    2017-12-01

    Full Text Available Sample preparation has been recognized as a major step in the chemical analysis workflow. As such, substantial efforts have been made in recent years to simplify the overall sample preparation process. Major focusses of these efforts have included miniaturization of the extraction device; minimizing/eliminating toxic and hazardous organic solvent consumption; eliminating sample pre-treatment and post-treatment steps; reducing the sample volume requirement; reducing extraction equilibrium time, maximizing extraction efficiency etc. All these improved attributes are congruent with the Green Analytical Chemistry (GAC principles. Classical sample preparation techniques such as solid phase extraction (SPE and liquid-liquid extraction (LLE are being rapidly replaced with emerging miniaturized and environmentally friendly techniques such as Solid Phase Micro Extraction (SPME, Stir bar Sorptive Extraction (SBSE, Micro Extraction by Packed Sorbent (MEPS, Fabric Phase Sorptive Extraction (FPSE, and Dispersive Liquid-Liquid Micro Extraction (DLLME. In addition to the development of many new generic extraction sorbents in recent years, a large number of molecularly imprinted polymers (MIPs created using different template molecules have also enriched the large cache of microextraction sorbents. Application of nanoparticles as high-performance extraction sorbents has undoubtedly elevated the extraction efficiency and method sensitivity of modern chromatographic analyses to a new level. Combining magnetic nanoparticles with many microextraction sorbents has opened up new possibilities to extract target analytes from sample matrices containing high volumes of matrix interferents. The aim of the current review is to critically audit the progress of microextraction techniques in recent years, which has indisputably transformed the analytical chemistry practices, from biological and therapeutic drug monitoring to the environmental field; from foods to phyto

  9. Development of core sampling technique for ITER Type B radwaste

    Energy Technology Data Exchange (ETDEWEB)

    Kim, S. G.; Hong, K. P.; Oh, W. H.; Park, M. C.; Jung, S. H.; Ahn, S. B. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Type B radwaste (intermediate level and long lived radioactive waste) imported from ITER vacuum vessel are to be treated and stored in basement of hot cell building. The Type B radwaste treatment process is composed of buffer storage, cutting, sampling/tritium measurement, tritium removal, characterization, pre-packaging, inspection/decontamination, and storage etc. The cut slices of Type B radwaste components generated from cutting process undergo sampling process before and after tritium removal process. The purpose of sampling is to obtain small pieces of samples in order to investigate the tritium content and concentration of Type B radwaste. Core sampling, which is the candidates of sampling technique to be applied to ITER hot cell, is available for not thick (less than 50 mm) metal without use of coolant. Experimented materials were SS316L and CuCrZr in order to simulate ITER Type B radwaste. In core sampling, substantial secondary wastes from cutting chips will be produced unavoidably. Thus, core sampling machine will have to be equipped with disposal system such as suction equipment. Core sampling is considered an unfavorable method for tool wear compared to conventional drilling.

  10. A simple sample size formula for analysis of covariance in cluster randomized trials.

    NARCIS (Netherlands)

    Teerenstra, S.; Eldridge, S.; Graff, M.J.; Hoop, E. de; Borm, G.F.

    2012-01-01

    For cluster randomized trials with a continuous outcome, the sample size is often calculated as if an analysis of the outcomes at the end of the treatment period (follow-up scores) would be performed. However, often a baseline measurement of the outcome is available or feasible to obtain. An

  11. 3D-Laser-Scanning Technique Applied to Bulk Density Measurements of Apollo Lunar Samples

    Science.gov (United States)

    Macke, R. J.; Kent, J. J.; Kiefer, W. S.; Britt, D. T.

    2015-01-01

    In order to better interpret gravimetric data from orbiters such as GRAIL and LRO to understand the subsurface composition and structure of the lunar crust, it is import to have a reliable database of the density and porosity of lunar materials. To this end, we have been surveying these physical properties in both lunar meteorites and Apollo lunar samples. To measure porosity, both grain density and bulk density are required. For bulk density, our group has historically utilized sub-mm bead immersion techniques extensively, though several factors have made this technique problematic for our work with Apollo samples. Samples allocated for measurement are often smaller than optimal for the technique, leading to large error bars. Also, for some samples we were required to use pure alumina beads instead of our usual glass beads. The alumina beads were subject to undesirable static effects, producing unreliable results. Other investigators have tested the use of 3d laser scanners on meteorites for measuring bulk volumes. Early work, though promising, was plagued with difficulties including poor response on dark or reflective surfaces, difficulty reproducing sharp edges, and large processing time for producing shape models. Due to progress in technology, however, laser scanners have improved considerably in recent years. We tested this technique on 27 lunar samples in the Apollo collection using a scanner at NASA Johnson Space Center. We found it to be reliable and more precise than beads, with the added benefit that it involves no direct contact with the sample, enabling the study of particularly friable samples for which bead immersion is not possible

  12. Analysis of pure and malachite green doped polysulfone sample using FT-IR technique

    Science.gov (United States)

    Nayak, Rashmi J.; Khare, P. K.; Nayak, J. G.

    2018-05-01

    The sample of pure and malachite green doped Polysulfone in the form of foil was prepared by isothermal immersion technique. For the preparation of pure sample 4 gm of Polysulfone was dissolved in 50 ml of Dimethyl farmamide (DMF) solvent, while for the preparation of doped sample 10 mg, 50 mg and 100 mg Malachite Green was mixed with 4 gm of Polysulfone respectively. For the study of structural characterization of these pure and doped sample, Fourier Transform Infra-Red Spectroscopy (FT-IR) technique was used. This study shows that the intensity of transmittance decreases as the ratio of doping increases in pure polysulfone. The reduction in intensity of transmittance is clearly apparent in the present case more over the bands were broader which indicates towards charge transfer interaction between the donar and acceptor molecule.

  13. Comparison of different anesthesia techniques during esophagogastroduedenoscopy in children: a randomized trial.

    Science.gov (United States)

    Patino, Mario; Glynn, Susan; Soberano, Mark; Putnam, Philip; Hossain, Md Monir; Hoffmann, Clifford; Samuels, Paul; Kibelbek, Michael J; Gunter, Joel

    2015-10-01

    Esophagogastroduedenoscopy (EGD) in children is usually performed under general anesthesia. Anesthetic goals include minimization of airway complications while maximizing operating room (OR) efficiency. Currently, there is no consensus on which anesthetic technique best meets these goals. We performed a prospective randomized study comparing three different anesthetic techniques. To evaluate the incidence of respiratory complications (primary aim) and institutional efficiency (secondary aim) among three different anesthetic techniques in children undergoing EGD. Subjects received a standardized inhalation induction of anesthesia followed by randomization to one of the three groups: Group intubated, sevoflurane (IS), Group intubated, propofol (IP), and Group native airway, nonintubated, propofol (NA). Respiratory complications included minor desaturation (SpO2 between 94% and 85%), severe desaturation (SpO2 < 85%), apnea, airway obstruction/laryngospasm, aspiration, and/or inadequate anesthesia during the endoscopy. Evaluation of institutional efficiency was determined by examining the time spent during the different phases of care (anesthesia preparation, procedure, OR stay, recovery, and total perioperative care). One hundred and seventy-nine children aged 1-12 years (median 7 years; 4.0, 10.0) were enrolled (Group IS N = 60, Group IP N = 59, Group NA N = 61). The incidence of respiratory complications was higher in the Group NA (0.459) vs Group IS (0.033) or Group IP (0.086) (P < 0.0001). The most commonly observed complications were desaturation, inadequate anesthesia, and apnea. There were no differences in institutional efficiency among the three groups. Respiratory complications were more common in Group NA. The use of native airway with propofol maintenance during EGD does not offer advantages with respect to respiratory complications or institutional efficiency. © 2015 John Wiley & Sons Ltd.

  14. Evaluation of a Class of Simple and Effective Uncertainty Methods for Sparse Samples of Random Variables and Functions

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Vicente [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bonney, Matthew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Schroeder, Benjamin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Weirs, V. Gregory [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-11-01

    When very few samples of a random quantity are available from a source distribution of unknown shape, it is usually not possible to accurately infer the exact distribution from which the data samples come. Under-estimation of important quantities such as response variance and failure probabilities can result. For many engineering purposes, including design and risk analysis, we attempt to avoid under-estimation with a strategy to conservatively estimate (bound) these types of quantities -- without being overly conservative -- when only a few samples of a random quantity are available from model predictions or replicate experiments. This report examines a class of related sparse-data uncertainty representation and inference approaches that are relatively simple, inexpensive, and effective. Tradeoffs between the methods' conservatism, reliability, and risk versus number of data samples (cost) are quantified with multi-attribute metrics use d to assess method performance for conservative estimation of two representative quantities: central 95% of response; and 10-4 probability of exceeding a response threshold in a tail of the distribution. Each method's performance is characterized with 10,000 random trials on a large number of diverse and challenging distributions. The best method and number of samples to use in a given circumstance depends on the uncertainty quantity to be estimated, the PDF character, and the desired reliability of bounding the true value. On the basis of this large data base and study, a strategy is proposed for selecting the method and number of samples for attaining reasonable credibility levels in bounding these types of quantities when sparse samples of random variables or functions are available from experiments or simulations.

  15. Convergence analysis for Latin-hypercube lattice-sample selection strategies for 3D correlated random hydraulic-conductivity fields

    OpenAIRE

    Simuta-Champo, R.; Herrera-Zamarrón, G. S.

    2010-01-01

    The Monte Carlo technique provides a natural method for evaluating uncertainties. The uncertainty is represented by a probability distribution or by related quantities such as statistical moments. When the groundwater flow and transport governing equations are solved and the hydraulic conductivity field is treated as a random spatial function, the hydraulic head, velocities and concentrations also become random spatial functions. When that is the case, for the stochastic simulation of groundw...

  16. Separation Techniques for Quantification of Radionuclides in Environmental Samples

    Directory of Open Access Journals (Sweden)

    Dusan Galanda

    2009-01-01

    Full Text Available The reliable and quantitative measurement of radionuclides is important in order to determine environmental quality and radiation safety, and to monitor regulatory compliance. We examined soil samples from Podunajske Biskupice, near the city of Bratislava in the Slovak Republic, for the presence of several natural (238U, 232Th, 40K and anthropogenic (137Cs, 90Sr, 239Pu, 240Pu, 241Am radionuclides. The area is adjacent to a refinery and hazardous waste processing center, as well as the municipal incinerator plant, and so might possess an unusually high level of ecotoxic metals. We found that the levels of both naturally occurring and anthropogenic radionuclides fell within the expected ranges, indicating that these facilities pose no radiological threat to the local environment. During the course of our analysis, we modified existing techniques in order to allow us to handle the unusually large and complex samples that were needed to determine the levels of 239Pu, 240Pu, and 241Am activity. We also rated three commercial techniques for the separation of 90Sr from aqueous solutions and found that two of them, AnaLig Sr-01 and Empore Extraction Disks, were suitable for the quantitative and reliable separation of 90Sr, while the third, Sr-Spec Resin, was less so. The main criterion in evaluating these methods was the chemical recovery of 90Sr, which was less than we had expected. We also considered speed of separation and additional steps needed to prepare the sample for separation.

  17. Effectiveness of a Treatment Involving Soft Tissue Techniques and/or Neural Mobilization Techniques in the Management of Tension-Type Headache: A Randomized Controlled Trial.

    Science.gov (United States)

    Ferragut-Garcías, Alejandro; Plaza-Manzano, Gustavo; Rodríguez-Blanco, Cleofás; Velasco-Roldán, Olga; Pecos-Martín, Daniel; Oliva-Pascual-Vaca, Jesús; Llabrés-Bennasar, Bartomeu; Oliva-Pascual-Vaca, Ángel

    2017-02-01

    To evaluate the effects of a protocol involving soft tissue techniques and/or neural mobilization techniques in the management of patients with frequent episodic tension-type headache (FETTH) and those with chronic tension-type headache (CTTH). Randomized, double-blind, placebo-controlled before and after trial. Rehabilitation area of the local hospital and a private physiotherapy center. Patients (N=97; 78 women, 19 men) diagnosed with FETTH or CTTH were randomly assigned to groups A, B, C, or D. (A) Placebo superficial massage; (B) soft tissue techniques; (C) neural mobilization techniques; (D) a combination of soft tissue and neural mobilization techniques. The pressure pain threshold (PPT) in the temporal muscles (points 1 and 2) and supraorbital region (point 3), the frequency and maximal intensity of pain crisis, and the score in the Headache Impact Test-6 (HIT-6) were evaluated. All variables were assessed before the intervention, at the end of the intervention, and 15 and 30 days after the intervention. Groups B, C, and D had an increase in PPT and a reduction in frequency, maximal intensity, and HIT-6 values in all time points after the intervention as compared with baseline and group A (P<.001 for all cases). Group D had the highest PPT values and the lowest frequency and HIT-6 values after the intervention. The application of soft tissue and neural mobilization techniques to patients with FETTH or CTTH induces significant changes in PPT, the characteristics of pain crisis, and its effect on activities of daily living as compared with the application of these techniques as isolated interventions. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  18. Accelerated Solvent Extraction: An Innovative Sample Extraction Technique for Natural Products

    International Nuclear Information System (INIS)

    Hazlina Ahmad Hassali; Azfar Hanif Abd Aziz; Rosniza Razali

    2015-01-01

    Accelerated solvent extraction (ASE) is one of the novel techniques that have been developed for the extraction of phytochemicals from plants in order to shorten the extraction time, decrease the solvent consumption, increase the extraction yield and enhance the quality of extracts. This technique combines elevated temperatures and pressure with liquid solvents. This paper gives a brief overview of accelerated solvent extraction technique for sample preparation and its application to the extraction of natural products. Through practical examples, the effects of operational parameters such as temperature, volume of solvent used, extraction time and extraction yields on the performance of ASE are discussed. It is demonstrated that ASE technique allows reduced solvent consumption and shorter extraction time, while the extraction yields are even higher than those obtained with conventional methods. (author)

  19. Practical aspects of the resin bead technique for mass spectrometric sample loading

    International Nuclear Information System (INIS)

    Walker, R.L.; Pritchard, C.A.; Carter, J.A.; Smith, D.H.

    1976-07-01

    Using an anion resin bead as a loading vehicle for uranium and plutonium samples which are to be analyzed isotopically in a mass spectrometer has many advantages over conventional techniques. It is applicable to any laboratory routinely performing such analyses, but should be particularly relevant for Safeguards' purposes. Because the techniques required differ markedly from those of conventional methods, this report has been written to describe them in detail to enable those unfamiliar with the technique to master it with a minimum of trouble

  20. Neurofeedback Against Binge Eating: A Randomized Controlled Trial in a Female Subclinical Threshold Sample.

    Science.gov (United States)

    Schmidt, Jennifer; Martin, Alexandra

    2016-09-01

    Brain-directed treatment techniques, such as neurofeedback, have recently been proposed as adjuncts in the treatment of eating disorders to improve therapeutic outcomes. In line with this recommendation, a cue exposure EEG-neurofeedback protocol was developed. The present study aimed at the evaluation of the specific efficacy of neurofeedback to reduce subjective binge eating in a female subthreshold sample. A total of 75 subjects were randomized to EEG-neurofeedback, mental imagery with a comparable treatment set-up or a waitlist group. At post-treatment, only EEG-neurofeedback led to a reduced frequency of binge eating (p = .015, g = 0.65). The effects remained stable to a 3-month follow-up. EEG-neurofeedback further showed particular beneficial effects on perceived stress and dietary self-efficacy. Differences in outcomes did not arise from divergent treatment expectations. Because EEG-neurofeedback showed a specific efficacy, it may be a promising brain-directed approach that should be tested as a treatment adjunct in clinical groups with binge eating. Copyright © 2016 John Wiley & Sons, Ltd and Eating Disorders Association. Copyright © 2016 John Wiley & Sons, Ltd and Eating Disorders Association.

  1. Machine-learning techniques for family demography: an application of random forests to the analysis of divorce determinants in Germany

    OpenAIRE

    Arpino, Bruno; Le Moglie, Marco; Mencarini, Letizia

    2018-01-01

    Demographers often analyze the determinants of life-course events with parametric regression-type approaches. Here, we present a class of nonparametric approaches, broadly defined as machine learning (ML) techniques, and discuss advantages and disadvantages of a popular type known as random forest. We argue that random forests can be useful either as a substitute, or a complement, to more standard parametric regression modeling. Our discussion of random forests is intuitive and...

  2. Microbiological analysis after complete or partial removal of carious dentin using two different techniques in primary teeth: A randomized clinical trial

    Science.gov (United States)

    Singhal, Deepak Kumar; Acharya, Shashidhar; Thakur, Arun Singh

    2016-01-01

    Background: The management of deep carious lesions can be done by various techniques but residual caries dilemma still persists and bacterial reduction in cavities treated by either partial or complete caries removal techniques is debatable. So the objective of the present randomized clinical trial was to compare microbial counts in cavities submitted to complete caries removal and partial caries removal using either hand instruments or burs before and after 3 weeks of restoration. Materials and Methods: Primary molars with acute carious lesions in inner half of dentine and vital pulp were randomly divided into three groups of 14 each: Group A: Partial caries removal using hand instruments atraumatic restorative treatment (ART) only; Group B: Partial caries removal using bur; Group C: Complete caries removal using bur and caries detector dye. Dentine sample obtained after caries removal and 3 weeks after restoration, were subjected to microbial culture and counting (colony-forming units [CFU]/mg of dentine) for total viable bacterial count, Streptococcus spp., mutans streptococci, Lactobacillus spp. Results: Three techniques of caries removal showed significant (P < 0.05) reduction in all microorganisms studied after 3 weeks of evaluation, but there was no statistically significant difference in percentage reduction of microbial count among three groups. Conclusion: Results suggest the use of partial caries removal in a single session as compared to complete caries removal as a part of treatment of deep lesions in deciduous teeth in order to reduce the risk of pulp exposure. Partial caries removal using ART can be preferred for community settings as public health procedure for caries management. PMID:26962313

  3. Teaching Techniques, Types of Personality, and English Listening Skill

    Directory of Open Access Journals (Sweden)

    Ni Made Ratminingsih

    2013-01-01

    Full Text Available Abstract: Teaching Techniques, Types of Personality, and English Listening Skill. This study inves­tigated the effect of teaching techniques and types of personality on English listening skill. This experi­mental study involved 88 students under investigation, which were determined randomly through multi-stage random sampling technique. The results of the research indicate that there is an interaction effect between the teaching techniques and types of personality on the English listening skill; there is no significant difference in the listening skill between the group of students who learn using the game technique and those who learn using the song technique; the listening skill of students having extrovert personality is better than those having introvert personality; the listening skill of students having extrovert personality who learn using the game technique is lower than those who learn using the song technique; and the listen­ing skill of students having introvert personality who learn using the game technique is higher than those who learn using the song technique. Abstrak: Teknik Pembelajaran, Tipe Kepribadian, dan Keterampilan Mendengarkan Bahasa Inggris. Penelitian ini bertujuan untuk mengetahui pengaruh teknik pembelajaran dan tipe kepribadian terhadap keterampilan mendengarkan bahasa Inggris. Penelitian ini melibatkan 88 orang siswa, yang ditentukan secara acak melalui multi stage random sampling technique. Hasil penelitian menunjukkan bahwa terdapat pengaruh interaksi antara teknik pembelajaran dan tipe kepribadian terhadap keterampilan mendengarkan bahasa Inggris; tidak terdapat perbedaan yang signifikan pada keterampilan mendengarkan antara siswa yang belajar dengan teknik pembelajaran permainan dan lagu; keterampilan mendengarkan siswa yang berkepribadian ekstroversi lebih baik daripada yang berkepribadian introversi; keterampilan mendengarkan siswa yang berkepribadian ekstroversi, yang belajar dengan teknik pembelajaran

  4. A fully blanketed early B star LTE model atmosphere using an opacity sampling technique

    International Nuclear Information System (INIS)

    Phillips, A.P.; Wright, S.L.

    1980-01-01

    A fully blanketed LTE model of a stellar atmosphere with Tsub(e) = 21914 K (thetasub(e) = 0.23), log g = 4 is presented. The model includes an explicit representation of the opacity due to the strongest lines, and uses a statistical opacity sampling technique to represent the weaker line opacity. The sampling technique is subjected to several tests and the model is compared with an atmosphere calculated using the line-distribution function method. The limitations of the distribution function method and the particular opacity sampling method used here are discussed in the light of the results obtained. (author)

  5. Sample preparation techniques based on combustion reactions in closed vessels - A brief overview and recent applications

    International Nuclear Information System (INIS)

    Flores, Erico M.M.; Barin, Juliano S.; Mesko, Marcia F.; Knapp, Guenter

    2007-01-01

    In this review, a general discussion of sample preparation techniques based on combustion reactions in closed vessels is presented. Applications for several kinds of samples are described, taking into account the literature data reported in the last 25 years. The operational conditions as well as the main characteristics and drawbacks are discussed for bomb combustion, oxygen flask and microwave-induced combustion (MIC) techniques. Recent applications of MIC techniques are discussed with special concern for samples not well digested by conventional microwave-assisted wet digestion as, for example, coal and also for subsequent determination of halogens

  6. Effects of pushing techniques in birth on mother and fetus: a randomized study.

    Science.gov (United States)

    Yildirim, Gulay; Beji, Nezihe Kizilkaya

    2008-03-01

    The Valsalva pushing technique is used routinely in the second stage of labor in many countries, and it is accepted as standard obstetric management in Turkey. The purpose of this study was to determine the effects of pushing techniques on mother and fetus in birth in this setting. This randomized study was conducted between July 2003 and June 2004 in Bakirkoy Maternity and Children's Teaching Hospital in Istanbul, Turkey. One hundred low-risk primiparas between 38 and 42 weeks' gestation, who expected a spontaneous vaginal delivery, were randomized to either a spontaneous pushing group or a Valsalva-type pushing group. Spontaneous pushing women were informed during the first stage of labor about spontaneous pushing technique (open glottis pushing while breathing out) and were supported in pushing spontaneously in the second stage of labor. Similarly, Valsalva pushing women were informed during the first stage of labor about the Valsalva pushing technique (closed glottis pushing while holding their breath) and were supported in using Valsalva pushing in the second stage of labor. Perineal tears, postpartum hemorrhage, and hemoglobin levels were evaluated in mothers; and umbilical artery pH, Po(2) (mmHg), and Pco(2) (mmHg) levels and Apgar scores at 1 and 5 minutes were evaluated in newborns in both groups. No significant differences were found between the two groups in their demographics, incidence of nonreassuring fetal surveillance patterns, or use of oxytocin. The second stage of labor and duration of the expulsion phase were significantly longer with Valsalva-type pushing. Differences in the incidence of episiotomy, perineal tears, or postpartum hemorrhage were not significant between the groups. The baby fared better with spontaneous pushing, with higher 1- and 5-minute Apgar scores, and higher umbilical cord pH and Po(2) levels. After the birth, women expressed greater satisfaction with spontaneous pushing. Educating women about the spontaneous pushing

  7. The application of statistical and/or non-statistical sampling techniques by internal audit functions in the South African banking industry

    Directory of Open Access Journals (Sweden)

    D.P. van der Nest

    2015-03-01

    Full Text Available This article explores the use by internal audit functions of audit sampling techniques in order to test the effectiveness of controls in the banking sector. The article focuses specifically on the use of statistical and/or non-statistical sampling techniques by internal auditors. The focus of the research for this article was internal audit functions in the banking sector of South Africa. The results discussed in the article indicate that audit sampling is still used frequently as an audit evidence-gathering technique. Non-statistical sampling techniques are used more frequently than statistical sampling techniques for the evaluation of the sample. In addition, both techniques are regarded as important for the determination of the sample size and the selection of the sample items

  8. Approximation of Quantities of Interest in Stochastic PDEs by the Random Discrete L^2 Projection on Polynomial Spaces

    KAUST Repository

    Migliorati, G.

    2013-05-30

    In this work we consider the random discrete L^2 projection on polynomial spaces (hereafter RDP) for the approximation of scalar quantities of interest (QOIs) related to the solution of a partial differential equation model with random input parameters. In the RDP technique the QOI is first computed for independent samples of the random input parameters, as in a standard Monte Carlo approach, and then the QOI is approximated by a multivariate polynomial function of the input parameters using a discrete least squares approach. We consider several examples including the Darcy equations with random permeability, the linear elasticity equations with random elastic coefficient, and the Navier--Stokes equations in random geometries and with random fluid viscosity. We show that the RDP technique is well suited to QOIs that depend smoothly on a moderate number of random parameters. Our numerical tests confirm the theoretical findings in [G. Migliorati, F. Nobile, E. von Schwerin, and R. Tempone, Analysis of the Discrete $L^2$ Projection on Polynomial Spaces with Random Evaluations, MOX report 46-2011, Politecnico di Milano, Milano, Italy, submitted], which have shown that, in the case of a single uniformly distributed random parameter, the RDP technique is stable and optimally convergent if the number of sampling points is proportional to the square of the dimension of the polynomial space. Here optimality means that the weighted $L^2$ norm of the RDP error is bounded from above by the best $L^\\\\infty$ error achievable in the given polynomial space, up to logarithmic factors. In the case of several random input parameters, the numerical evidence indicates that the condition on quadratic growth of the number of sampling points could be relaxed to a linear growth and still achieve stable and optimal convergence. This makes the RDP technique very promising for moderately high dimensional uncertainty quantification.

  9. A sub-sampled approach to extremely low-dose STEM

    Energy Technology Data Exchange (ETDEWEB)

    Stevens, A. [OptimalSensing, Southlake, Texas 76092, USA; Duke University, ECE, Durham, North Carolina 27708, USA; Luzi, L. [Rice University, ECE, Houston, Texas 77005, USA; Yang, H. [Lawrence Berkeley National Laboratory, Berkeley, California 94720, USA; Kovarik, L. [Pacific NW National Laboratory, Richland, Washington 99354, USA; Mehdi, B. L. [Pacific NW National Laboratory, Richland, Washington 99354, USA; University of Liverpool, Materials Engineering, Liverpool L69 3GH, United Kingdom; Liyu, A. [Pacific NW National Laboratory, Richland, Washington 99354, USA; Gehm, M. E. [Duke University, ECE, Durham, North Carolina 27708, USA; Browning, N. D. [Pacific NW National Laboratory, Richland, Washington 99354, USA; University of Liverpool, Materials Engineering, Liverpool L69 3GH, United Kingdom

    2018-01-22

    The inpainting of randomly sub-sampled images acquired by scanning transmission electron microscopy (STEM) is an attractive method for imaging under low-dose conditions (≤ 1 e-2) without changing either the operation of the microscope or the physics of the imaging process. We show that 1) adaptive sub-sampling increases acquisition speed, resolution, and sensitivity; and 2) random (non-adaptive) sub-sampling is equivalent, but faster than, traditional low-dose techniques. Adaptive sub-sampling opens numerous possibilities for the analysis of beam sensitive materials and in-situ dynamic processes at the resolution limit of the aberration corrected microscope and is demonstrated here for the analysis of the node distribution in metal-organic frameworks (MOFs).

  10. Comparison of the efficacy of two anesthetic techniques of mandibular primary first molar: A randomized clinical trial

    Directory of Open Access Journals (Sweden)

    Davood Ghasemi Tudeshchoie

    2013-01-01

    Full Text Available Background: The most common technique to anesthetize mandibular primary teeth is inferior alveolar (I.A nerve block injection which induces a relatively sustained anesthesia and in turn may potentially traumatize soft-tissues. Therefore, the need of having an alternative technique of anesthesia with a shorter term but the same efficacy is reasonable. The aim of this study was a comparison of the efficacy of two anesthetic techniques of mandibular primary first molar. Materials and Methods: In this randomized crossover clinical trial, 40 children with ages ranged from 5 years to 8 years whose mandibular primary first molars were eligible for pulpotomy, were selected and divided randomly into two groups. The right and left mandibular first molars of group A were anesthetized with infiltration and I. A nerve block techniques in the first and second sessions respectively. The left and right mandibular first molars of group B were anesthetized with I.A nerve block and infiltration techniques in the first and second sessions respectively. The severity of pain were measured and recorded according to sound-eye-motor scale by a certain person. Data was analyzed using Wilcoxon Signed Rank and Mann-Whitney U tests (P < 0.05. Results: The severity of pain was lower in infiltration technique versus I.A nerve block. There were no significant differences between the severities of pain on pulpal exposure of two techniques. Conclusion: It seems that infiltration technique is more favorable to anesthetize the mandibular primary first molar compared to I.A nerve block.

  11. X-ray spectrometry and X-ray microtomography techniques for soil and geological samples analysis

    International Nuclear Information System (INIS)

    Kubala-Kukuś, A.; Banaś, D.; Braziewicz, J.; Dziadowicz, M.; Kopeć, E.; Majewska, U.; Mazurek, M.; Pajek, M.; Sobisz, M.; Stabrawa, I.; Wudarczyk-Moćko, J.; Góźdź, S.

    2015-01-01

    A particular subject of X-ray fluorescence analysis is its application in studies of the multielemental sample of composition in a wide range of concentrations, samples with different matrices, also inhomogeneous ones and those characterized with different grain size. Typical examples of these kinds of samples are soil or geological samples for which XRF elemental analysis may be difficult due to XRF disturbing effects. In this paper the WDXRF technique was applied in elemental analysis concerning different soil and geological samples (therapeutic mud, floral soil, brown soil, sandy soil, calcium aluminum cement). The sample morphology was analyzed using X-ray microtomography technique. The paper discusses the differences between the composition of samples, the influence of procedures with respect to the preparation of samples as regards their morphology and, finally, a quantitative analysis. The results of the studies were statistically tested (one-way ANOVA and correlation coefficients). For lead concentration determination in samples of sandy soil and cement-like matrix, the WDXRF spectrometer calibration was performed. The elemental analysis of the samples was complemented with knowledge of chemical composition obtained by X-ray powder diffraction.

  12. X-ray spectrometry and X-ray microtomography techniques for soil and geological samples analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kubala-Kukuś, A.; Banaś, D.; Braziewicz, J. [Institute of Physics, Jan Kochanowski University, ul. Świetokrzyska 15, 25-406 Kielce (Poland); Holycross Cancer Center, ul. Artwińskiego 3, 25-734 Kielce (Poland); Dziadowicz, M.; Kopeć, E. [Institute of Physics, Jan Kochanowski University, ul. Świetokrzyska 15, 25-406 Kielce (Poland); Majewska, U. [Institute of Physics, Jan Kochanowski University, ul. Świetokrzyska 15, 25-406 Kielce (Poland); Holycross Cancer Center, ul. Artwińskiego 3, 25-734 Kielce (Poland); Mazurek, M.; Pajek, M.; Sobisz, M.; Stabrawa, I. [Institute of Physics, Jan Kochanowski University, ul. Świetokrzyska 15, 25-406 Kielce (Poland); Wudarczyk-Moćko, J. [Holycross Cancer Center, ul. Artwińskiego 3, 25-734 Kielce (Poland); Góźdź, S. [Holycross Cancer Center, ul. Artwińskiego 3, 25-734 Kielce (Poland); Institute of Public Health, Jan Kochanowski University, IX Wieków Kielc 19, 25-317 Kielce (Poland)

    2015-12-01

    A particular subject of X-ray fluorescence analysis is its application in studies of the multielemental sample of composition in a wide range of concentrations, samples with different matrices, also inhomogeneous ones and those characterized with different grain size. Typical examples of these kinds of samples are soil or geological samples for which XRF elemental analysis may be difficult due to XRF disturbing effects. In this paper the WDXRF technique was applied in elemental analysis concerning different soil and geological samples (therapeutic mud, floral soil, brown soil, sandy soil, calcium aluminum cement). The sample morphology was analyzed using X-ray microtomography technique. The paper discusses the differences between the composition of samples, the influence of procedures with respect to the preparation of samples as regards their morphology and, finally, a quantitative analysis. The results of the studies were statistically tested (one-way ANOVA and correlation coefficients). For lead concentration determination in samples of sandy soil and cement-like matrix, the WDXRF spectrometer calibration was performed. The elemental analysis of the samples was complemented with knowledge of chemical composition obtained by X-ray powder diffraction.

  13. Sampling methods for rumen microbial counts by Real-Time PCR techniques

    Directory of Open Access Journals (Sweden)

    S. Puppo

    2010-02-01

    Full Text Available Fresh rumen samples were withdrawn from 4 cannulated buffalo females fed a fibrous diets in order to quantify bacteria concentration in the rumen by Real-Time PCR techniques. To obtain DNA of a good quality from whole rumen fluid, eight (M1-M8 different pre-filtration methods (cheese cloths, glass-fibre and nylon filter in combination with various centrifugation speeds (1000, 5000 and 14,000 rpm were tested. Genomic DNA extraction was performed either on fresh or frozen samples (-20°C. The quantitative bacteria analysis was realized according to Real-Time PCR procedure for Butyrivibrio fibrisolvens reported in literature. M5 resulted the best sampling procedure allowing to obtain a suitable genomic DNA. No differences were revealed between fresh and frozen samples.

  14. Randomization of grab-sampling strategies for estimating the annual exposure of U miners to Rn daughters.

    Science.gov (United States)

    Borak, T B

    1986-04-01

    Periodic grab sampling in combination with time-of-occupancy surveys has been the accepted procedure for estimating the annual exposure of underground U miners to Rn daughters. Temporal variations in the concentration of potential alpha energy in the mine generate uncertainties in this process. A system to randomize the selection of locations for measurement is described which can reduce uncertainties and eliminate systematic biases in the data. In general, a sample frequency of 50 measurements per year is sufficient to satisfy the criteria that the annual exposure be determined in working level months to within +/- 50% of the true value with a 95% level of confidence. Suggestions for implementing this randomization scheme are presented.

  15. A systematic examination of a random sampling strategy for source apportionment calculations.

    Science.gov (United States)

    Andersson, August

    2011-12-15

    Estimating the relative contributions from multiple potential sources of a specific component in a mixed environmental matrix is a general challenge in diverse fields such as atmospheric, environmental and earth sciences. Perhaps the most common strategy for tackling such problems is by setting up a system of linear equations for the fractional influence of different sources. Even though an algebraic solution of this approach is possible for the common situation with N+1 sources and N source markers, such methodology introduces a bias, since it is implicitly assumed that the calculated fractions and the corresponding uncertainties are independent of the variability of the source distributions. Here, a random sampling (RS) strategy for accounting for such statistical bias is examined by investigating rationally designed synthetic data sets. This random sampling methodology is found to be robust and accurate with respect to reproducibility and predictability. This method is also compared to a numerical integration solution for a two-source situation where source variability also is included. A general observation from this examination is that the variability of the source profiles not only affects the calculated precision but also the mean/median source contributions. Copyright © 2011 Elsevier B.V. All rights reserved.

  16. Characterization of electron microscopes with binary pseudo-random multilayer test samples

    International Nuclear Information System (INIS)

    Yashchuk, Valeriy V.; Conley, Raymond; Anderson, Erik H.; Barber, Samuel K.; Bouet, Nathalie; McKinney, Wayne R.; Takacs, Peter Z.; Voronov, Dmitriy L.

    2010-01-01

    We discuss the results of SEM and TEM measurements with the BPRML test samples fabricated from a BPRML (WSi2/Si with fundamental layer thickness of 3 nm) with a Dual Beam FIB (focused ion beam)/SEM technique. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML test samples can be used to characterize x-ray microscopes. Corresponding work with x-ray microscopes is in progress.

  17. A smart rotary technique versus conventional pulpectomy for primary teeth: A randomized controlled clinical study.

    Science.gov (United States)

    Mokhtari, Negar; Shirazi, Alireza-Sarraf; Ebrahimi, Masoumeh

    2017-11-01

    Techniques with adequate accuracy of working length determination along with shorter duration of treatment in pulpectomy procedure seems to be essential in pediatric dentistry. The aim of the present study was to evaluate the accuracy of root canal length measurement with Root ZX II apex locator and rotary system in pulpectomy of primary teeth. In this randomized control clinical trial complete pulpectomy was performed on 80 mandibular primary molars in 80, 4-6-year-old children. The study population was randomly divided into case and control groups. In control group conventional pulpectomy was performed and in the case group working length was determined by electronic apex locator Root ZXII and instrumented with Mtwo rotary files. Statistical evaluation was performed using Mann-Whitney and Chi-Square tests ( P <0.05). There were no significant differences between electronic apex locator Root ZXII and conventional method in accuracy of root canal length determination. However significantly less time was needed for instrumenting with rotary files ( P =0.000). Considering the comparable results in accuracy of root canal length determination and the considerably shorter instrumentation time in Root ZXII apex locator and rotary system, it may be suggested for pulpectomy in primary molar teeth. Key words: Rotary technique, conventional technique, pulpectomy, primary teeth.

  18. Sample preparation techniques in trace element analysis by X-ray emission spectroscopy

    International Nuclear Information System (INIS)

    Valkovic, V.

    1983-11-01

    The report, written under a research contract with the IAEA, contains a detailed presentation of the most difficult problem encountered in the trace element analysis by methods of the X-ray emission spectroscopy, namely the sample preparation techniques. The following items are covered. Sampling - with specific consideration of aerosols, water, soil, biological materials, petroleum and its products, storage of samples and their handling. Pretreatment of samples - preconcentration, ashing, solvent extraction, ion exchange and electrodeposition. Sample preparations for PIXE - analysis - backings, target uniformity and homogeneity, effects of irradiation, internal standards and specific examples of preparation (aqueous, biological, blood serum and solid samples). Sample preparations for radioactive sources or tube excitation - with specific examples (water, liquid and solid samples, soil, geological, plants and tissue samples). Finally, the problem of standards and reference materials, as well as that of interlaboratory comparisons, is discussed

  19. Network Sampling with Memory: A proposal for more efficient sampling from social networks

    Science.gov (United States)

    Mouw, Ted; Verdery, Ashton M.

    2013-01-01

    Techniques for sampling from networks have grown into an important area of research across several fields. For sociologists, the possibility of sampling from a network is appealing for two reasons: (1) A network sample can yield substantively interesting data about network structures and social interactions, and (2) it is useful in situations where study populations are difficult or impossible to survey with traditional sampling approaches because of the lack of a sampling frame. Despite its appeal, methodological concerns about the precision and accuracy of network-based sampling methods remain. In particular, recent research has shown that sampling from a network using a random walk based approach such as Respondent Driven Sampling (RDS) can result in high design effects (DE)—the ratio of the sampling variance to the sampling variance of simple random sampling (SRS). A high design effect means that more cases must be collected to achieve the same level of precision as SRS. In this paper we propose an alternative strategy, Network Sampling with Memory (NSM), which collects network data from respondents in order to reduce design effects and, correspondingly, the number of interviews needed to achieve a given level of statistical power. NSM combines a “List” mode, where all individuals on the revealed network list are sampled with the same cumulative probability, with a “Search” mode, which gives priority to bridge nodes connecting the current sample to unexplored parts of the network. We test the relative efficiency of NSM compared to RDS and SRS on 162 school and university networks from Add Health and Facebook that range in size from 110 to 16,278 nodes. The results show that the average design effect for NSM on these 162 networks is 1.16, which is very close to the efficiency of a simple random sample (DE=1), and 98.5% lower than the average DE we observed for RDS. PMID:24159246

  20. A Systematic Review of Surgical Randomized Controlled Trials: Part 2. Funding Source, Conflict of Interest, and Sample Size in Plastic Surgery.

    Science.gov (United States)

    Voineskos, Sophocles H; Coroneos, Christopher J; Ziolkowski, Natalia I; Kaur, Manraj N; Banfield, Laura; Meade, Maureen O; Chung, Kevin C; Thoma, Achilleas; Bhandari, Mohit

    2016-02-01

    The authors examined industry support, conflict of interest, and sample size in plastic surgery randomized controlled trials that compared surgical interventions. They hypothesized that industry-funded trials demonstrate statistically significant outcomes more often, and randomized controlled trials with small sample sizes report statistically significant results more frequently. An electronic search identified randomized controlled trials published between 2000 and 2013. Independent reviewers assessed manuscripts and performed data extraction. Funding source, conflict of interest, primary outcome direction, and sample size were examined. Chi-squared and independent-samples t tests were used in the analysis. The search identified 173 randomized controlled trials, of which 100 (58 percent) did not acknowledge funding status. A relationship between funding source and trial outcome direction was not observed. Both funding status and conflict of interest reporting improved over time. Only 24 percent (six of 25) of industry-funded randomized controlled trials reported authors to have independent control of data and manuscript contents. The mean number of patients randomized was 73 per trial (median, 43, minimum, 3, maximum, 936). Small trials were not found to be positive more often than large trials (p = 0.87). Randomized controlled trials with small sample size were common; however, this provides great opportunity for the field to engage in further collaboration and produce larger, more definitive trials. Reporting of trial funding and conflict of interest is historically poor, but it greatly improved over the study period. Underreporting at author and journal levels remains a limitation when assessing the relationship between funding source and trial outcomes. Improved reporting and manuscript control should be goals that both authors and journals can actively achieve.

  1. Large-volume constant-concentration sampling technique coupling with surface-enhanced Raman spectroscopy for rapid on-site gas analysis

    Science.gov (United States)

    Zhang, Zhuomin; Zhan, Yisen; Huang, Yichun; Li, Gongke

    2017-08-01

    In this work, a portable large-volume constant-concentration (LVCC) sampling technique coupling with surface-enhanced Raman spectroscopy (SERS) was developed for the rapid on-site gas analysis based on suitable derivatization methods. LVCC sampling technique mainly consisted of a specially designed sampling cell including the rigid sample container and flexible sampling bag, and an absorption-derivatization module with a portable pump and a gas flowmeter. LVCC sampling technique allowed large, alterable and well-controlled sampling volume, which kept the concentration of gas target in headspace phase constant during the entire sampling process and made the sampling result more representative. Moreover, absorption and derivatization of gas target during LVCC sampling process were efficiently merged in one step using bromine-thiourea and OPA-NH4+ strategy for ethylene and SO2 respectively, which made LVCC sampling technique conveniently adapted to consequent SERS analysis. Finally, a new LVCC sampling-SERS method was developed and successfully applied for rapid analysis of trace ethylene and SO2 from fruits. It was satisfied that trace ethylene and SO2 from real fruit samples could be actually and accurately quantified by this method. The minor concentration fluctuations of ethylene and SO2 during the entire LVCC sampling process were proved to be samples were achieved in range of 95.0-101% and 97.0-104% respectively. It is expected that portable LVCC sampling technique would pave the way for rapid on-site analysis of accurate concentrations of trace gas targets from real samples by SERS.

  2. Distributed fiber sparse-wideband vibration sensing by sub-Nyquist additive random sampling

    Science.gov (United States)

    Zhang, Jingdong; Zheng, Hua; Zhu, Tao; Yin, Guolu; Liu, Min; Bai, Yongzhong; Qu, Dingrong; Qiu, Feng; Huang, Xianbing

    2018-05-01

    The round trip time of the light pulse limits the maximum detectable vibration frequency response range of phase-sensitive optical time domain reflectometry ({\\phi}-OTDR). Unlike the uniform laser pulse interval in conventional {\\phi}-OTDR, we randomly modulate the pulse interval, so that an equivalent sub-Nyquist additive random sampling (sNARS) is realized for every sensing point of the long interrogation fiber. For an {\\phi}-OTDR system with 10 km sensing length, the sNARS method is optimized by theoretical analysis and Monte Carlo simulation, and the experimental results verify that a wide-band spars signal can be identified and reconstructed. Such a method can broaden the vibration frequency response range of {\\phi}-OTDR, which is of great significance in sparse-wideband-frequency vibration signal detection, such as rail track monitoring and metal defect detection.

  3. Comparative Study of Radon Concentration with Two Techniques and Elemental Analysis in Drinking Water Samples of the Jammu District, Jammu and Kashmir, India.

    Science.gov (United States)

    Kumar, Ajay; Kaur, Manpreet; Mehra, Rohit; Sharma, Dinesh Kumar; Mishra, Rosaline

    2017-10-01

    The level of radon concentration has been assessed using the Advanced SMART RnDuo technique in 30 drinking water samples from Jammu district, Jammu and Kashmir, India. The water samples were collected from wells, hand pumps, submersible pumps, and stored waters. The randomly obtained 14 values of radon concentration in water sources using the SMART RnDuo technique have been compared and cross checked by a RAD7 device. A good positive correlation (R = 0.88) has been observed between the two techniques. The overall value of radon concentration in various water sources has ranged from 2.45 to 18.43 Bq L, with a mean value of 8.24 ± 4.04 Bq L, and it agreed well with the recommended limit suggested by the European Commission and UNSCEAR. However, the higher activity of mean radon concentration was found in groundwater drawn from well, hand and submersible pumps as compared to stored water. The total annual effective dose due to radon inhalation and ingestion ranged from 6.69 to 50.31 μSv y with a mean value of 22.48 ± 11.03 μSv y. The total annual effective dose was found to lie within the safe limit (100 μSv y) suggested by WHO. Heavy metal analysis was also carried out in various water sources by using an atomic absorption spectrophotometer (AAS), and the highest value of heavy metals was found mostly in groundwater samples. The obtained results were compared with Indian and International organizations like WHO and the EU Council. Among all the samples, the elemental analysis is not on the exceeding side of the permissible limit.

  4. Experimental technique to measure thoron generation rate of building material samples using RAD7 detector

    International Nuclear Information System (INIS)

    Csige, I.; Szabó, Zs.; Szabó, Cs.

    2013-01-01

    Thoron ( 220 Rn) is the second most abundant radon isotope in our living environment. In some dwellings it is present in significant amount which calls for its identification and remediation. Indoor thoron originates mainly from building materials. In this work we have developed and tested an experimental technique to measure thoron generation rate in building material samples using RAD7 radon-thoron detector. The mathematical model of the measurement technique provides the thoron concentration response of RAD7 as a function of the sample thickness. For experimental validation of the technique an adobe building material sample was selected for measuring the thoron concentration at nineteen different sample thicknesses. Fitting the parameters of the model to the measurement results, both the generation rate and the diffusion length of thoron was estimated. We have also determined the optimal sample thickness for estimating the thoron generation rate from a single measurement. -- Highlights: • RAD7 is used for the determination of thoron generation rate (emanation). • The described model takes into account the thoron decay and attenuation. • The model describes well the experimental results. • A single point measurement method is offered at a determined sample thickness

  5. Peyton's four-step approach for teaching complex spinal manipulation techniques - a prospective randomized trial.

    Science.gov (United States)

    Gradl-Dietsch, Gertraud; Lübke, Cavan; Horst, Klemens; Simon, Melanie; Modabber, Ali; Sönmez, Tolga T; Münker, Ralf; Nebelung, Sven; Knobe, Matthias

    2016-11-03

    The objectives of this prospective randomized trial were to assess the impact of Peyton's four-step approach on the acquisition of complex psychomotor skills and to examine the influence of gender on learning outcomes. We randomly assigned 95 third to fifth year medical students to an intervention group which received instructions according to Peyton (PG) or a control group, which received conventional teaching (CG). Both groups attended four sessions on the principles of manual therapy and specific manipulative and diagnostic techniques for the spine. We assessed differences in theoretical knowledge (multiple choice (MC) exam) and practical skills (Objective Structured Practical Examination (OSPE)) with respect to type of intervention and gender. Participants took a second OSPE 6 months after completion of the course. There were no differences between groups with respect to the MC exam. Students in the PG group scored significantly higher in the OSPE. Gender had no additional impact. Results of the second OSPE showed a significant decline in competency regardless of gender and type of intervention. Peyton's approach is superior to standard instruction for teaching complex spinal manipulation skills regardless of gender. Skills retention was equally low for both techniques.

  6. Sampling phased array - a new technique for ultrasonic signal processing and imaging

    OpenAIRE

    Verkooijen, J.; Boulavinov, A.

    2008-01-01

    Over the past 10 years, the improvement in the field of microelectronics and computer engineering has led to significant advances in ultrasonic signal processing and image construction techniques that are currently being applied to non-destructive material evaluation. A new phased array technique, called 'Sampling Phased Array', has been developed in the Fraunhofer Institute for Non-Destructive Testing([1]). It realises a unique approach of measurement and processing of ultrasonic signals. Th...

  7. Non-parametric adaptive importance sampling for the probability estimation of a launcher impact position

    International Nuclear Information System (INIS)

    Morio, Jerome

    2011-01-01

    Importance sampling (IS) is a useful simulation technique to estimate critical probability with a better accuracy than Monte Carlo methods. It consists in generating random weighted samples from an auxiliary distribution rather than the distribution of interest. The crucial part of this algorithm is the choice of an efficient auxiliary PDF that has to be able to simulate more rare random events. The optimisation of this auxiliary distribution is often in practice very difficult. In this article, we propose to approach the IS optimal auxiliary density with non-parametric adaptive importance sampling (NAIS). We apply this technique for the probability estimation of spatial launcher impact position since it has currently become a more and more important issue in the field of aeronautics.

  8. Randomized clinical trial comparing control of maxillary anchorage with 2 retraction techniques.

    Science.gov (United States)

    Xu, Tian-Min; Zhang, Xiaoyun; Oh, Hee Soo; Boyd, Robert L; Korn, Edward L; Baumrind, Sheldon

    2010-11-01

    The objective of this pilot randomized clinical trial was to investigate the relative effectiveness of anchorage conservation of en-masse and 2-step retraction techniques during maximum anchorage treatment in patients with Angle Class I and Class II malocclusions. Sixty-four growing subjects (25 boys, 39 girls; 10.2-15.9 years old) who required maximum anchorage were randomized to 2 treatment techniques: en-masse retraction (n = 32) and 2-step retraction (n = 32); the groups were stratified by sex and starting age. Each patient was treated by a full-time clinic instructor experienced in the use of both retraction techniques at the orthodontic clinic of Peking University School of Stomatology in China. All patients used headgear, and most had transpalatal appliances. Lateral cephalograms taken before treatment and at the end of treatment were used to evaluate treatment-associated changes. Differences in maxillary molar mesial displacement and maxillary incisor retraction were measured with the before and after treatment tracings superimposed on the anatomic best fit of the palatal structures. Differences in mesial displacement of the maxillary first molar were compared between the 2 treatment techniques, between sexes, and between different starting-age groups. Average mesial displacement of the maxillary first molar was slightly less in the en-masse group than in the 2-step group (mean, -0.36 mm; 95% CI, -1.42 to 0.71 mm). The average mesial displacement of the maxillary first molar for both treatment groups pooled (n = 63, because 1 patient was lost to follow-up) was 4.3 ± 2.1 mm (mean ± standard deviation). Boys had significantly more mesial displacement than girls (mean difference, 1.3 mm; P <0.03). Younger adolescents had significantly more mesial displacement than older adolescents (mean difference, 1.3 mm; P <0.02). Average mesial displacement of the maxillary first molar with 2-step retraction was slightly greater than that for en-masse retraction, but the

  9. Two-compartment, two-sample technique for accurate estimation of effective renal plasma flow: Theoretical development and comparison with other methods

    International Nuclear Information System (INIS)

    Lear, J.L.; Feyerabend, A.; Gregory, C.

    1989-01-01

    Discordance between effective renal plasma flow (ERPF) measurements from radionuclide techniques that use single versus multiple plasma samples was investigated. In particular, the authors determined whether effects of variations in distribution volume (Vd) of iodine-131 iodohippurate on measurement of ERPF could be ignored, an assumption implicit in the single-sample technique. The influence of Vd on ERPF was found to be significant, a factor indicating an important and previously unappreciated source of error in the single-sample technique. Therefore, a new two-compartment, two-plasma-sample technique was developed on the basis of the observations that while variations in Vd occur from patient to patient, the relationship between intravascular and extravascular components of Vd and the rate of iodohippurate exchange between the components are stable throughout a wide range of physiologic and pathologic conditions. The new technique was applied in a series of 30 studies in 19 patients. Results were compared with those achieved with the reference, single-sample, and slope-intercept techniques. The new two-compartment, two-sample technique yielded estimates of ERPF that more closely agreed with the reference multiple-sample method than either the single-sample or slope-intercept techniques

  10. A Randomized Controlled Trial of Mastication with Complete Dentures Made by a Conventional or an Abbreviated Technique.

    Science.gov (United States)

    Mengatto, Cristiane Machado; Gameiro, Gustavo Hauber; Brondani, Mario; Owen, C Peter; MacEntee, Michael I

    The aim of this randomized clinical trial was to test the hypothesis that there are no statistically significant differences after 3 and 6 months in masticatory performance or chewing ability of people with new complete dentures made by an abbreviated or a conventional technique. The trial included 20 edentulous participants at a dental school in Brazil assigned randomly to receive dentures made by either a conventional technique involving six clinical sessions or by an abbreviated technique involving three clinical sessions. At baseline with old dentures and at 3 and 6 months with new dentures, masticatory performance was measured by counting the number of chewing strokes and the time before participants had an urge to swallow and by calculating the medium particle size of a silicone material after 20 chewing strokes and at the urge to swallow. On each occasion, the participants recorded on visual analog scales their ability to chew five food textures. Statistical significance (P ≤ .05) of changes in masticatory performance and chewing ability during the trial were analyzed with generalized estimating equations. Both techniques improved masticatory performance between baseline and 6 months and the ability to bite and chew all foods apart from hard apples. There were no significant differences in masticatory performance or chewing ability after 6 months between complete dentures made by a conventional or an abbreviated technique.

  11. Exploring pseudo- and chaotic random Monte Carlo simulations

    Science.gov (United States)

    Blais, J. A. Rod; Zhang, Zhan

    2011-07-01

    Computer simulations are an increasingly important area of geoscience research and development. At the core of stochastic or Monte Carlo simulations are the random number sequences that are assumed to be distributed with specific characteristics. Computer-generated random numbers, uniformly distributed on (0, 1), can be very different depending on the selection of pseudo-random number (PRN) or chaotic random number (CRN) generators. In the evaluation of some definite integrals, the resulting error variances can even be of different orders of magnitude. Furthermore, practical techniques for variance reduction such as importance sampling and stratified sampling can be applied in most Monte Carlo simulations and significantly improve the results. A comparative analysis of these strategies has been carried out for computational applications in planar and spatial contexts. Based on these experiments, and on some practical examples of geodetic direct and inverse problems, conclusions and recommendations concerning their performance and general applicability are included.

  12. Power distribution system reliability evaluation using dagger-sampling Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Hu, Y.; Zhao, S.; Ma, Y. [North China Electric Power Univ., Hebei (China). Dept. of Electrical Engineering

    2009-03-11

    A dagger-sampling Monte Carlo simulation method was used to evaluate power distribution system reliability. The dagger-sampling technique was used to record the failure of a component as an incident and to determine its occurrence probability by generating incident samples using random numbers. The dagger sampling technique was combined with the direct sequential Monte Carlo method to calculate average values of load point indices and system indices. Results of the 2 methods with simulation times of up to 100,000 years were then compared. The comparative evaluation showed that less computing time was required using the dagger-sampling technique due to its higher convergence speed. When simulation times were 1000 years, the dagger-sampling method required 0.05 seconds to accomplish an evaluation, while the direct method required 0.27 seconds. 12 refs., 3 tabs., 4 figs.

  13. Neutron activation analysis technique and X-ray fluorescence in bovine liver sample

    International Nuclear Information System (INIS)

    Maihara, V.A.; Favaro, D.I.T.; Vasconcellos, M.B.A.; Sato, I.M.; Salvador, V.L.

    2002-01-01

    Many analytical techniques have been used in food and diet analysis in order to determine a great number of nutritional elements, ranging from percentage to ng g -1 , with high sensitivity and accuracy. Instrumental Neutron activation Analysis (INAA) has been employed to certificate many trace elements in biological reference materials. More recently, the X-Ray Fluorescence (FRX-WD) has been also used to determine some essential elements in food samples. The INAA has been applied in nutrition studies in our laboratory at IPEN since the 80 s. For the development of analytical methodologies the use of the reference materials with the same characteristics of the sample analyzed is essential. Several Brazilian laboratories do not have conditions to use these materials due their high cost.In this paper preliminary results of commercial bovine liver sample analyses obtained by INAA and WD-XRF methods are presented. This sample was prepared to be a Brazilian candidate of reference material for a group of laboratories participating in a research project sponsored by FAPESP. The concentrations of some elements like Cl, K, Na, P, S and trace elements Br, Ca, Co, Cu, Fe, Mg, Mn, Mo, Rb, Se and Zn were determined by INAA and WD-XFR. For validation methodology of both techniques, NIST SRM 1577b Bovine Liver reference material was analyzed and the detection limits were calculated. The concentrations of elements determined by both analytical techniques were compared by using the Student's t-test and for Cl, Cu, Fe, K, Mg, Na, Rn and Zn the results do show no statistical difference for 95% significance level. (author)

  14. Tobacco smoking surveillance: is quota sampling an efficient tool for monitoring national trends? A comparison with a random cross-sectional survey.

    Directory of Open Access Journals (Sweden)

    Romain Guignard

    Full Text Available OBJECTIVES: It is crucial for policy makers to monitor the evolution of tobacco smoking prevalence. In France, this monitoring is based on a series of cross-sectional general population surveys, the Health Barometers, conducted every five years and based on random samples. A methodological study has been carried out to assess the reliability of a monitoring system based on regular quota sampling surveys for smoking prevalence. DESIGN / OUTCOME MEASURES: In 2010, current and daily tobacco smoking prevalences obtained in a quota survey on 8,018 people were compared with those of the 2010 Health Barometer carried out on 27,653 people. Prevalences were assessed separately according to the telephone equipment of the interviewee (landline phone owner vs "mobile-only", and logistic regressions were conducted in the pooled database to assess the impact of the telephone equipment and of the survey mode on the prevalences found. Finally, logistic regressions adjusted for sociodemographic characteristics were conducted in the random sample in order to determine the impact of the needed number of calls to interwiew "hard-to-reach" people on the prevalence found. RESULTS: Current and daily prevalences were higher in the random sample (respectively 33.9% and 27.5% in 15-75 years-old than in the quota sample (respectively 30.2% and 25.3%. In both surveys, current and daily prevalences were lower among landline phone owners (respectively 31.8% and 25.5% in the random sample and 28.9% and 24.0% in the quota survey. The required number of calls was slightly related to the smoking status after adjustment for sociodemographic characteristics. CONCLUSION: Random sampling appears to be more effective than quota sampling, mainly by making it possible to interview hard-to-reach populations.

  15. Effect of novel inhaler technique reminder labels on the retention of inhaler technique skills in asthma: a single-blind randomized controlled trial.

    Science.gov (United States)

    Basheti, Iman A; Obeidat, Nathir M; Reddel, Helen K

    2017-02-09

    Inhaler technique can be corrected with training, but skills drop off quickly without repeated training. The aim of our study was to explore the effect of novel inhaler technique labels on the retention of correct inhaler technique. In this single-blind randomized parallel-group active-controlled study, clinical pharmacists enrolled asthma patients using controller medication by Accuhaler [Diskus] or Turbuhaler. Inhaler technique was assessed using published checklists (score 0-9). Symptom control was assessed by asthma control test. Patients were randomized into active (ACCa; THa) and control (ACCc; THc) groups. All patients received a "Show-and-Tell" inhaler technique counseling service. Active patients also received inhaler labels highlighting their initial errors. Baseline data were available for 95 patients, 68% females, mean age 44.9 (SD 15.2) years. Mean inhaler scores were ACCa:5.3 ± 1.0; THa:4.7 ± 0.9, ACCc:5.5 ± 1.1; THc:4.2 ± 1.0. Asthma was poorly controlled (mean ACT scores ACCa:13.9 ± 4.3; THa:12.1 ± 3.9; ACCc:12.7 ± 3.3; THc:14.3 ± 3.7). After training, all patients had correct technique (score 9/9). After 3 months, there was significantly less decline in inhaler technique scores for active than control groups (mean difference: Accuhaler -1.04 (95% confidence interval -1.92, -0.16, P = 0.022); Turbuhaler -1.61 (-2.63, -0.59, P = 0.003). Symptom control improved significantly, with no significant difference between active and control patients, but active patients used less reliever medication (active 2.19 (SD 1.78) vs. control 3.42 (1.83) puffs/day, P = 0.002). After inhaler training, novel inhaler technique labels improve retention of correct inhaler technique skills with dry powder inhalers. Inhaler technique labels represent a simple, scalable intervention that has the potential to extend the benefit of inhaler training on asthma outcomes. REMINDER LABELS IMPROVE INHALER TECHNIQUE: Personalized

  16. Improved mesh based photon sampling techniques for neutron activation analysis

    International Nuclear Information System (INIS)

    Relson, E.; Wilson, P. P. H.; Biondo, E. D.

    2013-01-01

    The design of fusion power systems requires analysis of neutron activation of large, complex volumes, and the resulting particles emitted from these volumes. Structured mesh-based discretization of these problems allows for improved modeling in these activation analysis problems. Finer discretization of these problems results in large computational costs, which drives the investigation of more efficient methods. Within an ad hoc subroutine of the Monte Carlo transport code MCNP, we implement sampling of voxels and photon energies for volumetric sources using the alias method. The alias method enables efficient sampling of a discrete probability distribution, and operates in 0(1) time, whereas the simpler direct discrete method requires 0(log(n)) time. By using the alias method, voxel sampling becomes a viable alternative to sampling space with the 0(1) approach of uniformly sampling the problem volume. Additionally, with voxel sampling it is straightforward to introduce biasing of volumetric sources, and we implement this biasing of voxels as an additional variance reduction technique that can be applied. We verify our implementation and compare the alias method, with and without biasing, to direct discrete sampling of voxels, and to uniform sampling. We study the behavior of source biasing in a second set of tests and find trends between improvements and source shape, material, and material density. Overall, however, the magnitude of improvements from source biasing appears to be limited. Future work will benefit from the implementation of efficient voxel sampling - particularly with conformal unstructured meshes where the uniform sampling approach cannot be applied. (authors)

  17. Large-volume constant-concentration sampling technique coupling with surface-enhanced Raman spectroscopy for rapid on-site gas analysis.

    Science.gov (United States)

    Zhang, Zhuomin; Zhan, Yisen; Huang, Yichun; Li, Gongke

    2017-08-05

    In this work, a portable large-volume constant-concentration (LVCC) sampling technique coupling with surface-enhanced Raman spectroscopy (SERS) was developed for the rapid on-site gas analysis based on suitable derivatization methods. LVCC sampling technique mainly consisted of a specially designed sampling cell including the rigid sample container and flexible sampling bag, and an absorption-derivatization module with a portable pump and a gas flowmeter. LVCC sampling technique allowed large, alterable and well-controlled sampling volume, which kept the concentration of gas target in headspace phase constant during the entire sampling process and made the sampling result more representative. Moreover, absorption and derivatization of gas target during LVCC sampling process were efficiently merged in one step using bromine-thiourea and OPA-NH 4 + strategy for ethylene and SO 2 respectively, which made LVCC sampling technique conveniently adapted to consequent SERS analysis. Finally, a new LVCC sampling-SERS method was developed and successfully applied for rapid analysis of trace ethylene and SO 2 from fruits. It was satisfied that trace ethylene and SO 2 from real fruit samples could be actually and accurately quantified by this method. The minor concentration fluctuations of ethylene and SO 2 during the entire LVCC sampling process were proved to be gas targets from real samples by SERS. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Iterative algorithm of discrete Fourier transform for processing randomly sampled NMR data sets

    International Nuclear Information System (INIS)

    Stanek, Jan; Kozminski, Wiktor

    2010-01-01

    Spectra obtained by application of multidimensional Fourier Transformation (MFT) to sparsely sampled nD NMR signals are usually corrupted due to missing data. In the present paper this phenomenon is investigated on simulations and experiments. An effective iterative algorithm for artifact suppression for sparse on-grid NMR data sets is discussed in detail. It includes automated peak recognition based on statistical methods. The results enable one to study NMR spectra of high dynamic range of peak intensities preserving benefits of random sampling, namely the superior resolution in indirectly measured dimensions. Experimental examples include 3D 15 N- and 13 C-edited NOESY-HSQC spectra of human ubiquitin.

  19. Application of digital sampling techniques to particle identification in scintillation detectors

    International Nuclear Information System (INIS)

    Bardelli, L.; Bini, M.; Poggi, G.; Taccetti, N.

    2002-01-01

    In this paper, the use of a fast digitizing system for identification of fast charged particles with scintillation detectors is discussed. The three-layer phoswich detectors developed in the framework of the FIASCO experiment for the detection of light charged particles (LCP) and intermediate mass fragments (IMF) emitted in heavy-ion collisions at Fermi energies are briefly discussed. The standard analog electronics treatment of the signals for particle identification is illustrated. After a description of the digitizer designed to perform a fast digital sampling of the phoswich signals, the feasibility of particle identification on the sampled data is demonstrated. The results obtained with two different pulse shape discrimination analyses based on the digitally sampled data are compared with the standard analog signal treatment. The obtained results suggest, for the present application, the replacement of the analog methods with the digital sampling technique

  20. Uranium content measurement in drinking water samples using track etch technique

    International Nuclear Information System (INIS)

    Kumar, Mukesh; Kumar, Ajay; Singh, Surinder; Mahajan, R.K.; Walia, T.P.S.

    2003-01-01

    The concentration of uranium has been assessed in drinking water samples collected from different locations in Bathinda district, Punjab, India. The water samples are taken from hand pumps and tube wells. Uranium is determined using fission track technique. Uranium concentration in the water samples varies from 1.65±0.06 to 74.98±0.38 μg/l. These values are compared with safe limit values recommended for drinking water. Most of the water samples are found to have uranium concentration above the safe limit. Analysis of some heavy metals (Zn, Cd, Pb and Cu) in water is also done in order to see if some correlation exists between the concentration of uranium and these heavy metals. A weak positive correlation has been observed between the concentration of uranium and heavy metals of Pb, Cd and Cu

  1. Attempts to develop a new nuclear measurement technique of β-glucuronidase levels in biological samples

    International Nuclear Information System (INIS)

    Unak, T.; Avcibasi, U.; Yildirim, Y.; Cetinkaya, B.

    2003-01-01

    β-Glucuronidase is one of the most important hydrolytic enzymes in living systems and plays an essential role in the detoxification pathway of toxic materials incorporated into the metabolism. Some organs, especially liver and some tumour tissues, have high level of β-glucuronidase activity. As a result the enzymatic activity of some kind of tumour cells, the radiolabelled glucuronide conjugates of cytotoxic, as well as radiotoxic compounds have potentially very valuable diagnostic and therapeutic applications in cancer research. For this reason, a sensitive measurement of β-glucuronidase levels in normal and tumour tissues is a very important step for these kinds of applications. According to the classical measurement method of β-glucuronidase activity, in general, the quantity of phenolphthalein liberated from its glucuronide conjugate, i.e. phenolphthalein-glucuronide, by β-glucuronidase has been measured by use of the spectrophotometric technique. The lower detection limit of phenolphthalein by the spectrophotometric technique is about 1-3 mg. This means that the β-glucuronidase levels could not be detected in biological samples having lower levels of β-glucuronidase activity and therefore the applications of the spectrophotometric technique in cancer research are very seriously limited. Starting from this consideration, we recently attempted to develop a new nuclear technique to measure much lower concentrations of β-glucuronidase in biological samples. To improve the detection limit, phenolphthalein-glucuronide and also phenyl-N-glucuronide were radioiodinated with 131 I and their radioactivity was measured by use of the counting technique. Therefore, the quantity of phenolphthalein or aniline radioiodinated with 131 I and liberated by the deglucuronidation reactivity of β-glucuronidase was used in an attempt to measure levels lower than the spectrophotometric measurement technique. The results obtained clearly verified that 0.01 pg level of

  2. Random-Access Technique for Self-Organization of 5G Millimeter-Wave Cellular Communications

    Directory of Open Access Journals (Sweden)

    Jasper Meynard Arana

    2016-01-01

    Full Text Available The random-access (RA technique is a key procedure in cellular networks and self-organizing networks (SONs, but the overall processing time of this technique in millimeter-wave (mm-wave cellular systems with directional beams is very long because RA preambles (RAPs should be transmitted in all directions of Tx and Rx beams. In this paper, two different types of preambles (RAP-1 and RAP-2 are proposed to reduce the processing time in the RA stage. After analyzing the correlation property, false-alarm probability, and detection probability of the proposed RAPs, we perform simulations to show that the RAP-2 is suitable for RA in mm-wave cellular systems with directional beams because of the smaller processing time and high detection probability in multiuser environments.

  3. Comparison between ultrasound guided technique and digital palpation technique for radial artery cannulation in adult patients: An updated meta-analysis of randomized controlled trials.

    Science.gov (United States)

    Bhattacharjee, Sulagna; Maitra, Souvik; Baidya, Dalim K

    2018-03-22

    Possible advantages and risks associated with ultrasound guided radial artery cannulation in-comparison to digital palpation guided method in adult patients are not fully known. We have compared ultrasound guided radial artery cannulation with digital palpation technique in this meta-analysis. Meta-analysis of randomized controlled trials. Trials conducted in operating room, emergency department, cardiac catheterization laboratory. PubMed and Cochrane Central Register of Controlled Trials (CENTRAL) were searched (from 1946 to 20th November 2017) to identify prospective randomized controlled trials in adult patients. Two-dimensional ultrasound guided radial artery catheterization versus digital palpation guided radial artery cannulation. Overall cannulation success rate, first attempt success rate, time to cannulation and mean number of attempts to successful cannulation. Odds ratio (OR) and standardized mean difference (SMD) or mean difference (MD) with 95% confidence interval (CI) were calculated for categorical and continuous variables respectively. Data of 1895 patients from 10 studies have been included in this meta- analysis. Overall cannulation success rate was similar between ultrasound guided technique and digital palpation [OR (95% CI) 2.01 (1.00, 4.06); p = 0.05]. Ultrasound guided radial artery cannulation is associated with higher first attempt success rate of radial artery cannulation in comparison to digital palpation [OR (95% CI) 2.76 (186, 4.10); p guided technique with palpation technique. Radial artery cannulation by ultrasound guidance may increase the first attempt success rate but not the overall cannulation success when compared to digital palpation technique. However, results of this meta-analysis should be interpreted with caution due presence of heterogeneity. Copyright © 2018. Published by Elsevier Inc.

  4. Compressive Sampling based Image Coding for Resource-deficient Visual Communication.

    Science.gov (United States)

    Liu, Xianming; Zhai, Deming; Zhou, Jiantao; Zhang, Xinfeng; Zhao, Debin; Gao, Wen

    2016-04-14

    In this paper, a new compressive sampling based image coding scheme is developed to achieve competitive coding efficiency at lower encoder computational complexity, while supporting error resilience. This technique is particularly suitable for visual communication with resource-deficient devices. At the encoder, compact image representation is produced, which is a polyphase down-sampled version of the input image; but the conventional low-pass filter prior to down-sampling is replaced by a local random binary convolution kernel. The pixels of the resulting down-sampled pre-filtered image are local random measurements and placed in the original spatial configuration. The advantages of local random measurements are two folds: 1) preserve high-frequency image features that are otherwise discarded by low-pass filtering; 2) remain a conventional image and can therefore be coded by any standardized codec to remove statistical redundancy of larger scales. Moreover, measurements generated by different kernels can be considered as multiple descriptions of the original image and therefore the proposed scheme has the advantage of multiple description coding. At the decoder, a unified sparsity-based soft-decoding technique is developed to recover the original image from received measurements in a framework of compressive sensing. Experimental results demonstrate that the proposed scheme is competitive compared with existing methods, with a unique strength of recovering fine details and sharp edges at low bit-rates.

  5. Development of Large Sample Neutron Activation Technique for New Applications in Thailand

    International Nuclear Information System (INIS)

    Laoharojanaphand, S.; Tippayakul, C.; Wonglee, S.; Channuie, J.

    2018-01-01

    The development of the Large Sample Neutron Activation Analysis (LSNAA) in Thailand is presented in this paper. The technique had been firstly developed with rice sample as the test subject. The Thai Research Reactor-1/Modification 1 (TRR-1/M1) was used as the neutron source. The first step was to select and characterize an appropriate irradiation facility for the research. An out-core irradiation facility (A4 position) was first attempted. The results performed with the A4 facility were then used as guides for the subsequent experiments with the thermal column facility. The characterization of the thermal column was performed with Cu-wire to determine spatial distribution without and with rice sample. The flux depression without rice sample was observed to be less than 30% while the flux depression with rice sample increased to within 60%. The flux monitors internal to the rice sample were used to determine average flux over the rice sample. The gamma selfshielding effect during gamma measurement was corrected using the Monte Carlo simulation. The ratio between the efficiencies of the volume source and the point source for each energy point was calculated by the MCNPX code. The research team adopted the k0-NAA methodology to calculate the element concentration in the research. The k0-NAA program which developed by IAEA was set up to simulate the conditions of the irradiation and measurement facilities used in this research. The element concentrations in the bulk rice sample were then calculated taking into account the flux depression and gamma efficiency corrections. At the moment, the results still show large discrepancies with the reference values. However, more research on the validation will be performed to identify sources of errors. Moreover, this LS-NAA technique was introduced for the activation analysis of the IAEA archaeological mock-up. The results are provided in this report. (author)

  6. Efficient sampling algorithms for Monte Carlo based treatment planning

    International Nuclear Information System (INIS)

    DeMarco, J.J.; Solberg, T.D.; Chetty, I.; Smathers, J.B.

    1998-01-01

    Efficient sampling algorithms are necessary for producing a fast Monte Carlo based treatment planning code. This study evaluates several aspects of a photon-based tracking scheme and the effect of optimal sampling algorithms on the efficiency of the code. Four areas were tested: pseudo-random number generation, generalized sampling of a discrete distribution, sampling from the exponential distribution, and delta scattering as applied to photon transport through a heterogeneous simulation geometry. Generalized sampling of a discrete distribution using the cutpoint method can produce speedup gains of one order of magnitude versus conventional sequential sampling. Photon transport modifications based upon the delta scattering method were implemented and compared with a conventional boundary and collision checking algorithm. The delta scattering algorithm is faster by a factor of six versus the conventional algorithm for a boundary size of 5 mm within a heterogeneous geometry. A comparison of portable pseudo-random number algorithms and exponential sampling techniques is also discussed

  7. Integrated sampling vs ion chromatography: Mathematical considerations

    International Nuclear Information System (INIS)

    Sundberg, L.L.

    1992-01-01

    This paper presents some general purpose considerations that can be utilized when comparisons are made between the results of integrated sampling over several hours or days, and ion chromatography where sample collection times are measured in minutes. The discussion is geared toward the measurement of soluble transition metal ions in BWR feedwater. Under steady-state conditions, the concentrations reported by both techniques should be in reasonable agreement. Transient operations effect both types of measurements. A simplistic model, applicable to both sampling techniques, is presented that demonstrates the effect of transients which occur during the acquisition of a steady-state sample. For a common set of conditions, the integrated concentration is proportional to the concentration and duration of the transient, and inversely proportional to the sample collection time. The adjustment of the collection period during a known transient allows an estimation of peak transient concentration. Though the probability of sampling a random transient with the integrated sampling technique is very high, the magnitude is severely diluted with long integration times. Transient concentrations are magnified with ion chromatography, but the probability of sampling a transient is significantly lower using normal ion chromatography operations. Various data averaging techniques are discussed for integrated sampling and IC determinations. The use of time-weighted averages appears to offer more advantages over arithmetic and geometric means for integrated sampling when the collection period is variable. For replicate steady-state ion chromatography determinations which bracket a transient sample, it may be advantageous to ignore the calculation of averages, and report the data as trending information only

  8. Contributions to sampling statistics

    CERN Document Server

    Conti, Pier; Ranalli, Maria

    2014-01-01

    This book contains a selection of the papers presented at the ITACOSM 2013 Conference, held in Milan in June 2013. ITACOSM is the bi-annual meeting of the Survey Sampling Group S2G of the Italian Statistical Society, intended as an international  forum of scientific discussion on the developments of theory and application of survey sampling methodologies and applications in human and natural sciences. The book gathers research papers carefully selected from both invited and contributed sessions of the conference. The whole book appears to be a relevant contribution to various key aspects of sampling methodology and techniques; it deals with some hot topics in sampling theory, such as calibration, quantile-regression and multiple frame surveys, and with innovative methodologies in important topics of both sampling theory and applications. Contributions cut across current sampling methodologies such as interval estimation for complex samples, randomized responses, bootstrap, weighting, modeling, imputati...

  9. Computer code ENDSAM for random sampling and validation of the resonance parameters covariance matrices of some major nuclear data libraries

    International Nuclear Information System (INIS)

    Plevnik, Lucijan; Žerovnik, Gašper

    2016-01-01

    Highlights: • Methods for random sampling of correlated parameters. • Link to open-source code for sampling of resonance parameters in ENDF-6 format. • Validation of the code on realistic and artificial data. • Validation of covariances in three major contemporary nuclear data libraries. - Abstract: Methods for random sampling of correlated parameters are presented. The methods are implemented for sampling of resonance parameters in ENDF-6 format and a link to the open-source code ENDSAM is given. The code has been validated on realistic data. Additionally, consistency of covariances of resonance parameters of three major contemporary nuclear data libraries (JEFF-3.2, ENDF/B-VII.1 and JENDL-4.0u2) has been checked.

  10. Direct sampling technique of bees on Vriesea philippocoburgii (Bromeliaceae, Tillandsioideae flowers

    Directory of Open Access Journals (Sweden)

    Afonso Inácio Orth

    2004-11-01

    Full Text Available In our study on Vriesea philippocoburgii Wawra pollination, due to the small proportion of flowers in anthesis on a single day and the damage caused to inflorescences when netting directly on flowers, we used the direct sampling technique (DST of bees on flowers. This technique was applied to 40 flowering plants and resulted in the capture of 160 specimens, belonging to nine genera of Apoidea and separated into 19 morph species. As DST maintains the integrity of flowers for later Bees’ visits, it can enhance the survey’s performance, constituting an alternative methodology for the collection of bees visiting flowering plants.

  11. Combined-stent covered technique and single covered stent technique for transjugular intrahepatic porto-systemic shunt: a prospective randomized controlled study

    International Nuclear Information System (INIS)

    Wang Changming; Li Xuan; Fu Jun; Lu Xianjun; Luan Jingyuan; Li Tianrun; Zhao Jun; Dong Guoxiang

    2014-01-01

    Objective: To compare the technique of combined stents with that of single stent-graft for the construction of transjugular intrahepatic porto-systemic shunt (TIPS) and to discuss their clinical outcomes. Methods: During the period from April 2011 to Dec. 2012, a total of 30 patients with upper gastrointestinal bleeding due to portal hypertension were admitted to the hospital. TIPS procedure was carried out in all the 30 patients. The patients were randomly allocated into either combined-stent group (n=17) or stent-graft group (control group, n=13) on the basis of a computer-generated randomization sequence. The PSC quality control, the patency rate of portal vein branches and the shunts, the incidence of encephalopathy and the survival rate of the two groups were documented and analyzed. Results: Technical success rate was 100%. After the treatment, in both groups the PSG became significantly lower than the PSG determined before the treatment (P<0.000). According to quality control chart, the of combined stents was superior to stent-graft technique in effectively controlling PSG. Five days after TIPS, the ammonia level in the combined- stent group was significantly decreased (P=0.029), while in the control group the ammonia level showed no significant changes (P=0.065). One patient died shortly after TIPS. The median follow- up time was 181 days. During the follow-up period death occurred in 3 cases in each group. The difference in the survival rate between the two groups was no significant (P=0.906). Of the three patients who developed encephalopathy, two were in the combined-stent group and one was in the control group. After medication the symptoms of encephalopathy were relieved. The patency rate of the shunts was 100% and the re-bleeding rate was 0% for both groups. The patency rate of portal vein branches was significantly higher in the combined-stent group than that in the control group (P=0.039). Conclusion: For the construction of TIPS, the technique of

  12. ON THE ESTIMATION OF RANDOM UNCERTAINTIES OF STAR FORMATION HISTORIES

    Energy Technology Data Exchange (ETDEWEB)

    Dolphin, Andrew E., E-mail: adolphin@raytheon.com [Raytheon Company, Tucson, AZ, 85734 (United States)

    2013-09-20

    The standard technique for measurement of random uncertainties of star formation histories (SFHs) is the bootstrap Monte Carlo, in which the color-magnitude diagram (CMD) is repeatedly resampled. The variation in SFHs measured from the resampled CMDs is assumed to represent the random uncertainty in the SFH measured from the original data. However, this technique systematically and significantly underestimates the uncertainties for times in which the measured star formation rate is low or zero, leading to overly (and incorrectly) high confidence in that measurement. This study proposes an alternative technique, the Markov Chain Monte Carlo (MCMC), which samples the probability distribution of the parameters used in the original solution to directly estimate confidence intervals. While the most commonly used MCMC algorithms are incapable of adequately sampling a probability distribution that can involve thousands of highly correlated dimensions, the Hybrid Monte Carlo algorithm is shown to be extremely effective and efficient for this particular task. Several implementation details, such as the handling of implicit priors created by parameterization of the SFH, are discussed in detail.

  13. ON THE ESTIMATION OF RANDOM UNCERTAINTIES OF STAR FORMATION HISTORIES

    International Nuclear Information System (INIS)

    Dolphin, Andrew E.

    2013-01-01

    The standard technique for measurement of random uncertainties of star formation histories (SFHs) is the bootstrap Monte Carlo, in which the color-magnitude diagram (CMD) is repeatedly resampled. The variation in SFHs measured from the resampled CMDs is assumed to represent the random uncertainty in the SFH measured from the original data. However, this technique systematically and significantly underestimates the uncertainties for times in which the measured star formation rate is low or zero, leading to overly (and incorrectly) high confidence in that measurement. This study proposes an alternative technique, the Markov Chain Monte Carlo (MCMC), which samples the probability distribution of the parameters used in the original solution to directly estimate confidence intervals. While the most commonly used MCMC algorithms are incapable of adequately sampling a probability distribution that can involve thousands of highly correlated dimensions, the Hybrid Monte Carlo algorithm is shown to be extremely effective and efficient for this particular task. Several implementation details, such as the handling of implicit priors created by parameterization of the SFH, are discussed in detail

  14. Identification of unknown sample using NAA, EDXRF, XRD techniques

    International Nuclear Information System (INIS)

    Dalvi, Aditi A.; Swain, K.K.; Chavan, Trupti; Remya Devi, P.S.; Wagh, D.N.; Verma, R.

    2015-01-01

    Analytical Chemistry Division (ACD), Bhabha Atomic Research Centre (BARC) receives samples from law enforcement agencies such as Directorate of Revenue Intelligence, Customs for analysis. Five unknown grey powdered samples were received for identification and were suspected to be Iridium (Ir). Identification of unknown sample is always a challenging task and suitable analytical techniques have to be judiciously utilized for arriving at the conclusion. Qualitative analysis was carried out using Jordan Valley, EX-3600 M Energy dispersive X-ray fluorescence (EDXRF) spectrometer at ACD, BARC. A SLP series LEO Si (Li) detector (active area: 30 mm 2 ; thickness: 3.5 mm; resolution: 140 eV at 5.9 keV of Mn K X-ray) was used during the measurement and only characteristic X-rays of Ir (Lα: 9.17 keV and Lβ: 10.70 keV) was seen in the X-ray spectrum. X-ray diffraction (XRD) measurement results indicated that the Ir was in the form of metal. To confirm the XRD data, neutron activation analysis (NAA) was carried out by irradiating samples and elemental standards (as comparator) in graphite reflector position of Advanced Heavy Water Reactor Critical Facility (AHWR CF) reactor, BARC, Mumbai. After suitable decay period, gamma activity measurements were carried out using 45% HPGe detector coupled to 8 k multi channel analyzer. Characteristic gamma line at 328.4 keV of the activation product 194 Ir was used for quantification of iridium and relative method of NAA was used for concentration calculations. NAA results confirmed that all the samples were Iridium metal. (author)

  15. Improving ambulatory saliva-sampling compliance in pregnant women: a randomized controlled study.

    Directory of Open Access Journals (Sweden)

    Julian Moeller

    Full Text Available OBJECTIVE: Noncompliance with scheduled ambulatory saliva sampling is common and has been associated with biased cortisol estimates in nonpregnant subjects. This study is the first to investigate in pregnant women strategies to improve ambulatory saliva-sampling compliance, and the association between sampling noncompliance and saliva cortisol estimates. METHODS: We instructed 64 pregnant women to collect eight scheduled saliva samples on two consecutive days each. Objective compliance with scheduled sampling times was assessed with a Medication Event Monitoring System and self-reported compliance with a paper-and-pencil diary. In a randomized controlled study, we estimated whether a disclosure intervention (informing women about objective compliance monitoring and a reminder intervention (use of acoustical reminders improved compliance. A mixed model analysis was used to estimate associations between women's objective compliance and their diurnal cortisol profiles, and between deviation from scheduled sampling and the cortisol concentration measured in the related sample. RESULTS: Self-reported compliance with a saliva-sampling protocol was 91%, and objective compliance was 70%. The disclosure intervention was associated with improved objective compliance (informed: 81%, noninformed: 60%, F(1,60  = 17.64, p<0.001, but not the reminder intervention (reminders: 68%, without reminders: 72%, F(1,60 = 0.78, p = 0.379. Furthermore, a woman's increased objective compliance was associated with a higher diurnal cortisol profile, F(2,64  = 8.22, p<0.001. Altered cortisol levels were observed in less objective compliant samples, F(1,705  = 7.38, p = 0.007, with delayed sampling associated with lower cortisol levels. CONCLUSIONS: The results suggest that in pregnant women, objective noncompliance with scheduled ambulatory saliva sampling is common and is associated with biased cortisol estimates. To improve sampling compliance, results suggest

  16. Development of SYVAC sampling techniques

    International Nuclear Information System (INIS)

    Prust, J.O.; Dalrymple, G.J.

    1985-04-01

    This report describes the requirements of a sampling scheme for use with the SYVAC radiological assessment model. The constraints on the number of samples that may be taken is considered. The conclusions from earlier studies using the deterministic generator sampling scheme are summarised. The method of Importance Sampling and a High Dose algorithm, which are designed to preferentially sample in the high dose region of the parameter space, are reviewed in the light of experience gained from earlier studies and the requirements of a site assessment and sensitivity analyses. In addition the use of an alternative numerical integration method for estimating risk is discussed. It is recommended that the method of Importance Sampling is developed and tested for use with SYVAC. An alternative numerical integration method is not recommended for investigation at this stage but should be the subject of future work. (author)

  17. Field-based random sampling without a sampling frame: control selection for a case-control study in rural Africa.

    Science.gov (United States)

    Crampin, A C; Mwinuka, V; Malema, S S; Glynn, J R; Fine, P E

    2001-01-01

    Selection bias, particularly of controls, is common in case-control studies and may materially affect the results. Methods of control selection should be tailored both for the risk factors and disease under investigation and for the population being studied. We present here a control selection method devised for a case-control study of tuberculosis in rural Africa (Karonga, northern Malawi) that selects an age/sex frequency-matched random sample of the population, with a geographical distribution in proportion to the population density. We also present an audit of the selection process, and discuss the potential of this method in other settings.

  18. Technical note: Alternatives to reduce adipose tissue sampling bias.

    Science.gov (United States)

    Cruz, G D; Wang, Y; Fadel, J G

    2014-10-01

    Understanding the mechanisms by which nutritional and pharmaceutical factors can manipulate adipose tissue growth and development in production animals has direct and indirect effects in the profitability of an enterprise. Adipocyte cellularity (number and size) is a key biological response that is commonly measured in animal science research. The variability and sampling of adipocyte cellularity within a muscle has been addressed in previous studies, but no attempt to critically investigate these issues has been proposed in the literature. The present study evaluated 2 sampling techniques (random and systematic) in an attempt to minimize sampling bias and to determine the minimum number of samples from 1 to 15 needed to represent the overall adipose tissue in the muscle. Both sampling procedures were applied on adipose tissue samples dissected from 30 longissimus muscles from cattle finished either on grass or grain. Briefly, adipose tissue samples were fixed with osmium tetroxide, and size and number of adipocytes were determined by a Coulter Counter. These results were then fit in a finite mixture model to obtain distribution parameters of each sample. To evaluate the benefits of increasing number of samples and the advantage of the new sampling technique, the concept of acceptance ratio was used; simply stated, the higher the acceptance ratio, the better the representation of the overall population. As expected, a great improvement on the estimation of the overall adipocyte cellularity parameters was observed using both sampling techniques when sample size number increased from 1 to 15 samples, considering both techniques' acceptance ratio increased from approximately 3 to 25%. When comparing sampling techniques, the systematic procedure slightly improved parameters estimation. The results suggest that more detailed research using other sampling techniques may provide better estimates for minimum sampling.

  19. An Intrinsic Algorithm for Parallel Poisson Disk Sampling on Arbitrary Surfaces.

    Science.gov (United States)

    Ying, Xiang; Xin, Shi-Qing; Sun, Qian; He, Ying

    2013-03-08

    Poisson disk sampling plays an important role in a variety of visual computing, due to its useful statistical property in distribution and the absence of aliasing artifacts. While many effective techniques have been proposed to generate Poisson disk distribution in Euclidean space, relatively few work has been reported to the surface counterpart. This paper presents an intrinsic algorithm for parallel Poisson disk sampling on arbitrary surfaces. We propose a new technique for parallelizing the dart throwing. Rather than the conventional approaches that explicitly partition the spatial domain to generate the samples in parallel, our approach assigns each sample candidate a random and unique priority that is unbiased with regard to the distribution. Hence, multiple threads can process the candidates simultaneously and resolve conflicts by checking the given priority values. It is worth noting that our algorithm is accurate as the generated Poisson disks are uniformly and randomly distributed without bias. Our method is intrinsic in that all the computations are based on the intrinsic metric and are independent of the embedding space. This intrinsic feature allows us to generate Poisson disk distributions on arbitrary surfaces. Furthermore, by manipulating the spatially varying density function, we can obtain adaptive sampling easily.

  20. Nonactivation interaction techniques in the analysis of environmental samples

    International Nuclear Information System (INIS)

    Tolgyessy, J.

    1986-01-01

    Nonactivation interaction analytical methods are based on the interaction processes of nuclear and X-ray radiation with a sample, leading to their absorption and backscattering, to the ionization of gases or excitation of fluorescent X-ray by radiation, but not to the activation of determined elements. From the point of view of environmental analysis, the most useful nonactivation interaction techniques are X-ray fluorescence by photon or charged particle excitation, ionization of gases by nuclear radiation, elastic scattering of charged particles and backscattering of beta radiation. The significant advantage of these methods is that they are nondestructive. (author)

  1. Comparison of Techniques for Sampling Adult Necrophilous Insects From Pig Carcasses.

    Science.gov (United States)

    Cruise, Angela; Hatano, Eduardo; Watson, David W; Schal, Coby

    2018-02-06

    Studies of the pre-colonization interval and mechanisms driving necrophilous insect ecological succession depend on effective sampling of adult insects and knowledge of their diel and successional activity patterns. The number of insects trapped, their diversity, and diel periodicity were compared with four sampling methods on neonate pigs. Sampling method, time of day and decomposition age of the pigs significantly affected the number of insects sampled from pigs. We also found significant interactions of sampling method and decomposition day, time of sampling and decomposition day. No single method was superior to the other methods during all three decomposition days. Sampling times after noon yielded the largest samples during the first 2 d of decomposition. On day 3 of decomposition however, all sampling times were equally effective. Therefore, to maximize insect collections from neonate pigs, the method used to sample must vary by decomposition day. The suction trap collected the most species-rich samples, but sticky trap samples were the most diverse, when both species richness and evenness were factored into a Shannon diversity index. Repeated sampling during the noon to 18:00 hours period was most effective to obtain the maximum diversity of trapped insects. The integration of multiple sampling techniques would most effectively sample the necrophilous insect community. However, because all four tested methods were deficient at sampling beetle species, future work should focus on optimizing the most promising methods, alone or in combinations, and incorporate hand-collections of beetles. © The Author(s) 2018. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  2. Efficient pseudo-random number generation for monte-carlo simulations using graphic processors

    Science.gov (United States)

    Mohanty, Siddhant; Mohanty, A. K.; Carminati, F.

    2012-06-01

    A hybrid approach based on the combination of three Tausworthe generators and one linear congruential generator for pseudo random number generation for GPU programing as suggested in NVIDIA-CUDA library has been used for MONTE-CARLO sampling. On each GPU thread, a random seed is generated on fly in a simple way using the quick and dirty algorithm where mod operation is not performed explicitly due to unsigned integer overflow. Using this hybrid generator, multivariate correlated sampling based on alias technique has been carried out using both CUDA and OpenCL languages.

  3. Efficient pseudo-random number generation for Monte-Carlo simulations using graphic processors

    International Nuclear Information System (INIS)

    Mohanty, Siddhant; Mohanty, A K; Carminati, F

    2012-01-01

    A hybrid approach based on the combination of three Tausworthe generators and one linear congruential generator for pseudo random number generation for GPU programing as suggested in NVIDIA-CUDA library has been used for MONTE-CARLO sampling. On each GPU thread, a random seed is generated on fly in a simple way using the quick and dirty algorithm where mod operation is not performed explicitly due to unsigned integer overflow. Using this hybrid generator, multivariate correlated sampling based on alias technique has been carried out using both CUDA and OpenCL languages.

  4. Improving oral hygiene skills by computer-based training: a randomized controlled comparison of the modified Bass and the Fones techniques.

    Directory of Open Access Journals (Sweden)

    Daniela Harnacke

    Full Text Available BACKGROUND: Gingivitis and other plaque-associated diseases have a high prevalence in western communities even though the majority of adults report daily oral hygiene. This indicates a lack of oral hygiene skills. Currently, there is no clear evidence as to which brushing technique would bring about the best oral hygiene skills. While the modified Bass technique is often recommended by dentists and in textbooks, the Fones technique is often recommended in patient brochures. Still, standardized comparisons of the effectiveness of teaching these techniques are lacking. METHODOLOGY/PRINCIPAL FINDINGS: In a final sample of n = 56 students, this multidisciplinary, randomized, examiner-blinded, controlled study compared the effects of parallel and standardized interactive computer presentations teaching either the Fones or the modified Bass technique. A control group was taught the basics of tooth brushing alone. Oral hygiene skills (remaining plaque after thorough oral hygiene and gingivitis were assessed at baseline and 6, 12, and 28 weeks after the intervention. We found a significant group×time interaction for gingivitis (F(4/102 = 3.267; p = 0.016; ε = 0.957; η(2 = 0.114 and a significant main effect of group for oral hygiene skills (F(2/51 = 7.088; p = 0.002; η(2 = 0.218. Fones was superior to Bass; Bass did not differ from the control group. Group differences were most prominent after 6 and 12 weeks. CONCLUSIONS/SIGNIFICANCE: The present trial indicates an advantage of teaching the Fones as compared to the modified Bass technique with respect to oral hygiene skills and gingivitis. Future studies are needed to analyze whether the disadvantage of teaching the Bass technique observed here is restricted to the teaching method employed. TRIAL REGISTRATION: German Clinical Trials Register DRKS00003488.

  5. Multi-element analysis of lubricant oil by WDXRF technique using thin-film sample preparation

    International Nuclear Information System (INIS)

    Scapin, M. A.; Salvador, V. L. R.; Lopes, C. D.; Sato, I. M.

    2006-01-01

    The quantitative analysis of the chemical elements in matrices like oils or gels represents a challenge for the analytical chemists. The classics methods or instrumental techniques such as atomic absorption spectrometry (AAS) and plasma optical emission spectrometry (ICP-OES) need chemical treatments, mainly sample dissolution and degradation processes. X-ray fluorescence technique allows a direct and multi-element analysis without previous sample treatments. In this work, a sensible method for the determination of elements Mg, Al, Si, P, Ca, Ti, V, Cr, Mn, Fe, Ni, Cu, Zn, Mo, Ag, Sn, Ba and Pb in lubricating oil is presented. The x-ray fluorescence (WDXRF) technique using linear regression method and thin film sample preparation was used. The validation of the methodology (repeatability and accuracy) was obtained by the analysis of the standard reference materials SRM Alpha AESAR lot 703527D, applying the Chauvenet, Cochrane, ANOVA and Z-score statistical tests. The method presents a relative standard deviation lower than 10% for all the elements, except for Pb determination (RSD Pb 15%). The Z-score values for all the elements were in the range -2 < Z < 2, indicating a very good accuracy.(Full text)

  6. Human mixed lymphocyte cultures. Evaluation of microculture technique utilizing the multiple automated sample harvester (MASH)

    Science.gov (United States)

    Thurman, G. B.; Strong, D. M.; Ahmed, A.; Green, S. S.; Sell, K. W.; Hartzman, R. J.; Bach, F. H.

    1973-01-01

    Use of lymphocyte cultures for in vitro studies such as pretransplant histocompatibility testing has established the need for standardization of this technique. A microculture technique has been developed that has facilitated the culturing of lymphocytes and increased the quantity of cultures feasible, while lowering the variation between replicate samples. Cultures were prepared for determination of tritiated thymidine incorporation using a Multiple Automated Sample Harvester (MASH). Using this system, the parameters that influence the in vitro responsiveness of human lymphocytes to allogeneic lymphocytes have been investigated. PMID:4271568

  7. [Influence of Natural Dissolved Organic Matter on the Passive Sampling Technique and its Application].

    Science.gov (United States)

    Yu, Shang-yun; Zhou, Yan-mei

    2015-08-01

    This paper studied the effects of different concentrations of natural dissolved organic matter (DOM) on the passive sampling technique. The results showed that the presence of DOM affected the organic pollutant adsorption ability of the membrane. For lgK(OW), 3-5, DOM had less impact on the adsorption of organic matter by the membrane; for lgK(OW), > 5.5, DOM significantly increased the adsorption capacity of the membrane. Meanwhile, LDPE passive sampling technique was applied to monitor PAHs and PAEs in pore water of three surface sediments in Taizi River. All of the target pollutants were detected in varying degrees at each sampling point. Finally, the quotient method was used to assess the ecological risks of PAHs and PAEs. The results showed that fluoranthene exceeded the reference value of the aquatic ecosystem, meaning there was a big ecological risk.

  8. Modern survey sampling

    CERN Document Server

    Chaudhuri, Arijit

    2014-01-01

    Exposure to SamplingAbstract Introduction Concepts of Population, Sample, and SamplingInitial RamificationsAbstract Introduction Sampling Design, Sampling SchemeRandom Numbers and Their Uses in Simple RandomSampling (SRS)Drawing Simple Random Samples with and withoutReplacementEstimation of Mean, Total, Ratio of Totals/Means:Variance and Variance EstimationDetermination of Sample SizesA.2 Appendix to Chapter 2 A.More on Equal Probability Sampling A.Horvitz-Thompson EstimatorA.SufficiencyA.LikelihoodA.Non-Existence Theorem More Intricacies Abstract Introduction Unequal Probability Sampling StrategiesPPS Sampling Exploring Improved WaysAbstract Introduction Stratified Sampling Cluster SamplingMulti-Stage SamplingMulti-Phase Sampling: Ratio and RegressionEstimationviiviii ContentsControlled SamplingModeling Introduction Super-Population ModelingPrediction Approach Model-Assisted Approach Bayesian Methods Spatial SmoothingSampling on Successive Occasions: Panel Rotation Non-Response and Not-at-Homes Weighting Adj...

  9. Root coverage with connective tissue graft associated with coronally advanced flap or tunnel technique: a randomized, double-blind, mono-centre clinical trial

    NARCIS (Netherlands)

    Azaripour, Adriano; Kissinger, Maren; Farina, Vittorio Siro Leone; van Noorden, Cornelis J. F.; Gerhold-Ay, Aslihan; Willershausen, Brita; Cortellini, Pierpaolo

    2016-01-01

    Aim: The aim of this randomized clinical trial was to compare the coronally advanced flap (CAF) with the modified microsurgical tunnel technique (MMTT) for treatment of Miller class I and II recessions. Material and Methods: Forty patients with 71 gingival recessions were recruited and randomly

  10. Evaluation of the Gow-Gates and Vazirani-Akinosi techniques in patients with symptomatic irreversible pulpitis: a prospective randomized study.

    Science.gov (United States)

    Click, Vivian; Drum, Melissa; Reader, Al; Nusstein, John; Beck, Mike

    2015-01-01

    Few studies have evaluated the effectiveness of the Gow-Gates and Vazirani-Akinosi techniques in patients presenting with symptomatic irreversible pulpitis. Therefore, the purpose of this prospective, randomized study was to evaluate the anesthetic efficacy of the Gow-Gates and Vazirani-Akinosi techniques using 3.6 mL 2% lidocaine with 1:100,000 epinephrine in mandibular posterior teeth in patients presenting with symptomatic irreversible pulpitis. One hundred twenty-five emergency patients diagnosed with symptomatic irreversible pulpitis randomly received either a Gow-Gates or Vazirani-Akinosi injection using 3.6 mL 2% lidocaine with 1:100,000 epinephrine to block the inferior alveolar nerve before endodontic access. Subjective lip numbness was recorded. Pulpal anesthetic success of the injection was defined as no pain or mild pain upon endodontic access and instrumentation as measured on a visual analog scale. Subjective lip numbness was obtained 92% of the time with the Gow-Gates technique and 63% of the time with the Vazirani-Akinosi technique. The difference was statistically significant (P = .0001). For the patients achieving lip numbness, successful pulpal anesthesia was obtained 35% of the time with the Gow-Gates technique and 16% of the time with the Vazirani-Akinosi technique. The difference was statistically significant (P = .0381). We concluded that for patients who achieved lip numbness neither the Gow-Gates technique nor the Vazirani-Akinosi technique provided adequate pulpal anesthesia for mandibular posterior teeth in patients presenting with symptomatic irreversible pulpitis. Both injections would require supplemental anesthesia. Copyright © 2015 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  11. Heating and thermal control of brazing technique to break contamination path for potential Mars sample return

    Science.gov (United States)

    Bao, Xiaoqi; Badescu, Mircea; Sherrit, Stewart; Bar-Cohen, Yoseph; Campos, Sergio

    2017-04-01

    The potential return of Mars sample material is of great interest to the planetary science community, as it would enable extensive analysis of samples with highly sensitive laboratory instruments. It is important to make sure such a mission concept would not bring any living microbes, which may possibly exist on Mars, back to Earth's environment. In order to ensure the isolation of Mars microbes from Earth's Atmosphere, a brazing sealing and sterilizing technique was proposed to break the Mars-to-Earth contamination path. Effectively, heating the brazing zone in high vacuum space and controlling the sample temperature for integrity are key challenges to the implementation of this technique. The break-thechain procedures for container configurations, which are being considered, were simulated by multi-physics finite element models. Different heating methods including induction and resistive/radiation were evaluated. The temperature profiles of Martian samples in a proposed container structure were predicted. The results show that the sealing and sterilizing process can be controlled such that the samples temperature is maintained below the level that may cause damage, and that the brazing technique is a feasible approach to breaking the contamination path.

  12. Texture investigation in aluminium and iron - silicon samples by neutron diffraction technique

    International Nuclear Information System (INIS)

    Pugliese, R.; Yamasaki, J.M.

    1988-09-01

    By means of the neutron diffraction technique the texture of 5% and 98% rolled-aluminium and of iron-silicon steel used in the core of electric transformers, have been determined. The measurements were performed by using a neutron diffractometer installed at the IEA-R1 Nuclear Research Reactor, in the Beam-Hole n 0 . 6. To avoid corrections such as neutron absorption and sample luminosity the geometric form of the samples were approximated to spheric or octagonal prism, and its dimensions do not exceed that of the neutron beam. The texture of the samples were analysed with the help of a computer programme that analyses the intensity of the diffracted neutron beam and plot the pole figures. (author) [pt

  13. Effects of pushing techniques during the second stage of labor: A randomized controlled trial.

    Science.gov (United States)

    Koyucu, Refika Genç; Demirci, Nurdan

    2017-10-01

    Spontaneous pushing is a method that is used in the management of the second stage of labor and suggested to be more physiological for the mother and infant. The present study aims to evaluate the effects of pushing techniques on the mother and newborn. This randomized prospective study was performed between June 2013-March 2014 in a tertiary maternity clinic in Istanbul. 80 low risk, nulliparous cases were randomized to pushing groups. Valsalva pushing group was told to hold their breath while pushing. No visual-verbal instructions were given to spontaneous pushing group and they were encouraged to push without preventing respiration. Demographic data, second stage period, perineal laceration rates, fetal heart rate patterns, presence of meconium stained amniotic liquid, newborn APGAR scores, POP-Q examination and Q-tip test results were evaluated in these cases. The second stage of labor was significantly longer with spontaneous pushing. Decrease in Hb levels in valsalva pushing group was determined to be higher than spontaneous pushing group. An increased urethral mobility was observed in valsalva pushing group. Although the duration of the second stage of labor was longer compared to valsalva pushing technique, women were able to give birth without requiring any verbal or visual instruction, without exceeding the limit value of two hours and without affecting fetal wellness and neonatal results. Copyright © 2017. Published by Elsevier B.V.

  14. The particle analysis based on FT-TIMS technique for swipe sample under the frame of nuclear safeguard

    International Nuclear Information System (INIS)

    Yang Tianli; Liu Xuemei; Liu Zhao; Tang Lei; Long Kaiming

    2008-06-01

    Under the frame of nuclear safeguard, the particles analysis for swipe sample is an advance mean to detect the undeclared uranium enriched facilities and undeclared uranium enriched activity. The technique of particle analysis based on fission track-thermal ionization mass spectrometry (FT-TIMS) for swipe sample have been built. The reliability and the experimental background for selecting particles consisting of uranium from swipe sample by FT method have been verified. In addition, the utilization coefficient of particles on the surface of swipe sample have also been tested. These works have provided the technique support for application in the area of nuclear verification. (authors)

  15. Refinement of NMR structures using implicit solvent and advanced sampling techniques.

    Science.gov (United States)

    Chen, Jianhan; Im, Wonpil; Brooks, Charles L

    2004-12-15

    NMR biomolecular structure calculations exploit simulated annealing methods for conformational sampling and require a relatively high level of redundancy in the experimental restraints to determine quality three-dimensional structures. Recent advances in generalized Born (GB) implicit solvent models should make it possible to combine information from both experimental measurements and accurate empirical force fields to improve the quality of NMR-derived structures. In this paper, we study the influence of implicit solvent on the refinement of protein NMR structures and identify an optimal protocol of utilizing these improved force fields. To do so, we carry out structure refinement experiments for model proteins with published NMR structures using full NMR restraints and subsets of them. We also investigate the application of advanced sampling techniques to NMR structure refinement. Similar to the observations of Xia et al. (J.Biomol. NMR 2002, 22, 317-331), we find that the impact of implicit solvent is rather small when there is a sufficient number of experimental restraints (such as in the final stage of NMR structure determination), whether implicit solvent is used throughout the calculation or only in the final refinement step. The application of advanced sampling techniques also seems to have minimal impact in this case. However, when the experimental data are limited, we demonstrate that refinement with implicit solvent can substantially improve the quality of the structures. In particular, when combined with an advanced sampling technique, the replica exchange (REX) method, near-native structures can be rapidly moved toward the native basin. The REX method provides both enhanced sampling and automatic selection of the most native-like (lowest energy) structures. An optimal protocol based on our studies first generates an ensemble of initial structures that maximally satisfy the available experimental data with conventional NMR software using a simplified

  16. Random Decrement Based FRF Estimation

    DEFF Research Database (Denmark)

    Brincker, Rune; Asmussen, J. C.

    to speed and quality. The basis of the new method is the Fourier transformation of the Random Decrement functions which can be used to estimate the frequency response functions. The investigations are based on load and response measurements of a laboratory model of a 3 span bridge. By applying both methods...... that the Random Decrement technique is based on a simple controlled averaging of time segments of the load and response processes. Furthermore, the Random Decrement technique is expected to produce reliable results. The Random Decrement technique will reduce leakage, since the Fourier transformation...

  17. Random Decrement Based FRF Estimation

    DEFF Research Database (Denmark)

    Brincker, Rune; Asmussen, J. C.

    1997-01-01

    to speed and quality. The basis of the new method is the Fourier transformation of the Random Decrement functions which can be used to estimate the frequency response functions. The investigations are based on load and response measurements of a laboratory model of a 3 span bridge. By applying both methods...... that the Random Decrement technique is based on a simple controlled averaging of time segments of the load and response processes. Furthermore, the Random Decrement technique is expected to produce reliable results. The Random Decrement technique will reduce leakage, since the Fourier transformation...

  18. Fixed-location hydroacoustic monitoring designs for estimating fish passage using stratified random and systematic sampling

    International Nuclear Information System (INIS)

    Skalski, J.R.; Hoffman, A.; Ransom, B.H.; Steig, T.W.

    1993-01-01

    Five alternate sampling designs are compared using 15 d of 24-h continuous hydroacoustic data to identify the most favorable approach to fixed-location hydroacoustic monitoring of salmonid outmigrants. Four alternative aproaches to systematic sampling are compared among themselves and with stratified random sampling (STRS). Stratifying systematic sampling (STSYS) on a daily basis is found to reduce sampling error in multiday monitoring studies. Although sampling precision was predictable with varying levels of effort in STRS, neither magnitude nor direction of change in precision was predictable when effort was varied in systematic sampling (SYS). Furthermore, modifying systematic sampling to include replicated (e.g., nested) sampling (RSYS) is further shown to provide unbiased point and variance estimates as does STRS. Numerous short sampling intervals (e.g., 12 samples of 1-min duration per hour) must be monitored hourly using RSYS to provide efficient, unbiased point and interval estimates. For equal levels of effort, STRS outperformed all variations of SYS examined. Parametric approaches to confidence interval estimates are found to be superior to nonparametric interval estimates (i.e., bootstrap and jackknife) in estimating total fish passage. 10 refs., 1 fig., 8 tabs

  19. Geospatial techniques for developing a sampling frame of watersheds across a region

    Science.gov (United States)

    Gresswell, Robert E.; Bateman, Douglas S.; Lienkaemper, George; Guy, T.J.

    2004-01-01

    Current land-management decisions that affect the persistence of native salmonids are often influenced by studies of individual sites that are selected based on judgment and convenience. Although this approach is useful for some purposes, extrapolating results to areas that were not sampled is statistically inappropriate because the sampling design is usually biased. Therefore, in recent investigations of coastal cutthroat trout (Oncorhynchus clarki clarki) located above natural barriers to anadromous salmonids, we used a methodology for extending the statistical scope of inference. The purpose of this paper is to apply geospatial tools to identify a population of watersheds and develop a probability-based sampling design for coastal cutthroat trout in western Oregon, USA. The population of mid-size watersheds (500-5800 ha) west of the Cascade Range divide was derived from watershed delineations based on digital elevation models. Because a database with locations of isolated populations of coastal cutthroat trout did not exist, a sampling frame of isolated watersheds containing cutthroat trout had to be developed. After the sampling frame of watersheds was established, isolated watersheds with coastal cutthroat trout were stratified by ecoregion and erosion potential based on dominant bedrock lithology (i.e., sedimentary and igneous). A stratified random sample of 60 watersheds was selected with proportional allocation in each stratum. By comparing watershed drainage areas of streams in the general population to those in the sampling frame and the resulting sample (n = 60), we were able to evaluate the how representative the subset of watersheds was in relation to the population of watersheds. Geospatial tools provided a relatively inexpensive means to generate the information necessary to develop a statistically robust, probability-based sampling design.

  20. Dosimetric characterization of BeO samples in alpha, beta and X radiation beams using luminescent techniques

    International Nuclear Information System (INIS)

    Groppo, Daniela Piai

    2013-01-01

    In the medical field, the ionizing radiation is used both for therapeutic and diagnostic purposes, in a wide range of radiation doses. In order to ensure that the objective is achieved in practice, detailed studies of detectors and devices in different types of radiations beams are necessary. In this work a dosimetric characterization of BeO samples was performed using the techniques of thermoluminescence (TL) and optically stimulated luminescence (OSL) by a comparison of their response for alpha, beta and X radiations and the establishment of an appropriated system for use in monitoring of these radiations beams. The main results are: the high sensitivity to beta radiation for both techniques, good reproducibility of TL and OSL response (coefficients of variation lower than 5%), maximum energy dependence of the X radiation of 28% for the TL technique, and only 7% for the OSL technique, within the studied energy range. The dosimetric characteristics obtained in this work show the possibility of applying BeO samples to dosimetry of alpha, beta and X radiations, considering the studied dose ranges, using the TL and OSL techniques. From the results obtained, the samples of BeO showed their potential use for beam dosimetry in diagnostic radiology and radiotherapy. (author)

  1. At convenience and systematic random sampling: effects on the prognostic value of nuclear area assessments in breast cancer patients.

    Science.gov (United States)

    Jannink, I; Bennen, J N; Blaauw, J; van Diest, P J; Baak, J P

    1995-01-01

    This study compares the influence of two different nuclear sampling methods on the prognostic value of assessments of mean and standard deviation of nuclear area (MNA, SDNA) in 191 consecutive invasive breast cancer patients with long term follow up. The first sampling method used was 'at convenience' sampling (ACS); the second, systematic random sampling (SRS). Both sampling methods were tested with a sample size of 50 nuclei (ACS-50 and SRS-50). To determine whether, besides the sampling methods, sample size had impact on prognostic value as well, the SRS method was also tested using a sample size of 100 nuclei (SRS-100). SDNA values were systematically lower for ACS, obviously due to (unconsciously) not including small and large nuclei. Testing prognostic value of a series of cut off points, MNA and SDNA values assessed by the SRS method were prognostically significantly stronger than the values obtained by the ACS method. This was confirmed in Cox regression analysis. For the MNA, the Mantel-Cox p-values from SRS-50 and SRS-100 measurements were not significantly different. However, for the SDNA, SRS-100 yielded significantly lower p-values than SRS-50. In conclusion, compared with the 'at convenience' nuclear sampling method, systematic random sampling of nuclei is not only superior with respect to reproducibility of results, but also provides a better prognostic value in patients with invasive breast cancer.

  2. Long-term monitoring of the Danube river-Sampling techniques, radionuclide metrology and radioecological assessment

    International Nuclear Information System (INIS)

    Maringer, F.J.; Gruber, V.; Hrachowitz, M.; Baumgartner, A.; Weilner, S.; Seidel, C.

    2009-01-01

    Sampling techniques and radiometric methods, developed and applied in a comprehensive radioecological study of the Danube River are presented. Results and radiometric data of sediment samples, collected by sediment traps in Austria and additionally by grab sampling in the Danube during research cruises between Germany and the delta (Black sea) are shown and discussed. Goal of the investigation is the protection of public and environment, especially the sustainable use and conservation of human freshwater resources against harmful radioactive exposure.

  3. Separation of arsenic species by capillary electrophoresis with sample-stacking techniques

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Zu Liang; Naidu, Ravendra [Adelaide Laboratory, CSIRO Land and Water, PMB2, 5064, Glen Osmond, SA (Australia); Lin, Jin-Ming [Research Center for Eco-Environmental Sciences, Chinese Academy of Sciences, P.O. Box 2871, 100085, Beijing (China)

    2003-03-01

    A simple capillary zone electrophoresis procedure was developed for the separation of arsenic species (AsO{sub 2}{sup 2-}, AsO{sub 4}{sup 2-}, and dimethylarsinic acid, DMA). Both counter-electroosmotic and co-electroosmotic (EOF) modes were investigated for the separation of arsenic species with direct UV detection at 185 nm using 20 mmol L{sup -1} sodium phosphate as the electrolyte. The separation selectivity mainly depends on the separation modes and electrolyte pH. Inorganic anions (Cl{sup -}, NO{sub 2}{sup -}, NO{sub 3}{sup -} and SO{sub 4}{sup 2-}) presented in real samples did not interfere with arsenic speciation in either separation mode. To improve the detection limits, sample-stacking techniques, including large-volume sample stacking (LVSS) and field-amplified sample injection (FASI), were investigated for the preconcentration of As species in co-CZE mode. Less than 1 {mu}mol L{sup -1} of detection limits for As species were achieved using FASI. The proposed method was demonstrated for the separation and detection of As species in water. (orig.)

  4. Decomposition and (importance) sampling techniques for multi-stage stochastic linear programs

    Energy Technology Data Exchange (ETDEWEB)

    Infanger, G.

    1993-11-01

    The difficulty of solving large-scale multi-stage stochastic linear programs arises from the sheer number of scenarios associated with numerous stochastic parameters. The number of scenarios grows exponentially with the number of stages and problems get easily out of hand even for very moderate numbers of stochastic parameters per stage. Our method combines dual (Benders) decomposition with Monte Carlo sampling techniques. We employ importance sampling to efficiently obtain accurate estimates of both expected future costs and gradients and right-hand sides of cuts. The method enables us to solve practical large-scale problems with many stages and numerous stochastic parameters per stage. We discuss the theory of sharing and adjusting cuts between different scenarios in a stage. We derive probabilistic lower and upper bounds, where we use importance path sampling for the upper bound estimation. Initial numerical results turned out to be promising.

  5. Random selection of items. Selection of n1 samples among N items composing a stratum

    International Nuclear Information System (INIS)

    Jaech, J.L.; Lemaire, R.J.

    1987-02-01

    STR-224 provides generalized procedures to determine required sample sizes, for instance in the course of a Physical Inventory Verification at Bulk Handling Facilities. The present report describes procedures to generate random numbers and select groups of items to be verified in a given stratum through each of the measurement methods involved in the verification. (author). 3 refs

  6. Monitoring of persistent organic pollutants in seawater of the Pearl River Estuary with rapid on-site active SPME sampling technique

    International Nuclear Information System (INIS)

    Huang, Siming; He, Shuming; Xu, Hao; Wu, Peiyan; Jiang, Ruifen; Zhu, Fang; Luan, Tiangang; Ouyang, Gangfeng

    2015-01-01

    An on-site active solid-phase microextraction (SPME) sampling technique coupled with gas chromatography-mass spectrometry (GC–MS) for sampling and monitoring 16 polycyclic aromatic hydrocarbons (PAHs) and 8 organochlorine pesticides (OCPs) in seawater was developed. Laboratory experiments demonstrated that the sampling-rate calibration method was practical and could be used for the quantification of on-site sampling. The proposed method was employed for field tests which covered large amounts of water samples in the Pearl River Estuary in rainy and dry seasons. The on-site SPME sampling method can avoid the contamination of sample, the losses of analytes during sample transportation, as well as the usage of solvent and time-consuming sample preparation process. Results indicated that the technique with the designed device can address the requirement of modern environment water analysis. In addition, the sources, bioaccumulation and potential risk to human of the PAHs and OCPs in seawater of the Pearl River Estuary were discussed. - Highlights: • SPME on-site active sampling technique was developed and validated. • The technique was employed for field tests in the Pearl River Estuary. • 16 PAHs and 8 OCPs in the seawater of Pearl River Estuary were monitored. • The potential risk of the PAHs and OCPs in Pearl River Estuary were discussed. - An on-site active SPME sampling technique was developed and successfully applied for sampling and monitoring 16 PAHs and 8 OCPs in the Pearl River Estuary

  7. Prompt Gamma Activation Analysis (PGAA): Technique of choice for nondestructive bulk analysis of returned comet samples

    International Nuclear Information System (INIS)

    Lindstrom, D.J.; Lindstrom, R.M.

    1989-01-01

    Prompt gamma activation analysis (PGAA) is a well-developed analytical technique. The technique involves irradiation of samples in an external neutron beam from a nuclear reactor, with simultaneous counting of gamma rays produced in the sample by neutron capture. Capture of neutrons leads to excited nuclei which decay immediately with the emission of energetic gamma rays to the ground state. PGAA has several advantages over other techniques for the analysis of cometary materials: (1) It is nondestructive; (2) It can be used to determine abundances of a wide variety of elements, including most major and minor elements (Na, Mg, Al, Si, P, K, Ca, Ti, Cr, Mn, Fe, Co, Ni), volatiles (H, C, N, F, Cl, S), and some trace elements (those with high neutron capture cross sections, including B, Cd, Nd, Sm, and Gd); and (3) It is a true bulk analysis technique. Recent developments should improve the technique's sensitivity and accuracy considerably

  8. Classification of Phishing Email Using Random Forest Machine Learning Technique

    Directory of Open Access Journals (Sweden)

    Andronicus A. Akinyelu

    2014-01-01

    Full Text Available Phishing is one of the major challenges faced by the world of e-commerce today. Thanks to phishing attacks, billions of dollars have been lost by many companies and individuals. In 2012, an online report put the loss due to phishing attack at about $1.5 billion. This global impact of phishing attacks will continue to be on the increase and thus requires more efficient phishing detection techniques to curb the menace. This paper investigates and reports the use of random forest machine learning algorithm in classification of phishing attacks, with the major objective of developing an improved phishing email classifier with better prediction accuracy and fewer numbers of features. From a dataset consisting of 2000 phishing and ham emails, a set of prominent phishing email features (identified from the literature were extracted and used by the machine learning algorithm with a resulting classification accuracy of 99.7% and low false negative (FN and false positive (FP rates.

  9. Experimental study of laser ablation as sample introduction technique for inductively coupled plasma-mass spectrometry

    International Nuclear Information System (INIS)

    Van Winckel, S.

    2001-01-01

    The contribution consists of an abstract of a PhD thesis. In the PhD study, several complementary applications of laser-ablation were investigated in order to characterise experimentally laser ablation (LA) as a sample introduction technique for ICP-MS. Three applications of LA as a sample introduction technique are discussed: (1) the microchemical analysis of the patina of weathered marble; (2) the possibility to measure isotope ratios (in particular Pb isotope ratios in archaeological bronze artefacts); and (3) the determination of Si in Al as part of a dosimetric study of the BR2 reactor vessel

  10. Visualization techniques for spatial probability density function data

    Directory of Open Access Journals (Sweden)

    Udeepta D Bordoloi

    2006-01-01

    Full Text Available Novel visualization methods are presented for spatial probability density function data. These are spatial datasets, where each pixel is a random variable, and has multiple samples which are the results of experiments on that random variable. We use clustering as a means to reduce the information contained in these datasets; and present two different ways of interpreting and clustering the data. The clustering methods are used on two datasets, and the results are discussed with the help of visualization techniques designed for the spatial probability data.

  11. Randomized comparison of vaginal self-sampling by standard vs. dry swabs for Human papillomavirus testing

    International Nuclear Information System (INIS)

    Eperon, Isabelle; Vassilakos, Pierre; Navarria, Isabelle; Menoud, Pierre-Alain; Gauthier, Aude; Pache, Jean-Claude; Boulvain, Michel; Untiet, Sarah; Petignat, Patrick

    2013-01-01

    To evaluate if human papillomavirus (HPV) self-sampling (Self-HPV) using a dry vaginal swab is a valid alternative for HPV testing. Women attending colposcopy clinic were recruited to collect two consecutive Self-HPV samples: a Self-HPV using a dry swab (S-DRY) and a Self-HPV using a standard wet transport medium (S-WET). These samples were analyzed for HPV using real time PCR (Roche Cobas). Participants were randomized to determine the order of the tests. Questionnaires assessing preferences and acceptability for both tests were conducted. Subsequently, women were invited for colposcopic examination; a physician collected a cervical sample (physician-sampling) with a broom-type device and placed it into a liquid-based cytology medium. Specimens were then processed for the production of cytology slides and a Hybrid Capture HPV DNA test (Qiagen) was performed from the residual liquid. Biopsies were performed if indicated. Unweighted kappa statistics (κ) and McNemar tests were used to measure the agreement among the sampling methods. A total of 120 women were randomized. Overall HPV prevalence was 68.7% (95% Confidence Interval (CI) 59.3–77.2) by S-WET, 54.4% (95% CI 44.8–63.9) by S-DRY and 53.8% (95% CI 43.8–63.7) by HC. Among paired samples (S-WET and S-DRY), the overall agreement was good (85.7%; 95% CI 77.8–91.6) and the κ was substantial (0.70; 95% CI 0.57-0.70). The proportion of positive type-specific HPV agreement was also good (77.3%; 95% CI 68.2-84.9). No differences in sensitivity for cervical intraepithelial neoplasia grade one (CIN1) or worse between the two Self-HPV tests were observed. Women reported the two Self-HPV tests as highly acceptable. Self-HPV using dry swab transfer does not appear to compromise specimen integrity. Further study in a large screening population is needed. ClinicalTrials.gov: http://clinicaltrials.gov/show/NCT01316120

  12. Microdamage of the cortical bone during mini-implant insertion with self-drilling and self-tapping techniques: a randomized controlled trial.

    Science.gov (United States)

    Yadav, Sumit; Upadhyay, Madhur; Liu, Sean; Roberts, Eugene; Neace, William P; Nanda, Ravindra

    2012-05-01

    The purpose of this research was to evaluate microdamage accumulation after mini-implant placement by self-drilling (without a pilot hole) and self-tapping (screwed into a pilot hole) insertion techniques. The null hypothesis was that the mini-implant insertion technique would have no influence on microcrack accumulation and propagation in the cortical bones of the maxillae and mandibles of adult hounds. Mini-implants (n = 162; diameter, 1.6 mm; length, 6 mm) were placed in the maxillae and mandibles of 9 hounds (12-14 months old) with self-drilling and self-tapping insertion techniques. The techniques were randomly assigned to the left or the right side of each jaw. Each hound received 18 mini-implants (10 in the mandible, 8 in the maxilla). Histomorphometric parameters including total crack length and crack surface density were measured. The null hypothesis was rejected in favor of an alternate hypothesis: that the self-drilling technique results in more microdamage (microcracks) accumulation in the adjacent cortical bone in both the maxilla and the mandible immediately after mini-implant placement. A cluster level analysis was used to analyze the data on the outcome measured. Since the measurements were clustered within dogs, a paired-samples t test was used to analyze the average differences between insertion methods at both jaw locations. A significance level of 0.05 was used for both analyses. The self-drilling technique resulted in greater total crack lengths in both the maxilla and the mandible (maxilla: mean difference, 18.70 ± 7.04 μm/mm(2); CI, 13.29-24.11; mandible: mean difference, 22.98 ± 6.43 μm/mm(2); CI, 18.04-27.93; P hounds in both the maxilla and the mandible by the self-drilling insertion technique compared with the self-tapping technique. Copyright © 2012 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  13. A Pragmatic Randomized Controlled Trial of 6-Step vs 3-Step Hand Hygiene Technique in Acute Hospital Care in the United Kingdom.

    Science.gov (United States)

    Reilly, Jacqui S; Price, Lesley; Lang, Sue; Robertson, Chris; Cheater, Francine; Skinner, Kirsty; Chow, Angela

    2016-06-01

    OBJECTIVE To evaluate the microbiologic effectiveness of the World Health Organization's 6-step and the Centers for Disease Control and Prevention's 3-step hand hygiene techniques using alcohol-based handrub. DESIGN A parallel group randomized controlled trial. SETTING An acute care inner-city teaching hospital (Glasgow). PARTICIPANTS Doctors (n=42) and nurses (n=78) undertaking direct patient care. INTERVENTION Random 1:1 allocation of the 6-step (n=60) or the 3-step (n=60) technique. RESULTS The 6-step technique was microbiologically more effective at reducing the median log10 bacterial count. The 6-step technique reduced the count from 3.28 CFU/mL (95% CI, 3.11-3.38 CFU/mL) to 2.58 CFU/mL (2.08-2.93 CFU/mL), whereas the 3-step reduced it from 3.08 CFU/mL (2.977-3.27 CFU/mL) to 2.88 CFU/mL (-2.58 to 3.15 CFU/mL) (P=.02). However, the 6-step technique did not increase the total hand coverage area (98.8% vs 99.0%, P=.15) and required 15% (95% CI, 6%-24%) more time (42.50 seconds vs 35.0 seconds, P=.002). Total hand coverage was not related to the reduction in bacterial count. CONCLUSIONS Two techniques for hand hygiene using alcohol-based handrub are promoted in international guidance, the 6-step by the World Health Organization and 3-step by the Centers for Disease Control and Prevention. The study provides the first evidence in a randomized controlled trial that the 6-step technique is superior, thus these international guidance documents should consider this evidence, as should healthcare organizations using the 3-step technique in practice. Infect Control Hosp Epidemiol 2016;37:661-666.

  14. A new method of maintaining airway during nasotracheal intubation--the hand mask technique.

    Science.gov (United States)

    Wu, R S; Wong, D S; Chung, P C; Tan, P P

    1993-09-01

    The efficacy of a new method (The hand mask technique) for airway maintenance during nasotracheal intubation was evaluated in our randomized crossover study. Sixty, age less than 50, ASA physical status class I-II patients undergoing surgery for the extremities with informed consent were randomly chosen for the study. Pulse oximeter, capnometer, EKG, blood pressure monitor and a peripheral nerve stimulator were attached to the patients before induction for continuous monitoring. An arterial cannula was inserted for intermittent blood gas sampling. After baseline room air blood gas data had been obtained from the spontaneously breathing patients, a flow rate of 6L/min pure oxygen was applied through a loosely fitted face mask and a semi-closed anesthesia breathing circuit for a period of 5 minutes. An arterial blood sample was drawn and the patients were put under general anesthesia with full muscle relaxation thereafter. Patients were then randomly assigned into two groups according to the ventilation technique used. Group A patients (n = 30) were manually ventilated first through a face mask for ten minutes and then the hand mask technique for another ten minutes. Blood gas data was sampled and heart rate, blood pressure, peak inspiratory airway pressure and end tidal CO2 were recorded immediately after each ventilation technique. For patients in Group B (n = 30), the sequence of the two ventilation technique were reversed. The results showed significant increases in PaO2 after artificial ventilation in both groups (No significant difference in results between the two groups) and less incidence of nasal bleeding in Group A.(ABSTRACT TRUNCATED AT 250 WORDS)

  15. A Combined Weighting Method Based on Hybrid of Interval Evidence Fusion and Random Sampling

    Directory of Open Access Journals (Sweden)

    Ying Yan

    2017-01-01

    Full Text Available Due to the complexity of system and lack of expertise, epistemic uncertainties may present in the experts’ judgment on the importance of certain indices during group decision-making. A novel combination weighting method is proposed to solve the index weighting problem when various uncertainties are present in expert comments. Based on the idea of evidence theory, various types of uncertain evaluation information are uniformly expressed through interval evidence structures. Similarity matrix between interval evidences is constructed, and expert’s information is fused. Comment grades are quantified using the interval number, and cumulative probability function for evaluating the importance of indices is constructed based on the fused information. Finally, index weights are obtained by Monte Carlo random sampling. The method can process expert’s information with varying degrees of uncertainties, which possesses good compatibility. Difficulty in effectively fusing high-conflict group decision-making information and large information loss after fusion is avertible. Original expert judgments are retained rather objectively throughout the processing procedure. Cumulative probability function constructing and random sampling processes do not require any human intervention or judgment. It can be implemented by computer programs easily, thus having an apparent advantage in evaluation practices of fairly huge index systems.

  16. Correcting Classifiers for Sample Selection Bias in Two-Phase Case-Control Studies

    Science.gov (United States)

    Theis, Fabian J.

    2017-01-01

    Epidemiological studies often utilize stratified data in which rare outcomes or exposures are artificially enriched. This design can increase precision in association tests but distorts predictions when applying classifiers on nonstratified data. Several methods correct for this so-called sample selection bias, but their performance remains unclear especially for machine learning classifiers. With an emphasis on two-phase case-control studies, we aim to assess which corrections to perform in which setting and to obtain methods suitable for machine learning techniques, especially the random forest. We propose two new resampling-based methods to resemble the original data and covariance structure: stochastic inverse-probability oversampling and parametric inverse-probability bagging. We compare all techniques for the random forest and other classifiers, both theoretically and on simulated and real data. Empirical results show that the random forest profits from only the parametric inverse-probability bagging proposed by us. For other classifiers, correction is mostly advantageous, and methods perform uniformly. We discuss consequences of inappropriate distribution assumptions and reason for different behaviors between the random forest and other classifiers. In conclusion, we provide guidance for choosing correction methods when training classifiers on biased samples. For random forests, our method outperforms state-of-the-art procedures if distribution assumptions are roughly fulfilled. We provide our implementation in the R package sambia. PMID:29312464

  17. A radioanalytical technique using (n,2n) reaction for the elemental analysis of samples

    International Nuclear Information System (INIS)

    Labor, M.

    1985-11-01

    A technique to determine elemental composition of samples is reported. The principle of the technique employs the internal standard method and involves the resolution of complex annihilation spectra. The technique has been applied to the determination of the mass of nitrogen, msub(N), and that of potassium, msub(K), in known masses of potassium nitrate. The percentage difference between the calculated mass and actual masses in 2g and 3g of potassium nitrate is 1.0 and 0.7 respectively for potassium, and 1.0 for nitrogen. The use of more simultaneous equations than necessary in solving for msub(N) and msub(K) offers one of the advantages of the technique. (author)

  18. On the classification techniques in data mining for microarray data classification

    Science.gov (United States)

    Aydadenta, Husna; Adiwijaya

    2018-03-01

    Cancer is one of the deadly diseases, according to data from WHO by 2015 there are 8.8 million more deaths caused by cancer, and this will increase every year if not resolved earlier. Microarray data has become one of the most popular cancer-identification studies in the field of health, since microarray data can be used to look at levels of gene expression in certain cell samples that serve to analyze thousands of genes simultaneously. By using data mining technique, we can classify the sample of microarray data thus it can be identified with cancer or not. In this paper we will discuss some research using some data mining techniques using microarray data, such as Support Vector Machine (SVM), Artificial Neural Network (ANN), Naive Bayes, k-Nearest Neighbor (kNN), and C4.5, and simulation of Random Forest algorithm with technique of reduction dimension using Relief. The result of this paper show performance measure (accuracy) from classification algorithm (SVM, ANN, Naive Bayes, kNN, C4.5, and Random Forets).The results in this paper show the accuracy of Random Forest algorithm higher than other classification algorithms (Support Vector Machine (SVM), Artificial Neural Network (ANN), Naive Bayes, k-Nearest Neighbor (kNN), and C4.5). It is hoped that this paper can provide some information about the speed, accuracy, performance and computational cost generated from each Data Mining Classification Technique based on microarray data.

  19. Preoperative warm-up the key to improved resident technique: a randomized study.

    Science.gov (United States)

    Moran-Atkin, Erin; Abdalla, Gamal; Chen, Grace; Magnuson, Thomas H; Lidor, Anne O; Schweitzer, Michael A; Steele, Kimberley E

    2015-05-01

    The ACGME has required that a skills lab be incorporated into the surgical residency curriculum. While the value of warm-up is generally accepted in other areas requiring complex motor skills, there is little evidence to support the benefits of warm-up prior to performing surgery. We are conducting this study in an attempt to identify whether a warm-up period prior to operating impacts operative technique. All general surgery residents and MIS fellows were included in this IRB-approved randomized study. Participants were randomized to either warm-up or no warm-up groups. Participants randomized to the warm-up group completed a 10 min practice session in the simulation lab within 1 h of starting the case, using an FLS training box. At the conclusion of the operation, the participant was evaluated by the attending surgeon using the validated global rating scales of Reznick and Vassiliou. The attending surgeons were blinded to the use of pre-procedure warm-up. The results of the questionnaire were analyzed using student's t test with p warm-up (19) or no warm-up (21). There was a statistically significant improvement in depth perception (p = 0.02), bimanual dexterity (p = 0.01), and efficiency of movements (p = 0.03) for those randomized to warm-up. There was statistical improvement when we preformed a composite scoring of the attending evaluations for each of the Reznick (p = 0.008) and the Vassiliou (p = 0.01) global rating scales. Preoperative warm-up significantly improves depth perception, bimanual dexterity, and efficiency of movements, as well as improvement in composite scores as judged by the attending surgeon. The lack of self-perceived improvement by the residents may be a reflection of the high standards and intense self-critique that is common among surgical trainees. We believe that our findings, while preliminary, reflect that surgical performance can be enhanced through structured warm-up activities.

  20. A novel non-invasive diagnostic sampling technique for cutaneous leishmaniasis.

    Directory of Open Access Journals (Sweden)

    Yasaman Taslimi

    2017-07-01

    Full Text Available Accurate diagnosis of cutaneous leishmaniasis (CL is important for chemotherapy and epidemiological studies. Common approaches for Leishmania detection involve the invasive collection of specimens for direct identification of amastigotes by microscopy and the culturing of promastigotes from infected tissues. Although these techniques are highly specific, they require highly skilled health workers and have the inherent risks of all invasive procedures, such as pain and risk of bacterial and fungal super-infection. Therefore, it is essential to reduce discomfort, potential infection and scarring caused by invasive diagnostic approaches especially for children. In this report, we present a novel non-invasive method, that is painless, rapid and user-friendly, using sequential tape strips for sampling and isolation of DNA from the surface of active and healed skin lesions of CL patients. A total of 119 patients suspected of suffering from cutaneous leishmaniasis with different clinical manifestations were recruited and samples were collected both from their lesions and from uninfected areas. In addition, 15 fungal-infected lesions and 54 areas of healthy skin were examined. The duration of sampling is short (less than one minute and species identification by PCR is highly specific and sensitive. The sequential tape stripping sampling method is a sensitive, non-invasive and cost-effective alternative to traditional diagnostic assays and it is suitable for field studies as well as for use in health care centers.

  1. Use of X-ray diffraction technique and chemometrics to aid soil sampling strategies in traceability studies.

    Science.gov (United States)

    Bertacchini, Lucia; Durante, Caterina; Marchetti, Andrea; Sighinolfi, Simona; Silvestri, Michele; Cocchi, Marina

    2012-08-30

    Aim of this work is to assess the potentialities of the X-ray powder diffraction technique as fingerprinting technique, i.e. as a preliminary tool to assess soil samples variability, in terms of geochemical features, in the context of food geographical traceability. A correct approach to sampling procedure is always a critical issue in scientific investigation. In particular, in food geographical traceability studies, where the cause-effect relations between the soil of origin and the final foodstuff is sought, a representative sampling of the territory under investigation is certainly an imperative. This research concerns a pilot study to investigate the field homogeneity with respect to both field extension and sampling depth, taking also into account the seasonal variability. Four Lambrusco production sites of the Modena district were considered. The X-Ray diffraction spectra, collected on the powder of each soil sample, were treated as fingerprint profiles to be deciphered by multivariate and multi-way data analysis, namely PCA and PARAFAC. The differentiation pattern observed in soil samples, as obtained by this fast and non-destructive analytical approach, well matches with the results obtained by characterization with other costly analytical techniques, such as ICP/MS, GFAAS, FAAS, etc. Thus, the proposed approach furnishes a rational basis to reduce the number of soil samples to be collected for further analytical characterization, i.e. metals content, isotopic ratio of radiogenic element, etc., while maintaining an exhaustive description of the investigated production areas. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. Sample preparation for large-scale bioanalytical studies based on liquid chromatographic techniques.

    Science.gov (United States)

    Medvedovici, Andrei; Bacalum, Elena; David, Victor

    2018-01-01

    Quality of the analytical data obtained for large-scale and long term bioanalytical studies based on liquid chromatography depends on a number of experimental factors including the choice of sample preparation method. This review discusses this tedious part of bioanalytical studies, applied to large-scale samples and using liquid chromatography coupled with different detector types as core analytical technique. The main sample preparation methods included in this paper are protein precipitation, liquid-liquid extraction, solid-phase extraction, derivatization and their versions. They are discussed by analytical performances, fields of applications, advantages and disadvantages. The cited literature covers mainly the analytical achievements during the last decade, although several previous papers became more valuable in time and they are included in this review. Copyright © 2017 John Wiley & Sons, Ltd.

  3. Use of an oscillation technique to measure effective cross-sections of fissionable samples in critical assemblies

    International Nuclear Information System (INIS)

    Tretiakoff, O.; Vidal, R.; Carre, J.C.; Robin, M.

    1964-01-01

    The authors describe the technique used to measure the effective absorption and neutron-yield cross-sections of a fissionable sample. These two values are determined by analysing the signals due to the variation in reactivity (over-all signal) and the local perturbation in the flux (local signal) produced by the oscillating sample. These signals are standardized by means of a set of samples containing quantities of fissionable material ( 235 U) and an absorber, boron, which are well known. The measurements are made for different neutron spectra characterized by lattice parameters which constitute the central zone within which the sample moves. This technique is used to study the effective cross-sections of uranium-plutonium alloys for different heavy-water and graphite lattices in the MINERVE and MARIUS critical assemblies. The same experiments are carried out on fuel samples of different irradiations in order to determine the evolution of effective cross-sections as a function of the spectrum and the irradiations. (authors) [fr

  4. Toward greener analytical techniques for the absolute quantification of peptides in pharmaceutical and biological samples.

    Science.gov (United States)

    Van Eeckhaut, Ann; Mangelings, Debby

    2015-09-10

    Peptide-based biopharmaceuticals represent one of the fastest growing classes of new drug molecules. New reaction types included in the synthesis strategies to reduce the rapid metabolism of peptides, along with the availability of new formulation and delivery technologies, resulted in an increased marketing of peptide drug products. In this regard, the development of analytical methods for quantification of peptides in pharmaceutical and biological samples is of utmost importance. From the sample preparation step to their analysis by means of chromatographic or electrophoretic methods, many difficulties should be tackled to analyze them. Recent developments in analytical techniques emphasize more and more on the use of green analytical techniques. This review will discuss the progresses in and challenges observed during green analytical method development for the quantification of peptides in pharmaceutical and biological samples. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Efficacy of Self-Instructional Talk and Multicomponent Techniques in ...

    African Journals Online (AJOL)

    The multi-stage random sampling technique was used to select 120 teachers (60 males and 60 females) who participated in the study. The treatment groups were exposed to eight weeks of training sessions while the participants in the control were not exposed to any treatment, Teacher Stress Scale was used to collect ...

  6. Self-reference and random sampling approach for label-free identification of DNA composition using plasmonic nanomaterials.

    Science.gov (United States)

    Freeman, Lindsay M; Pang, Lin; Fainman, Yeshaiahu

    2018-05-09

    The analysis of DNA has led to revolutionary advancements in the fields of medical diagnostics, genomics, prenatal screening, and forensic science, with the global DNA testing market expected to reach revenues of USD 10.04 billion per year by 2020. However, the current methods for DNA analysis remain dependent on the necessity for fluorophores or conjugated proteins, leading to high costs associated with consumable materials and manual labor. Here, we demonstrate a potential label-free DNA composition detection method using surface-enhanced Raman spectroscopy (SERS) in which we identify the composition of cytosine and adenine within single strands of DNA. This approach depends on the fact that there is one phosphate backbone per nucleotide, which we use as a reference to compensate for systematic measurement variations. We utilize plasmonic nanomaterials with random Raman sampling to perform label-free detection of the nucleotide composition within DNA strands, generating a calibration curve from standard samples of DNA and demonstrating the capability of resolving the nucleotide composition. The work represents an innovative way for detection of the DNA composition within DNA strands without the necessity of attached labels, offering a highly sensitive and reproducible method that factors in random sampling to minimize error.

  7. A cost-effective technique for integrating personal radiation dose assessment with personal gravimetric sampling

    International Nuclear Information System (INIS)

    Strydom, R.; Rolle, R.; Van der Linde, A.

    1992-01-01

    During recent years there has been an increasing awareness internationally of radiation levels in the mining and milling of radioactive ores, including those from non-uranium mines. A major aspect of radiation control is concerned with the measurement of radiation levels and the assessment of radiation doses incurred by individual workers. Current techniques available internationally for personnel monitoring of radiation exposures are expensive and there is a particular need to reduce the cost of personal radiation monitoring in South African gold mines because of the large labour force employed. In this regard the obvious benefits of integrating personal radiation monitoring with existing personal monitoring systems already in place in South African gold mines should be exploited. A system which can be utilized for this purpose is personal gravimetric sampling. A new cost-effective technique for personal radiation monitoring, which can be fully integrated with the personal gravimetric sampling strategy being implemented on mines, has been developed in South Africa. The basic principles of this technique and its potential in South African mines are described. 9 refs., 7 figs

  8. Comparing cluster-level dynamic treatment regimens using sequential, multiple assignment, randomized trials: Regression estimation and sample size considerations.

    Science.gov (United States)

    NeCamp, Timothy; Kilbourne, Amy; Almirall, Daniel

    2017-08-01

    Cluster-level dynamic treatment regimens can be used to guide sequential treatment decision-making at the cluster level in order to improve outcomes at the individual or patient-level. In a cluster-level dynamic treatment regimen, the treatment is potentially adapted and re-adapted over time based on changes in the cluster that could be impacted by prior intervention, including aggregate measures of the individuals or patients that compose it. Cluster-randomized sequential multiple assignment randomized trials can be used to answer multiple open questions preventing scientists from developing high-quality cluster-level dynamic treatment regimens. In a cluster-randomized sequential multiple assignment randomized trial, sequential randomizations occur at the cluster level and outcomes are observed at the individual level. This manuscript makes two contributions to the design and analysis of cluster-randomized sequential multiple assignment randomized trials. First, a weighted least squares regression approach is proposed for comparing the mean of a patient-level outcome between the cluster-level dynamic treatment regimens embedded in a sequential multiple assignment randomized trial. The regression approach facilitates the use of baseline covariates which is often critical in the analysis of cluster-level trials. Second, sample size calculators are derived for two common cluster-randomized sequential multiple assignment randomized trial designs for use when the primary aim is a between-dynamic treatment regimen comparison of the mean of a continuous patient-level outcome. The methods are motivated by the Adaptive Implementation of Effective Programs Trial which is, to our knowledge, the first-ever cluster-randomized sequential multiple assignment randomized trial in psychiatry.

  9. 238U And 232Th Concentration In Rock Samples using Alpha Autoradiography and Gamma Spectroscopy Techniques

    International Nuclear Information System (INIS)

    Hafez, A.F.; El-Farrash, A.H.; Yousef, H.A.

    2009-01-01

    The activity concentrations of uranium and thorium were measured for some rock samples selected from Dahab region in the south tip of Sinai. In order to detect any harmful radiation that would affect on the tourists and is becoming economic resource because Dahab have open fields of tourism in Egypt. The activity concentration of uranium and thorium in rocks samples was measured using two techniques. The first is .-autoradiography technique with LR-115 and CR-39 detectors and the second is gamma spectroscopic technique with NaI(Tl) detector. It was found that the average activity concentrations of uranium and thorium using .-autoradiography technique ranged from 6.41-49.31 Bqkg-1, 4.86- 40.87 Bqkg-1 respectively and by gamma detector are ranged from 6.70- 49.50 Bqkg-1, 4.47- 42.33 Bqkg-1 respectively. From the obtained data we can conclude that there is no radioactive healthy hazard for human and living beings in the area under investigation. It was found that there are no big differences between the calculated thorium to uranium ratios in both techniques

  10. Atmospheric pressure surface sampling/ionization techniques for direct coupling of planar separations with mass spectrometry.

    Science.gov (United States)

    Pasilis, Sofie P; Van Berkel, Gary J

    2010-06-18

    Planar separations, which include thin layer chromatography and gel electrophoresis, are in widespread use as important and powerful tools for conducting separations of complex mixtures. To increase the utility of planar separations, new methods are needed that allow in situ characterization of the individual components of the separated mixtures. A large number of atmospheric pressure surface sampling and ionization techniques for use with mass spectrometry have emerged in the past several years, and several have been investigated as a means for mass spectrometric read-out of planar separations. In this article, we review the atmospheric pressure surface sampling and ionization techniques that have been used for the read-out of planar separation media. For each technique, we briefly explain the operational basics and discuss the analyte type for which it is appropriate and some specific applications from the literature. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  11. Waste minimization in analytical chemistry through innovative sample preparation techniques

    International Nuclear Information System (INIS)

    Smith, L. L.

    1998-01-01

    water samples. In this SPME technique, a fused-silica fiber coated with a polymeric film is exposed to the sample, extraction is allowed to take place, and then the analytes are thermally desorbed for GC analysis. Unlike liquid-liquid extraction or solid-phase extraction, SPME consumes all of the extracted sample in the analysis, significantly reducing the required sample volume

  12. High-throughput droplet analysis and multiplex DNA detection in the microfluidic platform equipped with a robust sample-introduction technique

    International Nuclear Information System (INIS)

    Chen, Jinyang; Ji, Xinghu; He, Zhike

    2015-01-01

    In this work, a simple, flexible and low-cost sample-introduction technique was developed and integrated with droplet platform. The sample-introduction strategy was realized based on connecting the components of positive pressure input device, sample container and microfluidic chip through the tygon tubing with homemade polydimethylsiloxane (PDMS) adaptor, so the sample was delivered into the microchip from the sample container under the driving of positive pressure. This sample-introduction technique is so robust and compatible that could be integrated with T-junction, flow-focus or valve-assisted droplet microchips. By choosing the PDMS adaptor with proper dimension, the microchip could be flexibly equipped with various types of familiar sample containers, makes the sampling more straightforward without trivial sample transfer or loading. And the convenient sample changing was easily achieved by positioning the adaptor from one sample container to another. Benefiting from the proposed technique, the time-dependent concentration gradient was generated and applied for quantum dot (QD)-based fluorescence barcoding within droplet chip. High-throughput droplet screening was preliminarily demonstrated through the investigation of the quenching efficiency of ruthenium complex to the fluorescence of QD. More importantly, multiplex DNA assay was successfully carried out in the integrated system, which shows the practicability and potentials in high-throughput biosensing. - Highlights: • A simple, robust and low-cost sample-introduction technique was developed. • Convenient and flexible sample changing was achieved in microfluidic system. • Novel strategy of concentration gradient generation was presented for barcoding. • High-throughput droplet screening could be realized in the integrated platform. • Multiplex DNA assay was successfully carried out in the droplet platform

  13. High-throughput droplet analysis and multiplex DNA detection in the microfluidic platform equipped with a robust sample-introduction technique

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Jinyang; Ji, Xinghu [Key Laboratory of Analytical Chemistry for Biology and Medicine (Ministry of Education), College of Chemistry and Molecular Sciences, Wuhan University, Wuhan 430072 (China); He, Zhike, E-mail: zhkhe@whu.edu.cn [Key Laboratory of Analytical Chemistry for Biology and Medicine (Ministry of Education), College of Chemistry and Molecular Sciences, Wuhan University, Wuhan 430072 (China); Suzhou Institute of Wuhan University, Suzhou 215123 (China)

    2015-08-12

    In this work, a simple, flexible and low-cost sample-introduction technique was developed and integrated with droplet platform. The sample-introduction strategy was realized based on connecting the components of positive pressure input device, sample container and microfluidic chip through the tygon tubing with homemade polydimethylsiloxane (PDMS) adaptor, so the sample was delivered into the microchip from the sample container under the driving of positive pressure. This sample-introduction technique is so robust and compatible that could be integrated with T-junction, flow-focus or valve-assisted droplet microchips. By choosing the PDMS adaptor with proper dimension, the microchip could be flexibly equipped with various types of familiar sample containers, makes the sampling more straightforward without trivial sample transfer or loading. And the convenient sample changing was easily achieved by positioning the adaptor from one sample container to another. Benefiting from the proposed technique, the time-dependent concentration gradient was generated and applied for quantum dot (QD)-based fluorescence barcoding within droplet chip. High-throughput droplet screening was preliminarily demonstrated through the investigation of the quenching efficiency of ruthenium complex to the fluorescence of QD. More importantly, multiplex DNA assay was successfully carried out in the integrated system, which shows the practicability and potentials in high-throughput biosensing. - Highlights: • A simple, robust and low-cost sample-introduction technique was developed. • Convenient and flexible sample changing was achieved in microfluidic system. • Novel strategy of concentration gradient generation was presented for barcoding. • High-throughput droplet screening could be realized in the integrated platform. • Multiplex DNA assay was successfully carried out in the droplet platform.

  14. The novel programmable riometer for in-depth ionospheric and magnetospheric observations (PRIAMOS) using direct sampling DSP techniques

    OpenAIRE

    Dekoulis, G.; Honary, F.

    2005-01-01

    This paper describes the feasibility study and simulation results for the unique multi-frequency, multi-bandwidth, Programmable Riometer for in-depth Ionospheric And Magnetospheric ObservationS (PRIAMOS) based on direct sampling digital signal processing (DSP) techniques. This novel architecture is based on sampling the cosmic noise wavefront at the antenna. It eliminates the usage of any intermediate frequency (IF) mixer stages (-6 dB) and the noise balancing technique (-3 dB), providing a m...

  15. Dorsal onlay (Barbagli technique) versus dorsal inlay (Asopa technique) buccal mucosal graft urethroplasty for anterior urethral stricture: a prospective randomized study.

    Science.gov (United States)

    Aldaqadossi, Hussein; El Gamal, Samir; El-Nadey, Mohamed; El Gamal, Osama; Radwan, Mohamed; Gaber, Mohamed

    2014-02-01

    To compare both the dorsal onlay technique of Barbagli and the dorsal inlay technique of Asopa for the management of long anterior urethral stricture. From January 2010 to May 2012, a total of 47 patients with long anterior urethral strictures were randomized into two groups. The first group included 25 patients who were managed by dorsal onlay buccal mucosal graft urethroplasty. The second group included 22 patients who were managed by dorsal inlay buccal mucosal graft urethroplasty. Different clinical parameters, postoperative complications and success rates were compared between both groups. The overall success rate in the dorsal onlay group was 88%, whereas in the dorsal inlay group the success rate was 86.4% during the follow-up period. The mean operative time was significantly longer in the dorsal onlay urethroplasty group (205 ± 19.63 min) than in the dorsal inlay urethroplasty group (128 ± 4.9 min, P-value <0.0001). The average blood loss was significantly higher in the dorsal onlay urethroplasty group (228 ± 5.32 mL) than in the dorsal inlay urethroplasty group (105 ± 12.05 mL, P-value <0.0001). The dorsal onlay technique of Barbagli and the dorsal inlay technique of Asopa buccal mucosal graft urethroplasty provide similar success rates. The Asopa technique is easy to carry out, provides shorter operative time and less blood loss, and it is associated with fewer complications for anterior urethral stricture repair. © 2013 The Japanese Urological Association.

  16. Laser-Assisted Sampling Techniques in Combination with ICP-MS: A Novel Approach for Particle Analysis at the IAEA Environmental Samples Laboratory

    International Nuclear Information System (INIS)

    Dzigal, N.; Chinea-Cano, E.

    2015-01-01

    Researchers have found many applications for lasers. About two decades ago, scientists started using lasers as sample introduction instruments for mass spectrometry measurements. Similarly, lasers as micro-dissection tools have also been increasingly on demand in the fields of life sciences, materials science, forensics, etc. This presentation deals with the interception of these aforementioned laser-assisted techniques to the field of particle analysis. Historically, the use of a nanosecond laser to ablate material has been used in materials science. Recently, it has been proven that in the analysis of particulate materials the disadvantages associated with the utilization of nanosecond lasers such as overheating and melting of the sample are suppressed when using femtosecond lasers. Further, due to the length of a single laser shot, fs-LA allows a more controlled ablation to occur and therefore the sample plasma is more homogeneous and less mass-fractionation events are detected. The use of laser micro-dissection devices enables the physical segmentation of microsized artefacts previously performed by a laborious manual procedure. By combining the precision of the laser cutting inherent to the LMD technique together with a particle identification methodology, one can increase the efficiency of single particle isolation. Further, besides the increase in throughput of analyses, this combination enhances the signal-to-noise ratio by removing matrix particles effectively. Specifically, this contribution describes the use of an Olympus+MMI laser microdissection device in improving the sample preparation of environmental swipe samples and the installation of an Applied Spectra J200 fs-LA/LIBS (laser ablation/laser inducedbreakdown spectroscopy) system as a sample introduction device to a quadrupole mass spectrometer, the iCap Q from Thermofisher Scientific at the IAEA Environmental Samples Laboratory are explored. Preliminary results of the ongoing efforts for the

  17. Thermophilic Campylobacter spp. in turkey samples: evaluation of two automated enzyme immunoassays and conventional microbiological techniques

    DEFF Research Database (Denmark)

    Borck, Birgitte; Stryhn, H.; Ersboll, A.K.

    2002-01-01

    Aims: To determine the sensitivity and specificity of two automated enzyme immunoassays (EIA), EiaFoss and Minividas, and a conventional microbiological culture technique for detecting thermophilic Campylobacter spp. in turkey samples. Methods and Results: A total of 286 samples (faecal, meat...

  18. Application of Vector Triggering Random Decrement

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Ibrahim, S. R.; Brincker, Rune

    result is a Random Decrement function from each measurement. In traditional Random Decrement estimation the triggering condition is a scalar condition, which should only be fulfilled in a single measurement. In vector triggering Random Decrement the triggering condition is a vector condition......This paper deals with applications of the vector triggering Random Decrement technique. This technique is new and developed with the aim of minimizing estimation time and identification errors. The theory behind the technique is discussed in an accompanying paper. The results presented...... in this paper should be regarded as a further documentation of the technique. The key point in Random Decrement estimation is the formulation of a triggering condition. If the triggering condition is fulfilled a time segment from each measurement is picked out and averaged with previous time segments. The final...

  19. Application of Vector Triggering Random Decrement

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Ibrahim, S. R.; Brincker, Rune

    1997-01-01

    result is a Random Decrement function from each measurement. In traditional Random Decrement estimation the triggering condition is a scalar condition, which should only be fulfilled in a single measurement. In vector triggering Random Decrement the triggering condition is a vector condition......This paper deals with applications of the vector triggering Random Decrement technique. This technique is new and developed with the aim of minimizing estimation time and identification errors. The theory behind the technique is discussed in an accompanying paper. The results presented...... in this paper should be regarded as a further documentation of the technique. The key point in Random Decrement estimation is the formulation of a triggering condition. If the triggering condition is fulfilled a time segment from each measurement is picked out and averaged with previous time segments. The final...

  20. Solving mercury (Hg) speciation in soil samples by synchrotron X-ray microspectroscopic techniques.

    Science.gov (United States)

    Terzano, Roberto; Santoro, Anna; Spagnuolo, Matteo; Vekemans, Bart; Medici, Luca; Janssens, Koen; Göttlicher, Jörg; Denecke, Melissa A; Mangold, Stefan; Ruggiero, Pacifico

    2010-08-01

    Direct mercury (Hg) speciation was assessed for soil samples with a Hg concentration ranging from 7 up to 240 mg kg(-1). Hg chemical forms were identified and quantified by sequential extractions and bulk- and micro-analytical techniques exploiting synchrotron generated X-rays. In particular, microspectroscopic techniques such as mu-XRF, mu-XRD and mu-XANES were necessary to solve bulk Hg speciation, in both soil fractions soil samples were metacinnabar (beta-HgS), cinnabar (alpha-HgS), corderoite (Hg(3)S(2)Cl(2)), and an amorphous phase containing Hg bound to chlorine and sulfur. The amount of metacinnabar and amorphous phases increased in the fraction soil components was observed. All the observed Hg-species originated from the slow weathering of an inert Hg-containing waste material (K106, U.S. EPA) dumped in the area several years ago, which is changing into a relatively more dangerous source of pollution. Copyright 2010 Elsevier Ltd. All rights reserved.

  1. Acute stress symptoms during the second Lebanon war in a random sample of Israeli citizens.

    Science.gov (United States)

    Cohen, Miri; Yahav, Rivka

    2008-02-01

    The aims of this study were to assess prevalence of acute stress disorder (ASD) and acute stress symptoms (ASS) in Israel during the second Lebanon war. A telephone survey was conducted in July 2006 of a random sample of 235 residents of northern Israel, who were subjected to missile attacks, and of central Israel, who were not subjected to missile attacks. Results indicate that ASS scores were higher in the northern respondents; 6.8% of the northern sample and 3.9% of the central sample met ASD criteria. Appearance of each symptom ranged from 15.4% for dissociative to 88.4% for reexperiencing, with significant differences between northern and central respondents only for reexperiencing and arousal. A low ASD rate and a moderate difference between areas subjected and not subjected to attack were found.

  2. Prevalence and correlates of problematic smartphone use in a large random sample of Chinese undergraduates.

    Science.gov (United States)

    Long, Jiang; Liu, Tie-Qiao; Liao, Yan-Hui; Qi, Chang; He, Hao-Yu; Chen, Shu-Bao; Billieux, Joël

    2016-11-17

    Smartphones are becoming a daily necessity for most undergraduates in Mainland China. Because the present scenario of problematic smartphone use (PSU) is largely unexplored, in the current study we aimed to estimate the prevalence of PSU and to screen suitable predictors for PSU among Chinese undergraduates in the framework of the stress-coping theory. A sample of 1062 undergraduate smartphone users was recruited by means of the stratified cluster random sampling strategy between April and May 2015. The Problematic Cellular Phone Use Questionnaire was used to identify PSU. We evaluated five candidate risk factors for PSU by using logistic regression analysis while controlling for demographic characteristics and specific features of smartphone use. The prevalence of PSU among Chinese undergraduates was estimated to be 21.3%. The risk factors for PSU were majoring in the humanities, high monthly income from the family (≥1500 RMB), serious emotional symptoms, high perceived stress, and perfectionism-related factors (high doubts about actions, high parental expectations). PSU among undergraduates appears to be ubiquitous and thus constitutes a public health issue in Mainland China. Although further longitudinal studies are required to test whether PSU is a transient phenomenon or a chronic and progressive condition, our study successfully identified socio-demographic and psychological risk factors for PSU. These results, obtained from a random and thus representative sample of undergraduates, opens up new avenues in terms of prevention and regulation policies.

  3. Coherent optical adaptive technique improves the spatial resolution of STED microscopy in thick samples

    Science.gov (United States)

    Yan, Wei; Yang, Yanlong; Tan, Yu; Chen, Xun; Li, Yang; Qu, Junle; Ye, Tong

    2018-01-01

    Stimulated emission depletion microscopy (STED) is one of far-field optical microscopy techniques that can provide sub-diffraction spatial resolution. The spatial resolution of the STED microscopy is determined by the specially engineered beam profile of the depletion beam and its power. However, the beam profile of the depletion beam may be distorted due to aberrations of optical systems and inhomogeneity of specimens’ optical properties, resulting in a compromised spatial resolution. The situation gets deteriorated when thick samples are imaged. In the worst case, the sever distortion of the depletion beam profile may cause complete loss of the super resolution effect no matter how much depletion power is applied to specimens. Previously several adaptive optics approaches have been explored to compensate aberrations of systems and specimens. However, it is hard to correct the complicated high-order optical aberrations of specimens. In this report, we demonstrate that the complicated distorted wavefront from a thick phantom sample can be measured by using the coherent optical adaptive technique (COAT). The full correction can effectively maintain and improve the spatial resolution in imaging thick samples. PMID:29400356

  4. Randomized Comparison of Two Vaginal Self-Sampling Methods for Human Papillomavirus Detection: Dry Swab versus FTA Cartridge

    OpenAIRE

    Catarino, Rosa; Vassilakos, Pierre; Bilancioni, Aline; Vanden Eynde, Mathieu; Meyer-Hamme, Ulrike; Menoud, Pierre-Alain; Guerry, Fr?d?ric; Petignat, Patrick

    2015-01-01

    Background Human papillomavirus (HPV) self-sampling (self-HPV) is valuable in cervical cancer screening. HPV testing is usually performed on physician-collected cervical smears stored in liquid-based medium. Dry filters and swabs are an alternative. We evaluated the adequacy of self-HPV using two dry storage and transport devices, the FTA cartridge and swab. Methods A total of 130 women performed two consecutive self-HPV samples. Randomization determined which of the two tests was performed f...

  5. Random Sampling with Interspike-Intervals of the Exponential Integrate and Fire Neuron: A Computational Interpretation of UP-States.

    Directory of Open Access Journals (Sweden)

    Andreas Steimer

    Full Text Available Oscillations between high and low values of the membrane potential (UP and DOWN states respectively are an ubiquitous feature of cortical neurons during slow wave sleep and anesthesia. Nevertheless, a surprisingly small number of quantitative studies have been conducted only that deal with this phenomenon's implications for computation. Here we present a novel theory that explains on a detailed mathematical level the computational benefits of UP states. The theory is based on random sampling by means of interspike intervals (ISIs of the exponential integrate and fire (EIF model neuron, such that each spike is considered a sample, whose analog value corresponds to the spike's preceding ISI. As we show, the EIF's exponential sodium current, that kicks in when balancing a noisy membrane potential around values close to the firing threshold, leads to a particularly simple, approximative relationship between the neuron's ISI distribution and input current. Approximation quality depends on the frequency spectrum of the current and is improved upon increasing the voltage baseline towards threshold. Thus, the conceptually simpler leaky integrate and fire neuron that is missing such an additional current boost performs consistently worse than the EIF and does not improve when voltage baseline is increased. For the EIF in contrast, the presented mechanism is particularly effective in the high-conductance regime, which is a hallmark feature of UP-states. Our theoretical results are confirmed by accompanying simulations, which were conducted for input currents of varying spectral composition. Moreover, we provide analytical estimations of the range of ISI distributions the EIF neuron can sample from at a given approximation level. Such samples may be considered by any algorithmic procedure that is based on random sampling, such as Markov Chain Monte Carlo or message-passing methods. Finally, we explain how spike-based random sampling relates to existing

  6. Random Sampling with Interspike-Intervals of the Exponential Integrate and Fire Neuron: A Computational Interpretation of UP-States.

    Science.gov (United States)

    Steimer, Andreas; Schindler, Kaspar

    2015-01-01

    Oscillations between high and low values of the membrane potential (UP and DOWN states respectively) are an ubiquitous feature of cortical neurons during slow wave sleep and anesthesia. Nevertheless, a surprisingly small number of quantitative studies have been conducted only that deal with this phenomenon's implications for computation. Here we present a novel theory that explains on a detailed mathematical level the computational benefits of UP states. The theory is based on random sampling by means of interspike intervals (ISIs) of the exponential integrate and fire (EIF) model neuron, such that each spike is considered a sample, whose analog value corresponds to the spike's preceding ISI. As we show, the EIF's exponential sodium current, that kicks in when balancing a noisy membrane potential around values close to the firing threshold, leads to a particularly simple, approximative relationship between the neuron's ISI distribution and input current. Approximation quality depends on the frequency spectrum of the current and is improved upon increasing the voltage baseline towards threshold. Thus, the conceptually simpler leaky integrate and fire neuron that is missing such an additional current boost performs consistently worse than the EIF and does not improve when voltage baseline is increased. For the EIF in contrast, the presented mechanism is particularly effective in the high-conductance regime, which is a hallmark feature of UP-states. Our theoretical results are confirmed by accompanying simulations, which were conducted for input currents of varying spectral composition. Moreover, we provide analytical estimations of the range of ISI distributions the EIF neuron can sample from at a given approximation level. Such samples may be considered by any algorithmic procedure that is based on random sampling, such as Markov Chain Monte Carlo or message-passing methods. Finally, we explain how spike-based random sampling relates to existing computational

  7. Different goodness of fit tests for Rayleigh distribution in ranked set sampling

    Directory of Open Access Journals (Sweden)

    Amer Al-Omari

    2016-03-01

    Full Text Available In this paper, different goodness of fit tests for the Rayleigh distribution are considered based on simple random sampling (SRS and ranked set sampling (RSS techniques. The performance of the suggested estimators is evaluated in terms of the power of the tests by using Monte Carlo simulation. It is found that the suggested RSS tests perform better than their counterparts  in SRS.

  8. Estimating HIES Data through Ratio and Regression Methods for Different Sampling Designs

    Directory of Open Access Journals (Sweden)

    Faqir Muhammad

    2007-01-01

    Full Text Available In this study, comparison has been made for different sampling designs, using the HIES data of North West Frontier Province (NWFP for 2001-02 and 1998-99 collected from the Federal Bureau of Statistics, Statistical Division, Government of Pakistan, Islamabad. The performance of the estimators has also been considered using bootstrap and Jacknife. A two-stage stratified random sample design is adopted by HIES. In the first stage, enumeration blocks and villages are treated as the first stage Primary Sampling Units (PSU. The sample PSU’s are selected with probability proportional to size. Secondary Sampling Units (SSU i.e., households are selected by systematic sampling with a random start. They have used a single study variable. We have compared the HIES technique with some other designs, which are: Stratified Simple Random Sampling. Stratified Systematic Sampling. Stratified Ranked Set Sampling. Stratified Two Phase Sampling. Ratio and Regression methods were applied with two study variables, which are: Income (y and Household sizes (x. Jacknife and Bootstrap are used for variance replication. Simple Random Sampling with sample size (462 to 561 gave moderate variances both by Jacknife and Bootstrap. By applying Systematic Sampling, we received moderate variance with sample size (467. In Jacknife with Systematic Sampling, we obtained variance of regression estimator greater than that of ratio estimator for a sample size (467 to 631. At a sample size (952 variance of ratio estimator gets greater than that of regression estimator. The most efficient design comes out to be Ranked set sampling compared with other designs. The Ranked set sampling with jackknife and bootstrap, gives minimum variance even with the smallest sample size (467. Two Phase sampling gave poor performance. Multi-stage sampling applied by HIES gave large variances especially if used with a single study variable.

  9. Sampling and sample preparation development for analytical and on-line measurement techniques of process liquids; Naeytteenoton ja kaesittelyn kehittaeminen prosessinesteiden analytiikan ja on-line mittaustekniikan tarpeisiin - MPKT 11

    Energy Technology Data Exchange (ETDEWEB)

    Karttunen, K. [Oulu Univ. (Finland)

    1998-12-31

    Main goal of the research project is to develop sampling and sample handling methods and techniques for pulp and paper industry to be used for analysis and on-line purposes. The research focus specially on the research and development of the classification and separation methods and techniques needed for liquid and colloidal substances as well as in ion analysis. (orig.)

  10. Sampling and sample preparation development for analytical and on-line measurement techniques of process liquids; Naeytteenoton ja kaesittelyn kehittaeminen prosessinesteiden analytiikan ja on-line mittaustekniikan tarpeisiin - MPKT 11

    Energy Technology Data Exchange (ETDEWEB)

    Karttunen, K [Oulu Univ. (Finland)

    1999-12-31

    Main goal of the research project is to develop sampling and sample handling methods and techniques for pulp and paper industry to be used for analysis and on-line purposes. The research focus specially on the research and development of the classification and separation methods and techniques needed for liquid and colloidal substances as well as in ion analysis. (orig.)

  11. Water stable isotope measurements of Antarctic samples by means of IRMS and WS-CRDS techniques

    Science.gov (United States)

    Michelini, Marzia; Bonazza, Mattia; Braida, Martina; Flora, Onelio; Dreossi, Giuliano; Stenni, Barbara

    2010-05-01

    In the last years in the scientific community there has been an increasing interest for the application of stable isotope techniques to several environmental problems such as drinking water safeguarding, groundwater management, climate change, soils and paleoclimate studies etc. For example, the water stable isotopes, being natural tracers of the hydrological cycle, have been extensively used as tools to characterize regional aquifers and to reconstruct past temperature changes from polar ice cores. Here the need for improvements in analytical techniques: the high request for information calls for technologies that can offer a great quantity of analyses in short times and with low costs. Furthermore, sometimes it is difficult to obtain big amount of samples (as is the case for Antarctic ice cores or interstitial water) preventing the possibility to replicate the analyses. Here, we present oxygen and hydrogen measurements performed on water samples covering a big range of isotopic values (from very negative antarctic precipitation to mid-latitude precipitation values) carried out with both the conventional Isotope Ratio Mass Spectrometry (IRMS) technique and with a new method based on laser absorption techniques, the Wavelenght Scanned Cavity Ringdown Spectroscopy (WS-CRDS). This study is focusing on improving the precision of the measurements carried out with WS-CRDS in order to extensively apply this method to Antarctic ice core paleoclimate studies. The WS-CRDS is a variation of the CRDS developed in 1988 by O'Keef and Deacon. In CRDS a pulse of light goes through a box with high reflective inner surfaces; when there is no sample in the box the light beam doesn't find any obstacle in its path, but the reflectivity of the walls is not perfect so eventually there will be an absorption of the light beam; when the sample is injected in the box there is absorption and the difference between the time of absorption without and with sample is proportional to the quantity

  12. Convergence and Efficiency of Adaptive Importance Sampling Techniques with Partial Biasing

    Science.gov (United States)

    Fort, G.; Jourdain, B.; Lelièvre, T.; Stoltz, G.

    2018-04-01

    We propose a new Monte Carlo method to efficiently sample a multimodal distribution (known up to a normalization constant). We consider a generalization of the discrete-time Self Healing Umbrella Sampling method, which can also be seen as a generalization of well-tempered metadynamics. The dynamics is based on an adaptive importance technique. The importance function relies on the weights (namely the relative probabilities) of disjoint sets which form a partition of the space. These weights are unknown but are learnt on the fly yielding an adaptive algorithm. In the context of computational statistical physics, the logarithm of these weights is, up to an additive constant, the free-energy, and the discrete valued function defining the partition is called the collective variable. The algorithm falls into the general class of Wang-Landau type methods, and is a generalization of the original Self Healing Umbrella Sampling method in two ways: (i) the updating strategy leads to a larger penalization strength of already visited sets in order to escape more quickly from metastable states, and (ii) the target distribution is biased using only a fraction of the free-energy, in order to increase the effective sample size and reduce the variance of importance sampling estimators. We prove the convergence of the algorithm and analyze numerically its efficiency on a toy example.

  13. Scan Order in Gibbs Sampling: Models in Which it Matters and Bounds on How Much.

    Science.gov (United States)

    He, Bryan; De Sa, Christopher; Mitliagkas, Ioannis; Ré, Christopher

    2016-01-01

    Gibbs sampling is a Markov Chain Monte Carlo sampling technique that iteratively samples variables from their conditional distributions. There are two common scan orders for the variables: random scan and systematic scan. Due to the benefits of locality in hardware, systematic scan is commonly used, even though most statistical guarantees are only for random scan. While it has been conjectured that the mixing times of random scan and systematic scan do not differ by more than a logarithmic factor, we show by counterexample that this is not the case, and we prove that that the mixing times do not differ by more than a polynomial factor under mild conditions. To prove these relative bounds, we introduce a method of augmenting the state space to study systematic scan using conductance.

  14. Discriminative motif discovery via simulated evolution and random under-sampling.

    Directory of Open Access Journals (Sweden)

    Tao Song

    Full Text Available Conserved motifs in biological sequences are closely related to their structure and functions. Recently, discriminative motif discovery methods have attracted more and more attention. However, little attention has been devoted to the data imbalance problem, which is one of the main reasons affecting the performance of the discriminative models. In this article, a simulated evolution method is applied to solve the multi-class imbalance problem at the stage of data preprocessing, and at the stage of Hidden Markov Models (HMMs training, a random under-sampling method is introduced for the imbalance between the positive and negative datasets. It is shown that, in the task of discovering targeting motifs of nine subcellular compartments, the motifs found by our method are more conserved than the methods without considering data imbalance problem and recover the most known targeting motifs from Minimotif Miner and InterPro. Meanwhile, we use the found motifs to predict protein subcellular localization and achieve higher prediction precision and recall for the minority classes.

  15. Discriminative motif discovery via simulated evolution and random under-sampling.

    Science.gov (United States)

    Song, Tao; Gu, Hong

    2014-01-01

    Conserved motifs in biological sequences are closely related to their structure and functions. Recently, discriminative motif discovery methods have attracted more and more attention. However, little attention has been devoted to the data imbalance problem, which is one of the main reasons affecting the performance of the discriminative models. In this article, a simulated evolution method is applied to solve the multi-class imbalance problem at the stage of data preprocessing, and at the stage of Hidden Markov Models (HMMs) training, a random under-sampling method is introduced for the imbalance between the positive and negative datasets. It is shown that, in the task of discovering targeting motifs of nine subcellular compartments, the motifs found by our method are more conserved than the methods without considering data imbalance problem and recover the most known targeting motifs from Minimotif Miner and InterPro. Meanwhile, we use the found motifs to predict protein subcellular localization and achieve higher prediction precision and recall for the minority classes.

  16. Peyton’s four-step approach for teaching complex spinal manipulation techniques – a prospective randomized trial

    Directory of Open Access Journals (Sweden)

    Gertraud Gradl-Dietsch

    2016-11-01

    Full Text Available Abstract Background The objectives of this prospective randomized trial were to assess the impact of Peyton’s four-step approach on the acquisition of complex psychomotor skills and to examine the influence of gender on learning outcomes. Methods We randomly assigned 95 third to fifth year medical students to an intervention group which received instructions according to Peyton (PG or a control group, which received conventional teaching (CG. Both groups attended four sessions on the principles of manual therapy and specific manipulative and diagnostic techniques for the spine. We assessed differences in theoretical knowledge (multiple choice (MC exam and practical skills (Objective Structured Practical Examination (OSPE with respect to type of intervention and gender. Participants took a second OSPE 6 months after completion of the course. Results There were no differences between groups with respect to the MC exam. Students in the PG group scored significantly higher in the OSPE. Gender had no additional impact. Results of the second OSPE showed a significant decline in competency regardless of gender and type of intervention. Conclusions Peyton’s approach is superior to standard instruction for teaching complex spinal manipulation skills regardless of gender. Skills retention was equally low for both techniques.

  17. Estimation of trace levels of plutonium in urine samples by fission track technique

    International Nuclear Information System (INIS)

    Sawant, P.D.; Prabhu, S.; Pendharkar, K.A.; Kalsi, P.C.

    2009-01-01

    Individual monitoring of radiation workers handling Pu in various nuclear installations requires the detection of trace levels of plutonium in bioassay samples. It is necessary to develop methods that can detect urinary excretion of Pu in fraction of mBq range. Therefore, a sensitive method such as fission track analysis has been developed for the measurement of trace levels of Pu in bioassay samples. In this technique, chemically separated plutonium from the sample and a Pu standard were electrodeposited on planchettes and covered with Lexan solid state nuclear track detector (SSNTD) and irradiated with thermal neutrons in APSARA reactor of Bhabha Atomic Research Centre, India. The fission track densities in the Lexan films of the sample and the standard were used to calculate the amount of Pu in the sample. The minimum amount of Pu that can be analyzed by this method using doubly distilled electronic grade (E. G.) reagents is about 12 μBq/L. (author)

  18. Fabrication Techniques of Stretchable and Cloth Electroadhesion Samples for Implementation on Devices with Space Application

    Data.gov (United States)

    National Aeronautics and Space Administration — The purpose of this study is to determine materials and fabrication techniques for efficient space-rated electroadhesion (EA) samples. Liquid metals, including...

  19. Sparse feature learning for instrument identification: Effects of sampling and pooling methods.

    Science.gov (United States)

    Han, Yoonchang; Lee, Subin; Nam, Juhan; Lee, Kyogu

    2016-05-01

    Feature learning for music applications has recently received considerable attention from many researchers. This paper reports on the sparse feature learning algorithm for musical instrument identification, and in particular, focuses on the effects of the frame sampling techniques for dictionary learning and the pooling methods for feature aggregation. To this end, two frame sampling techniques are examined that are fixed and proportional random sampling. Furthermore, the effect of using onset frame was analyzed for both of proposed sampling methods. Regarding summarization of the feature activation, a standard deviation pooling method is used and compared with the commonly used max- and average-pooling techniques. Using more than 47 000 recordings of 24 instruments from various performers, playing styles, and dynamics, a number of tuning parameters are experimented including the analysis frame size, the dictionary size, and the type of frequency scaling as well as the different sampling and pooling methods. The results show that the combination of proportional sampling and standard deviation pooling achieve the best overall performance of 95.62% while the optimal parameter set varies among the instrument classes.

  20. Random walk of passive tracers among randomly moving obstacles

    OpenAIRE

    Gori, Matteo; Donato, Irene; Floriani, Elena; Nardecchia, Ilaria; Pettini, Marco

    2016-01-01

    Background: This study is mainly motivated by the need of understanding how the diffusion behaviour of a biomolecule (or even of a larger object) is affected by other moving macromolecules, organelles, and so on, inside a living cell, whence the possibility of understanding whether or not a randomly walking biomolecule is also subject to a long-range force field driving it to its target. Method: By means of the Continuous Time Random Walk (CTRW) technique the topic of random walk in random en...

  1. A review of analytical techniques for the determination of carbon-14 in environmental samples

    International Nuclear Information System (INIS)

    Milton, G.M.; Brown, R.M.

    1993-11-01

    This report contains a brief summary of analytical techniques commonly used for the determination of radiocarbon in a variety of environmental samples. Details of the applicable procedures developed and tested in the Environmental Research Branch at Chalk River Laboratories are appended

  2. Arsenic, Antimony, Chromium, and Thallium Speciation in Water and Sediment Samples with the LC-ICP-MS Technique

    Directory of Open Access Journals (Sweden)

    Magdalena Jabłońska-Czapla

    2015-01-01

    Full Text Available Chemical speciation is a very important subject in the environmental protection, toxicology, and chemical analytics due to the fact that toxicity, availability, and reactivity of trace elements depend on the chemical forms in which these elements occur. Research on low analyte levels, particularly in complex matrix samples, requires more and more advanced and sophisticated analytical methods and techniques. The latest trends in this field concern the so-called hyphenated techniques. Arsenic, antimony, chromium, and (underestimated thallium attract the closest attention of toxicologists and analysts. The properties of those elements depend on the oxidation state in which they occur. The aim of the following paper is to answer the question why the speciation analytics is so important. The paper also provides numerous examples of the hyphenated technique usage (e.g., the LC-ICP-MS application in the speciation analysis of chromium, antimony, arsenic, or thallium in water and bottom sediment samples. An important issue addressed is the preparation of environmental samples for speciation analysis.

  3. Sexual dysfunction after inguinal hernia repair with the Onstep versus Lichtenstein technique

    DEFF Research Database (Denmark)

    Andresen, Kristoffer; Burcharth, Jakob; Fonnes, Siv

    2017-01-01

    BACKGROUND: Sequelae after inguinal hernia repair include pain-related impairment of sexual function. Pain during intercourse can originate from the scar, scrotum, penis, or during ejaculation. The aim of this study was to investigate if the Onstep technique resulted in better results than...... the Lichtenstein technique regarding pain-related impairment of sexual function. METHODS: This study was part of the randomized ONLI trial (NCT01753219, Onstep versus Lichtenstein for inguinal hernia repair). Separate reporting of pain-related impairment of sexual function was planned before the study start......, with a separate sample size calculation. Participants were randomized to the Onstep or Lichtenstein technique for repair of their primary inguinal hernia and followed up at 6 months postoperative with the use of a questionnaire specific for pain-related impairment of sexual function. RESULTS: A total of 259...

  4. Techniques for the detection of pathogenic Cryptococcus species in wood decay substrata and the evaluation of viability in stored samples

    Directory of Open Access Journals (Sweden)

    Christian Alvarez

    2013-02-01

    Full Text Available In this study, we evaluated several techniques for the detection of the yeast form of Cryptococcus in decaying wood and measured the viability of these fungi in environmental samples stored in the laboratory. Samples were collected from a tree known to be positive for Cryptococcus and were each inoculated on 10 Niger seed agar (NSA plates. The conventional technique (CT yielded a greater number of positive samples and indicated a higher fungal density [in colony forming units per gram of wood (CFU.g-1] compared to the humid swab technique (ST. However, the difference in positive and false negative results between the CT-ST was not significant. The threshold of detection for the CT was 0.05.10³ CFU.g-1, while the threshold for the ST was greater than 0.1.10³ CFU-1. No colonies were recovered using the dry swab technique. We also determined the viability of Cryptococcus in wood samples stored for 45 days at 25ºC using the CT and ST and found that samples not only continued to yield a positive response, but also exhibited an increase in CFU.g-1, suggesting that Cryptococcus is able to grow in stored environmental samples. The ST.1, in which samples collected with swabs were immediately plated on NSA medium, was more efficient and less laborious than either the CT or ST and required approximately 10 min to perform; however, additional studies are needed to validate this technique.

  5. Investigation of CPD and HMDS Sample Preparation Techniques for Cervical Cells in Developing Computer-Aided Screening System Based on FE-SEM/EDX

    Science.gov (United States)

    Ng, Siew Cheok; Abu Osman, Noor Azuan

    2014-01-01

    This paper investigated the effects of critical-point drying (CPD) and hexamethyldisilazane (HMDS) sample preparation techniques for cervical cells on field emission scanning electron microscopy and energy dispersive X-ray (FE-SEM/EDX). We investigated the visualization of cervical cell image and elemental distribution on the cervical cell for two techniques of sample preparation. Using FE-SEM/EDX, the cervical cell images are captured and the cell element compositions are extracted for both sample preparation techniques. Cervical cell image quality, elemental composition, and processing time are considered for comparison of performances. Qualitatively, FE-SEM image based on HMDS preparation technique has better image quality than CPD technique in terms of degree of spread cell on the specimen and morphologic signs of cell deteriorations (i.e., existence of plate and pellet drying artifacts and membrane blebs). Quantitatively, with mapping and line scanning EDX analysis, carbon and oxygen element compositions in HMDS technique were higher than the CPD technique in terms of weight percentages. The HMDS technique has shorter processing time than the CPD technique. The results indicate that FE-SEM imaging, elemental composition, and processing time for sample preparation with the HMDS technique were better than CPD technique for cervical cell preparation technique for developing computer-aided screening system. PMID:25610902

  6. Accounting for randomness in measurement and sampling in studying cancer cell population dynamics.

    Science.gov (United States)

    Ghavami, Siavash; Wolkenhauer, Olaf; Lahouti, Farshad; Ullah, Mukhtar; Linnebacher, Michael

    2014-10-01

    Knowing the expected temporal evolution of the proportion of different cell types in sample tissues gives an indication about the progression of the disease and its possible response to drugs. Such systems have been modelled using Markov processes. We here consider an experimentally realistic scenario in which transition probabilities are estimated from noisy cell population size measurements. Using aggregated data of FACS measurements, we develop MMSE and ML estimators and formulate two problems to find the minimum number of required samples and measurements to guarantee the accuracy of predicted population sizes. Our numerical results show that the convergence mechanism of transition probabilities and steady states differ widely from the real values if one uses the standard deterministic approach for noisy measurements. This provides support for our argument that for the analysis of FACS data one should consider the observed state as a random variable. The second problem we address is about the consequences of estimating the probability of a cell being in a particular state from measurements of small population of cells. We show how the uncertainty arising from small sample sizes can be captured by a distribution for the state probability.

  7. RANdom SAmple Consensus (RANSAC) algorithm for material-informatics: application to photovoltaic solar cells.

    Science.gov (United States)

    Kaspi, Omer; Yosipof, Abraham; Senderowitz, Hanoch

    2017-06-06

    An important aspect of chemoinformatics and material-informatics is the usage of machine learning algorithms to build Quantitative Structure Activity Relationship (QSAR) models. The RANdom SAmple Consensus (RANSAC) algorithm is a predictive modeling tool widely used in the image processing field for cleaning datasets from noise. RANSAC could be used as a "one stop shop" algorithm for developing and validating QSAR models, performing outlier removal, descriptors selection, model development and predictions for test set samples using applicability domain. For "future" predictions (i.e., for samples not included in the original test set) RANSAC provides a statistical estimate for the probability of obtaining reliable predictions, i.e., predictions within a pre-defined number of standard deviations from the true values. In this work we describe the first application of RNASAC in material informatics, focusing on the analysis of solar cells. We demonstrate that for three datasets representing different metal oxide (MO) based solar cell libraries RANSAC-derived models select descriptors previously shown to correlate with key photovoltaic properties and lead to good predictive statistics for these properties. These models were subsequently used to predict the properties of virtual solar cells libraries highlighting interesting dependencies of PV properties on MO compositions.

  8. Relationship between accuracy and number of samples on statistical quantity and contour map of environmental gamma-ray dose rate. Example of random sampling

    International Nuclear Information System (INIS)

    Matsuda, Hideharu; Minato, Susumu

    2002-01-01

    The accuracy of statistical quantity like the mean value and contour map obtained by measurement of the environmental gamma-ray dose rate was evaluated by random sampling of 5 different model distribution maps made by the mean slope, -1.3, of power spectra calculated from the actually measured values. The values were derived from 58 natural gamma dose rate data reported worldwide ranging in the means of 10-100 Gy/h rates and 10 -3 -10 7 km 2 areas. The accuracy of the mean value was found around ±7% even for 60 or 80 samplings (the most frequent number) and the standard deviation had the accuracy less than 1/4-1/3 of the means. The correlation coefficient of the frequency distribution was found 0.860 or more for 200-400 samplings (the most frequent number) but of the contour map, 0.502-0.770. (K.H.)

  9. Clinical application of microsampling versus conventional sampling techniques in the quantitative bioanalysis of antibiotics: a systematic review.

    Science.gov (United States)

    Guerra Valero, Yarmarly C; Wallis, Steven C; Lipman, Jeffrey; Stove, Christophe; Roberts, Jason A; Parker, Suzanne L

    2018-03-01

    Conventional sampling techniques for clinical pharmacokinetic studies often require the removal of large blood volumes from patients. This can result in a physiological or emotional burden, particularly for neonates or pediatric patients. Antibiotic pharmacokinetic studies are typically performed on healthy adults or general ward patients. These may not account for alterations to a patient's pathophysiology and can lead to suboptimal treatment. Microsampling offers an important opportunity for clinical pharmacokinetic studies in vulnerable patient populations, where smaller sample volumes can be collected. This systematic review provides a description of currently available microsampling techniques and an overview of studies reporting the quantitation and validation of antibiotics using microsampling. A comparison of microsampling to conventional sampling in clinical studies is included.

  10. Elemental analysis of brazing alloy samples by neutron activation technique

    International Nuclear Information System (INIS)

    Eissa, E.A.; Rofail, N.B.; Hassan, A.M.; El-Shershaby, A.; Walley El-Dine, N.

    1996-01-01

    Two brazing alloy samples (C P 2 and C P 3 ) have been investigated by Neutron activation analysis (NAA) technique in order to identify and estimate their constituent elements. The pneumatic irradiation rabbit system (PIRS), installed at the first egyptian research reactor (ETRR-1) was used for short-time irradiation (30 s) with a thermal neutron flux of 1.6 x 10 1 1 n/cm 2 /s in the reactor reflector, where the thermal to epithermal neutron flux ratio is 106. Long-time irradiation (48 hours) was performed at reactor core periphery with thermal neutron flux of 3.34 x 10 1 2 n/cm 2 /s, and thermal to epithermal neutron flux ratio of 79. Activation by epithermal neutrons was taken into account for the (1/v) and resonance neutron absorption in both methods. A hyper pure germanium detection system was used for gamma-ray acquisitions. The concentration values of Al, Cr, Fe, Co, Cu, Zn, Se, Ag and Sb were estimated as percentages of the sample weight and compared with reported values. 1 tab

  11. Elemental analysis of brazing alloy samples by neutron activation technique

    Energy Technology Data Exchange (ETDEWEB)

    Eissa, E A; Rofail, N B; Hassan, A M [Reactor and Neutron physics Department, Nuclear Research Centre, Atomic Energy Authority, Cairo (Egypt); El-Shershaby, A; Walley El-Dine, N [Physics Department, Faculty of Girls, Ain Shams Universty, Cairo (Egypt)

    1997-12-31

    Two brazing alloy samples (C P{sup 2} and C P{sup 3}) have been investigated by Neutron activation analysis (NAA) technique in order to identify and estimate their constituent elements. The pneumatic irradiation rabbit system (PIRS), installed at the first egyptian research reactor (ETRR-1) was used for short-time irradiation (30 s) with a thermal neutron flux of 1.6 x 10{sup 1}1 n/cm{sup 2}/s in the reactor reflector, where the thermal to epithermal neutron flux ratio is 106. Long-time irradiation (48 hours) was performed at reactor core periphery with thermal neutron flux of 3.34 x 10{sup 1}2 n/cm{sup 2}/s, and thermal to epithermal neutron flux ratio of 79. Activation by epithermal neutrons was taken into account for the (1/v) and resonance neutron absorption in both methods. A hyper pure germanium detection system was used for gamma-ray acquisitions. The concentration values of Al, Cr, Fe, Co, Cu, Zn, Se, Ag and Sb were estimated as percentages of the sample weight and compared with reported values. 1 tab.

  12. Generalized Dynamic Panel Data Models with Random Effects for Cross-Section and Time

    NARCIS (Netherlands)

    Mesters, G.; Koopman, S.J.

    2014-01-01

    An exact maximum likelihood method is developed for the estimation of parameters in a nonlinear non-Gaussian dynamic panel data model with unobserved random individual-specific and time-varying effects. We propose an estimation procedure based on the importance sampling technique. In particular, a

  13. Measuring gas-residence times in large municipal incinerators, by means of a pseudo-random binary signal tracer technique

    International Nuclear Information System (INIS)

    Nasserzadeh, V.; Swithenbank, J.; Jones, B.

    1995-01-01

    The problem of measuring gas-residence time in large incinerators was studied by the pseudo-random binary sequence (PRBS) stimulus tracer response technique at the Sheffield municipal solid-waste incinerator (35 MW plant). The steady-state system was disturbed by the superimposition of small fluctuations in the form of a pseudo-random binary sequence of methane pulses, and the response of the incinerator was determined from the CO 2 concentration in flue gases at the boiler exit, measured with a specially developed optical gas analyser with a high-frequency response. For data acquisition, an on-line PC computer was used together with the LAB Windows software system; the output response was then cross-correlated with the perturbation signal to give the impulse response of the incinerator. There was very good agreement between the gas-residence time for the Sheffield MSW incinerator as calculated by computational fluid dynamics (FLUENT Model) and gas-residence time at the plant as measured by the PRBS tracer technique. The results obtained from this research programme clearly demonstrate that the PRBS stimulus tracer response technique can be successfully and economically used to measure gas-residence times in large incinerator plants. It also suggests that the common commercial practice of characterising the incinerator operation by a single-residence-time parameter may lead to a misrepresentation of the complexities involved in describing the operation of the incineration system. (author)

  14. Neural tension technique is no different from random passive movements in reducing spasticity in patients with traumatic brain injury

    DEFF Research Database (Denmark)

    Lorentzen, Jakob; Nielsen, Dorthe; Holm, Karl

    2012-01-01

    Purpose: Neural tension technique (NTT) is a therapy believed to reduce spasticity and to increase range of motion (ROM). This study compared the ability of NTT and random passive movements (RPMs) to reduce spasticity in the knee flexors in 10 spastic patients with brain injury. Methods: An RCT...

  15. Trace uranium analysis in geological sample by isotope dilution-alpha spectrometry and comparison with other techniques

    International Nuclear Information System (INIS)

    Shihomatsu, H.M.; Iyer, S.S.

    1988-12-01

    Establishment of uranium determination in geological samples by alpha spectrometric isotope dilution technique using 233 U tracer is described in the present work. The various steps involved in the method namely, preparation of the sample, electrodeposition, alpha spectrometry, isotope dilution, calculation of the concentration and error statistics are discussed in detail. The experimental parameters for the electrodeposition of uranium, like current density, pH concentration of the electrolyte solution, deposition time, electrode distance were all optimised based on the efficiency of the deposition. The total accuracy and precision of the IDAS using 233 U tracer in the determination of uranium in mineral and granite samples were of the order of 1 to 2% for the concentration range of 50-1500 ppm of U. Our results are compared with those obtained by others workers using similar and different techniques. (author) [pt

  16. Comparison of mobile and stationary spore-sampling techniques for estimating virulence frequencies in aerial barley powdery mildew populations

    DEFF Research Database (Denmark)

    Hovmøller, M.S.; Munk, L.; Østergård, Hanne

    1995-01-01

    Gene frequencies in samples of aerial populations of barley powdery mildew (Erysiphe graminis f.sp. hordei), which were collected in adjacent barley areas and in successive periods of time, were compared using mobile and stationary sampling techniques. Stationary samples were collected from trap ...

  17. Integrating SQ4R Technique with Graphic Postorganizers in the Science Learning of Earth and Space

    OpenAIRE

    Djudin, Tomo; Amir, R

    2018-01-01

    This study examined the effect of integrating SQ4R reading technique with graphic post organizers on the students' Earth and Space Science learning achievement and development of metacognitive knowledge. The pretest-posttest non-equivalent control group design was employed in this quasi-experimental method. The sample which consists of 103 seventh grade of secondary school students of SMPN 1 Pontianak was drawn by using intact group random sampling technique. An achievement test and a questio...

  18. Choosing a suitable sample size in descriptive sampling

    International Nuclear Information System (INIS)

    Lee, Yong Kyun; Choi, Dong Hoon; Cha, Kyung Joon

    2010-01-01

    Descriptive sampling (DS) is an alternative to crude Monte Carlo sampling (CMCS) in finding solutions to structural reliability problems. It is known to be an effective sampling method in approximating the distribution of a random variable because it uses the deterministic selection of sample values and their random permutation,. However, because this method is difficult to apply to complex simulations, the sample size is occasionally determined without thorough consideration. Input sample variability may cause the sample size to change between runs, leading to poor simulation results. This paper proposes a numerical method for choosing a suitable sample size for use in DS. Using this method, one can estimate a more accurate probability of failure in a reliability problem while running a minimal number of simulations. The method is then applied to several examples and compared with CMCS and conventional DS to validate its usefulness and efficiency

  19. Preparation of quality control samples for thyroid hormones T3 and T4 in radioimmunoassay techniques

    International Nuclear Information System (INIS)

    Ahmed, F.O.A.

    2006-03-01

    Today, the radioimmunoassay becomes one of the best techniques for quantitative analysis of very low concentration of different substances. RIA is being widely used in medical and research laboratories. To maintain high specificity and accuracy in RIA and other related techniques the quality controls must be introduced. In this dissertation quality control samples for thyroid hormones (Triiodothyronine T3 and Thyroxin T4), using RIA techniques. Ready made chinese T4, T3 RIA kits were used. IAEA statistical package were selected.(Author)

  20. Improved sample preparation and counting techniques for enhanced tritium measurement sensitivity

    Science.gov (United States)

    Moran, J.; Aalseth, C.; Bailey, V. L.; Mace, E. K.; Overman, C.; Seifert, A.; Wilcox Freeburg, E. D.

    2015-12-01

    Tritium (T) measurements offer insight to a wealth of environmental applications including hydrologic tracking, discerning ocean circulation patterns, and aging ice formations. However, the relatively short half-life of T (12.3 years) limits its effective age dating range. Compounding this limitation is the decrease in atmospheric T content by over two orders of magnitude (from 1000-2000 TU in 1962 to testing in the 1960's. We are developing sample preparation methods coupled to direct counting of T via ultra-low background proportional counters which, when combined, offer improved T measurement sensitivity (~4.5 mmoles of H2 equivalent) and will help expand the application of T age dating to smaller sample sizes linked to persistent environmental questions despite the limitations above. For instance, this approach can be used to T date ~ 2.2 mmoles of CH4 collected from sample-limited systems including microbial communities, soils, or subsurface aquifers and can be combined with radiocarbon dating to distinguish the methane's formation age from C age in a system. This approach can also expand investigations into soil organic C where the improved sensitivity will permit resolution of soil C into more descriptive fractions and provide direct assessments of the stability of specific classes of organic matter in soils environments. We are employing a multiple step sample preparation system whereby organic samples are first combusted with resulting CO2 and H2O being used as a feedstock to synthesize CH4. This CH4 is mixed with Ar and loaded directly into an ultra-low background proportional counter for measurement of T β decay in a shallow underground laboratory. Analysis of water samples requires only the addition of geologic CO2 feedstock with the sample for methane synthesis. The chemical nature of the preparation techniques enable high sample throughput with only the final measurement requiring T decay with total sample analysis time ranging from 2 -5 weeks

  1. Performance evaluation of an importance sampling technique in a Jackson network

    Science.gov (United States)

    brahim Mahdipour, E.; Masoud Rahmani, Amir; Setayeshi, Saeed

    2014-03-01

    Importance sampling is a technique that is commonly used to speed up Monte Carlo simulation of rare events. However, little is known regarding the design of efficient importance sampling algorithms in the context of queueing networks. The standard approach, which simulates the system using an a priori fixed change of measure suggested by large deviation analysis, has been shown to fail in even the simplest network settings. Estimating probabilities associated with rare events has been a topic of great importance in queueing theory, and in applied probability at large. In this article, we analyse the performance of an importance sampling estimator for a rare event probability in a Jackson network. This article carries out strict deadlines to a two-node Jackson network with feedback whose arrival and service rates are modulated by an exogenous finite state Markov process. We have estimated the probability of network blocking for various sets of parameters, and also the probability of missing the deadline of customers for different loads and deadlines. We have finally shown that the probability of total population overflow may be affected by various deadline values, service rates and arrival rates.

  2. Photon event distribution sampling: an image formation technique for scanning microscopes that permits tracking of sub-diffraction particles with high spatial and temporal resolutions.

    Science.gov (United States)

    Larkin, J D; Publicover, N G; Sutko, J L

    2011-01-01

    In photon event distribution sampling, an image formation technique for scanning microscopes, the maximum likelihood position of origin of each detected photon is acquired as a data set rather than binning photons in pixels. Subsequently, an intensity-related probability density function describing the uncertainty associated with the photon position measurement is applied to each position and individual photon intensity distributions are summed to form an image. Compared to pixel-based images, photon event distribution sampling images exhibit increased signal-to-noise and comparable spatial resolution. Photon event distribution sampling is superior to pixel-based image formation in recognizing the presence of structured (non-random) photon distributions at low photon counts and permits use of non-raster scanning patterns. A photon event distribution sampling based method for localizing single particles derived from a multi-variate normal distribution is more precise than statistical (Gaussian) fitting to pixel-based images. Using the multi-variate normal distribution method, non-raster scanning and a typical confocal microscope, localizations with 8 nm precision were achieved at 10 ms sampling rates with acquisition of ~200 photons per frame. Single nanometre precision was obtained with a greater number of photons per frame. In summary, photon event distribution sampling provides an efficient way to form images when low numbers of photons are involved and permits particle tracking with confocal point-scanning microscopes with nanometre precision deep within specimens. © 2010 The Authors Journal of Microscopy © 2010 The Royal Microscopical Society.

  3. Using Load Balancing to Scalably Parallelize Sampling-Based Motion Planning Algorithms

    KAUST Repository

    Fidel, Adam; Jacobs, Sam Ade; Sharma, Shishir; Amato, Nancy M.; Rauchwerger, Lawrence

    2014-01-01

    Motion planning, which is the problem of computing feasible paths in an environment for a movable object, has applications in many domains ranging from robotics, to intelligent CAD, to protein folding. The best methods for solving this PSPACE-hard problem are so-called sampling-based planners. Recent work introduced uniform spatial subdivision techniques for parallelizing sampling-based motion planning algorithms that scaled well. However, such methods are prone to load imbalance, as planning time depends on region characteristics and, for most problems, the heterogeneity of the sub problems increases as the number of processors increases. In this work, we introduce two techniques to address load imbalance in the parallelization of sampling-based motion planning algorithms: an adaptive work stealing approach and bulk-synchronous redistribution. We show that applying these techniques to representatives of the two major classes of parallel sampling-based motion planning algorithms, probabilistic roadmaps and rapidly-exploring random trees, results in a more scalable and load-balanced computation on more than 3,000 cores. © 2014 IEEE.

  4. Using Load Balancing to Scalably Parallelize Sampling-Based Motion Planning Algorithms

    KAUST Repository

    Fidel, Adam

    2014-05-01

    Motion planning, which is the problem of computing feasible paths in an environment for a movable object, has applications in many domains ranging from robotics, to intelligent CAD, to protein folding. The best methods for solving this PSPACE-hard problem are so-called sampling-based planners. Recent work introduced uniform spatial subdivision techniques for parallelizing sampling-based motion planning algorithms that scaled well. However, such methods are prone to load imbalance, as planning time depends on region characteristics and, for most problems, the heterogeneity of the sub problems increases as the number of processors increases. In this work, we introduce two techniques to address load imbalance in the parallelization of sampling-based motion planning algorithms: an adaptive work stealing approach and bulk-synchronous redistribution. We show that applying these techniques to representatives of the two major classes of parallel sampling-based motion planning algorithms, probabilistic roadmaps and rapidly-exploring random trees, results in a more scalable and load-balanced computation on more than 3,000 cores. © 2014 IEEE.

  5. Random Decrement

    DEFF Research Database (Denmark)

    Asmussen, J.C.; Ibrahim, S.R.; Brincker, Rune

    Abstraet Thispaper demansirates how to use the Random Decrement (RD) technique for identification o flinear structures subjected to ambient excitation. The theory behind the technique will be presented and guidelines how to choose the different variables will be given. This is done by introducing...

  6. Predicting disease risks from highly imbalanced data using random forest

    Directory of Open Access Journals (Sweden)

    Chakraborty Sounak

    2011-07-01

    Full Text Available Abstract Background We present a method utilizing Healthcare Cost and Utilization Project (HCUP dataset for predicting disease risk of individuals based on their medical diagnosis history. The presented methodology may be incorporated in a variety of applications such as risk management, tailored health communication and decision support systems in healthcare. Methods We employed the National Inpatient Sample (NIS data, which is publicly available through Healthcare Cost and Utilization Project (HCUP, to train random forest classifiers for disease prediction. Since the HCUP data is highly imbalanced, we employed an ensemble learning approach based on repeated random sub-sampling. This technique divides the training data into multiple sub-samples, while ensuring that each sub-sample is fully balanced. We compared the performance of support vector machine (SVM, bagging, boosting and RF to predict the risk of eight chronic diseases. Results We predicted eight disease categories. Overall, the RF ensemble learning method outperformed SVM, bagging and boosting in terms of the area under the receiver operating characteristic (ROC curve (AUC. In addition, RF has the advantage of computing the importance of each variable in the classification process. Conclusions In combining repeated random sub-sampling with RF, we were able to overcome the class imbalance problem and achieve promising results. Using the national HCUP data set, we predicted eight disease categories with an average AUC of 88.79%.

  7. Beyond simple small-angle X-ray scattering: developments in online complementary techniques and sample environments.

    Science.gov (United States)

    Bras, Wim; Koizumi, Satoshi; Terrill, Nicholas J

    2014-11-01

    Small- and wide-angle X-ray scattering (SAXS, WAXS) are standard tools in materials research. The simultaneous measurement of SAXS and WAXS data in time-resolved studies has gained popularity due to the complementary information obtained. Furthermore, the combination of these data with non X-ray based techniques, via either simultaneous or independent measurements, has advanced understanding of the driving forces that lead to the structures and morphologies of materials, which in turn give rise to their properties. The simultaneous measurement of different data regimes and types, using either X-rays or neutrons, and the desire to control parameters that initiate and control structural changes have led to greater demands on sample environments. Examples of developments in technique combinations and sample environment design are discussed, together with a brief speculation about promising future developments.

  8. Statistical sampling applied to the radiological characterization of historical waste

    Directory of Open Access Journals (Sweden)

    Zaffora Biagio

    2016-01-01

    Full Text Available The evaluation of the activity of radionuclides in radioactive waste is required for its disposal in final repositories. Easy-to-measure nuclides, like γ-emitters and high-energy X-rays, can be measured via non-destructive nuclear techniques from outside a waste package. Some radionuclides are difficult-to-measure (DTM from outside a package because they are α- or β-emitters. The present article discusses the application of linear regression, scaling factors (SF and the so-called “mean activity method” to estimate the activity of DTM nuclides on metallic waste produced at the European Organization for Nuclear Research (CERN. Various statistical sampling techniques including simple random sampling, systematic sampling, stratified and authoritative sampling are described and applied to 2 waste populations of activated copper cables. The bootstrap is introduced as a tool to estimate average activities and standard errors in waste characterization. The analysis of the DTM Ni-63 is used as an example. Experimental and theoretical values of SFs are calculated and compared. Guidelines for sampling historical waste using probabilistic and non-probabilistic sampling are finally given.

  9. Methodology Series Module 5: Sampling Strategies.

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ' Sampling Method'. There are essentially two types of sampling methods: 1) probability sampling - based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling - based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ' random sample' when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ' generalizability' of these results. In such a scenario, the researcher may want to use ' purposive sampling' for the study.

  10. Methodology series module 5: Sampling strategies

    Directory of Open Access Journals (Sweden)

    Maninder Singh Setia

    2016-01-01

    Full Text Available Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the 'Sampling Method'. There are essentially two types of sampling methods: 1 probability sampling – based on chance events (such as random numbers, flipping a coin etc.; and 2 non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term 'random sample' when the researcher has used convenience sample. The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the 'generalizability' of these results. In such a scenario, the researcher may want to use 'purposive sampling' for the study.

  11. Random Decrement

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Ibrahim, S. R.; Brincker, Rune

    This paper demonstrates how to use the Random Decrement (RD) technique for identification of linear structures subjected to ambient excitation. The theory behind the technique will be presented and guidelines how to choose the different variables will be given. This is done by introducing a new...

  12. Random Decrement

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Ibrahim, R.; Brincker, Rune

    1998-01-01

    This paper demonstrates how to use the Random Decrement (RD) technique for identification of linear structures subjected to ambient excitation. The theory behind the technique will be presented and guidelines how to choose the different variables will be given. This is done by introducing a new...

  13. Methodology Series Module 5: Sampling Strategies

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ‘ Sampling Method’. There are essentially two types of sampling methods: 1) probability sampling – based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ‘ random sample’ when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ‘ generalizability’ of these results. In such a scenario, the researcher may want to use ‘ purposive sampling’ for the study. PMID:27688438

  14. Diagnostic-test evaluation of immunoassays for anti-Toxoplasma gondii IgG antibodies in a random sample of Mexican population.

    Science.gov (United States)

    Caballero-Ortega, Heriberto; Castillo-Cruz, Rocío; Murieta, Sandra; Ortíz-Alegría, Luz Belinda; Calderón-Segura, Esther; Conde-Glez, Carlos J; Cañedo-Solares, Irma; Correa, Dolores

    2014-05-14

    There are few articles on evaluation of Toxoplasma gondii serological tests. Besides, commercially available tests are not always useful and are expensive for studies in open population. The aim of this study was to evaluate in-house ELISA and western blot for IgG antibodies in a representative sample of people living in Mexico. Three hundred and five serum samples were randomly selected from two national seroepidemiological survey banks; they were taken from men and women of all ages and from all areas of the country. ELISA cut-off was established using the mean plus three standard deviations of negative samples. Western blots were analysed by two experienced technicians and positivity was established according to the presence of at least three diagnostic bands. A commercial ELISA kit was used as a third test. Two reference standards were built up: one using concordant results of two assays leaving the evaluated test out and the other in which the evaluated test was included (IN) with at least two concordant results to define diagnosis. the lowest values of diagnostic parameters were obtained with the OUT reference standards: in-house ELISA had 96.9% sensitivity, 62.1% specificity, 49.6% PPV, 98.1% NPV and 71.8% accuracy, while western blot presented 81.8%, 89.7%, 84.0%, 88.2% and 86.6% values and the best kappa coefficient (0.72-0.82). The in-house ELISA is useful for screening people of Mexico, due to its high sensitivity, while western blot may be used to confirm diagnosis. These techniques might prove useful in other Latin American countries.

  15. Location and multi-depot vehicle routing for emergency vehicles using tour coverage and random sampling

    Directory of Open Access Journals (Sweden)

    Alireza Goli

    2015-09-01

    Full Text Available Distribution and optimum allocation of emergency resources are the most important tasks, which need to be accomplished during crisis. When a natural disaster such as earthquake, flood, etc. takes place, it is necessary to deliver rescue efforts as quickly as possible. Therefore, it is important to find optimum location and distribution of emergency relief resources. When a natural disaster occurs, it is not possible to reach some damaged areas. In this paper, location and multi-depot vehicle routing for emergency vehicles using tour coverage and random sampling is investigated. In this study, there is no need to visit all the places and some demand points receive their needs from the nearest possible location. The proposed study is implemented for some randomly generated numbers in different sizes. The preliminary results indicate that the proposed method was capable of reaching desirable solutions in reasonable amount of time.

  16. Sampling Large Graphs for Anticipatory Analytics

    Science.gov (United States)

    2015-05-15

    low. C. Random Area Sampling Random area sampling [8] is a “ snowball ” sampling method in which a set of random seed vertices are selected and areas... Sampling Large Graphs for Anticipatory Analytics Lauren Edwards, Luke Johnson, Maja Milosavljevic, Vijay Gadepally, Benjamin A. Miller Lincoln...systems, greater human-in-the-loop involvement, or through complex algorithms. We are investigating the use of sampling to mitigate these challenges

  17. Determination of trace elements in plant samples using XRF, PIXE and ICP-OES techniques

    International Nuclear Information System (INIS)

    Ahmed, Hassan Elzain Hassan

    2014-07-01

    The purpose of this study is to determine trace element concentration (Ca, Cu, Cr, K,Fe, Mn,Sr, and Za) in some sudanese wild plants namely, Ziziphus Abyssinica and Grewia Tenax. X-ray fluorescence ( X RF), particle-induced x-ray emission ( PIXE) and inductively coupled plasma-optical emission spectroscopy (ICP-OES) techniques were used for element determination. A series of plants standard references materials were used to check the reliability of the different employed techniques as well as to estimate possible factors for correcting the concentration of some elements that deviated significantly from their actual concentration. The results showed that, X RF, PIXE and ICP-OES are equally competitive methods for measuring Ca,K, Fe, Sr and Zn elements. Unlikely to ICP-OES seems to be superior techniques tend to be appropriate methods for Cu determination in plant samples however, for Mn element PIXE and ICP-OES are advisable techniques for measuring this element rather than X RF method. On the other hand, ICP-OES seems to be the superior techniques over PIXE and X RF methods for Cr and Ni determination in plant samples. The effect of geographical location on trace elements concentration in plants has been examined through determination of element in different species of Grewia Tenax than collected from different location. Most of measured elements showed similarity indicating there is no significant impact of locations on the difference of element contents. In addition, two plants with different genetic families namely, Ziziphus Spina Christi and Ziziphus Abyssinica were collected from the same location and screened for their trace element content. It was found that there were no difference between the two plants for Ca, K, Cu, Fe, and Sr element. However, significant variations were observed for Mn and Zn concentrations implying the possibility of using of those two elements for plant taxonomy purposes.(Author)

  18. Acupuncture-Related Techniques for Psoriasis: A Systematic Review with Pairwise and Network Meta-Analyses of Randomized Controlled Trials.

    Science.gov (United States)

    Yeh, Mei-Ling; Ko, Shu-Hua; Wang, Mei-Hua; Chi, Ching-Chi; Chung, Yu-Chu

    2017-12-01

    There has be a large body of evidence on the pharmacological treatments for psoriasis, but whether nonpharmacological interventions are effective in managing psoriasis remains largely unclear. This systematic review conducted pairwise and network meta-analyses to determine the effects of acupuncture-related techniques on acupoint stimulation for the treatment of psoriasis and to determine the order of effectiveness of these remedies. This study searched the following databases from inception to March 15, 2016: Medline, PubMed, Cochrane Central Register of Controlled Trials, EBSCO (including Academic Search Premier, American Doctoral Dissertations, and CINAHL), Airiti Library, and China National Knowledge Infrastructure. Randomized controlled trials (RCTs) on the effects of acupuncture-related techniques on acupoint stimulation as intervention for psoriasis were independently reviewed by two researchers. A total of 13 RCTs with 1,060 participants were included. The methodological quality of included studies was not rigorous. Acupoint stimulation, compared with nonacupoint stimulation, had a significant treatment for psoriasis. However, the most common adverse events were thirst and dry mouth. Subgroup analysis was further done to confirm that the short-term treatment effect was superior to that of the long-term effect in treating psoriasis. Network meta-analysis identified acupressure or acupoint catgut embedding, compared with medication, and had a significant effect for improving psoriasis. It was noted that acupressure was the most effective treatment. Acupuncture-related techniques could be considered as an alternative or adjuvant therapy for psoriasis in short term, especially of acupressure and acupoint catgut embedding. This study recommends further well-designed, methodologically rigorous, and more head-to-head randomized trials to explore the effects of acupuncture-related techniques for treating psoriasis.

  19. Honest Importance Sampling with Multiple Markov Chains.

    Science.gov (United States)

    Tan, Aixin; Doss, Hani; Hobert, James P

    2015-01-01

    Importance sampling is a classical Monte Carlo technique in which a random sample from one probability density, π 1 , is used to estimate an expectation with respect to another, π . The importance sampling estimator is strongly consistent and, as long as two simple moment conditions are satisfied, it obeys a central limit theorem (CLT). Moreover, there is a simple consistent estimator for the asymptotic variance in the CLT, which makes for routine computation of standard errors. Importance sampling can also be used in the Markov chain Monte Carlo (MCMC) context. Indeed, if the random sample from π 1 is replaced by a Harris ergodic Markov chain with invariant density π 1 , then the resulting estimator remains strongly consistent. There is a price to be paid however, as the computation of standard errors becomes more complicated. First, the two simple moment conditions that guarantee a CLT in the iid case are not enough in the MCMC context. Second, even when a CLT does hold, the asymptotic variance has a complex form and is difficult to estimate consistently. In this paper, we explain how to use regenerative simulation to overcome these problems. Actually, we consider a more general set up, where we assume that Markov chain samples from several probability densities, π 1 , …, π k , are available. We construct multiple-chain importance sampling estimators for which we obtain a CLT based on regeneration. We show that if the Markov chains converge to their respective target distributions at a geometric rate, then under moment conditions similar to those required in the iid case, the MCMC-based importance sampling estimator obeys a CLT. Furthermore, because the CLT is based on a regenerative process, there is a simple consistent estimator of the asymptotic variance. We illustrate the method with two applications in Bayesian sensitivity analysis. The first concerns one-way random effects models under different priors. The second involves Bayesian variable

  20. Improved sampling and analysis of images in corneal confocal microscopy.

    Science.gov (United States)

    Schaldemose, E L; Fontain, F I; Karlsson, P; Nyengaard, J R

    2017-10-01

    Corneal confocal microscopy (CCM) is a noninvasive clinical method to analyse and quantify corneal nerve fibres in vivo. Although the CCM technique is in constant progress, there are methodological limitations in terms of sampling of images and objectivity of the nerve quantification. The aim of this study was to present a randomized sampling method of the CCM images and to develop an adjusted area-dependent image analysis. Furthermore, a manual nerve fibre analysis method was compared to a fully automated method. 23 idiopathic small-fibre neuropathy patients were investigated using CCM. Corneal nerve fibre length density (CNFL) and corneal nerve fibre branch density (CNBD) were determined in both a manual and automatic manner. Differences in CNFL and CNBD between (1) the randomized and the most common sampling method, (2) the adjusted and the unadjusted area and (3) the manual and automated quantification method were investigated. The CNFL values were significantly lower when using the randomized sampling method compared to the most common method (p = 0.01). There was not a statistical significant difference in the CNBD values between the randomized and the most common sampling method (p = 0.85). CNFL and CNBD values were increased when using the adjusted area compared to the standard area. Additionally, the study found a significant increase in the CNFL and CNBD values when using the manual method compared to the automatic method (p ≤ 0.001). The study demonstrated a significant difference in the CNFL values between the randomized and common sampling method indicating the importance of clear guidelines for the image sampling. The increase in CNFL and CNBD values when using the adjusted cornea area is not surprising. The observed increases in both CNFL and CNBD values when using the manual method of nerve quantification compared to the automatic method are consistent with earlier findings. This study underlines the importance of improving the analysis of the

  1. Sample application of sensitivity/uncertainty analysis techniques to a groundwater transport problem. National Low-Level Waste Management Program

    International Nuclear Information System (INIS)

    Seitz, R.R.; Rood, A.S.; Harris, G.A.; Maheras, S.J.; Kotecki, M.

    1991-06-01

    The primary objective of this document is to provide sample applications of selected sensitivity and uncertainty analysis techniques within the context of the radiological performance assessment process. These applications were drawn from the companion document Guidelines for Sensitivity and Uncertainty Analyses of Low-Level Radioactive Waste Performance Assessment Computer Codes (S. Maheras and M. Kotecki, DOE/LLW-100, 1990). Three techniques are illustrated in this document: one-factor-at-a-time (OFAT) analysis, fractional factorial design, and Latin hypercube sampling. The report also illustrates the differences in sensitivity and uncertainty analysis at the early and latter stages of the performance assessment process, and potential pitfalls that can be encountered when applying the techniques. The emphasis is on application of the techniques as opposed to the actual results, since the results are hypothetical and are not based on site-specific conditions

  2. Low-mass molecular dynamics simulation: A simple and generic technique to enhance configurational sampling

    Energy Technology Data Exchange (ETDEWEB)

    Pang, Yuan-Ping, E-mail: pang@mayo.edu

    2014-09-26

    Highlights: • Reducing atomic masses by 10-fold vastly improves sampling in MD simulations. • CLN025 folded in 4 of 10 × 0.5-μs MD simulations when masses were reduced by 10-fold. • CLN025 folded as early as 96.2 ns in 1 of the 4 simulations that captured folding. • CLN025 did not fold in 10 × 0.5-μs MD simulations when standard masses were used. • Low-mass MD simulation is a simple and generic sampling enhancement technique. - Abstract: CLN025 is one of the smallest fast-folding proteins. Until now it has not been reported that CLN025 can autonomously fold to its native conformation in a classical, all-atom, and isothermal–isobaric molecular dynamics (MD) simulation. This article reports the autonomous and repeated folding of CLN025 from a fully extended backbone conformation to its native conformation in explicit solvent in multiple 500-ns MD simulations at 277 K and 1 atm with the first folding event occurring as early as 66.1 ns. These simulations were accomplished by using AMBER forcefield derivatives with atomic masses reduced by 10-fold on Apple Mac Pros. By contrast, no folding event was observed when the simulations were repeated using the original AMBER forcefields of FF12SB and FF14SB. The results demonstrate that low-mass MD simulation is a simple and generic technique to enhance configurational sampling. This technique may propel autonomous folding of a wide range of miniature proteins in classical, all-atom, and isothermal–isobaric MD simulations performed on commodity computers—an important step forward in quantitative biology.

  3. Review of cleaning techniques and their effects on the chemical composition of foliar samples

    Energy Technology Data Exchange (ETDEWEB)

    Rossini Oliva, S.; Raitio, H.

    2003-07-01

    Chemical foliar analysis is a tool widely used to study tree nutrition and to monitor the impact and extent of air pollutants. This paper reviews a number of cleaning methods, and the effects of cleaning on foliar chemistry. Cleaning may include mechanical techniques such as the use of dry or moistened tissues, shaking, blowing, and brushing, or use various washing techniques with water or other solvents. Owing to the diversity of plant species, tissue differences, etc., there is no standard procedure for all kinds of samples. Analysis of uncleaned leaves is considered a good method for assessing the degree of air contamination because it provides an estimate of the element content of the deposits on leaf surfaces or when the analysis is aimed at the investigation of transfer of elements along the food chain. Sample cleaning is recommended in order (1) to investigate the transfer rate of chemical elements from soil to plants, (2) to qualify the washoff of dry deposition from foliage and (3) to separate superficially absorbed and biomass-incorporated elements. Since there is not a standard cleaning procedure for all kinds of samples and aims, it is advised to conduct a pilot study in order to be able to establish a cleaning procedure to provide reliable foliar data. (orig.)

  4. Detection of equine herpesvirus in horses with idiopathic keratoconjunctivitis and comparison of three sampling techniques.

    Science.gov (United States)

    Hollingsworth, Steven R; Pusterla, Nicola; Kass, Philip H; Good, Kathryn L; Brault, Stephanie A; Maggs, David J

    2015-09-01

    To determine the role of equine herpesvirus (EHV) in idiopathic keratoconjunctivitis in horses and to determine whether sample collection method affects detection of EHV DNA by quantitative polymerase chain reaction (qPCR). Twelve horses with idiopathic keratoconjunctivitis and six horses without signs of ophthalmic disease. Conjunctival swabs, corneal scrapings, and conjunctival biopsies were collected from 18 horses: 12 clinical cases with idiopathic keratoconjunctivitis and six euthanized controls. In horses with both eyes involved, the samples were taken from the eye judged to be more severely affected. Samples were tested with qPCR for EHV-1, EHV-2, EHV-4, and EHV-5 DNA. Quantity of EHV DNA and viral replicative activity were compared between the two populations and among the different sampling techniques; relative sensitivities of the sampling techniques were determined. Prevalence of EHV DNA as assessed by qPCR did not differ significantly between control horses and those with idiopathic keratoconjunctivitis. Sampling by conjunctival swab was more likely to yield viral DNA as assessed by qPCR than was conjunctival biopsy. EHV-1 and EHV-4 DNA were not detected in either normal or IKC-affected horses; EHV-2 DNA was detected in two of 12 affected horses but not in normal horses. EHV-5 DNA was commonly found in ophthalmically normal horses and horses with idiopathic keratoconjunctivitis. Because EHV-5 DNA was commonly found in control horses and in horses with idiopathic keratoconjunctivitis, qPCR was not useful for the etiological diagnosis of equine keratoconjunctivitis. Conjunctival swabs were significantly better at obtaining viral DNA samples than conjunctival biopsy in horses in which EHV-5 DNA was found. © 2015 American College of Veterinary Ophthalmologists.

  5. Parameters, test criteria and fault assessment in random sampling of waste barrels from non-qualified processes

    International Nuclear Information System (INIS)

    Martens, B.R.

    1989-01-01

    In the context of random sampling tests, parameters are checked on the waste barrels and criteria are given on which these tests are based. Also, it is shown how faulty data on the properties of the waste or faulty waste barrels should be treated. To decide the extent of testing, the properties of the waste relevant to final storage are determined based on the conditioning process used. (DG) [de

  6. Chorionic villus sampling and amniocentesis.

    Science.gov (United States)

    Brambati, Bruno; Tului, Lucia

    2005-04-01

    The advantages and disadvantages of common invasive methods for prenatal diagnosis are presented in light of new investigations. Several aspects of first-trimester chorionic villus sampling and mid-trimester amniocentesis remain controversial, especially fetal loss rate, feto-maternal complications, and the extension of both sampling methods to less traditional gestational ages (early amniocentesis, late chorionic villus sampling), all of which complicate genetic counseling. A recent randomized trial involving early amniocentesis and late chorionic villus sampling has confirmed previous studies, leading to the unquestionable conclusion that transabdominal chorionic villus sampling is safer. The old dispute over whether limb reduction defects are caused by chorionic villus sampling gains new vigor, with a paper suggesting that this technique has distinctive teratogenic effects. The large experience involving maternal and fetal complications following mid-trimester amniocentesis allows a better estimate of risk for comparison with chorionic villus sampling. Transabdominal chorionic villus sampling, which appears to be the gold standard sampling method for genetic investigations between 10 and 15 completed weeks, permits rapid diagnosis in high-risk cases detected by first-trimester screening of aneuploidies. Sampling efficiency and karyotyping reliability are as high as in mid-trimester amniocentesis with fewer complications, provided the operator has the required training, skill and experience.

  7. Beyond simple small-angle X-ray scattering: developments in online complementary techniques and sample environments

    Directory of Open Access Journals (Sweden)

    Wim Bras

    2014-11-01

    Full Text Available Small- and wide-angle X-ray scattering (SAXS, WAXS are standard tools in materials research. The simultaneous measurement of SAXS and WAXS data in time-resolved studies has gained popularity due to the complementary information obtained. Furthermore, the combination of these data with non X-ray based techniques, via either simultaneous or independent measurements, has advanced understanding of the driving forces that lead to the structures and morphologies of materials, which in turn give rise to their properties. The simultaneous measurement of different data regimes and types, using either X-rays or neutrons, and the desire to control parameters that initiate and control structural changes have led to greater demands on sample environments. Examples of developments in technique combinations and sample environment design are discussed, together with a brief speculation about promising future developments.

  8. Determination of trace element contents in grass samples for cattle feeding using NAA techniques

    Energy Technology Data Exchange (ETDEWEB)

    Yusof, Alias Mohamad; Jagir Singh, Jasbir Kaur

    1987-09-01

    An investigation on trace elements contents in six types of grass samples used for cattle feeding have been carried out using NAA techniques. The grass samples, Mardi Digit, African Star, Signal, Guinea, Setaria and Setaria Splendida were found to contain at least 19 trace elements in varying concentrations. The results were compared to the figures obtained from available sources to ascertain the status as to whether the grass samples studied would satisfy the minimum requirements of trace elements present in grass for cattle feeding or otherwise. Preference made on the suitability of the grass samples for cattle feeding was based on the availability and abundance of the trace elements, taking into account factors such as the degree of toxicity, inadequate amounts and contamination due to the presence of other trace elements not essential for cattle feeding.

  9. Determination of trace element contents in grass samples for cattle feeding using NAA techniques

    International Nuclear Information System (INIS)

    Alias Mohamad Yusof; Jasbir Kaur Jagir Singh

    1987-01-01

    An investigation on trace elements contents in six types of grass samples used for cattle feeding have been carried out using NAA techniques. The grass samples, Mardi Digit, African Star, Signal, Guinea, Setaria and Setaria Splendida were found to contain at least 19 trace elements in varying concentrations. The results were compared to the figures obtained from available sources to ascertain the status as to whether the grass samples studied would satisfy the minimum requirements of trace elements present in grass for cattle feeding or otherwise. Preference made on the suitability of the grass samples for cattle feeding was based on the availability and abundance of the trace elements, taking into account factors such as the degree of toxicity, inadequate amounts and contamination due to the presence of other trace elements not essential for cattle feeding. (author)

  10. Efficiency of analytical and sampling-based uncertainty propagation in intensity-modulated proton therapy

    Science.gov (United States)

    Wahl, N.; Hennig, P.; Wieser, H. P.; Bangert, M.

    2017-07-01

    The sensitivity of intensity-modulated proton therapy (IMPT) treatment plans to uncertainties can be quantified and mitigated with robust/min-max and stochastic/probabilistic treatment analysis and optimization techniques. Those methods usually rely on sparse random, importance, or worst-case sampling. Inevitably, this imposes a trade-off between computational speed and accuracy of the uncertainty propagation. Here, we investigate analytical probabilistic modeling (APM) as an alternative for uncertainty propagation and minimization in IMPT that does not rely on scenario sampling. APM propagates probability distributions over range and setup uncertainties via a Gaussian pencil-beam approximation into moments of the probability distributions over the resulting dose in closed form. It supports arbitrary correlation models and allows for efficient incorporation of fractionation effects regarding random and systematic errors. We evaluate the trade-off between run-time and accuracy of APM uncertainty computations on three patient datasets. Results are compared against reference computations facilitating importance and random sampling. Two approximation techniques to accelerate uncertainty propagation and minimization based on probabilistic treatment plan optimization are presented. Runtimes are measured on CPU and GPU platforms, dosimetric accuracy is quantified in comparison to a sampling-based benchmark (5000 random samples). APM accurately propagates range and setup uncertainties into dose uncertainties at competitive run-times (GPU ≤slant {5} min). The resulting standard deviation (expectation value) of dose show average global γ{3% / {3}~mm} pass rates between 94.2% and 99.9% (98.4% and 100.0%). All investigated importance sampling strategies provided less accuracy at higher run-times considering only a single fraction. Considering fractionation, APM uncertainty propagation and treatment plan optimization was proven to be possible at constant time complexity

  11. Effects of modulation techniques on the input current interharmonics of Adjustable Speed Drives

    DEFF Research Database (Denmark)

    Soltani, Hamid; Davari, Pooya; Zare, Firuz

    2018-01-01

    operation of the grid. This paper presents the effect of the symmetrical regularly sampled Space Vector Modulation (SVM) and Discontinuous Pulse Width Modulation-30olag (DPWM2) techniques, as the most popular modulation methods in the ASD applications, on the drive’s input current interharmonic magnitudes....... Further investigations are also devoted to the cases, where the Random Modulation (RM) technique is applied on the selected modulation strategies. The comparative results show that how different modulation techniques can influence the ASD’s input current interharmonics and consequently may...

  12. Performance of medical residents in sterile techniques during central vein catheterization: randomized trial of efficacy of simulation-based training.

    Science.gov (United States)

    Khouli, Hassan; Jahnes, Katherine; Shapiro, Janet; Rose, Keith; Mathew, Joseph; Gohil, Amit; Han, Qifa; Sotelo, Andre; Jones, James; Aqeel, Adnan; Eden, Edward; Fried, Ethan

    2011-01-01

    Catheter-related bloodstream infection (CRBSI) is a preventable cause of a potentially lethal ICU infection. The optimal method to teach health-care providers correct sterile techniques during central vein catheterization (CVC) remains unclear. We randomly assigned second- and third-year internal medicine residents trained by a traditional apprenticeship model to simulation-based plus video training or video training alone from December 2007 to January 2008, with a follow-up period to examine CRBSI ending in July 2009. During the follow-up period, a simulation-based training program in sterile techniques during CVC was implemented in the medical ICU (MICU). A surgical ICU (SICU) where no residents received study interventions was used for comparison. The primary outcome measures were median residents' scores in sterile techniques and rates of CRBSI per 1,000 catheter-days. Of the 47 enrolled residents, 24 were randomly assigned to the simulation-based plus video training group and 23 to the video training group. Median baseline scores in both groups were equally poor: 12.5 to 13 (52%-54%) out of maximum score of 24 (P = .95; median difference, 0; 95% CI, 0.2-2.0). After training, median score was significantly higher for the simulation-based plus video training group: 22 (92%) vs 18 (75%) for the video training group (P training in sterile techniques during CVC is superior to traditional training or video training alone and is associated with decreased rate of CRBSI. Simulation-based training in CVC should be routinely used to reduce iatrogenic risk. ClinicalTrials.gov; No.: NCT00612131; URL: clinicaltrials.gov.

  13. Coordination of Conditional Poisson Samples

    Directory of Open Access Journals (Sweden)

    Grafström Anton

    2015-12-01

    Full Text Available Sample coordination seeks to maximize or to minimize the overlap of two or more samples. The former is known as positive coordination, and the latter as negative coordination. Positive coordination is mainly used for estimation purposes and to reduce data collection costs. Negative coordination is mainly performed to diminish the response burden of the sampled units. Poisson sampling design with permanent random numbers provides an optimum coordination degree of two or more samples. The size of a Poisson sample is, however, random. Conditional Poisson (CP sampling is a modification of the classical Poisson sampling that produces a fixed-size πps sample. We introduce two methods to coordinate Conditional Poisson samples over time or simultaneously. The first one uses permanent random numbers and the list-sequential implementation of CP sampling. The second method uses a CP sample in the first selection and provides an approximate one in the second selection because the prescribed inclusion probabilities are not respected exactly. The methods are evaluated using the size of the expected sample overlap, and are compared with their competitors using Monte Carlo simulation. The new methods provide a good coordination degree of two samples, close to the performance of Poisson sampling with permanent random numbers.

  14. Trace uranium analysis in Indian coal samples using the fission track technique

    International Nuclear Information System (INIS)

    Jojo, P.J.; Rawat, A.; Kumar, Ashavani; Prasad, Rajendra

    1993-01-01

    The ever-growing demand for energy has resulted in the extensive use of fossil fuels, especially coal, for power generation. Coal and its by-products often contain significant amounts of radionuclides, including uranium, which is the ultimate source of the radioactive gas Radon-222. The present study gives the concentration of uranium in coal samples of different collieries in India, collected from various thermal power plants in the state of Uttar Pradesh. The estimates were made using the fission track technique. Latent damage tracks were not found to be uniformly distributed but showed sun bursts and clusters. Non-uniform distributions of trace elements are a very common phenomenon in rocks. The levels of uranium in the coal samples were found to vary from 2.0 to 4.9 ppm in uniform distributions and from 21.3 to 41.0 ppm in non-uniform distributions. Measurements were also made on fly ash samples where the average uranium concentration was found to be 8.4 and 49.3 ppm in uniform and non-uniform distributions, respectively. (author)

  15. Immediate effects of the suboccipital muscle inhibition technique in craniocervical posture and greater occipital nerve mechanosensitivity in subjects with a history of orthodontia use: a randomized trial.

    Science.gov (United States)

    Heredia Rizo, Alberto M; Pascual-Vaca, Ángel Oliva; Cabello, Manuel Albornoz; Blanco, Cleofás Rodríguez; Pozo, Fernando Piña; Carrasco, Antonio Luque

    2012-07-01

    The purpose of this study was to measure the immediate differences in craniocervical posture and pressure pain threshold of the greater occipital (GO) nerve in asymptomatic subjects with a history of having used orthodontics, after intervention by a suboccipital muscle inhibition (SMI) technique. This was a randomized, single-blind, clinical study with a sample of 24 subjects (21±1.78 years) that were divided into an experimental group (n=12) who underwent the SMI technique and a sham group (n=12) who underwent a sham (placebo) intervention. The sitting and standing craniovertebral angle and the pressure pain threshold of the GO nerve in both hemispheres were measured. The between-group comparison of the sample indicated that individuals subjected to the SMI technique showed a statistically significant increase in the craniovertebral angle in both the sitting (P<.001, F1,22=102.09, R2=0.82) and the standing (P<.001, F1,22=21.42, R2=0.56) positions and in the GO nerve pressure pain threshold in the nondominant hemisphere (P=.014, F1,22=7.06, R2=0.24). There were no statistically significant differences observed for the GO nerve mechanosensitivity in the dominant side (P=.202). Suboccipital muscle inhibition technique immediately improved the position of the head with the subject seated and standing, the clinical effect size being large in the former case. It also immediately decreased the mechanosensitivity of the GO nerve in the nondominant hemisphere, although the effect size was small. Copyright © 2012 National University of Health Sciences. Published by Mosby, Inc. All rights reserved.

  16. Randomized Comparison of Two Vaginal Self-Sampling Methods for Human Papillomavirus Detection: Dry Swab versus FTA Cartridge.

    Science.gov (United States)

    Catarino, Rosa; Vassilakos, Pierre; Bilancioni, Aline; Vanden Eynde, Mathieu; Meyer-Hamme, Ulrike; Menoud, Pierre-Alain; Guerry, Frédéric; Petignat, Patrick

    2015-01-01

    Human papillomavirus (HPV) self-sampling (self-HPV) is valuable in cervical cancer screening. HPV testing is usually performed on physician-collected cervical smears stored in liquid-based medium. Dry filters and swabs are an alternative. We evaluated the adequacy of self-HPV using two dry storage and transport devices, the FTA cartridge and swab. A total of 130 women performed two consecutive self-HPV samples. Randomization determined which of the two tests was performed first: self-HPV using dry swabs (s-DRY) or vaginal specimen collection using a cytobrush applied to an FTA cartridge (s-FTA). After self-HPV, a physician collected a cervical sample using liquid-based medium (Dr-WET). HPV types were identified by real-time PCR. Agreement between collection methods was measured using the kappa statistic. HPV prevalence for high-risk types was 62.3% (95%CI: 53.7-70.2) detected by s-DRY, 56.2% (95%CI: 47.6-64.4) by Dr-WET, and 54.6% (95%CI: 46.1-62.9) by s-FTA. There was overall agreement of 70.8% between s-FTA and s-DRY samples (kappa = 0.34), and of 82.3% between self-HPV and Dr-WET samples (kappa = 0.56). Detection sensitivities for low-grade squamous intraepithelial lesion or worse (LSIL+) were: 64.0% (95%CI: 44.5-79.8) for s-FTA, 84.6% (95%CI: 66.5-93.9) for s-DRY, and 76.9% (95%CI: 58.0-89.0) for Dr-WET. The preferred self-collection method among patients was s-DRY (40.8% vs. 15.4%). Regarding costs, FTA card was five times more expensive than the swab (~5 US dollars (USD)/per card vs. ~1 USD/per swab). Self-HPV using dry swabs is sensitive for detecting LSIL+ and less expensive than s-FTA. International Standard Randomized Controlled Trial Number (ISRCTN): 43310942.

  17. Single-particle characterization of ice-nucleating particles and ice particles residuals sampled by three different techniques

    Science.gov (United States)

    Kandler, Konrad; Worringen, Annette; Benker, Nathalie; Dirsch, Thomas; Mertes, Stephan; Schenk, Ludwig; Kästner, Udo; Frank, Fabian; Nillius, Björn; Bundke, Ulrich; Rose, Diana; Curtius, Joachim; Kupiszewski, Piotr; Weingartner, Ernest; Vochezer, Paul; Schneider, Johannes; Schmidt, Susan; Weinbruch, Stephan; Ebert, Martin

    2015-04-01

    During January/February 2013, at the High Alpine Research Station Jungfraujoch a measurement campaign was carried out, which was centered on atmospheric ice-nucleating particles (INP) and ice particle residuals (IPR). Three different techniques for separation of INP and IPR from the non-ice-active particles are compared. The Ice Selective Inlet (ISI) and the Ice Counterflow Virtual Impactor (Ice-CVI) sample ice particles from mixed phase clouds and allow for the analysis of the residuals. The combination of the Fast Ice Nucleus Chamber (FINCH) and the Ice Nuclei Pumped Counterflow Virtual Impactor (IN-PCVI) provides ice-activating conditions to aerosol particles and extracts the activated INP for analysis. Collected particles were analyzed by scanning electron microscopy and energy-dispersive X-ray microanalysis to determine size, chemical composition and mixing state. All INP/IPR-separating techniques had considerable abundances (median 20 - 70 %) of instrumental contamination artifacts (ISI: Si-O spheres, probably calibration aerosol; Ice-CVI: Al-O particles; FINCH+IN-PCVI: steel particles). Also, potential sampling artifacts (e.g., pure soluble material) occurred with a median abundance of separated by all three techniques. Soot was a minor contributor. Lead was detected in less than 10 % of the particles, of which the majority were internal mixtures with other particle types. Sea-salt and sulfates were identified by all three methods as INP/IPR. Most samples showed a maximum of the INP/IPR size distribution at 400 nm geometric diameter. In a few cases, a second super-micron maximum was identified. Soot/carbonaceous material and metal oxides were present mainly in the submicron range. ISI and FINCH yielded silicates and Ca-rich particles mainly with diameters above 1 µm, while the Ice-CVI also separated many submicron IPR. As strictly parallel sampling could not be performed, a part of the discrepancies between the different techniques may result from

  18. Estimation of Correlation Functions by Random Decrement

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Brincker, Rune

    This paper illustrates how correlation functions can be estimated by the random decrement technique. Several different formulations of the random decrement technique, estimating the correlation functions are considered. The speed and accuracy of the different formulations of the random decrement...... and the length of the correlation functions. The accuracy of the estimates with respect to the theoretical correlation functions and the modal parameters are both investigated. The modal parameters are extracted from the correlation functions using the polyreference time domain technique....

  19. Investigating the Randomness of Numbers

    Science.gov (United States)

    Pendleton, Kenn L.

    2009-01-01

    The use of random numbers is pervasive in today's world. Random numbers have practical applications in such far-flung arenas as computer simulations, cryptography, gambling, the legal system, statistical sampling, and even the war on terrorism. Evaluating the randomness of extremely large samples is a complex, intricate process. However, the…

  20. LVQ-SMOTE - Learning Vector Quantization based Synthetic Minority Over-sampling Technique for biomedical data.

    Science.gov (United States)

    Nakamura, Munehiro; Kajiwara, Yusuke; Otsuka, Atsushi; Kimura, Haruhiko

    2013-10-02

    Over-sampling methods based on Synthetic Minority Over-sampling Technique (SMOTE) have been proposed for classification problems of imbalanced biomedical data. However, the existing over-sampling methods achieve slightly better or sometimes worse result than the simplest SMOTE. In order to improve the effectiveness of SMOTE, this paper presents a novel over-sampling method using codebooks obtained by the learning vector quantization. In general, even when an existing SMOTE applied to a biomedical dataset, its empty feature space is still so huge that most classification algorithms would not perform well on estimating borderlines between classes. To tackle this problem, our over-sampling method generates synthetic samples which occupy more feature space than the other SMOTE algorithms. Briefly saying, our over-sampling method enables to generate useful synthetic samples by referring to actual samples taken from real-world datasets. Experiments on eight real-world imbalanced datasets demonstrate that our proposed over-sampling method performs better than the simplest SMOTE on four of five standard classification algorithms. Moreover, it is seen that the performance of our method increases if the latest SMOTE called MWMOTE is used in our algorithm. Experiments on datasets for β-turn types prediction show some important patterns that have not been seen in previous analyses. The proposed over-sampling method generates useful synthetic samples for the classification of imbalanced biomedical data. Besides, the proposed over-sampling method is basically compatible with basic classification algorithms and the existing over-sampling methods.

  1. Method and apparstus for determining random coincidence count rate in a scintillation counter utilizing the coincidence technique

    International Nuclear Information System (INIS)

    Horrocks, D.L.

    1980-01-01

    A method and apparatus for the reliable determination of a random coincidence count attributable to chance coincidences of single-photon events which are each detected in only a single detector of a scintillation counter utilizing two detectors in a coincidence counting technique are described. A firstdelay device is employed to delay output pulses from one detector, and then the delayed signal is compared with the undelayed signal from the other detector in a coincidence circuit, to obtain an approximate random coincidence count. The output of the coincidence circuit is applied to an anti-coincidence circuit, where it is corrected by elimination of pulses coincident with, and attributable to, conventionally detected real coincidences, and by elimination of pulses coincident with, and attributable to, real coincidences that have been delayed by a second delay device having the same time parameter as the first. 8 claims

  2. Extraction of Plutonium From Spiked INEEL Soil Samples Using the Ligand-Assisted Supercritical Fluid Extraction (LA-SFE) Technique

    International Nuclear Information System (INIS)

    Fox, R.V.; Mincher, B.J.; Holmes, R.G.G.

    1999-01-01

    In order to investigate the effectiveness of ligand-assisted supercritical fluid extraction for the removal of transuranic contaminations from soils an Idaho National Engineering and Environmental Laboratory (INEEL) silty-clay soil sample was obtained from near the Radioactive Waste Management Complex area and subjected to three different chemical preparations before being spiked with plutonium. The spiked INEEL soil samples were subjected to a sequential aqueous extraction procedure to determine radionuclide portioning in each sample. Results from those extractions demonstrate that plutonium consistently partitioned into the residual fraction across all three INEEL soil preparations whereas americium partitioned 73% into the iron/manganese fraction for soil preparation A, with the balance partitioning into the residual fraction. Plutonium and americium were extracted from the INEEL soil samples using a ligand-assisted supercritical fluid extraction technique. Initial supercritical fluid extraction runs produced plutonium extraction technique. Initial supercritical fluid extraction runs produced plutonium extraction efficiencies ranging from 14% to 19%. After a second round wherein the initial extraction parameters were changed, the plutonium extraction efficiencies increased to 60% and as high as 80% with the americium level in the post-extracted soil samples dropping near to the detection limits. The third round of experiments are currently underway. These results demonstrate that the ligand-assisted supercritical fluid extraction technique can effectively extract plutonium from the spiked INEEL soil preparations

  3. THE USE OF SKIT TECHNIQUE TO INCREASE STUDENTS’ SELF-CONFIDENCE IN SPEAKING

    Directory of Open Access Journals (Sweden)

    Dini Deswarni

    2017-12-01

    The study aimed to know the use of Skit Technique to increase students’ self-confidence in speaking English. Speaking is one of important skills in English. Self-Confidence is really need to speak in English. Someone needs self-confidence to express their idea and their feelings in a foreign language especially in English. Not all of the people have good self-confidence in using English to express or to share their idea and their opinion. The aim of this study is to know whether the Skit Technique can increase self- confidence in speaking English. In other words Skit Technique can help to increase self-confidence in speaking English. The design of this research is experiment. The population of the research is 46 students at the eight grade students of MTs Darul Ihsan Duri. The sample technique is random sampling. From the data, it can be seen that Ttable < Tvalue (2.101 < 12.466. It means Ha is accepted and Skit Technique done in experiment class.

  4. Effectiveness of passive stretching versus hold relax technique in flexibility of hamstring muscle

    Directory of Open Access Journals (Sweden)

    Gauri Shankar

    2010-10-01

    Full Text Available Aim: To compare the effectiveness of passive stretching and hold relax technique in the flexibility of hamstring muscle. Methods: A total of 80 normal healthy female subjects between age group 20-30 years referred to the department of physiotherapy, Sumandeep Vidyapeeth University, sampling method being convenient sampling. The subjects were randomly divided in two groups i.e. passive stretching group (n=40 and PNF group (n=40 and given passive stretching and proprioceptive neuromuscular facilitation technique respectively. Active knee extension range was measured before and after the intervention by goniometer. Results: t test showed a highly significant (p=0.000 increase in range of motion in PNF group. Conclusion: Proprioceptive neuromuscular facilitation technique is more effective in increasing hamstring flexibility than the passive stretching.

  5. A novel in-situ sampling and VFA sensor technique for anaerobic systems

    DEFF Research Database (Denmark)

    Pind, Peter Frode; Angelidaki, Irini; Ahring, Birgitte Kiær

    2002-01-01

    A key information for understanding and controlling the anaerobic biogas process is the concentration of Volatile Fatty Acids (VFA). However, access to this information has so far been limited to off-line measurements by manual time and labour consuming methods. We have developed a new technique ...... than 1000 samples on both a fullscale biogas plant and lab-scale reactors. The measuring range covers specific measurements of acetate, propionate, iso-/n-butyrate and iso-/n-valerate from 0.1 to 50 mM (6–3,000 mg)....

  6. An Electromagnetic Gauge Technique for Measuring Shocked Particle Velocity in Electrically Conductive Samples

    Science.gov (United States)

    Cheng, David; Yoshinaka, Akio

    2014-11-01

    Electromagnetic velocity (EMV) gauges are a class of film gauges which permit the direct in-situ measurement of shocked material flow velocity. The active sensing element, typically a metallic foil, requires exposure to a known external magnetic field in order to produce motional electromotive force (emf). Due to signal distortion caused by mutual inductance between sample and EMV gauge, this technique is typically limited to shock waves in non-conductive materials. In conductive samples, motional emf generated in the EMV gauge has to be extracted from the measured signal which results from the combined effects of both motional emf and voltage changes from induced currents. An electromagnetic technique is presented which analytically models the dynamics of induced current between a copper disk moving as a rigid body with constant 1D translational velocity toward an EMV gauge, where both disk and gauge are exposed to a uniform external static magnetic field. The disk is modelled as a magnetic dipole loop where its Foucault current is evaluated from the characteristics of the fields, whereas the EMV gauge is modelled as a circuit loop immersed in the field of the magnetic dipole loop, the intensity of which is calculated as a function of space and, implicitly, time. Equations of mutual induction are derived and the current induced in the EMV gauge loop is solved, allowing discrimination of the motional emf. Numerical analysis is provided for the step response of the induced EMV gauge current with respect to the Foucault current in the moving copper sample.

  7. Active sampling technique to enhance chemical signature of buried explosives

    Science.gov (United States)

    Lovell, John S.; French, Patrick D.

    2004-09-01

    Deminers and dismounted countermine engineers commonly use metal detectors, ground penetrating radar and probes to locate mines. Many modern landmines have a very low metal content, which severely limits the effectiveness of metal detectors. Canines have also been used for landmine detection for decades. Experiments have shown that canines smell the explosives which are known to leak from most types of landmines. The fact that dogs can detect landmines indicates that vapor sensing is a viable approach to landmine detection. Several groups are currently developing systems to detect landmines by "sniffing" for the ultra-trace explosive vapors above the soil. The amount of material that is available to passive vapor sensing systems is limited to no more than the vapor in equilibrium with the explosive related chemicals (ERCs) distributed in the surface soils over and near the landmine. The low equilibrium vapor pressure of TNT in the soil/atmosphere boundary layer and the limited volume of the boundary layer air imply that passive chemical vapor sensing systems require sensitivities in the picogram range, or lower. ADA is working to overcome many of the limitations of passive sampling methods, by the use of an active sampling method that employs a high-powered (1,200+ joules) strobe lamp to create a highly amplified plume of vapor and/or ERC-bearing fine particulates. Initial investigations have demonstrated that this approach can amplify the detectability of TNT by two or three orders of magnitude. This new active sampling technique could be used with any suitable explosive sensor.

  8. Characteristics of men with substance use disorder consequent to illicit drug use: comparison of a random sample and volunteers.

    Science.gov (United States)

    Reynolds, Maureen D; Tarter, Ralph E; Kirisci, Levent

    2004-09-06

    Men qualifying for substance use disorder (SUD) consequent to consumption of an illicit drug were compared according to recruitment method. It was hypothesized that volunteers would be more self-disclosing and exhibit more severe disturbances compared to randomly recruited subjects. Personal, demographic, family, social, substance use, psychiatric, and SUD characteristics of volunteers (N = 146) were compared to randomly recruited (N = 102) subjects. Volunteers had lower socioceconomic status, were more likely to be African American, and had lower IQ than randomly recruited subjects. Volunteers also evidenced greater social and family maladjustment and more frequently had received treatment for substance abuse. In addition, lower social desirability response bias was observed in the volunteers. SUD was not more severe in the volunteers; however, they reported a higher lifetime rate of opiate, diet, depressant, and analgesic drug use. Volunteers and randomly recruited subjects qualifying for SUD consequent to illicit drug use are similar in SUD severity but differ in terms of severity of psychosocial disturbance and history of drug involvement. The factors discriminating volunteers and randomly recruited subjects are well known to impact on outcome, hence they need to be considered in research design, especially when selecting a sampling strategy in treatment research.

  9. Sampling phased array, a new technique for ultrasonic signal processing and imaging now available to industry

    OpenAIRE

    Verkooijen, J.; Bulavinov, A.

    2008-01-01

    Over the past 10 years the improvement in the field of microelectronics and computer engineering has led to significant advances in ultrasonic signal processing and image construction techniques that are currently being applied to non-destructive material evaluation. A new phased array technique, called "Sampling Phased Array" has been developed in the Fraunhofer Institute for non-destructive testing [1]. It realizes a unique approach of measurement and processing of ultrasonic signals. The s...

  10. Determination of multi-element in marine sediment samples collected in Angola by the k0-NAA technique

    International Nuclear Information System (INIS)

    Teixeira, M.C.P.; Ho Manh Dung; Cao Dong Vu; Nguyen Thi Sy; Nguyen Thanh Binh; Vuong Huu Tan

    2006-01-01

    The marine sediment samples were designed to collect in Angola for marine environmental pollution study. The k 0 -standardization method of neutron activation analysis (k 0 -NAA) on Dalat research reactor has been developed to determine of multi-element in the Angola marine sediment samples. The samples were irradiated in cell 7-1 for short- and middle-lived nuclides and rotary specimen rack for long-lived nuclides. The irradiation facilities were characterized for neutron spectrum parameters and post-activated samples were measured on the calibrated gamma-ray spectrometers using HPGe detectors. The analytical results for 9 marine sediment samples with 27 elements: Al, As, Br, Ca, Ce,Cl, Co, Cs, Dy, Fe, Hf, I, K, Mg, Mn, Na, Rb, Sb, Sc, Se, Sm, Th, Ti, U, V and Zn in term of mean concentration, standard deviation and their content range are shown in the report. The analytical quality assurance was done by analysis of a Japan's certified reference material namely marine sediment NMIJ-CRM-7302a. These preliminary results revealed that the k 0 -NAA technique on the Dalat research reactor is a good analytical technique for determination of multi-element in the marine sediment samples. Some heavy metals and trace elements determined in this work possibly connected to the human activities at the sampling region. (author)

  11. Enhanced sampling algorithms.

    Science.gov (United States)

    Mitsutake, Ayori; Mori, Yoshiharu; Okamoto, Yuko

    2013-01-01

    In biomolecular systems (especially all-atom models) with many degrees of freedom such as proteins and nucleic acids, there exist an astronomically large number of local-minimum-energy states. Conventional simulations in the canonical ensemble are of little use, because they tend to get trapped in states of these energy local minima. Enhanced conformational sampling techniques are thus in great demand. A simulation in generalized ensemble performs a random walk in potential energy space and can overcome this difficulty. From only one simulation run, one can obtain canonical-ensemble averages of physical quantities as functions of temperature by the single-histogram and/or multiple-histogram reweighting techniques. In this article we review uses of the generalized-ensemble algorithms in biomolecular systems. Three well-known methods, namely, multicanonical algorithm, simulated tempering, and replica-exchange method, are described first. Both Monte Carlo and molecular dynamics versions of the algorithms are given. We then present various extensions of these three generalized-ensemble algorithms. The effectiveness of the methods is tested with short peptide and protein systems.

  12. Characterization of Some Iraqi Archaeological Samples Using IBA, Analytical X-ray and Other Complementary Techniques

    International Nuclear Information System (INIS)

    Al-Sarraj, Ziyad Shihab; Damboos, Hassan I; Roumie, Mohamad

    2012-01-01

    The present work aimed at investigating the compositions and microstructures of some archaeological samples which dated back to various periods of the ancient Iraqi civilizations using PIXE, XRF, XRD, and SEM techniques. The models selected for the study (ceramics, glaze, etc.) were diverse in size and nature, therefore a limited number of samples were then butted from them by a small diamond wheel. Conventional powder metallurgy method was then used to prepare the samples. Dried samples were then coated with a thin layer of carbon, and analyzed using the ion beam accelerator of the LAEC. Three other groups of samples were also prepared for the purpose of analysis by X-ray fluorescence (XRF), X-ray diffraction (XRD), and scanning electron microscope (SEM). Analysis results of the chemical composition showed good agreement between the various techniques as well as for phases, while the fine structure analysis obtained by optical and scanning microscopy exhibited features of a structure where it got an intensified densification in the final stage of sintering and accompanied by quasi-homogeneous distribution of the closed pores. This will lead to the conclusion that the temperature used for sintering by ancient Iraqi was sufficient and it may fall in the range between 950-1200°C, also the mixes and the forming methods used by them, were both suitable to obtain good sintered bodies with even distribution of pores. A ring-shaped trace noticed in SEM micrographs need more work and study to explain what it is?

  13. Analysis of soil samples from Gebeng area using NAA technique

    Science.gov (United States)

    Elias, Md Suhaimi; Wo, Yii Mei; Hamzah, Mohd Suhaimi; Shukor, Shakirah Abd; Rahman, Shamsiah Ab; Salim, Nazaratul Ashifa Abdullah; Azman, Muhamad Azfar; Hashim, Azian

    2017-01-01

    Rapid development and urbanization will increase number of residence and industrial area. Without proper management and control of pollution, these will give an adverse effect to environment and human life. The objective of this study to identify and quantify key contaminants into the environment of the Gebeng area as a result of industrial and human activities. Gebeng area was gazetted as one of the industrial estate in Pahang state. Assessment of elemental pollution in soil of Gebeng area base on level of concentration, enrichment factor and geo-accumulation index. The enrichment factors (EFs) were determined by the elemental rationing method, whilst the geo-accumulation index (Igeo) by comparing of current to continental crustal average concentration of element. Twenty-seven of soil samples were collected from Gebeng area. Soil samples were analysed by using Neutron Activation Analyses (NAA) technique. The obtained data showed higher concentration of iron (Fe) due to abundance in soil compared to other elements. The results of enrichment factor showed that Gebeng area have enrich with elements of As, Br, Hf, Sb, Th and U. Base on the geo-accumulation index (Igeo) classification, the soil quality of Gebeng area can be classified as class 0, (uncontaminated) to Class 3, (moderately to heavily contaminated).

  14. Advances in modern sample preparation techniques using microwaves assisted chemistry for metal species determination (W1)

    International Nuclear Information System (INIS)

    Ponard, O.F.X.

    2002-01-01

    Full text: Sample preparation has long been the bottleneck of environmental analysis for both total and species specific analysis. Digestion, extraction and preparation of the analytes are relying on a series of chemical reactions. The introduction of microwave assisted sample preparation has first been viewed as a mean to accelerate the kinetics of digestion of the matrix for total elements and fast samples preparation procedures. However, the extensive development and success of microwave digestion procedures in total elemental analysis has now allowed to have a larger insight of the perspectives offered by this technique. Microwave technologies now offer to have a precise control of the temperature and indirectly control the reaction kinetics taking place during the sample preparation procedures. Microwave assisted chemistry permits to perform simultaneously the fundamental steps required for metal species extraction and derivatization. The number of sample preparation steps used for organotin or organomercury species have been reduced to one and the total time of sample preparation brought down for a few hours to some minutes. Further, the developments of GC/ICP/MS techniques allow to routinely use speciated isotopic dilution methods has internal probe of the chemical reactions. These new approaches allow us to use the addition of the labeled species for isotopic dilution as a mean to evaluate and follow the chemical processes taking place during the extraction procedure. These procedures will help us to understand and check for the stability of the analytes during the chemistry of the sample preparation procedure and bring some insights of the chemistry taking place during the extraction. Understanding the different mechanisms involved in the sample preparation steps will allow us in return to further improve all theses procedures and bring us to the horizon of 'on-line sample preparation and detection'. (author)

  15. Exploring the Connection Between Sampling Problems in Bayesian Inference and Statistical Mechanics

    Science.gov (United States)

    Pohorille, Andrew

    2006-01-01

    The Bayesian and statistical mechanical communities often share the same objective in their work - estimating and integrating probability distribution functions (pdfs) describing stochastic systems, models or processes. Frequently, these pdfs are complex functions of random variables exhibiting multiple, well separated local minima. Conventional strategies for sampling such pdfs are inefficient, sometimes leading to an apparent non-ergodic behavior. Several recently developed techniques for handling this problem have been successfully applied in statistical mechanics. In the multicanonical and Wang-Landau Monte Carlo (MC) methods, the correct pdfs are recovered from uniform sampling of the parameter space by iteratively establishing proper weighting factors connecting these distributions. Trivial generalizations allow for sampling from any chosen pdf. The closely related transition matrix method relies on estimating transition probabilities between different states. All these methods proved to generate estimates of pdfs with high statistical accuracy. In another MC technique, parallel tempering, several random walks, each corresponding to a different value of a parameter (e.g. "temperature"), are generated and occasionally exchanged using the Metropolis criterion. This method can be considered as a statistically correct version of simulated annealing. An alternative approach is to represent the set of independent variables as a Hamiltonian system. Considerab!e progress has been made in understanding how to ensure that the system obeys the equipartition theorem or, equivalently, that coupling between the variables is correctly described. Then a host of techniques developed for dynamical systems can be used. Among them, probably the most powerful is the Adaptive Biasing Force method, in which thermodynamic integration and biased sampling are combined to yield very efficient estimates of pdfs. The third class of methods deals with transitions between states described

  16. Sample Size Calculation: Inaccurate A Priori Assumptions for Nuisance Parameters Can Greatly Affect the Power of a Randomized Controlled Trial.

    Directory of Open Access Journals (Sweden)

    Elsa Tavernier

    Full Text Available We aimed to examine the extent to which inaccurate assumptions for nuisance parameters used to calculate sample size can affect the power of a randomized controlled trial (RCT. In a simulation study, we separately considered an RCT with continuous, dichotomous or time-to-event outcomes, with associated nuisance parameters of standard deviation, success rate in the control group and survival rate in the control group at some time point, respectively. For each type of outcome, we calculated a required sample size N for a hypothesized treatment effect, an assumed nuisance parameter and a nominal power of 80%. We then assumed a nuisance parameter associated with a relative error at the design stage. For each type of outcome, we randomly drew 10,000 relative errors of the associated nuisance parameter (from empirical distributions derived from a previously published review. Then, retro-fitting the sample size formula, we derived, for the pre-calculated sample size N, the real power of the RCT, taking into account the relative error for the nuisance parameter. In total, 23%, 0% and 18% of RCTs with continuous, binary and time-to-event outcomes, respectively, were underpowered (i.e., the real power was 90%. Even with proper calculation of sample size, a substantial number of trials are underpowered or overpowered because of imprecise knowledge of nuisance parameters. Such findings raise questions about how sample size for RCTs should be determined.

  17. Duodenum preserving pancreatectomy in chronic pancreatitis: Design of a randomized controlled trial comparing two surgical techniques [ISRCTN50638764

    Directory of Open Access Journals (Sweden)

    Reidel Margot A

    2006-05-01

    Full Text Available Abstract Background Chronic pancreatitis is an inflammatory disease which is characterized by an irreversible conversion of pancreatic parenchyma to fibrous tissue. Beside obstructive jaundice and pseudocyst formation, about half of the patients need surgical intervention due to untreatable chronic pain during the course of the disease. In most of the patients with chronic pancreatitis, the head of the pancreas is the trigger of the chronic inflammatory process. Therefore, resection of pancreatic head tissue must be the central part of any surgical intervention. However, it is unclear to which extent the surgical procedure must be radical in order to obtain a favourable outcome for the patients. Design A single centre randomized controlled, superiority trial to compare two techniques of duodenum preserving pancreatic head resection. Sample size: 65 patients will be included and randomized intraoperatively. Eligibility criteria: All patients with chronic pancreatitis and indication for surgical resection and signed informed consent. Cumulative primary endpoint (hierarchical model: duration of surgical procedure, quality of life after one year, duration of intensive care unit stay, duration of hospital stay. Reference treatment: Resection of the pancreatic head with dissection of the pancreas from the portal vein and transsection of the gland (Beger procedure. Intervention: Partial Resection of the pancreatic head without transsection of the organ and visualization of the portal vein (Berne procedure. Duration: September 2003-October 2007. Organisation/responsibility The trial is conducted in compliance with the protocol and in accordance with the moral, ethical, regulatory and scientific principles governing clinical research as set out in the Declaration of Helsinki (1989 and the Good Clinical Practice guideline (GCP. The Center for Clinical Studies of the Department of Surgery Heidelberg is responsible for planning, conducting and final

  18. Republic of Georgia estimates for prevalence of drug use: Randomized response techniques suggest under-estimation.

    Science.gov (United States)

    Kirtadze, Irma; Otiashvili, David; Tabatadze, Mzia; Vardanashvili, Irina; Sturua, Lela; Zabransky, Tomas; Anthony, James C

    2018-06-01

    Validity of responses in surveys is an important research concern, especially in emerging market economies where surveys in the general population are a novelty, and the level of social control is traditionally higher. The Randomized Response Technique (RRT) can be used as a check on response validity when the study aim is to estimate population prevalence of drug experiences and other socially sensitive and/or illegal behaviors. To apply RRT and to study potential under-reporting of drug use in a nation-scale, population-based general population survey of alcohol and other drug use. For this first-ever household survey on addictive substances for the Country of Georgia, we used the multi-stage probability sampling of 18-to-64-year-old household residents of 111 urban and 49 rural areas. During the interviewer-administered assessments, RRT involved pairing of sensitive and non-sensitive questions about drug experiences. Based upon the standard household self-report survey estimate, an estimated 17.3% [95% confidence interval, CI: 15.5%, 19.1%] of Georgian household residents have tried cannabis. The corresponding RRT estimate was 29.9% [95% CI: 24.9%, 34.9%]. The RRT estimates for other drugs such as heroin also were larger than the standard self-report estimates. We remain unsure about what is the "true" value for prevalence of using illegal psychotropic drugs in the Republic of Georgia study population. Our RRT results suggest that standard non-RRT approaches might produce 'under-estimates' or at best, highly conservative, lower-end estimates. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Large Sample Neutron Activation Analysis of Heterogeneous Samples

    International Nuclear Information System (INIS)

    Stamatelatos, I.E.; Vasilopoulou, T.; Tzika, F.

    2018-01-01

    A Large Sample Neutron Activation Analysis (LSNAA) technique was developed for non-destructive analysis of heterogeneous bulk samples. The technique incorporated collimated scanning and combining experimental measurements and Monte Carlo simulations for the identification of inhomogeneities in large volume samples and the correction of their effect on the interpretation of gamma-spectrometry data. Corrections were applied for the effect of neutron self-shielding, gamma-ray attenuation, geometrical factor and heterogeneous activity distribution within the sample. A benchmark experiment was performed to investigate the effect of heterogeneity on the accuracy of LSNAA. Moreover, a ceramic vase was analyzed as a whole demonstrating the feasibility of the technique. The LSNAA results were compared against results obtained by INAA and a satisfactory agreement between the two methods was observed. This study showed that LSNAA is a technique capable to perform accurate non-destructive, multi-elemental compositional analysis of heterogeneous objects. It also revealed the great potential of the technique for the analysis of precious objects and artefacts that need to be preserved intact and cannot be damaged for sampling purposes. (author)

  20. Single-particle characterization of ice-nucleating particles and ice particle residuals sampled by three different techniques

    Science.gov (United States)

    Worringen, A.; Kandler, K.; Benker, N.; Dirsch, T.; Mertes, S.; Schenk, L.; Kästner, U.; Frank, F.; Nillius, B.; Bundke, U.; Rose, D.; Curtius, J.; Kupiszewski, P.; Weingartner, E.; Vochezer, P.; Schneider, J.; Schmidt, S.; Weinbruch, S.; Ebert, M.

    2015-04-01

    In the present work, three different techniques to separate ice-nucleating particles (INPs) as well as ice particle residuals (IPRs) from non-ice-active particles are compared. The Ice Selective Inlet (ISI) and the Ice Counterflow Virtual Impactor (Ice-CVI) sample ice particles from mixed-phase clouds and allow after evaporation in the instrument for the analysis of the residuals. The Fast Ice Nucleus Chamber (FINCH) coupled with the Ice Nuclei Pumped Counterflow Virtual Impactor (IN-PCVI) provides ice-activating conditions to aerosol particles and extracts the activated particles for analysis. The instruments were run during a joint field campaign which took place in January and February 2013 at the High Alpine Research Station Jungfraujoch (Switzerland). INPs and IPRs were analyzed offline by scanning electron microscopy and energy-dispersive X-ray microanalysis to determine their size, chemical composition and mixing state. Online analysis of the size and chemical composition of INP activated in FINCH was performed by laser ablation mass spectrometry. With all three INP/IPR separation techniques high abundances (median 20-70%) of instrumental contamination artifacts were observed (ISI: Si-O spheres, probably calibration aerosol; Ice-CVI: Al-O particles; FINCH + IN-PCVI: steel particles). After removal of the instrumental contamination particles, silicates, Ca-rich particles, carbonaceous material and metal oxides were the major INP/IPR particle types obtained by all three techniques. In addition, considerable amounts (median abundance mostly a few percent) of soluble material (e.g., sea salt, sulfates) were observed. As these soluble particles are often not expected to act as INP/IPR, we consider them as potential measurement artifacts. Minor types of INP/IPR include soot and Pb-bearing particles. The Pb-bearing particles are mainly present as an internal mixture with other particle types. Most samples showed a maximum of the INP/IPR size distribution at 200

  1. Design and building of a homemade sample changer for automation of the irradiation in neutron activation analysis technique

    International Nuclear Information System (INIS)

    Gago, Javier; Hernandez, Yuri; Baltuano, Oscar; Bedregal, Patricia; Lopez, Yon; Urquizo, Rafael

    2014-01-01

    Because the RP-10 research reactor operates during weekends, it was necessary to design and build a sample changer for irradiation as part of the automation process of neutron activation analysis technique. The device is formed by an aluminum turntable disk which can accommodate 19 polyethylene capsules, containing samples to be sent using the pneumatic transfer system from the laboratory to the irradiation position. The system is operate by a control switchboard to send and return capsules in a variable preset time and by two different ways, allowing the determination of short, medium and long lived radionuclides. Also another mechanism is designed called 'exchange valve' for changing travel paths (pipelines) allowing the irradiated samples to be stored for a longer time in the reactor hall. The system design has allowed complete automation of this technique, enabling the irradiation of samples without the presence of an analyst. The design, construction and operation of the device is described and presented in this article. (authors).

  2. Microwave Heating of Synthetic Skin Samples for Potential Treatment of Gout Using the Metal-Assisted and Microwave-Accelerated Decrystallization Technique.

    Science.gov (United States)

    Toker, Salih; Boone-Kukoyi, Zainab; Thompson, Nishone; Ajifa, Hillary; Clement, Travis; Ozturk, Birol; Aslan, Kadir

    2016-11-30

    Physical stability of synthetic skin samples during their exposure to microwave heating was investigated to demonstrate the use of the metal-assisted and microwave-accelerated decrystallization (MAMAD) technique for potential biomedical applications. In this regard, optical microscopy and temperature measurements were employed for the qualitative and quantitative assessment of damage to synthetic skin samples during 20 s intermittent microwave heating using a monomode microwave source (at 8 GHz, 2-20 W) up to 120 s. The extent of damage to synthetic skin samples, assessed by the change in the surface area of skin samples, was negligible for microwave power of ≤7 W and more extensive damage (>50%) to skin samples occurred when exposed to >7 W at initial temperature range of 20-39 °C. The initial temperature of synthetic skin samples significantly affected the extent of change in temperature of synthetic skin samples during their exposure to microwave heating. The proof of principle use of the MAMAD technique was demonstrated for the decrystallization of a model biological crystal (l-alanine) placed under synthetic skin samples in the presence of gold nanoparticles. Our results showed that the size (initial size ∼850 μm) of l-alanine crystals can be reduced up to 60% in 120 s without damage to synthetic skin samples using the MAMAD technique. Finite-difference time-domain-based simulations of the electric field distribution of an 8 GHz monomode microwave radiation showed that synthetic skin samples are predicted to absorb ∼92.2% of the microwave radiation.

  3. Microwave Heating of Synthetic Skin Samples for Potential Treatment of Gout Using the Metal-Assisted and Microwave-Accelerated Decrystallization Technique

    Science.gov (United States)

    2016-01-01

    Physical stability of synthetic skin samples during their exposure to microwave heating was investigated to demonstrate the use of the metal-assisted and microwave-accelerated decrystallization (MAMAD) technique for potential biomedical applications. In this regard, optical microscopy and temperature measurements were employed for the qualitative and quantitative assessment of damage to synthetic skin samples during 20 s intermittent microwave heating using a monomode microwave source (at 8 GHz, 2–20 W) up to 120 s. The extent of damage to synthetic skin samples, assessed by the change in the surface area of skin samples, was negligible for microwave power of ≤7 W and more extensive damage (>50%) to skin samples occurred when exposed to >7 W at initial temperature range of 20–39 °C. The initial temperature of synthetic skin samples significantly affected the extent of change in temperature of synthetic skin samples during their exposure to microwave heating. The proof of principle use of the MAMAD technique was demonstrated for the decrystallization of a model biological crystal (l-alanine) placed under synthetic skin samples in the presence of gold nanoparticles. Our results showed that the size (initial size ∼850 μm) of l-alanine crystals can be reduced up to 60% in 120 s without damage to synthetic skin samples using the MAMAD technique. Finite-difference time-domain-based simulations of the electric field distribution of an 8 GHz monomode microwave radiation showed that synthetic skin samples are predicted to absorb ∼92.2% of the microwave radiation. PMID:27917407

  4. A comparative study of 232Th and 238U activity estimation in soil samples by gamma spectrometry and neutron activation analysis technique

    International Nuclear Information System (INIS)

    Anilkumar, Rekha; Anilkumar, S.; Narayani, K.; Babu, D.A.R.; Sharma, D.N.

    2012-01-01

    Neutron activation analysis (NAA) is a well-established analytical technique. It has many advantages as compared to the other commonly used techniques. NAA can be performed in a variety of ways depending on the element, its activity level in the sample, interference from the sample matrix and other elements, etc. This technique is used to get high analytical sensitivity and low detection limits (ppm to ppb). The high sensitivity is due to the irradiation at high neutron flux available from the research reactors and the activity measurement is done using high resolution HPGe detectors. In this paper, the activity estimation of soil samples using neutron activation and direct gamma spectrometry methods are compared. Even though the weights of samples considered and samples preparation methods are different for these two methods, the estimated activity values are comparable. (author)

  5. Detection, characterization and quantification of inorganic engineered nanomaterials: A review of techniques and methodological approaches for the analysis of complex samples

    Energy Technology Data Exchange (ETDEWEB)

    Laborda, Francisco, E-mail: flaborda@unizar.es; Bolea, Eduardo; Cepriá, Gemma; Gómez, María T.; Jiménez, María S.; Pérez-Arantegui, Josefina; Castillo, Juan R.

    2016-01-21

    The increasing demand of analytical information related to inorganic engineered nanomaterials requires the adaptation of existing techniques and methods, or the development of new ones. The challenge for the analytical sciences has been to consider the nanoparticles as a new sort of analytes, involving both chemical (composition, mass and number concentration) and physical information (e.g. size, shape, aggregation). Moreover, information about the species derived from the nanoparticles themselves and their transformations must also be supplied. Whereas techniques commonly used for nanoparticle characterization, such as light scattering techniques, show serious limitations when applied to complex samples, other well-established techniques, like electron microscopy and atomic spectrometry, can provide useful information in most cases. Furthermore, separation techniques, including flow field flow fractionation, capillary electrophoresis and hydrodynamic chromatography, are moving to the nano domain, mostly hyphenated to inductively coupled plasma mass spectrometry as element specific detector. Emerging techniques based on the detection of single nanoparticles by using ICP-MS, but also coulometry, are in their way to gain a position. Chemical sensors selective to nanoparticles are in their early stages, but they are very promising considering their portability and simplicity. Although the field is in continuous evolution, at this moment it is moving from proofs-of-concept in simple matrices to methods dealing with matrices of higher complexity and relevant analyte concentrations. To achieve this goal, sample preparation methods are essential to manage such complex situations. Apart from size fractionation methods, matrix digestion, extraction and concentration methods capable of preserving the nature of the nanoparticles are being developed. This review presents and discusses the state-of-the-art analytical techniques and sample preparation methods suitable for

  6. Detection, characterization and quantification of inorganic engineered nanomaterials: A review of techniques and methodological approaches for the analysis of complex samples

    International Nuclear Information System (INIS)

    Laborda, Francisco; Bolea, Eduardo; Cepriá, Gemma; Gómez, María T.; Jiménez, María S.; Pérez-Arantegui, Josefina; Castillo, Juan R.

    2016-01-01

    The increasing demand of analytical information related to inorganic engineered nanomaterials requires the adaptation of existing techniques and methods, or the development of new ones. The challenge for the analytical sciences has been to consider the nanoparticles as a new sort of analytes, involving both chemical (composition, mass and number concentration) and physical information (e.g. size, shape, aggregation). Moreover, information about the species derived from the nanoparticles themselves and their transformations must also be supplied. Whereas techniques commonly used for nanoparticle characterization, such as light scattering techniques, show serious limitations when applied to complex samples, other well-established techniques, like electron microscopy and atomic spectrometry, can provide useful information in most cases. Furthermore, separation techniques, including flow field flow fractionation, capillary electrophoresis and hydrodynamic chromatography, are moving to the nano domain, mostly hyphenated to inductively coupled plasma mass spectrometry as element specific detector. Emerging techniques based on the detection of single nanoparticles by using ICP-MS, but also coulometry, are in their way to gain a position. Chemical sensors selective to nanoparticles are in their early stages, but they are very promising considering their portability and simplicity. Although the field is in continuous evolution, at this moment it is moving from proofs-of-concept in simple matrices to methods dealing with matrices of higher complexity and relevant analyte concentrations. To achieve this goal, sample preparation methods are essential to manage such complex situations. Apart from size fractionation methods, matrix digestion, extraction and concentration methods capable of preserving the nature of the nanoparticles are being developed. This review presents and discusses the state-of-the-art analytical techniques and sample preparation methods suitable for

  7. Nuclear techniques for trace element analysis. PIXE and its applications to biomedical samples

    International Nuclear Information System (INIS)

    Cata-Danil, I.; Moro, R.; Gialanella, G.

    1996-01-01

    Problems in understanding the role of trace elements in the functioning of life processes are discussed. A brief review of the state of the PIXE technique is given. Principles and recent advances in beam systems, instrumentation and sample handling are covered. A rather comprehensive list of references regarding varies methodological aspects and biomedical applications is given. Some applications are discussed. In particular, preliminary results of an investigation regarding pediatric obesity are presented. (author) 5 tabs., 21 refs

  8. Determination of elements in industrial waste sample and TENORM using XRF Technique in Nuclear Malaysia

    International Nuclear Information System (INIS)

    Paulus, W.; Sarimah Mahat; Meor Yusoff Meor Sulaiman

    2011-01-01

    Industrial waste such as aluminium dross and TENORM waste, oil sludge has been used as sample in this research. Determination of main elements by using X-Ray Fluorescence (XRF) in Material Technology Group, Malaysian Nuclear Agency. Results shows that main elements in these samples, aluminium and silicon, respectively. Thereby, this research shows that XRF can be considered as one of the techniques that can be used in waste characterization and furthermore, it can help researchers and engineer in the research related to waste treatment especially radioactive waste. (author)

  9. Effect of sample preparation techniques on the concentrations and distributions of elements in biological tissues using µSRXRF: a comparative study

    International Nuclear Information System (INIS)

    Al-Ebraheem, A; Dao, E; Desouza, E; McNeill, F E; Farquharson, M J; Li, C; Wainman, B C

    2015-01-01

    Routine tissue sample preparation using chemical fixatives is known to preserve the morphology of the tissue being studied. A competitive method, cryofixation followed by freeze drying, involves no chemical agents and maintains the biological function of the tissue. The possible effects of both sample preparation techniques in terms of the distribution of bio-metals (calcium (Ca), copper (Cu) zinc (Zn), and iron (Fe) specifically) in human skin tissue samples was investigated. Micro synchrotron radiation x-ray fluorescence (μSRXRF) was used to map bio-metal distribution in epidermal and dermal layers of human skin samples from various locations of the body that have been prepared using both techniques. For Ca, Cu and Zn, there were statistically significant differences between the epidermis and dermis using the freeze drying technique (p = 0.02, p < 0.01, and p < 0.01, respectively). Also using the formalin fixed, paraffin embedded technique the levels of Ca, Cu and Zn, were significantly different between the epidermis and dermis layers (p = 0.03, p < 0.01, and p < 0.01, respectively). However, the difference in levels of Fe between the epidermis and dermis was unclear and further analysis was required. The epidermis was further divided into two sub-layers, one mainly composed of the stratum corneum and the other deeper layer, the stratum basale. It was found that the difference between the distribution of Fe in the two epidermal layers using the freeze drying technique resulted in a statistically significant difference (p = 0.012). This same region also showed a difference in Fe using the formalin fixed, paraffin embedded technique (p < 0.01). The formalin fixed, paraffin embedded technique also showed a difference between the deeper epidermal layer and the dermis (p < 0.01). It can be concluded that studies involving Ca, Cu and Zn might show similar results using both sample preparation techniques, however studies involving Fe would need more

  10. Application of neutron activation analysis technique in elemental determination of lichen samples

    International Nuclear Information System (INIS)

    Djoko Prakoso Dwi Atmodjo; Syukria Kurniawati; Woro Yatu Niken Syahfitri; Nana Suherman; Dadang Supriatna

    2010-01-01

    Lichen is one of the biological materials as pollution monitor that can give information about level, direction, and history of various pollutants in environment. Small sample weights and elemental content of lichens is on the order of ppm, so that its characterization requires advanced analytical techniques that has high sensitivity and capable of analyzing samples with weight of - 25 mg, such as neutron activation analysis. In this research, determination of elements was done in lichen samples obtained from Kiaracondong and Holis areas in Bandung city, to understanding the difference of industrial exposure level on surrounding environment. Samples were irradiated in RSG GA Siwabessy, Serpong, at 15 MW for 1-2 and 60 minutes for short irradiation and long irradiation, respectively. The samples were then counted using HPGe detector with GENIE 2000 software. The level of element in lichen for Kiaracondong area were Co, Cr, Cs, Fe, Mg, Mn, Sb, Sc, and V in the range of 0.55-0.86, 1.47-2.57, 0.87-1.19, 540-1005, 949-1674, 34.91-45.94, 0.08-0.14, 0.16-0.31, and ≤ 2.33 mg/kg, respectively, while for Holis area were 1.04-2.37, 4.41-10.36, 0.41-0.89, 3166-709, 1131-1422, 40.97-72.51, 0.33-0.50, 0.98-2.18, and 5.30-13.05 mg/kg respectively. From these results, it is known that pollution exposure from the semi industrial area Holis provide greater influence than in the semi industrial area Kiaracondong. (author)

  11. High Field In Vivo 13C Magnetic Resonance Spectroscopy of Brain by Random Radiofrequency Heteronuclear Decoupling and Data Sampling

    Science.gov (United States)

    Li, Ningzhi; Li, Shizhe; Shen, Jun

    2017-06-01

    In vivo 13C magnetic resonance spectroscopy (MRS) is a unique and effective tool for studying dynamic human brain metabolism and the cycling of neurotransmitters. One of the major technical challenges for in vivo 13C-MRS is the high radio frequency (RF) power necessary for heteronuclear decoupling. In the common practice of in vivo 13C-MRS, alkanyl carbons are detected in the spectra range of 10-65ppm. The amplitude of decoupling pulses has to be significantly greater than the large one-bond 1H-13C scalar coupling (1JCH=125-145 Hz). Two main proton decoupling methods have been developed: broadband stochastic decoupling and coherent composite or adiabatic pulse decoupling (e.g., WALTZ); the latter is widely used because of its efficiency and superb performance under inhomogeneous B1 field. Because the RF power required for proton decoupling increases quadratically with field strength, in vivo 13C-MRS using coherent decoupling is often limited to low magnetic fields (protons via weak long-range 1H-13C scalar couplings, which can be decoupled using low RF power broadband stochastic decoupling. Recently, the carboxylic/amide 13C-MRS technique using low power random RF heteronuclear decoupling was safely applied to human brain studies at 7T. Here, we review the two major decoupling methods and the carboxylic/amide 13C-MRS with low power decoupling strategy. Further decreases in RF power deposition by frequency-domain windowing and time-domain random under-sampling are also discussed. Low RF power decoupling opens the possibility of performing in vivo 13C experiments of human brain at very high magnetic fields (such as 11.7T), where signal-to-noise ratio as well as spatial and temporal spectral resolution are more favorable than lower fields.

  12. Effect of DNA extraction methods and sampling techniques on the apparent structure of cow and sheep rumen microbial communities.

    Directory of Open Access Journals (Sweden)

    Gemma Henderson

    Full Text Available Molecular microbial ecology techniques are widely used to study the composition of the rumen microbiota and to increase understanding of the roles they play. Therefore, sampling and DNA extraction methods that result in adequate yields of microbial DNA that also accurately represents the microbial community are crucial. Fifteen different methods were used to extract DNA from cow and sheep rumen samples. The DNA yield and quality, and its suitability for downstream PCR amplifications varied considerably, depending on the DNA extraction method used. DNA extracts from nine extraction methods that passed these first quality criteria were evaluated further by quantitative PCR enumeration of microbial marker loci. Absolute microbial numbers, determined on the same rumen samples, differed by more than 100-fold, depending on the DNA extraction method used. The apparent compositions of the archaeal, bacterial, ciliate protozoal, and fungal communities in identical rumen samples were assessed using 454 Titanium pyrosequencing. Significant differences in microbial community composition were observed between extraction methods, for example in the relative abundances of members of the phyla Bacteroidetes and Firmicutes. Microbial communities in parallel samples collected from cows by oral stomach-tubing or through a rumen fistula, and in liquid and solid rumen digesta fractions, were compared using one of the DNA extraction methods. Community representations were generally similar, regardless of the rumen sampling technique used, but significant differences in the abundances of some microbial taxa such as the Clostridiales and the Methanobrevibacter ruminantium clade were observed. The apparent microbial community composition differed between rumen sample fractions, and Prevotellaceae were most abundant in the liquid fraction. DNA extraction methods that involved phenol-chloroform extraction and mechanical lysis steps tended to be more comparable. However

  13. Importance sampling of heavy-tailed iterated random functions

    NARCIS (Netherlands)

    B. Chen (Bohan); C.H. Rhee (Chang-Han); A.P. Zwart (Bert)

    2016-01-01

    textabstractWe consider a stochastic recurrence equation of the form $Z_{n+1} = A_{n+1} Z_n+B_{n+1}$, where $\\mathbb{E}[\\log A_1]<0$, $\\mathbb{E}[\\log^+ B_1]<\\infty$ and $\\{(A_n,B_n)\\}_{n\\in\\mathbb{N}}$ is an i.i.d. sequence of positive random vectors. The stationary distribution of this Markov

  14. Randomized Comparison of Two Vaginal Self-Sampling Methods for Human Papillomavirus Detection: Dry Swab versus FTA Cartridge.

    Directory of Open Access Journals (Sweden)

    Rosa Catarino

    Full Text Available Human papillomavirus (HPV self-sampling (self-HPV is valuable in cervical cancer screening. HPV testing is usually performed on physician-collected cervical smears stored in liquid-based medium. Dry filters and swabs are an alternative. We evaluated the adequacy of self-HPV using two dry storage and transport devices, the FTA cartridge and swab.A total of 130 women performed two consecutive self-HPV samples. Randomization determined which of the two tests was performed first: self-HPV using dry swabs (s-DRY or vaginal specimen collection using a cytobrush applied to an FTA cartridge (s-FTA. After self-HPV, a physician collected a cervical sample using liquid-based medium (Dr-WET. HPV types were identified by real-time PCR. Agreement between collection methods was measured using the kappa statistic.HPV prevalence for high-risk types was 62.3% (95%CI: 53.7-70.2 detected by s-DRY, 56.2% (95%CI: 47.6-64.4 by Dr-WET, and 54.6% (95%CI: 46.1-62.9 by s-FTA. There was overall agreement of 70.8% between s-FTA and s-DRY samples (kappa = 0.34, and of 82.3% between self-HPV and Dr-WET samples (kappa = 0.56. Detection sensitivities for low-grade squamous intraepithelial lesion or worse (LSIL+ were: 64.0% (95%CI: 44.5-79.8 for s-FTA, 84.6% (95%CI: 66.5-93.9 for s-DRY, and 76.9% (95%CI: 58.0-89.0 for Dr-WET. The preferred self-collection method among patients was s-DRY (40.8% vs. 15.4%. Regarding costs, FTA card was five times more expensive than the swab (~5 US dollars (USD/per card vs. ~1 USD/per swab.Self-HPV using dry swabs is sensitive for detecting LSIL+ and less expensive than s-FTA.International Standard Randomized Controlled Trial Number (ISRCTN: 43310942.

  15. Comparison of four techniques of nasogastric tube insertion in anaesthetised, intubated patients: A randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Mohan Chandra Mandal

    2014-01-01

    Full Text Available Background and Aims: Insertion of nasogastric tubes (NGTs in anaesthetised, intubated patients with a conventional method is sometimes difficult. Different techniques of NGT insertion have been tried with varying degree of success. The aim of this prospective, randomised, open-label study was to evaluate three modified techniques of NGT insertion comparing with the conventional method in respect of success rate, time taken for insertion and the adverse events. Methods: In the operation theatre of general surgery, the patients were randomly allocated into four groups: Group C (control group, n = 54, Group W (ureteral guide wire group, n = 54, Group F (neck flexion with lateral pressure, n = 54 and Group R (reverse Sellick′s manoeuvre, n = 54. The number of attempts for successful NGT insertion, time taken for insertion and adverse events were noted. Results: All the three modified techniques were found more successful than the conventional method on the first attempt. The least time taken for insertion was noted in the reverse Sellick′s method. However, on intergroup analysis, neck flexion and reverse Sellick′s methods were comparable but significantly faster than the other two methods with respect to time taken for insertion. Conclusion: Reverse Sellick′s manoeuver, neck flexion with lateral neck pressure and guide wire-assisted techniques are all better alternatives to the conventional method for successful, quick and reliable NGT insertion with permissible adverse events in anaesthetised, intubated adult patients. Further studies after eliminating major limitations of the present study are warranted to establish the superiority of any one of these modified techniques.

  16. Use of Random and Site-Directed Mutagenesis to Probe Protein Structure-Function Relationships: Applied Techniques in the Study of Helicobacter pylori.

    Science.gov (United States)

    Whitmire, Jeannette M; Merrell, D Scott

    2017-01-01

    Mutagenesis is a valuable tool to examine the structure-function relationships of bacterial proteins. As such, a wide variety of mutagenesis techniques and strategies have been developed. This chapter details a selection of random mutagenesis methods and site-directed mutagenesis procedures that can be applied to an array of bacterial species. Additionally, the direct application of the techniques to study the Helicobacter pylori Ferric Uptake Regulator (Fur) protein is described. The varied approaches illustrated herein allow the robust investigation of the structural-functional relationships within a protein of interest.

  17. Investigation of an egyptian phosphate ore sample by neutron activation analysis technique

    International Nuclear Information System (INIS)

    Eissa, E.A.; Aly, R.A.; Rofail, N.B.; Hassan, A.M.

    1995-01-01

    A domestic phosphate ore sample has been analysed by means of prompt and delayed gamma-ray spectrometry following the activation by thermal neutron capture technique. The rabbit pneumatic transfer system (RPTS), long irradiation facility and two Pu/Be (2,5 Ci each) neutron sources set-Pu for prompt (n,gamma) were applied. The high purity germanium (HPGe) gamma-ray spectrometer with a personal computer analyzer (PCA) system were used for spectrum measurements. Programmes on the VAX computer were utilized for estimating the elemental concentrations of 22 out of 36 elements identified in this work. 2 tabs

  18. Relative efficiency and sample size for cluster randomized trials with variable cluster sizes.

    Science.gov (United States)

    You, Zhiying; Williams, O Dale; Aban, Inmaculada; Kabagambe, Edmond Kato; Tiwari, Hemant K; Cutter, Gary

    2011-02-01

    The statistical power of cluster randomized trials depends on two sample size components, the number of clusters per group and the numbers of individuals within clusters (cluster size). Variable cluster sizes are common and this variation alone may have significant impact on study power. Previous approaches have taken this into account by either adjusting total sample size using a designated design effect or adjusting the number of clusters according to an assessment of the relative efficiency of unequal versus equal cluster sizes. This article defines a relative efficiency of unequal versus equal cluster sizes using noncentrality parameters, investigates properties of this measure, and proposes an approach for adjusting the required sample size accordingly. We focus on comparing two groups with normally distributed outcomes using t-test, and use the noncentrality parameter to define the relative efficiency of unequal versus equal cluster sizes and show that statistical power depends only on this parameter for a given number of clusters. We calculate the sample size required for an unequal cluster sizes trial to have the same power as one with equal cluster sizes. Relative efficiency based on the noncentrality parameter is straightforward to calculate and easy to interpret. It connects the required mean cluster size directly to the required sample size with equal cluster sizes. Consequently, our approach first determines the sample size requirements with equal cluster sizes for a pre-specified study power and then calculates the required mean cluster size while keeping the number of clusters unchanged. Our approach allows adjustment in mean cluster size alone or simultaneous adjustment in mean cluster size and number of clusters, and is a flexible alternative to and a useful complement to existing methods. Comparison indicated that we have defined a relative efficiency that is greater than the relative efficiency in the literature under some conditions. Our measure

  19. MARRT: Medial Axis biased rapidly-exploring random trees

    KAUST Repository

    Denny, Jory

    2014-05-01

    © 2014 IEEE. Motion planning is a difficult and widely studied problem in robotics. Current research aims not only to find feasible paths, but to ensure paths have certain properties, e.g., shortest or safest paths. This is difficult for current state-of-the-art sampling-based techniques as they typically focus on simply finding any path. Despite this difficulty, sampling-based techniques have shown great success in planning for a wide range of applications. Among such planners, Rapidly-Exploring Random Trees (RRTs) search the planning space by biasing exploration toward unexplored regions. This paper introduces a novel RRT variant, Medial Axis RRT (MARRT), which biases tree exploration to the medial axis of free space by pushing all configurations from expansion steps towards the medial axis. We prove that this biasing increases the tree\\'s clearance from obstacles. Improving obstacle clearance is useful where path safety is important, e.g., path planning for robots performing tasks in close proximity to the elderly. Finally, we experimentally analyze MARRT, emphasizing its ability to effectively map difficult passages while increasing obstacle clearance, and compare it to contemporary RRT techniques.

  20. A sero-survey of rinderpest in nomadic pastoral systems in central and southern Somalia from 2002 to 2003, using a spatially integrated random sampling approach.

    Science.gov (United States)

    Tempia, S; Salman, M D; Keefe, T; Morley, P; Freier, J E; DeMartini, J C; Wamwayi, H M; Njeumi, F; Soumaré, B; Abdi, A M

    2010-12-01

    A cross-sectional sero-survey, using a two-stage cluster sampling design, was conducted between 2002 and 2003 in ten administrative regions of central and southern Somalia, to estimate the seroprevalence and geographic distribution of rinderpest (RP) in the study area, as well as to identify potential risk factors for the observed seroprevalence distribution. The study was also used to test the feasibility of the spatially integrated investigation technique in nomadic and semi-nomadic pastoral systems. In the absence of a systematic list of livestock holdings, the primary sampling units were selected by generating random map coordinates. A total of 9,216 serum samples were collected from cattle aged 12 to 36 months at 562 sampling sites. Two apparent clusters of RP seroprevalence were detected. Four potential risk factors associated with the observed seroprevalence were identified: the mobility of cattle herds, the cattle population density, the proximity of cattle herds to cattle trade routes and cattle herd size. Risk maps were then generated to assist in designing more targeted surveillance strategies. The observed seroprevalence in these areas declined over time. In subsequent years, similar seroprevalence studies in neighbouring areas of Kenya and Ethiopia also showed a very low seroprevalence of RP or the absence of antibodies against RP. The progressive decline in RP antibody prevalence is consistent with virus extinction. Verification of freedom from RP infection in the Somali ecosystem is currently in progress.

  1. Ear Acupuncture for Acute Sore Throat: A Randomized Controlled Trial

    Science.gov (United States)

    2014-09-26

    SEP 2014 2. REPORT TYPE Final 3. DATES COVERED 4. TITLE AND SUBTITLE Ear acupuncture for acute sore throat. A randomized controlled trial...Auncular Acupuncture is a low risk option for acute pain control •Battlefield acupuncture (BFA) IS a specific auncular acupuncture technique •BFA IS...Strengths: Prospect1ve RCT •Weaknesses Small sample stze. no sham acupuncture performed, patients not blinded to treatment •Th1s study represents an

  2. A preclustering-based ensemble learning technique for acute appendicitis diagnoses.

    Science.gov (United States)

    Lee, Yen-Hsien; Hu, Paul Jen-Hwa; Cheng, Tsang-Hsiang; Huang, Te-Chia; Chuang, Wei-Yao

    2013-06-01

    Acute appendicitis is a common medical condition, whose effective, timely diagnosis can be difficult. A missed diagnosis not only puts the patient in danger but also requires additional resources for corrective treatments. An acute appendicitis diagnosis constitutes a classification problem, for which a further fundamental challenge pertains to the skewed outcome class distribution of instances in the training sample. A preclustering-based ensemble learning (PEL) technique aims to address the associated imbalanced sample learning problems and thereby support the timely, accurate diagnosis of acute appendicitis. The proposed PEL technique employs undersampling to reduce the number of majority-class instances in a training sample, uses preclustering to group similar majority-class instances into multiple groups, and selects from each group representative instances to create more balanced samples. The PEL technique thereby reduces potential information loss from random undersampling. It also takes advantage of ensemble learning to improve performance. We empirically evaluate this proposed technique with 574 clinical cases obtained from a comprehensive tertiary hospital in southern Taiwan, using several prevalent techniques and a salient scoring system as benchmarks. The comparative results show that PEL is more effective and less biased than any benchmarks. The proposed PEL technique seems more sensitive to identifying positive acute appendicitis than the commonly used Alvarado scoring system and exhibits higher specificity in identifying negative acute appendicitis. In addition, the sensitivity and specificity values of PEL appear higher than those of the investigated benchmarks that follow the resampling approach. Our analysis suggests PEL benefits from the more representative majority-class instances in the training sample. According to our overall evaluation results, PEL records the best overall performance, and its area under the curve measure reaches 0.619. The

  3. Localisation and identification of radioactive particles in solid samples by means of a nuclear track technique

    International Nuclear Information System (INIS)

    Boehnke, Antje; Treutler, Hanns-Christian; Freyer, Klaus; Schubert, Michael; Holger Weiss

    2005-01-01

    This study is aimed to develop a generally applicable methodology of investigation that can be used for the localisation of single alpha-active particles in solid samples, such as industrial dust or natural soils, sediments and rocks by autoradiography using solid-state nuclear track detectors. The developed technique allows the detection of local enrichments of alpha-emitters in any solid material. The results of such an investigation are of interest from technical, biological and environmental points of view. The idea behind the methodology is to locate the position of alpha-active spots in a sample by attaching the track detector to the sample in a defined manner, thoroughly described in the paper. The located alpha-active particles are subsequently analysed by an electron microscope and an electron microprobe. An example of the application of this methodology is also given. An ultra-fine -grained ore-processing residue, which causes serious environmental pollution in the respective mining district and thus limits possible land use and affects quality of life in the area, was examined using the described technique. The investigation revealed considerable amounts of alpha-active particles in this material

  4. Effect of apical clearing technique on the treatment outcome of teeth with asymptomatic apical periodontitis: A randomized clinical trial

    OpenAIRE

    Priya Mittal; Ajay Logani; Naseem Shah; R M Pandey

    2016-01-01

    Aim: This study aims to compare the periapical healing of teeth with asymptomatic apical periodontitis treated either by conventional apical preparation (CAP) or apical clearing technique (ACT). Materials and Methods: T wenty subjects with bilateral nonvital similar teeth exhibiting comparable periapical index (PAI) score were enrolled and randomly allocated. Group I (CAP, n = 20): Apical preparation three sizes greater (master apical file [MAF]) than the first binding file at the establis...

  5. Coacervative extraction as a green technique for sample preparation for the analysis of organic compounds.

    Science.gov (United States)

    Melnyk, A; Wolska, L; Namieśnik, J

    2014-04-25

    One of the present trends in analytical chemistry is miniaturization, which is one of the methods of green analytical chemistry application. A particular emphasis is placed on the elimination of the use of large amounts of organic solvents which are toxic and harmful to the environment, maintaining high efficiency of the extraction process, high recovery values and low values of quantification (LOQ) and detection (LOD) limits. These requirements are fulfilled by coacervative extraction (CAE) technique. In this review, theoretical aspects of the coacervation process are presented along with environmental and bioanalytical applications of this technique, its advantages, limitations and competitiveness with other techniques. Due to its simplicity and rapidity, CAE is an excellent alternative for currently practiced procedures of sample preparation for the analysis of organic compounds. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Power Spectrum Estimation of Randomly Sampled Signals

    DEFF Research Database (Denmark)

    Velte, C. M.; Buchhave, P.; K. George, W.

    algorithms; sample and-hold and the direct spectral estimator without residence time weighting. The computer generated signal is a Poisson process with a sample rate proportional to velocity magnitude that consist of well-defined frequency content, which makes bias easy to spot. The idea...

  7. Sampling and examination methods used for TMI-2 samples

    International Nuclear Information System (INIS)

    Marley, A.W.; Akers, D.W.; McIsaac, C.V.

    1988-01-01

    The purpose of this paper is to summarize the sampling and examination techniques that were used in the collection and analysis of TMI-2 samples. Samples ranging from auxiliary building air to core debris were collected and analyzed. Handling of the larger samples and many of the smaller samples had to be done remotely and many standard laboratory analytical techniques were modified to accommodate the extremely high radiation fields associated with these samples. The TMI-2 samples presented unique problems with sampling and the laboratory analysis of prior molten fuel debris. 14 refs., 8 figs

  8. Amorphous and liquid samples structure and density measurements at high pressure - high temperature using diffraction and imaging techniques

    Science.gov (United States)

    Guignot, N.; King, A.; Clark, A. N.; Perrillat, J. P.; Boulard, E.; Morard, G.; Deslandes, J. P.; Itié, J. P.; Ritter, X.; Sanchez-Valle, C.

    2016-12-01

    Determination of the density and structure of liquids such as iron alloys, silicates and carbonates is a key to understand deep Earth structure and dynamics. X-ray diffraction provided by large synchrotron facilities gives excellent results as long as the signal scattered from the sample can be isolated from its environment. Different techniques already exist; we present here the implementation and the first results given by the combined angle- and energy-dispersive structural analysis and refinement (CAESAR) technique introduced by Wang et al. in 2004, that has never been used in this context. It has several advantages in the study of liquids: 1/ the standard energy-dispersive technique (EDX), fast and compatible with large multi-anvil presses frames, is used for fast analysis free of signal pollution from the sample environment 2/ some limitations of the EDX technique (homogeneity of the sample, low resolution) are irrelevant in the case of liquid signals, others (wrong intensities, escape peaks artifacts, background subtraction) are solved by the CAESAR technique 3/ high Q data (up to 15 A-1 and more) can be obtained in a few hours (usually less than 2). We present here the facilities available on the PSICHE beamline (SOLEIL synchrotron, France) and a few results obtained using a Paris-Edinburgh (PE) press and a 1200 tons load capacity multi-anvil press with a (100) DIA compression module. X-ray microtomography, used in conjunction with a PE press featuring rotating anvils (RotoPEc, Philippe et al., 2013) is also very effective, by simply measuring the 3D volume of glass or liquid spheres at HPHT, thus providing density. This can be done in conjunction with the CAESAR technique and we illustrate this point. Finally, absorption profiles can be obtained via imaging techniques, providing another independent way to measure the density of these materials. References Y. Wang et al., A new technique for angle-dispersive powder diffraction using an energy

  9. Reverse sample genome probing, a new technique for identification of bacteria in environmental samples by DNA hybridization, and its application to the identification of sulfate-reducing bacteria in oil field samples

    International Nuclear Information System (INIS)

    Voordouw, G.; Voordouw, J.K.; Karkhoff-Schweizer, R.R.; Fedorak, P.M.; Westlake, D.W.S.

    1991-01-01

    A novel method for identification of bacteria in environmental samples by DNA hybridization is presented. It is based on the fact that, even within a genus, the genomes of different bacteria may have little overall sequence homology. This allows the use of the labeled genomic DNA of a given bacterium (referred to as a standard) to probe for its presence and that of bacteria with highly homologous genomes in total DNA obtained from an environmental sample. Alternatively, total DNA extracted from the sample can be labeled and used to probe filters on which denatured chromosomal DNA from relevant bacterial standards has been spotted. The latter technique is referred to as reverse sample genome probing, since it is the reverse of the usual practice of deriving probes from reference bacteria for analyzing a DNA sample. Reverse sample genome probing allows identification of bacteria in a sample in a single step once a master filter with suitable standards has been developed. Application of reverse sample genome probing to the identification of sulfate-reducing bacteria in 31 samples obtained primarily from oil fields in the province of Alberta has indicated that there are at least 20 genotypically different sulfate-reducing bacteria in these samples

  10. Issues in the analyze of low content gold mining samples by fire assay technique

    Science.gov (United States)

    Cetean, Valentina

    2016-04-01

    The classic technique analyze of samples with low gold content - below 0.1 g/t (=100 ppb = parts per billion), either ore or gold sediments, involves the preparation of sample by fire assay extraction, followed by the chemical attack with aqua regia (hydrochloric and nitric acid) and measuring the gold content by atomic absorption spectrometry or inductively coupled mass spectrometry. The issues raised by this analysis are well known for the world laboratories, commercial or research ones. The author's knowledge regarding this method of determining the gold content, accumulated in such laboratory from Romania (with more than 40 years of experience, even if not longer available from 2014) confirms the obtaining of reliable results required a lot of attention, amount of work and the involving of an experienced fire assayer specialist. The analytical conclusion for a research laboratory is that most reliable and statistically valid results are till reached for samples with more than 100 ppb gold content; the degree of confidence below this value is lower than 90%. Usually, for samples below 50 ppb, it does not exceed 50-70 %, unless without very strictly control of each stage, that involve additional percentage of hours allocated for successive extracting tests and knowing more precisely the other compounds that appear in the sample (Cu, Sb, As, sulfur / sulphides, Te, organic matter, etc.) or impurities. The most important operation is the preparation, namely: - grinding and splitting of sample (which can cause uneven distribution of gold flakes in the double samples for analyzed); - pyro-metallurgical recovery of gold = fire assay stage, involving the more precise temperature control in furnace during all stages (fusion and cupellation) and adjusting of the fire assay flux components to produce a successful fusion depending of the sample matrix and content; - reducing the sample weight to decrease the amount of impurities that can be concentrated in the lead button

  11. Soil sampling

    International Nuclear Information System (INIS)

    Fortunati, G.U.; Banfi, C.; Pasturenzi, M.

    1994-01-01

    This study attempts to survey the problems associated with techniques and strategies of soil sampling. Keeping in mind the well defined objectives of a sampling campaign, the aim was to highlight the most important aspect of representativeness of samples as a function of the available resources. Particular emphasis was given to the techniques and particularly to a description of the many types of samplers which are in use. The procedures and techniques employed during the investigations following the Seveso accident are described. (orig.)

  12. Comparison of two extracorporeal shock wave therapy techniques for the treatment of painful subcalcaneal spur. A randomized controlled study.

    Science.gov (United States)

    Tornese, Davide; Mattei, Enrico; Lucchesi, Giampaolo; Bandi, Marco; Ricci, Gabriele; Melegati, Gianluca

    2008-09-01

    To describe and compare two extracorporeal shock wave therapy techniques for the treatment of painful subcalcaneal spur. Random assignment to two groups of treatment with two and eight months follow-up. The data were collected in outpatients. Forty-five subjects with a history of at least six months of heel pain were studied. Each subject received a three-session ultrasound-guided extracorporeal shock wave therapy (performed weekly). Perpendicular technique was used in group A (n=22, mean age 59.3 +/- 12 years) and tangential technique was used in group B (n= 23, mean age 58.8 +/- 12.3 years). Mayo Clinical Scoring System was used to evaluate each subject before the treatment and at two and eight months follow-up. Mayo Clinical Scoring System pretreatment scores were homogeneous between the groups (group A 55.2 +/-18.7; group B 53.5 +/- 20; P>0.05). In both groups there was a significant (Pwave therapy. The tangential technique was found to be better tolerated as regards treatment-induced pain, allowing higher energy dosages to be used.

  13. Discrete least squares polynomial approximation with random evaluations - application to PDEs with Random parameters

    KAUST Repository

    Nobile, Fabio

    2015-01-01

    the parameter-to-solution map u(y) from random noise-free or noisy observations in random points by discrete least squares on polynomial spaces. The noise-free case is relevant whenever the technique is used to construct metamodels, based on polynomial

  14. Assessing differences in groups randomized by recruitment chain in a respondent-driven sample of Seattle-area injection drug users.

    Science.gov (United States)

    Burt, Richard D; Thiede, Hanne

    2014-11-01

    Respondent-driven sampling (RDS) is a form of peer-based study recruitment and analysis that incorporates features designed to limit and adjust for biases in traditional snowball sampling. It is being widely used in studies of hidden populations. We report an empirical evaluation of RDS's consistency and variability, comparing groups recruited contemporaneously, by identical methods and using identical survey instruments. We randomized recruitment chains from the RDS-based 2012 National HIV Behavioral Surveillance survey of injection drug users in the Seattle area into two groups and compared them in terms of sociodemographic characteristics, drug-associated risk behaviors, sexual risk behaviors, human immunodeficiency virus (HIV) status and HIV testing frequency. The two groups differed in five of the 18 variables examined (P ≤ .001): race (e.g., 60% white vs. 47%), gender (52% male vs. 67%), area of residence (32% downtown Seattle vs. 44%), an HIV test in the previous 12 months (51% vs. 38%). The difference in serologic HIV status was particularly pronounced (4% positive vs. 18%). In four further randomizations, differences in one to five variables attained this level of significance, although the specific variables involved differed. We found some material differences between the randomized groups. Although the variability of the present study was less than has been reported in serial RDS surveys, these findings indicate caution in the interpretation of RDS results. Copyright © 2014 Elsevier Inc. All rights reserved.

  15. Characterization of Friction Joints Subjected to High Levels of Random Vibration

    Science.gov (United States)

    deSantos, Omar; MacNeal, Paul

    2012-01-01

    This paper describes the test program in detail including test sample description, test procedures, and vibration test results of multiple test samples. The material pairs used in the experiment were Aluminum-Aluminum, Aluminum- Dicronite coated Aluminum, and Aluminum-Plasmadize coated Aluminum. Levels of vibration for each set of twelve samples of each material pairing were gradually increased until all samples experienced substantial displacement. Data was collected on 1) acceleration in all three axes, 2) relative static displacement between vibration runs utilizing photogrammetry techniques, and 3) surface galling and contaminant generation. This data was used to estimate the values of static friction during random vibratory motion when "stick-slip" occurs and compare these to static friction coefficients measured before and after vibration testing.

  16. Technique for preparation of transmission electron microscope specimens from wire samples of Al and Al-Al2O3 alloys

    DEFF Research Database (Denmark)

    Lindbo, Jørgen

    1966-01-01

    A technique for thinning 1 mm wire samples of aluminium and aluminium-alumina alloys for transmission electron microscopy is described. The essential feature of the technique, which involves spark machining and electropolishing in a polytetrafluoroethylene holder followed by chemical polishing...

  17. Non destructive multi elemental analysis using prompt gamma neutron activation analysis techniques: Preliminary results for concrete sample

    Energy Technology Data Exchange (ETDEWEB)

    Dahing, Lahasen Normanshah [School of Applied Physics, Universiti Kebangsaan Malaysia, 43600 Bangi, Selangor, Malaysia and Malaysian Nuclear Agency (Nuklear Malaysia), Bangi 43000, Kajang (Malaysia); Yahya, Redzuan [School of Applied Physics, Universiti Kebangsaan Malaysia, 43600 Bangi, Selangor (Malaysia); Yahya, Roslan; Hassan, Hearie [Malaysian Nuclear Agency (Nuklear Malaysia), Bangi 43000, Kajang (Malaysia)

    2014-09-03

    In this study, principle of prompt gamma neutron activation analysis has been used as a technique to determine the elements in the sample. The system consists of collimated isotopic neutron source, Cf-252 with HPGe detector and Multichannel Analysis (MCA). Concrete with size of 10×10×10 cm{sup 3} and 15×15×15 cm{sup 3} were analysed as sample. When neutrons enter and interact with elements in the concrete, the neutron capture reaction will occur and produce characteristic prompt gamma ray of the elements. The preliminary result of this study demonstrate the major element in the concrete was determined such as Si, Mg, Ca, Al, Fe and H as well as others element, such as Cl by analysis the gamma ray lines respectively. The results obtained were compared with NAA and XRF techniques as a part of reference and validation. The potential and the capability of neutron induced prompt gamma as tool for multi elemental analysis qualitatively to identify the elements present in the concrete sample discussed.

  18. A random cluster survey and a convenience sample give comparable estimates of immunity to vaccine preventable diseases in children of school age in Victoria, Australia.

    Science.gov (United States)

    Kelly, Heath; Riddell, Michaela A; Gidding, Heather F; Nolan, Terry; Gilbert, Gwendolyn L

    2002-08-19

    We compared estimates of the age-specific population immunity to measles, mumps, rubella, hepatitis B and varicella zoster viruses in Victorian school children obtained by a national sero-survey, using a convenience sample of residual sera from diagnostic laboratories throughout Australia, with those from a three-stage random cluster survey. When grouped according to school age (primary or secondary school) there was no significant difference in the estimates of immunity to measles, mumps, hepatitis B or varicella. Compared with the convenience sample, the random cluster survey estimated higher immunity to rubella in samples from both primary (98.7% versus 93.6%, P = 0.002) and secondary school students (98.4% versus 93.2%, P = 0.03). Despite some limitations, this study suggests that the collection of a convenience sample of sera from diagnostic laboratories is an appropriate sampling strategy to provide population immunity data that will inform Australia's current and future immunisation policies. Copyright 2002 Elsevier Science Ltd.

  19. A partially reflecting random walk on spheres algorithm for electrical impedance tomography

    Energy Technology Data Exchange (ETDEWEB)

    Maire, Sylvain, E-mail: maire@univ-tln.fr [Laboratoire LSIS Equipe Signal et Image, Université du Sud Toulon-Var, Av. Georges Pompidou, BP 56, 83162 La Valette du Var Cedex (France); Simon, Martin, E-mail: simon@math.uni-mainz.de [Institute of Mathematics, Johannes Gutenberg University, 55099 Mainz (Germany)

    2015-12-15

    In this work, we develop a probabilistic estimator for the voltage-to-current map arising in electrical impedance tomography. This novel so-called partially reflecting random walk on spheres estimator enables Monte Carlo methods to compute the voltage-to-current map in an embarrassingly parallel manner, which is an important issue with regard to the corresponding inverse problem. Our method uses the well-known random walk on spheres algorithm inside subdomains where the diffusion coefficient is constant and employs replacement techniques motivated by finite difference discretization to deal with both mixed boundary conditions and interface transmission conditions. We analyze the global bias and the variance of the new estimator both theoretically and experimentally. Subsequently, the variance of the new estimator is considerably reduced via a novel control variate conditional sampling technique which yields a highly efficient hybrid forward solver coupling probabilistic and deterministic algorithms.

  20. Decomposition techniques

    Science.gov (United States)

    Chao, T.T.; Sanzolone, R.F.

    1992-01-01

    Sample decomposition is a fundamental and integral step in the procedure of geochemical analysis. It is often the limiting factor to sample throughput, especially with the recent application of the fast and modern multi-element measurement instrumentation. The complexity of geological materials makes it necessary to choose the sample decomposition technique that is compatible with the specific objective of the analysis. When selecting a decomposition technique, consideration should be given to the chemical and mineralogical characteristics of the sample, elements to be determined, precision and accuracy requirements, sample throughput, technical capability of personnel, and time constraints. This paper addresses these concerns and discusses the attributes and limitations of many techniques of sample decomposition along with examples of their application to geochemical analysis. The chemical properties of reagents as to their function as decomposition agents are also reviewed. The section on acid dissolution techniques addresses the various inorganic acids that are used individually or in combination in both open and closed systems. Fluxes used in sample fusion are discussed. The promising microwave-oven technology and the emerging field of automation are also examined. A section on applications highlights the use of decomposition techniques for the determination of Au, platinum group elements (PGEs), Hg, U, hydride-forming elements, rare earth elements (REEs), and multi-elements in geological materials. Partial dissolution techniques used for geochemical exploration which have been treated in detail elsewhere are not discussed here; nor are fire-assaying for noble metals and decomposition techniques for X-ray fluorescence or nuclear methods be discussed. ?? 1992.

  1. Prediction of soil CO2 flux in sugarcane management systems using the Random Forest approach

    Directory of Open Access Journals (Sweden)

    Rose Luiza Moraes Tavares

    Full Text Available ABSTRACT: The Random Forest algorithm is a data mining technique used for classifying attributes in order of importance to explain the variation in an attribute-target, as soil CO2 flux. This study aimed to identify prediction of soil CO2 flux variables in management systems of sugarcane through the machine-learning algorithm called Random Forest. Two different management areas of sugarcane in the state of São Paulo, Brazil, were selected: burned and green. In each area, we assembled a sampling grid with 81 georeferenced points to assess soil CO2 flux through automated portable soil gas chamber with measuring spectroscopy in the infrared during the dry season of 2011 and the rainy season of 2012. In addition, we sampled the soil to evaluate physical, chemical, and microbiological attributes. For data interpretation, we used the Random Forest algorithm, based on the combination of predicted decision trees (machine learning algorithms in which every tree depends on the values of a random vector sampled independently with the same distribution to all the trees of the forest. The results indicated that clay content in the soil was the most important attribute to explain the CO2 flux in the areas studied during the evaluated period. The use of the Random Forest algorithm originated a model with a good fit (R2 = 0.80 for predicted and observed values.

  2. Evaluation of diesel particulate matter sampling techniques

    CSIR Research Space (South Africa)

    Pretorius, CJ

    2011-09-01

    Full Text Available The study evaluated diesel particulate matter (DPM) sampling methods used in the South African mining industry. The three-piece cassette respirable, open face and stopper sampling methods were compared with the SKC DPM cassette method to find a...

  3. Randomized Item Response Theory Models

    NARCIS (Netherlands)

    Fox, Gerardus J.A.

    2005-01-01

    The randomized response (RR) technique is often used to obtain answers on sensitive questions. A new method is developed to measure latent variables using the RR technique because direct questioning leads to biased results. Within the RR technique is the probability of the true response modeled by

  4. Hypoalgesic effects of three different manual therapy techniques on cervical spine and psychological interaction: A randomized clinical trial.

    Science.gov (United States)

    Alonso-Perez, Jose Luis; Lopez-Lopez, Almudena; La Touche, Roy; Lerma-Lara, Sergio; Suarez, Emilio; Rojas, Javier; Bishop, Mark D; Villafañe, Jorge Hugo; Fernández-Carnero, Josué

    2017-10-01

    The purpose of this study was to evaluate the extent to which psychological factors interact with a particular manual therapy (MT) technique to induce hypoalgesia in healthy subjects. Seventy-five healthy volunteers (36 female, 39 males), were recruited in this double-blind, controlled and parallel study. Subjects were randomly assigned to receive: High velocity low amplitude technique (HVLA), joint mobilization, or Cervical Lateral glide mobilization (CLGM). Pressure pain threshold (PPT) over C7 unilaterally, trapezius muscle and lateral epicondyle bilaterally, were measured prior to single technique MT was applied and immediately after to applied MT. Pain catastrophizing, depression, anxiety and kinesiophobia were evaluated before treatment. The results indicate that hypoalgesia was observed in all groups after treatment in the neck and elbow region (P < 0.05), but mobilization induces more hypoalgesic effects. Catastrophizing interacted with change over time in PPT, for changes in C7 and in manipulation group. All the MT techniques studied produced local and segmental hypoalgesic effects, supporting the results of previous studies studying the individual interventions. Interaction between catastrophizing and HVLA technique suggest that whether catastrophizing level is low or medium, the chance of success is high, but high levels of catastrophizing may result in poor outcome after HVLA intervention. ClinicalTrials.gov Registration Number: NCT02782585. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. A simple technique for measuring the superconducting critical temperature of small (>= 10 μg) samples

    International Nuclear Information System (INIS)

    Pereira, R.F.R.; Meyer, E.; Silveira, M.F. da.

    1983-01-01

    A simple technique for measuring the superconducting critical temperature of small (>=10μg) samples is described. The apparatus is built in the form of a probe, which can be introduced directly into a liquid He storage dewar and permits the determination of the critical temperature, with an imprecision of +- 0.05 K above 4.2 K, in about 10 minutes. (Author) [pt

  6. Using the Superpopulation Model for Imputations and Variance Computation in Survey Sampling

    Directory of Open Access Journals (Sweden)

    Petr Novák

    2012-03-01

    Full Text Available This study is aimed at variance computation techniques for estimates of population characteristics based on survey sampling and imputation. We use the superpopulation regression model, which means that the target variable values for each statistical unit are treated as random realizations of a linear regression model with weighted variance. We focus on regression models with one auxiliary variable and no intercept, which have many applications and straightforward interpretation in business statistics. Furthermore, we deal with caseswhere the estimates are not independent and thus the covariance must be computed. We also consider chained regression models with auxiliary variables as random variables instead of constants.

  7. Sampling practices and analytical techniques used in the monitoring of steam and water in CEGB nuclear boilers

    International Nuclear Information System (INIS)

    Goodfellow, G.I.

    1978-01-01

    The steam and water in CEGB Magnox and AGR nuclear boilers are continuously monitored, using both laboratory techniques and on-line instrumentation, in order to maintain the chemical quality within pre-determined limits. The sampling systems in use and some of the difficulties associated with sampling requirements are discussed. The relative merits of chemical instruments installed either locally in various parts of the plant or in centralized instrument rooms are reviewed. The quality of water in nuclear boilers, as with all high-pressure steam-raising plant, is extremely high; consequently very sensitive analytical procedures are required, particularly for monitoring the feed-water of 'once-through boiler' systems. Considerable progress has been made in this field and examples are given of some of the techniques developed for analyses at the 'μ/kg' level together with some of the current problems.(author)

  8. 42 CFR 431.814 - Sampling plan and procedures.

    Science.gov (United States)

    2010-10-01

    ... reliability of the reduced sample. (4) The sample selection procedure. Systematic random sampling is... sampling, and yield estimates with the same or better precision than achieved in systematic random sampling... 42 Public Health 4 2010-10-01 2010-10-01 false Sampling plan and procedures. 431.814 Section 431...

  9. Techniques for sampling nuclear waste tank contents and in situ measurement of activity

    International Nuclear Information System (INIS)

    Lawrence, R.C.

    1978-04-01

    A study was conducted to develop suitable sampling equipment and techniques for characterizing the mechanical properties of nuclear wastes; identifying effective means of measuring radiation levels, temperatures, and neutron fluxes in situ in wastes; and developing a waste core sampler. A portable, stainless steel probe was developed which is placed in the tank through a riser. This probe is built for the insertion of instrumentation that can measure the contents of the tank at any level and take temperature, radiation, and neutron activation readings with reliable accuracy. A simple and reliable instrument for the in situ extraction of waste materials ranging from liquid to concrete-like substances was also developed. This portable, stainless steel waste core sampler can remove up to one liter of radioactive waste from tanks for transportation to hot cell laboratories for analysis of hardness, chemical form, and isotopic content. A cask for transporting the waste samples from the tanks to the laboratory under radiation-protected conditions was also fabricated. This cask was designed with a ''boot'' or inner-seal liner to contain any radioactive wastes that might remain on the outside of the waste core sampling device

  10. Variances in the projections, resulting from CLIMEX, Boosted Regression Trees and Random Forests techniques

    Science.gov (United States)

    Shabani, Farzin; Kumar, Lalit; Solhjouy-fard, Samaneh

    2017-08-01

    The aim of this study was to have a comparative investigation and evaluation of the capabilities of correlative and mechanistic modeling processes, applied to the projection of future distributions of date palm in novel environments and to establish a method of minimizing uncertainty in the projections of differing techniques. The location of this study on a global scale is in Middle Eastern Countries. We compared the mechanistic model CLIMEX (CL) with the correlative models MaxEnt (MX), Boosted Regression Trees (BRT), and Random Forests (RF) to project current and future distributions of date palm ( Phoenix dactylifera L.). The Global Climate Model (GCM), the CSIRO-Mk3.0 (CS) using the A2 emissions scenario, was selected for making projections. Both indigenous and alien distribution data of the species were utilized in the modeling process. The common areas predicted by MX, BRT, RF, and CL from the CS GCM were extracted and compared to ascertain projection uncertainty levels of each individual technique. The common areas identified by all four modeling techniques were used to produce a map indicating suitable and unsuitable areas for date palm cultivation for Middle Eastern countries, for the present and the year 2100. The four different modeling approaches predict fairly different distributions. Projections from CL were more conservative than from MX. The BRT and RF were the most conservative methods in terms of projections for the current time. The combination of the final CL and MX projections for the present and 2100 provide higher certainty concerning those areas that will become highly suitable for future date palm cultivation. According to the four models, cold, hot, and wet stress, with differences on a regional basis, appears to be the major restrictions on future date palm distribution. The results demonstrate variances in the projections, resulting from different techniques. The assessment and interpretation of model projections requires reservations

  11. Stratified random sampling plans designed to assist in the determination of radon and radon daughter concentrations in underground uranium mine atmosphere

    International Nuclear Information System (INIS)

    Makepeace, C.E.

    1981-01-01

    Sampling strategies for the monitoring of deleterious agents present in uranium mine air in underground and surface mining areas are described. These methods are designed to prevent overexposure of the lining of the respiratory system of uranium miners to ionizing radiation from radon and radon daughters, and whole body overexposure to external gamma radiation. A detailed description is provided of stratified random sampling monitoring methodology for obtaining baseline data to be used as a reference for subsequent compliance assessment

  12. Random walk of passive tracers among randomly moving obstacles.

    Science.gov (United States)

    Gori, Matteo; Donato, Irene; Floriani, Elena; Nardecchia, Ilaria; Pettini, Marco

    2016-04-14

    This study is mainly motivated by the need of understanding how the diffusion behavior of a biomolecule (or even of a larger object) is affected by other moving macromolecules, organelles, and so on, inside a living cell, whence the possibility of understanding whether or not a randomly walking biomolecule is also subject to a long-range force field driving it to its target. By means of the Continuous Time Random Walk (CTRW) technique the topic of random walk in random environment is here considered in the case of a passively diffusing particle among randomly moving and interacting obstacles. The relevant physical quantity which is worked out is the diffusion coefficient of the passive tracer which is computed as a function of the average inter-obstacles distance. The results reported here suggest that if a biomolecule, let us call it a test molecule, moves towards its target in the presence of other independently interacting molecules, its motion can be considerably slowed down.

  13. Assessment of Random Assignment in Training and Test Sets using Generalized Cluster Analysis Technique

    Directory of Open Access Journals (Sweden)

    Sorana D. BOLBOACĂ

    2011-06-01

    Full Text Available Aim: The properness of random assignment of compounds in training and validation sets was assessed using the generalized cluster technique. Material and Method: A quantitative Structure-Activity Relationship model using Molecular Descriptors Family on Vertices was evaluated in terms of assignment of carboquinone derivatives in training and test sets during the leave-many-out analysis. Assignment of compounds was investigated using five variables: observed anticancer activity and four structure descriptors. Generalized cluster analysis with K-means algorithm was applied in order to investigate if the assignment of compounds was or not proper. The Euclidian distance and maximization of the initial distance using a cross-validation with a v-fold of 10 was applied. Results: All five variables included in analysis proved to have statistically significant contribution in identification of clusters. Three clusters were identified, each of them containing both carboquinone derivatives belonging to training as well as to test sets. The observed activity of carboquinone derivatives proved to be normal distributed on every. The presence of training and test sets in all clusters identified using generalized cluster analysis with K-means algorithm and the distribution of observed activity within clusters sustain a proper assignment of compounds in training and test set. Conclusion: Generalized cluster analysis using the K-means algorithm proved to be a valid method in assessment of random assignment of carboquinone derivatives in training and test sets.

  14. The development and use of parametric sampling techniques for probabilistic risk assessment

    International Nuclear Information System (INIS)

    Dalrymple, G.J.; Broyd, T.W.

    1987-01-01

    In order to enable evaluation to be made of proposals for the underground disposal of low and intermediate level radioactive wastes in the United Kingdom, the Department of the Environment (DoE) research programme includes development of computer-based methods for use in a multistage assessment process. To test the adequacy of the various methods of data acquisitions and radiological assessment a mock assessment exercise is currently being conducted by the department. This paper outlines the proposed methodology which provides for the use of probabilistic modelling based upon the Atomic Energy of Canada Ltd SYVAC variability analysis approach using new models (SYVAC 'A') and data appropriate to UK conditions for a deep horizontal tunnel repository concept. This chapter describes the choice of a suitable technique for the sampling of data input to the SYVAC 'A' model and techniques for analysing the predictions of dose and risk made by the model. The sensitivity of the model predictions (risk and dose to man) to the input parameters was compared for four different methods. All four methods identified the same geological parameters as the most important. (author)

  15. The quality of the reported sample size calculations in randomized controlled trials indexed in PubMed.

    Science.gov (United States)

    Lee, Paul H; Tse, Andy C Y

    2017-05-01

    There are limited data on the quality of reporting of information essential for replication of the calculation as well as the accuracy of the sample size calculation. We examine the current quality of reporting of the sample size calculation in randomized controlled trials (RCTs) published in PubMed and to examine the variation in reporting across study design, study characteristics, and journal impact factor. We also reviewed the targeted sample size reported in trial registries. We reviewed and analyzed all RCTs published in December 2014 with journals indexed in PubMed. The 2014 Impact Factors for the journals were used as proxies for their quality. Of the 451 analyzed papers, 58.1% reported an a priori sample size calculation. Nearly all papers provided the level of significance (97.7%) and desired power (96.6%), and most of the papers reported the minimum clinically important effect size (73.3%). The median (inter-quartile range) of the percentage difference of the reported and calculated sample size calculation was 0.0% (IQR -4.6%;3.0%). The accuracy of the reported sample size was better for studies published in journals that endorsed the CONSORT statement and journals with an impact factor. A total of 98 papers had provided targeted sample size on trial registries and about two-third of these papers (n=62) reported sample size calculation, but only 25 (40.3%) had no discrepancy with the reported number in the trial registries. The reporting of the sample size calculation in RCTs published in PubMed-indexed journals and trial registries were poor. The CONSORT statement should be more widely endorsed. Copyright © 2016 European Federation of Internal Medicine. Published by Elsevier B.V. All rights reserved.

  16. X-Ray Micro-Computed Tomography of Apollo Samples as a Curation Technique Enabling Better Research

    Science.gov (United States)

    Ziegler, R. A.; Almeida, N. V.; Sykes, D.; Smith, C. L.

    2014-01-01

    X-ray micro-computed tomography (micro-CT) is a technique that has been used to research meteorites for some time and many others], and recently it is becoming a more common tool for the curation of meteorites and Apollo samples. Micro-CT is ideally suited to the characterization of astromaterials in the curation process as it can provide textural and compositional information at a small spatial resolution rapidly, nondestructively, and without compromising the cleanliness of the samples (e.g., samples can be scanned sealed in Teflon bags). This data can then inform scientists and curators when making and processing future sample requests for meteorites and Apollo samples. Here we present some preliminary results on micro-CT scans of four Apollo regolith breccias. Methods: Portions of four Apollo samples were used in this study: 14321, 15205, 15405, and 60639. All samples were 8-10 cm in their longest dimension and approximately equant. These samples were micro-CT scanned on the Nikon HMXST 225 System at the Natural History Museum in London. Scans were made at 205-220 kV, 135-160 microamps beam current, with an effective voxel size of 21-44 microns. Results: Initial examination of the data identify a variety of mineral clasts (including sub-voxel FeNi metal grains) and lithic clasts within the regolith breccias. Textural information within some of the lithic clasts was also discernable. Of particular interest was a large basalt clast (approx.1.3 cc) found within sample 60639, which appears to have a sub-ophitic texture. Additionally, internal void space, e.g., fractures and voids, is readily identifiable. Discussion: It is clear from the preliminary data that micro-CT analyses are able to identify important "new" clasts within the Apollo breccias, and better characterize previously described clasts or igneous samples. For example, the 60639 basalt clast was previously believed to be quite small based on its approx.0.5 sq cm exposure on the surface of the main mass

  17. Monte Carlo Finite Volume Element Methods for the Convection-Diffusion Equation with a Random Diffusion Coefficient

    Directory of Open Access Journals (Sweden)

    Qian Zhang

    2014-01-01

    Full Text Available The paper presents a framework for the construction of Monte Carlo finite volume element method (MCFVEM for the convection-diffusion equation with a random diffusion coefficient, which is described as a random field. We first approximate the continuous stochastic field by a finite number of random variables via the Karhunen-Loève expansion and transform the initial stochastic problem into a deterministic one with a parameter in high dimensions. Then we generate independent identically distributed approximations of the solution by sampling the coefficient of the equation and employing finite volume element variational formulation. Finally the Monte Carlo (MC method is used to compute corresponding sample averages. Statistic error is estimated analytically and experimentally. A quasi-Monte Carlo (QMC technique with Sobol sequences is also used to accelerate convergence, and experiments indicate that it can improve the efficiency of the Monte Carlo method.

  18. Randomized controlled trial of attention bias modification in a racially diverse, socially anxious, alcohol dependent sample.

    Science.gov (United States)

    Clerkin, Elise M; Magee, Joshua C; Wells, Tony T; Beard, Courtney; Barnett, Nancy P

    2016-12-01

    Attention biases may be an important treatment target for both alcohol dependence and social anxiety. This is the first ABM trial to investigate two (vs. one) targets of attention bias within a sample with co-occurring symptoms of social anxiety and alcohol dependence. Additionally, we used trial-level bias scores (TL-BS) to capture the phenomena of attention bias in a more ecologically valid, dynamic way compared to traditional attention bias scores. Adult participants (N = 86; 41% Female; 52% African American; 40% White) with elevated social anxiety symptoms and alcohol dependence were randomly assigned to an 8-session training condition in this 2 (Social Anxiety ABM vs. Social Anxiety Control) by 2 (Alcohol ABM vs. Alcohol Control) design. Symptoms of social anxiety, alcohol dependence, and attention bias were assessed across time. Multilevel models estimated the trajectories for each measure within individuals, and tested whether these trajectories differed according to the randomized training conditions. Across time, there were significant or trending decreases in all attention TL-BS parameters (but not traditional attention bias scores) and most symptom measures. However, there were not significant differences in the trajectories of change between any ABM and control conditions for any symptom measures. These findings add to previous evidence questioning the robustness of ABM and point to the need to extend the effects of ABM to samples that are racially diverse and/or have co-occurring psychopathology. The results also illustrate the potential importance of calculating trial-level attention bias scores rather than only including traditional bias scores. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Randomized Controlled Trial of Attention Bias Modification in a Racially Diverse, Socially Anxious, Alcohol Dependent Sample

    Science.gov (United States)

    Clerkin, Elise M.; Magee, Joshua C.; Wells, Tony T.; Beard, Courtney; Barnett, Nancy P.

    2016-01-01

    Objective Attention biases may be an important treatment target for both alcohol dependence and social anxiety. This is the first ABM trial to investigate two (vs. one) targets of attention bias within a sample with co-occurring symptoms of social anxiety and alcohol dependence. Additionally, we used trial-level bias scores (TL-BS) to capture the phenomena of attention bias in a more ecologically valid, dynamic way compared to traditional attention bias scores. Method Adult participants (N=86; 41% Female; 52% African American; 40% White) with elevated social anxiety symptoms and alcohol dependence were randomly assigned to an 8-session training condition in this 2 (Social Anxiety ABM vs. Social Anxiety Control) by 2 (Alcohol ABM vs. Alcohol Control) design. Symptoms of social anxiety, alcohol dependence, and attention bias were assessed across time. Results Multilevel models estimated the trajectories for each measure within individuals, and tested whether these trajectories differed according to the randomized training conditions. Across time, there were significant or trending decreases in all attention TL-BS parameters (but not traditional attention bias scores) and most symptom measures. However, there were not significant differences in the trajectories of change between any ABM and control conditions for any symptom measures. Conclusions These findings add to previous evidence questioning the robustness of ABM and point to the need to extend the effects of ABM to samples that are racially diverse and/or have co-occurring psychopathology. The results also illustrate the potential importance of calculating trial-level attention bias scores rather than only including traditional bias scores. PMID:27591918

  20. Pseudo-random number generator for the Sigma 5 computer

    Science.gov (United States)

    Carroll, S. N.

    1983-01-01

    A technique is presented for developing a pseudo-random number generator based on the linear congruential form. The two numbers used for the generator are a prime number and a corresponding primitive root, where the prime is the largest prime number that can be accurately represented on a particular computer. The primitive root is selected by applying Marsaglia's lattice test. The technique presented was applied to write a random number program for the Sigma 5 computer. The new program, named S:RANDOM1, is judged to be superior to the older program named S:RANDOM. For applications requiring several independent random number generators, a table is included showing several acceptable primitive roots. The technique and programs described can be applied to any computer having word length different from that of the Sigma 5.