WorldWideScience

Sample records for random sampling technique

  1. Sampling Polya-Gamma random variates: alternate and approximate techniques

    OpenAIRE

    Windle, Jesse; Polson, Nicholas G.; Scott, James G.

    2014-01-01

    Efficiently sampling from the P\\'olya-Gamma distribution, ${PG}(b,z)$, is an essential element of P\\'olya-Gamma data augmentation. Polson et. al (2013) show how to efficiently sample from the ${PG}(1,z)$ distribution. We build two new samplers that offer improved performance when sampling from the ${PG}(b,z)$ distribution and $b$ is not unity.

  2. MUP, CEC-DES, STRADE. Codes for uncertainty propagation, experimental design and stratified random sampling techniques

    International Nuclear Information System (INIS)

    Amendola, A.; Astolfi, M.; Lisanti, B.

    1983-01-01

    The report describes the how-to-use of the codes: MUP (Monte Carlo Uncertainty Propagation) for uncertainty analysis by Monte Carlo simulation, including correlation analysis, extreme value identification and study of selected ranges of the variable space; CEC-DES (Central Composite Design) for building experimental matrices according to the requirements of Central Composite and Factorial Experimental Designs; and, STRADE (Stratified Random Design) for experimental designs based on the Latin Hypercube Sampling Techniques. Application fields, of the codes are probabilistic risk assessment, experimental design, sensitivity analysis and system identification problems

  3. Independent random sampling methods

    CERN Document Server

    Martino, Luca; Míguez, Joaquín

    2018-01-01

    This book systematically addresses the design and analysis of efficient techniques for independent random sampling. Both general-purpose approaches, which can be used to generate samples from arbitrary probability distributions, and tailored techniques, designed to efficiently address common real-world practical problems, are introduced and discussed in detail. In turn, the monograph presents fundamental results and methodologies in the field, elaborating and developing them into the latest techniques. The theory and methods are illustrated with a varied collection of examples, which are discussed in detail in the text and supplemented with ready-to-run computer code. The main problem addressed in the book is how to generate independent random samples from an arbitrary probability distribution with the weakest possible constraints or assumptions in a form suitable for practical implementation. The authors review the fundamental results and methods in the field, address the latest methods, and emphasize the li...

  4. Random sampling technique for ultra-fast computations of molecular opacities for exoplanet atmospheres

    Science.gov (United States)

    Min, M.

    2017-10-01

    Context. Opacities of molecules in exoplanet atmospheres rely on increasingly detailed line-lists for these molecules. The line lists available today contain for many species up to several billions of lines. Computation of the spectral line profile created by pressure and temperature broadening, the Voigt profile, of all of these lines is becoming a computational challenge. Aims: We aim to create a method to compute the Voigt profile in a way that automatically focusses the computation time into the strongest lines, while still maintaining the continuum contribution of the high number of weaker lines. Methods: Here, we outline a statistical line sampling technique that samples the Voigt profile quickly and with high accuracy. The number of samples is adjusted to the strength of the line and the local spectral line density. This automatically provides high accuracy line shapes for strong lines or lines that are spectrally isolated. The line sampling technique automatically preserves the integrated line opacity for all lines, thereby also providing the continuum opacity created by the large number of weak lines at very low computational cost. Results: The line sampling technique is tested for accuracy when computing line spectra and correlated-k tables. Extremely fast computations ( 3.5 × 105 lines per second per core on a standard current day desktop computer) with high accuracy (≤1% almost everywhere) are obtained. A detailed recipe on how to perform the computations is given.

  5. Landslide Susceptibility Assessment Using Frequency Ratio Technique with Iterative Random Sampling

    Directory of Open Access Journals (Sweden)

    Hyun-Joo Oh

    2017-01-01

    Full Text Available This paper assesses the performance of the landslide susceptibility analysis using frequency ratio (FR with an iterative random sampling. A pair of before-and-after digital aerial photographs with 50 cm spatial resolution was used to detect landslide occurrences in Yongin area, Korea. Iterative random sampling was run ten times in total and each time it was applied to the training and validation datasets. Thirteen landslide causative factors were derived from the topographic, soil, forest, and geological maps. The FR scores were calculated from the causative factors and training occurrences repeatedly ten times. The ten landslide susceptibility maps were obtained from the integration of causative factors that assigned FR scores. The landslide susceptibility maps were validated by using each validation dataset. The FR method achieved susceptibility accuracies from 89.48% to 93.21%. And the landslide susceptibility accuracy of the FR method is higher than 89%. Moreover, the ten times iterative FR modeling may contribute to a better understanding of a regularized relationship between the causative factors and landslide susceptibility. This makes it possible to incorporate knowledge-driven considerations of the causative factors into the landslide susceptibility analysis and also be extensively used to other areas.

  6. Application of bias factor method using random sampling technique for prediction accuracy improvement of critical eigenvalue of BWR

    International Nuclear Information System (INIS)

    Ito, Motohiro; Endo, Tomohiro; Yamamoto, Akio; Kuroda, Yusuke; Yoshii, Takashi

    2017-01-01

    The bias factor method based on the random sampling technique is applied to the benchmark problem of Peach Bottom Unit 2. Validity and availability of the present method, i.e. correction of calculation results and reduction of uncertainty, are confirmed in addition to features and performance of the present method. In the present study, core characteristics in cycle 3 are corrected with the proposed method using predicted and 'measured' critical eigenvalues in cycles 1 and 2. As the source of uncertainty, variance-covariance of cross sections is considered. The calculation results indicate that bias between predicted and measured results, and uncertainty owing to cross section can be reduced. Extension to other uncertainties such as thermal hydraulics properties will be a future task. (author)

  7. Application of a stratified random sampling technique to the estimation and minimization of respirable quartz exposure to underground miners

    International Nuclear Information System (INIS)

    Makepeace, C.E.; Horvath, F.J.; Stocker, H.

    1981-11-01

    The aim of a stratified random sampling plan is to provide the best estimate (in the absence of full-shift personal gravimetric sampling) of personal exposure to respirable quartz among underground miners. One also gains information of the exposure distribution of all the miners at the same time. Three variables (or strata) are considered in the present scheme: locations, occupations and times of sampling. Random sampling within each stratum ensures that each location, occupation and time of sampling has equal opportunity of being selected without bias. Following implementation of the plan and analysis of collected data, one can determine the individual exposures and the mean. This information can then be used to identify those groups whose exposure contributes significantly to the collective exposure. In turn, this identification, along with other considerations, allows the mine operator to carry out a cost-benefit optimization and eventual implementation of engineering controls for these groups. This optimization and engineering control procedure, together with the random sampling plan, can then be used in an iterative manner to minimize the mean value of the distribution and collective exposures

  8. An unbiased estimator of the variance of simple random sampling using mixed random-systematic sampling

    OpenAIRE

    Padilla, Alberto

    2009-01-01

    Systematic sampling is a commonly used technique due to its simplicity and ease of implementation. The drawback of this simplicity is that it is not possible to estimate the design variance without bias. There are several ways to circumvent this problem. One method is to suppose that the variable of interest has a random order in the population, so the sample variance of simple random sampling without replacement is used. By means of a mixed random - systematic sample, an unbiased estimator o...

  9. Generation and Analysis of Constrained Random Sampling Patterns

    DEFF Research Database (Denmark)

    Pierzchlewski, Jacek; Arildsen, Thomas

    2016-01-01

    Random sampling is a technique for signal acquisition which is gaining popularity in practical signal processing systems. Nowadays, event-driven analog-to-digital converters make random sampling feasible in practical applications. A process of random sampling is defined by a sampling pattern, which...... indicates signal sampling points in time. Practical random sampling patterns are constrained by ADC characteristics and application requirements. In this paper, we introduce statistical methods which evaluate random sampling pattern generators with emphasis on practical applications. Furthermore, we propose...... algorithm generates random sampling patterns dedicated for event-driven-ADCs better than existed sampling pattern generators. Finally, implementation issues of random sampling patterns are discussed....

  10. Sample size estimation and sampling techniques for selecting a representative sample

    Directory of Open Access Journals (Sweden)

    Aamir Omair

    2014-01-01

    Full Text Available Introduction: The purpose of this article is to provide a general understanding of the concepts of sampling as applied to health-related research. Sample Size Estimation: It is important to select a representative sample in quantitative research in order to be able to generalize the results to the target population. The sample should be of the required sample size and must be selected using an appropriate probability sampling technique. There are many hidden biases which can adversely affect the outcome of the study. Important factors to consider for estimating the sample size include the size of the study population, confidence level, expected proportion of the outcome variable (for categorical variables/standard deviation of the outcome variable (for numerical variables, and the required precision (margin of accuracy from the study. The more the precision required, the greater is the required sample size. Sampling Techniques: The probability sampling techniques applied for health related research include simple random sampling, systematic random sampling, stratified random sampling, cluster sampling, and multistage sampling. These are more recommended than the nonprobability sampling techniques, because the results of the study can be generalized to the target population.

  11. Systematic versus random sampling in stereological studies.

    Science.gov (United States)

    West, Mark J

    2012-12-01

    The sampling that takes place at all levels of an experimental design must be random if the estimate is to be unbiased in a statistical sense. There are two fundamental ways by which one can make a random sample of the sections and positions to be probed on the sections. Using a card-sampling analogy, one can pick any card at all out of a deck of cards. This is referred to as independent random sampling because the sampling of any one card is made without reference to the position of the other cards. The other approach to obtaining a random sample would be to pick a card within a set number of cards and others at equal intervals within the deck. Systematic sampling along one axis of many biological structures is more efficient than random sampling, because most biological structures are not randomly organized. This article discusses the merits of systematic versus random sampling in stereological studies.

  12. A Bayesian Justification for Random Sampling in Sample Survey

    Directory of Open Access Journals (Sweden)

    Glen Meeden

    2012-07-01

    Full Text Available In the usual Bayesian approach to survey sampling the sampling design, plays a minimal role, at best. Although a close relationship between exchangeable prior distributions and simple random sampling has been noted; how to formally integrate simple random sampling into the Bayesian paradigm is not clear. Recently it has been argued that the sampling design can be thought of as part of a Bayesian's prior distribution. We will show here that under this scenario simple random sample can be given a Bayesian justification in survey sampling.

  13. Systematic random sampling of the comet assay.

    Science.gov (United States)

    McArt, Darragh G; Wasson, Gillian R; McKerr, George; Saetzler, Kurt; Reed, Matt; Howard, C Vyvyan

    2009-07-01

    The comet assay is a technique used to quantify DNA damage and repair at a cellular level. In the assay, cells are embedded in agarose and the cellular content is stripped away leaving only the DNA trapped in an agarose cavity which can then be electrophoresed. The damaged DNA can enter the agarose and migrate while the undamaged DNA cannot and is retained. DNA damage is measured as the proportion of the migratory 'tail' DNA compared to the total DNA in the cell. The fundamental basis of these arbitrary values is obtained in the comet acquisition phase using fluorescence microscopy with a stoichiometric stain in tandem with image analysis software. Current methods deployed in such an acquisition are expected to be both objectively and randomly obtained. In this paper we examine the 'randomness' of the acquisition phase and suggest an alternative method that offers both objective and unbiased comet selection. In order to achieve this, we have adopted a survey sampling approach widely used in stereology, which offers a method of systematic random sampling (SRS). This is desirable as it offers an impartial and reproducible method of comet analysis that can be used both manually or automated. By making use of an unbiased sampling frame and using microscope verniers, we are able to increase the precision of estimates of DNA damage. Results obtained from a multiple-user pooled variation experiment showed that the SRS technique attained a lower variability than that of the traditional approach. The analysis of a single user with repetition experiment showed greater individual variances while not being detrimental to overall averages. This would suggest that the SRS method offers a better reflection of DNA damage for a given slide and also offers better user reproducibility.

  14. k-Means: Random Sampling Procedure

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. k-Means: Random Sampling Procedure. Optimal 1-Mean is. Approximation of Centroid (Inaba et al). S = random sample of size O(1/ ); Centroid of S is a (1+ )-approx centroid of P with constant probability.

  15. Development of SYVAC sampling techniques

    International Nuclear Information System (INIS)

    Prust, J.O.; Dalrymple, G.J.

    1985-04-01

    This report describes the requirements of a sampling scheme for use with the SYVAC radiological assessment model. The constraints on the number of samples that may be taken is considered. The conclusions from earlier studies using the deterministic generator sampling scheme are summarised. The method of Importance Sampling and a High Dose algorithm, which are designed to preferentially sample in the high dose region of the parameter space, are reviewed in the light of experience gained from earlier studies and the requirements of a site assessment and sensitivity analyses. In addition the use of an alternative numerical integration method for estimating risk is discussed. It is recommended that the method of Importance Sampling is developed and tested for use with SYVAC. An alternative numerical integration method is not recommended for investigation at this stage but should be the subject of future work. (author)

  16. Two sampling techniques for game meat

    OpenAIRE

    van der Merwe, Maretha; Jooste, Piet J.; Hoffman, Louw C.; Calitz, Frikkie J.

    2013-01-01

    A study was conducted to compare the excision sampling technique used by the export market and the sampling technique preferred by European countries, namely the biotrace cattle and swine test. The measuring unit for the excision sampling was grams (g) and square centimetres (cm2) for the swabbing technique. The two techniques were compared after a pilot test was conducted on spiked approved beef carcasses (n = 12) that statistically proved the two measuring units correlated. The two sampling...

  17. Sampling problems for randomly broken sticks

    Energy Technology Data Exchange (ETDEWEB)

    Huillet, Thierry [Laboratoire de Physique Theorique et Modelisation, CNRS-UMR 8089 et Universite de Cergy-Pontoise, 5 mail Gay-Lussac, 95031, Neuville sur Oise (France)

    2003-04-11

    Consider the random partitioning model of a population (represented by a stick of length 1) into n species (fragments) with identically distributed random weights (sizes). Upon ranking the fragments' weights according to ascending sizes, let S{sub m:n} be the size of the mth smallest fragment. Assume that some observer is sampling such populations as follows: drop at random k points (the sample size) onto this stick and record the corresponding numbers of visited fragments. We shall investigate the following sampling problems: (1) what is the sample size if the sampling is carried out until the first visit of the smallest fragment (size S{sub 1:n})? (2) For a given sample size, have all the fragments of the stick been visited at least once or not? This question is related to Feller's random coupon collector problem. (3) In what order are new fragments being discovered and what is the random number of samples separating the discovery of consecutive new fragments until exhaustion of the list? For this problem, the distribution of the size-biased permutation of the species' weights, as the sequence of their weights in their order of appearance is needed and studied.

  18. Spectral Estimation by the Random Dec Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Jensen, Jacob L.; Krenk, Steen

    1990-01-01

    This paper contains an empirical study of the accuracy of the Random Dec (RDD) technique. Realizations of the response from a single-degree-of-freedom system loaded by white noise are simulated using an ARMA model. The Autocorrelation function is estimated using the RDD technique and the estimated...

  19. Spectral Estimation by the Random DEC Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Jensen, J. Laigaard; Krenk, S.

    This paper contains an empirical study of the accuracy of the Random Dec (RDD) technique. Realizations of the response from a single-degree-of-freedom system loaded by white noise are simulated using an ARMA model. The Autocorrelation function is estimated using the RDD technique and the estimated...

  20. Urine sampling techniques in symptomatic primary-care patients

    DEFF Research Database (Denmark)

    Holm, Anne; Aabenhus, Rune

    2016-01-01

    in infection rate between mid-stream-clean-catch, mid-stream-urine and random samples. Conclusions: At present, no evidence suggests that sampling technique affects the accuracy of the microbiological diagnosis in non-pregnant women with symptoms of urinary tract infection in primary care. However......Background: Choice of urine sampling technique in urinary tract infection may impact diagnostic accuracy and thus lead to possible over- or undertreatment. Currently no evidencebased consensus exists regarding correct sampling technique of urine from women with symptoms of urinary tract infection...... a randomized or paired design to compare the result of urine culture obtained with two or more collection techniques in adult, female, non-pregnant patients with symptoms of urinary tract infection. We evaluated quality of the studies and compared accuracy based on dichotomized outcomes. Results: We included...

  1. Comparison of sampling techniques for use in SYVAC

    International Nuclear Information System (INIS)

    Dalrymple, G.J.

    1984-01-01

    The Stephen Howe review (reference TR-STH-1) recommended the use of a deterministic generator (DG) sampling technique for sampling the input values to the SYVAC (SYstems Variability Analysis Code) program. This technique was compared with Monte Carlo simple random sampling (MC) by taking a 1000 run case of SYVAC using MC as the reference case. The results show that DG appears relatively inaccurate for most values of consequence when used with 11 sample intervals. If 22 sample intervals are used then DG generates cumulative distribution functions that are statistically similar to the reference distribution. 400 runs of DG or MC are adequate to generate a representative cumulative distribution function. The MC technique appears to perform better than DG for the same number of runs. However, the DG predicts higher doses and in view of the importance of generating data in the high dose region this sampling technique with 22 sample intervals is recommended for use in SYVAC. (author)

  2. Newly introduced sample preparation techniques: towards miniaturization.

    Science.gov (United States)

    Costa, Rosaria

    2014-01-01

    Sampling and sample preparation are of crucial importance in an analytical procedure, representing quite often a source of errors. The technique chosen for the isolation of analytes greatly affects the success of a chemical determination. On the other hand, growing concerns about environmental and human safety, along with the introduction of international regulations for quality control, have moved the interest of scientists towards specific needs. Newly introduced sample preparation techniques are challenged to meet new criteria: (i) miniaturization, (ii) higher sensitivity and selectivity, and (iii) automation. In this survey, the most recent techniques introduced in the field of sample preparation will be described and discussed, along with many examples of applications.

  3. Two sampling techniques for game meat

    Directory of Open Access Journals (Sweden)

    Maretha van der Merwe

    2013-03-01

    Full Text Available A study was conducted to compare the excision sampling technique used by the export market and the sampling technique preferred by European countries, namely the biotrace cattle and swine test. The measuring unit for the excision sampling was grams (g and square centimetres (cm2 for the swabbing technique. The two techniques were compared after a pilot test was conducted on spiked approved beef carcasses (n = 12 that statistically proved the two measuring units correlated. The two sampling techniques were conducted on the same game carcasses (n = 13 and analyses performed for aerobic plate count (APC, Escherichia coli and Staphylococcus aureus, for both techniques. A more representative result was obtained by swabbing and no damage was caused to the carcass. Conversely, the excision technique yielded fewer organisms and caused minor damage to the carcass. The recovery ratio from the sampling technique improved 5.4 times for APC, 108.0 times for E. coli and 3.4 times for S. aureus over the results obtained from the excision technique. It was concluded that the sampling methods of excision and swabbing can be used to obtain bacterial profiles from both export and local carcasses and could be used to indicate whether game carcasses intended for the local market are possibly on par with game carcasses intended for the export market and therefore safe for human consumption.

  4. Two sampling techniques for game meat.

    Science.gov (United States)

    van der Merwe, Maretha; Jooste, Piet J; Hoffman, Louw C; Calitz, Frikkie J

    2013-03-20

    A study was conducted to compare the excision sampling technique used by the export market and the sampling technique preferred by European countries, namely the biotrace cattle and swine test. The measuring unit for the excision sampling was grams (g) and square centimetres (cm2) for the swabbing technique. The two techniques were compared after a pilot test was conducted on spiked approved beef carcasses (n = 12) that statistically proved the two measuring units correlated. The two sampling techniques were conducted on the same game carcasses (n = 13) and analyses performed for aerobic plate count (APC), Escherichia coli and Staphylococcus aureus, for both techniques. A more representative result was obtained by swabbing and no damage was caused to the carcass. Conversely, the excision technique yielded fewer organisms and caused minor damage to the carcass. The recovery ratio from the sampling technique improved 5.4 times for APC, 108.0 times for E. coli and 3.4 times for S. aureus over the results obtained from the excision technique. It was concluded that the sampling methods of excision and swabbing can be used to obtain bacterial profiles from both export and local carcasses and could be used to indicate whether game carcasses intended for the local market are possibly on par with game carcasses intended for the export market and therefore safe for human consumption.

  5. Critical evaluation of sample pretreatment techniques.

    Science.gov (United States)

    Hyötyläinen, Tuulia

    2009-06-01

    Sample preparation before chromatographic separation is the most time-consuming and error-prone part of the analytical procedure. Therefore, selecting and optimizing an appropriate sample preparation scheme is a key factor in the final success of the analysis, and the judicious choice of an appropriate procedure greatly influences the reliability and accuracy of a given analysis. The main objective of this review is to critically evaluate the applicability, disadvantages, and advantages of various sample preparation techniques. Particular emphasis is placed on extraction techniques suitable for both liquid and solid samples.

  6. Acceptance sampling using judgmental and randomly selected samples

    Energy Technology Data Exchange (ETDEWEB)

    Sego, Landon H.; Shulman, Stanley A.; Anderson, Kevin K.; Wilson, John E.; Pulsipher, Brent A.; Sieber, W. Karl

    2010-09-01

    We present a Bayesian model for acceptance sampling where the population consists of two groups, each with different levels of risk of containing unacceptable items. Expert opinion, or judgment, may be required to distinguish between the high and low-risk groups. Hence, high-risk items are likely to be identifed (and sampled) using expert judgment, while the remaining low-risk items are sampled randomly. We focus on the situation where all observed samples must be acceptable. Consequently, the objective of the statistical inference is to quantify the probability that a large percentage of the unsampled items in the population are also acceptable. We demonstrate that traditional (frequentist) acceptance sampling and simpler Bayesian formulations of the problem are essentially special cases of the proposed model. We explore the properties of the model in detail, and discuss the conditions necessary to ensure that required samples sizes are non-decreasing function of the population size. The method is applicable to a variety of acceptance sampling problems, and, in particular, to environmental sampling where the objective is to demonstrate the safety of reoccupying a remediated facility that has been contaminated with a lethal agent.

  7. A random sampling procedure for anisotropic distributions

    International Nuclear Information System (INIS)

    Nagrajan, P.S.; Sethulakshmi, P.; Raghavendran, C.P.; Bhatia, D.P.

    1975-01-01

    A procedure is described for sampling the scattering angle of neutrons as per specified angular distribution data. The cosine of the scattering angle is written as a double Legendre expansion in the incident neutron energy and a random number. The coefficients of the expansion are given for C, N, O, Si, Ca, Fe and Pb and these elements are of interest in dosimetry and shielding. (author)

  8. Multiuser Random Coding Techniques for Mismatched Decoding

    OpenAIRE

    Scarlett, Jonathan; Martinez, Alfonso; Guillén i Fàbregas, Albert

    2016-01-01

    This paper studies multiuser random coding techniques for channel coding with a given (possibly suboptimal) decoding rule. For the mismatched discrete memoryless multiple-access channel, an error exponent is obtained that is tight with respect to the ensemble average, and positive within the interior of Lapidoth's achievable rate region. This exponent proves the ensemble tightness of the exponent of Liu and Hughes in the case of maximum-likelihood decoding. An equivalent dual form of Lapidoth...

  9. NAIL SAMPLING TECHNIQUE AND ITS INTERPRETATION

    Directory of Open Access Journals (Sweden)

    TZAR MN

    2011-01-01

    Full Text Available The clinical suspicion of onychomyosis based on appearance of the nails, requires culture for confirmation. This is because treatment requires prolonged use of systemic agents which may cause side effects. One of the common problems encountered is improper nail sampling technique which results in loss of essential information. The unfamiliar terminologies used in reporting culture results may intimidate physicians resulting in misinterpretation and hamper treatment decision. This article provides a simple guide on nail sampling technique and the interpretation of culture results.

  10. BWIP-RANDOM-SAMPLING, Random Sample Generation for Nuclear Waste Disposal

    International Nuclear Information System (INIS)

    Sagar, B.

    1989-01-01

    1 - Description of program or function: Random samples for different distribution types are generated. Distribution types as required for performance assessment modeling of geologic nuclear waste disposal are provided. These are: - Uniform, - Log-uniform (base 10 or natural), - Normal, - Lognormal (base 10 or natural), - Exponential, - Bernoulli, - User defined continuous distribution. 2 - Method of solution: A linear congruential generator is used for uniform random numbers. A set of functions is used to transform the uniform distribution to the other distributions. Stratified, rather than random, sampling can be chosen. Truncated limits can be specified on many distributions, whose usual definition has an infinite support. 3 - Restrictions on the complexity of the problem: Generation of correlated random variables is not included

  11. Microextraction sample preparation techniques in biomedical analysis.

    Science.gov (United States)

    Szultka, Malgorzata; Pomastowski, Pawel; Railean-Plugaru, Viorica; Buszewski, Boguslaw

    2014-11-01

    Biologically active compounds are found in biological samples at relatively low concentration levels. The sample preparation of target compounds from biological, pharmaceutical, environmental, and food matrices is one of the most time-consuming steps in the analytical procedure. The microextraction techniques are dominant. Metabolomic studies also require application of proper analytical technique for the determination of endogenic metabolites present in biological matrix on trace concentration levels. Due to the reproducibility of data, precision, relatively low cost of the appropriate analysis, simplicity of the determination, and the possibility of direct combination of those techniques with other methods (combination types on-line and off-line), they have become the most widespread in routine determinations. Additionally, sample pretreatment procedures have to be more selective, cheap, quick, and environmentally friendly. This review summarizes the current achievements and applications of microextraction techniques. The main aim is to deal with the utilization of different types of sorbents for microextraction and emphasize the use of new synthesized sorbents as well as to bring together studies concerning the systematic approach to method development. This review is dedicated to the description of microextraction techniques and their application in biomedical analysis. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. NAIL SAMPLING TECHNIQUE AND ITS INTERPRETATION

    OpenAIRE

    TZAR MN; LEELAVATHI M

    2011-01-01

    The clinical suspicion of onychomyosis based on appearance of the nails, requires culture for confirmation. This is because treatment requires prolonged use of systemic agents which may cause side effects. One of the common problems encountered is improper nail sampling technique which results in loss of essential information. The unfamiliar terminologies used in reporting culture results may intimidate physicians resulting in misinterpretation and hamper treatment decision. This article prov...

  13. Sample preparation techniques for (p, X) spectrometry

    International Nuclear Information System (INIS)

    Whitehead, N.E.

    1985-01-01

    Samples are ashed at low temperature, using oxygen plasma; a rotary evaporator, and freeze drying speeded up the ashing. The new design of apparatus manufactured was only 10 watt but was as efficient as a 200 watt commercial machine; a circuit diagram is included. Samples of hair and biopsy samples of skin were analysed by the technique. A wool standard was prepared for interlaboratory comparison exercises. It was based on New Zealand merino sheep wool and was 2.9 kg in weight. A washing protocol was developed, which preserves most of the trace element content. The wool was ground in liquid nitrogen using a plastic pestle and beaker, driven by a rotary drill press. (author)

  14. Statistical Theory of the Vector Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Brincker, Rune; Ibrahim, S. R.

    1999-01-01

    decays. Due to the speed and/or accuracy of the Vector Random Decrement technique, it was introduced as an attractive alternative to the Random Decrement technique. In this paper, the theory of the Vector Random Decrement technique is extended by applying a statistical description of the stochastic...

  15. Estimation of Correlation Functions by the Random DEC Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Krenk, Steen; Jensen, Jakob Laigaard

    The Random Dec Technique is a versatile technique for characterization of random signals in the time domain. In this paper a short review of the most important properties of the technique is given. The review is mainly based on recently achieved results that are still unpublished, or that has just...

  16. A random spatial sampling method in a rural developing nation

    Science.gov (United States)

    Michelle C. Kondo; Kent D.W. Bream; Frances K. Barg; Charles C. Branas

    2014-01-01

    Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. We describe a stratified random sampling method...

  17. Random vs. systematic sampling from administrative databases involving human subjects.

    Science.gov (United States)

    Hagino, C; Lo, R J

    1998-09-01

    Two sampling techniques, simple random sampling (SRS) and systematic sampling (SS), were compared to determine whether they yield similar and accurate distributions for the following four factors: age, gender, geographic location and years in practice. Any point estimate within 7 yr or 7 percentage points of its reference standard (SRS or the entire data set, i.e., the target population) was considered "acceptably similar" to the reference standard. The sampling frame was from the entire membership database of the Canadian Chiropractic Association. The two sampling methods were tested using eight different sample sizes of n (50, 100, 150, 200, 250, 300, 500, 800). From the profile/characteristics, summaries of four known factors [gender, average age, number (%) of chiropractors in each province and years in practice], between- and within-methods chi 2 tests and unpaired t tests were performed to determine whether any of the differences [descriptively greater than 7% or 7 yr] were also statistically significant. The strengths of the agreements between the provincial distributions were quantified by calculating the percent agreements for each (provincial pairwise-comparison methods). Any percent agreement less than 70% was judged to be unacceptable. Our assessments of the two sampling methods (SRS and SS) for the different sample sizes tested suggest that SRS and SS yielded acceptably similar results. Both methods started to yield "correct" sample profiles at approximately the same sample size (n > 200). SS is not only convenient, it can be recommended for sampling from large databases in which the data are listed without any inherent order biases other than alphabetical listing by surname.

  18. Power Spectrum Estimation of Randomly Sampled Signals

    DEFF Research Database (Denmark)

    Velte, C. M.; Buchhave, P.; K. George, W.

    algorithms; sample and-hold and the direct spectral estimator without residence time weighting. The computer generated signal is a Poisson process with a sample rate proportional to velocity magnitude that consist of well-defined frequency content, which makes bias easy to spot. The idea...

  19. Estimation of Correlation Functions by the Random Decrement Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Krenk, Steen; Jensen, Jakob Laigaard

    responses simulated by two SDOF ARMA models loaded by the same bandlimited white noise. The speed and the accuracy of the RDD technique is compared to the Fast Fourier Transform (FFT) technique. The RDD technique does not involve multiplications, but only additions. Therefore, the technique is very fast......The Random Decrement (RDD) Technique is a versatile technique for characterization of random signals in the time domain. In this paper a short review of the theoretical basis is given, and the technique is illustrated by estimating auto-correlation functions and cross-correlation functions on modal...

  20. Estimation of Correlation Functions by the Random Decrement Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Krenk, Steen; Jensen, Jacob Laigaard

    1991-01-01

    responses simulated by two SDOF ARMA models loaded by the same band-limited white noise. The speed and the accuracy of the RDD technique is compared to the Fast Fourier Transform (FFT) technique. The RDD technique does not involve multiplications, but only additions. Therefore, the technique is very fast......The Random Decrement (RDD) Technique is a versatile technique for characterization of random signals in the time domain. In this paper a short review of the theoretical basis is given, and the technique is illustrated by estimating auto-correlation functions and cross-correlation functions on modal...

  1. Estimation of Correlation Functions by the Random Decrement Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Krenk, Steen; Jensen, Jakob Laigaard

    1992-01-01

    responses simulated by two SDOF ARMA models loaded by the same bandlimited white noise. The speed and the accuracy of the RDD technique is compared to the Fast Fourier Transform (FFT) technique. The RDD technique does not involve multiplications, but only additions. Therefore, the technique is very fast......The Random Decrement (RDD) Technique is a versatile technique for characterization of random signals in the time domain. In this paper a short review of the theoretical basis is given, and the technique is illustrated by estimating auto-correlation functions and cross-correlation functions on modal...

  2. Evaluation of diesel particulate matter sampling techniques

    CSIR Research Space (South Africa)

    Pretorius, CJ

    2011-09-01

    Full Text Available The study evaluated diesel particulate matter (DPM) sampling methods used in the South African mining industry. The three-piece cassette respirable, open face and stopper sampling methods were compared with the SKC DPM cassette method to find a...

  3. Are most samples of animals systematically biased? Consistent individual trait differences bias samples despite random sampling.

    Science.gov (United States)

    Biro, Peter A

    2013-02-01

    Sampling animals from the wild for study is something nearly every biologist has done, but despite our best efforts to obtain random samples of animals, 'hidden' trait biases may still exist. For example, consistent behavioral traits can affect trappability/catchability, independent of obvious factors such as size and gender, and these traits are often correlated with other repeatable physiological and/or life history traits. If so, systematic sampling bias may exist for any of these traits. The extent to which this is a problem, of course, depends on the magnitude of bias, which is presently unknown because the underlying trait distributions in populations are usually unknown, or unknowable. Indeed, our present knowledge about sampling bias comes from samples (not complete population censuses), which can possess bias to begin with. I had the unique opportunity to create naturalized populations of fish by seeding each of four small fishless lakes with equal densities of slow-, intermediate-, and fast-growing fish. Using sampling methods that are not size-selective, I observed that fast-growing fish were up to two-times more likely to be sampled than slower-growing fish. This indicates substantial and systematic bias with respect to an important life history trait (growth rate). If correlations between behavioral, physiological and life-history traits are as widespread as the literature suggests, then many animal samples may be systematically biased with respect to these traits (e.g., when collecting animals for laboratory use), and affect our inferences about population structure and abundance. I conclude with a discussion on ways to minimize sampling bias for particular physiological/behavioral/life-history types within animal populations.

  4. STATISTICAL LANDMARKS AND PRACTICAL ISSUES REGARDING THE USE OF SIMPLE RANDOM SAMPLING IN MARKET RESEARCHES

    Directory of Open Access Journals (Sweden)

    CODRUŢA DURA

    2010-01-01

    Full Text Available The sample represents a particular segment of the statistical populationchosen to represent it as a whole. The representativeness of the sample determines the accuracyfor estimations made on the basis of calculating the research indicators and the inferentialstatistics. The method of random sampling is part of probabilistic methods which can be usedwithin marketing research and it is characterized by the fact that it imposes the requirementthat each unit belonging to the statistical population should have an equal chance of beingselected for the sampling process. When the simple random sampling is meant to be rigorouslyput into practice, it is recommended to use the technique of random number tables in order toconfigure the sample which will provide information that the marketer needs. The paper alsodetails the practical procedure implemented in order to create a sample for a marketingresearch by generating random numbers using the facilities offered by Microsoft Excel.

  5. Efficient sampling of complex network with modified random walk strategies

    Science.gov (United States)

    Xie, Yunya; Chang, Shuhua; Zhang, Zhipeng; Zhang, Mi; Yang, Lei

    2018-02-01

    We present two novel random walk strategies, choosing seed node (CSN) random walk and no-retracing (NR) random walk. Different from the classical random walk sampling, the CSN and NR strategies focus on the influences of the seed node choice and path overlap, respectively. Three random walk samplings are applied in the Erdös-Rényi (ER), Barabási-Albert (BA), Watts-Strogatz (WS), and the weighted USAir networks, respectively. Then, the major properties of sampled subnets, such as sampling efficiency, degree distributions, average degree and average clustering coefficient, are studied. The similar conclusions can be reached with these three random walk strategies. Firstly, the networks with small scales and simple structures are conducive to the sampling. Secondly, the average degree and the average clustering coefficient of the sampled subnet tend to the corresponding values of original networks with limited steps. And thirdly, all the degree distributions of the subnets are slightly biased to the high degree side. However, the NR strategy performs better for the average clustering coefficient of the subnet. In the real weighted USAir networks, some obvious characters like the larger clustering coefficient and the fluctuation of degree distribution are reproduced well by these random walk strategies.

  6. Development of sampling techniques for ITER Type B radwaste

    International Nuclear Information System (INIS)

    Hong, Kwon Pyo; Kim, Sung Geun; Jung, Sang Hee; Oh, Wan Ho; Park, Myung Chul; Kim, Hee Moon; Ahn, Sang Bok

    2016-01-01

    There are several difficulties and limitation in sampling activities. As the Type B radwaste components are mostly metallic(mostly stainless steel) and bulk(∼ 1 m in size and ∼ 100 mm in thickness), it is difficult in taking samples from the surface of Type B radwaste by remote operation. But also, sampling should be performed without use of any liquid coolant to avoid the spread of contamination. And all sampling procedures are carried in the hot cell red zone with remote operation. Three kinds of sampling techniques are being developed. They are core sampling, chip sampling, and wedge sampling, which are the candidates of sampling techniques to be applied to ITER hot cell. Object materials for sampling are stainless steel or Cu alloy block in order to simulate ITER Type B radwaste. The best sampling technique for ITER Type B radwaste among the three sampling techniques will be suggested in several months after finishing the related experiment

  7. Development of sampling techniques for ITER Type B radwaste

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Kwon Pyo; Kim, Sung Geun; Jung, Sang Hee; Oh, Wan Ho; Park, Myung Chul; Kim, Hee Moon; Ahn, Sang Bok [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    There are several difficulties and limitation in sampling activities. As the Type B radwaste components are mostly metallic(mostly stainless steel) and bulk(∼ 1 m in size and ∼ 100 mm in thickness), it is difficult in taking samples from the surface of Type B radwaste by remote operation. But also, sampling should be performed without use of any liquid coolant to avoid the spread of contamination. And all sampling procedures are carried in the hot cell red zone with remote operation. Three kinds of sampling techniques are being developed. They are core sampling, chip sampling, and wedge sampling, which are the candidates of sampling techniques to be applied to ITER hot cell. Object materials for sampling are stainless steel or Cu alloy block in order to simulate ITER Type B radwaste. The best sampling technique for ITER Type B radwaste among the three sampling techniques will be suggested in several months after finishing the related experiment.

  8. The Dirichet-Multinomial model for multivariate randomized response data and small samples

    NARCIS (Netherlands)

    Avetisyan, Marianna; Fox, Gerardus J.A.

    2012-01-01

    In survey sampling the randomized response (RR) technique can be used to obtain truthful answers to sensitive questions. Although the individual answers are masked due to the RR technique, individual (sensitive) response rates can be estimated when observing multivariate response data. The

  9. The Dirichlet-Multinomial Model for Multivariate Randomized Response Data and Small Samples

    Science.gov (United States)

    Avetisyan, Marianna; Fox, Jean-Paul

    2012-01-01

    In survey sampling the randomized response (RR) technique can be used to obtain truthful answers to sensitive questions. Although the individual answers are masked due to the RR technique, individual (sensitive) response rates can be estimated when observing multivariate response data. The beta-binomial model for binary RR data will be generalized…

  10. Nuclear techniques for analysis of environmental samples

    International Nuclear Information System (INIS)

    1986-12-01

    The main purposes of this meeting were to establish the state-of-the-art in the field, to identify new research and development that is required to provide an adequate framework for analysis of environmental samples and to assess needs and possibilities for international cooperation in problem areas. This technical report was prepared on the subject based on the contributions made by the participants. A separate abstract was prepared for each of the 9 papers

  11. Sampling large random knots in a confined space

    International Nuclear Information System (INIS)

    Arsuaga, J; Blackstone, T; Diao, Y; Hinson, K; Karadayi, E; Saito, M

    2007-01-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e n 2 )). We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n 2 ). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications

  12. Sampling large random knots in a confined space

    Science.gov (United States)

    Arsuaga, J.; Blackstone, T.; Diao, Y.; Hinson, K.; Karadayi, E.; Saito, M.

    2007-09-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e^{n^2}) . We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n2). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  13. Sampling large random knots in a confined space

    Energy Technology Data Exchange (ETDEWEB)

    Arsuaga, J [Department of Mathematics, San Francisco State University, 1600 Holloway Ave, San Francisco, CA 94132 (United States); Blackstone, T [Department of Computer Science, San Francisco State University, 1600 Holloway Ave., San Francisco, CA 94132 (United States); Diao, Y [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Hinson, K [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Karadayi, E [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States); Saito, M [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States)

    2007-09-28

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e{sup n{sup 2}}). We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n{sup 2}). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  14. Martian regolith geochemistry and sampling techniques

    Science.gov (United States)

    Clark, B. C.

    Laboratory study of samples of the intermediate and fine-grained regolith, including duricrust peds, is a fundamental prerequisite for understanding the types of physical and chemical weathering processes on Mars. The extraordinary importance of such samples is their relevance to understanding past changes in climate, availability (and possible physical state) of water, eolian forces, the thermal and chemical influences of volcanic and impact processes, and the inventory and fates of Martian volatiles. Fortunately, this regolith material appears to be ubiquitous over the Martian surface, and should be available at many different landing sites. Viking data has been interpreted to indicate a smectite-rich regolith material, implying extensive weathering involving aqueous activity and geochemical alteration. An all-igneous source of the Martian fines has also been proposed. The X-ray fluorescence measurement data set can now be fully explained in terms of a simple two-component model. The first component is silicate, having strong geochemical similarities with Shergottites, but not other SNC meteorites. The second component is salt. Variations in these components could produce silicate and salt-rich beds, the latter being of high potential importance for microenvironments in which liquid water (brines) could exist. It therefore would be desirable to scan the surface of the regolith for such prospects.

  15. Martian regolith geochemistry and sampling techniques

    Science.gov (United States)

    Clark, B. C.

    1988-01-01

    Laboratory study of samples of the intermediate and fine-grained regolith, including duricrust peds, is a fundamental prerequisite for understanding the types of physical and chemical weathering processes on Mars. The extraordinary importance of such samples is their relevance to understanding past changes in climate, availability (and possible physical state) of water, eolian forces, the thermal and chemical influences of volcanic and impact processes, and the inventory and fates of Martian volatiles. Fortunately, this regolith material appears to be ubiquitous over the Martian surface, and should be available at many different landing sites. Viking data has been interpreted to indicate a smectite-rich regolith material, implying extensive weathering involving aqueous activity and geochemical alteration. An all-igneous source of the Martian fines has also been proposed. The X-ray fluorescence measurement data set can now be fully explained in terms of a simple two-component model. The first component is silicate, having strong geochemical similarities with Shergottites, but not other SNC meteorites. The second component is salt. Variations in these components could produce silicate and salt-rich beds, the latter being of high potential importance for microenvironments in which liquid water (brines) could exist. It therefore would be desirable to scan the surface of the regolith for such prospects.

  16. A comparative study of sampling techniques for monitoring carcass contamination

    NARCIS (Netherlands)

    Snijders, J.M.A.; Janssen, M.H.W.; Gerats, G.E.; Corstiaensen, G.P.

    1984-01-01

    Four bacteriological sampling techniques i.e. the excision, double swab, agar contract and modified agar contact techniques were compared by sampling pig carcasses before and after chilling. As well as assessing the advantages and disadvantages of the techniques particular attention was paid to

  17. SOME SYSTEMATIC SAMPLING STRATEGIES USING MULTIPLE RANDOM STARTS

    OpenAIRE

    Sampath Sundaram; Ammani Sivaraman

    2010-01-01

    In this paper an attempt is made to extend linear systematic sampling using multiple random starts due to Gautschi(1957)for various types of systematic sampling schemes available in literature, namely(i)  Balanced Systematic Sampling (BSS) of  Sethi (1965) and (ii) Modified Systematic Sampling (MSS) of Singh, Jindal, and Garg  (1968). Further, the proposed methods were compared with Yates corrected estimator developed with reference to Gautschi’s Linear systematic samplin...

  18. Petrosal sinus sampling: technique and rationale.

    Science.gov (United States)

    Miller, D L; Doppman, J L

    1991-01-01

    Bilateral simultaneous sampling of the inferior petrosal sinuses is an extremely sensitive, specific, and accurate test for diagnosing Cushing disease and distinguishing between that entity and the ectopic ACTH syndrome. It is also valuable for lateralizing small hormone-producing adenomas within the pituitary gland. The inferior petrosal sinuses connect the cavernous sinuses with the ipsilateral internal jugular veins. The anatomy of the anastomoses between the inferior petrosal sinus, the internal jugular vein, and the venous plexuses at the base of the skull varies, but it is almost always possible to catheterize the inferior petrosal sinus. In addition, variations in size and anatomy are often present between the two inferior petrosal sinuses in a patient. Advance preparation is required for petrosal sinus sampling. Teamwork is a critical element, and each member of the staff should know what he or she will be doing during the procedure. The samples must be properly labeled, processed, and stored. Specific needles, guide wires, and catheters are recommended for this procedure. The procedure is performed with specific attention to the three areas of potential technical difficulty: catheterization of the common femoral veins, crossing the valve at the base of the left internal jugular vein, and selective catheterization of the inferior petrosal sinuses. There are specific methods for dealing with each of these areas. The sine qua non of correct catheter position in the inferior petrosal sinus is demonstration of reflux of contrast material into the ipsilateral cavernous sinus. Images must always be obtained to document correct catheter position. Special attention must be paid to two points to prevent potential complications: The patient must be given an adequate dose of heparin, and injection of contrast material into the inferior petrosal sinuses and surrounding veins must be done gently and carefully. When the procedure is performed as outlined, both inferior

  19. Direct progeny detection techniques and random epidemiology

    International Nuclear Information System (INIS)

    Mayya, Y.S.; Mishra, Rosaline; Sapra, B.K.

    2015-01-01

    Over the past 40 years, there has been considerable progress in the measurements methods and their application to the estimates of risks due to radon among general populations. The previous decade saw major development in this regard. It was the direct estimate of indoor radon risk from epidemiological studies in Europe and North America. These were important findings that demonstrated the presence of lung cancer risks at residential radon levels supplementing the generally used risks estimates at high exposures obtained from uranium miner's data. The residential radon epidemiological studies largely used radon concentration as a measure of exposure. The exposure to decay products, which are primarily the dose givers, are assumed to be proportional to the measured gas concentrations. Also, the presence of thoron was neglected in these studies. Although several corrections have appeared to these assessments, the question of variability of actual decay product exposures has largely remained unaddressed. In order to circumvent this limitation, passive techniques were developed to estimate the decay product concentrations directly using deposition monitors. These are based on detecting the alpha particles from decay products deposited on an absorber mounted LR-115 detectors. Known as Direct radon, and Thoron Progeny sensors (DRPS/DTPS), these have been further refined to separate fine fraction from the coarse fraction by wire-mesh capping techniques. Large number environmental calibration exercises and field data generation has been carried out on the progeny concentrations in Indian and some European environments. The development of progeny sensors offers a new tool for future epidemiology. Since in the Indian context, there exist no radon related epidemiological estimates of risk, it is time one conducts large scale studies to seek possible correlations between DRPS/DTPS data and lung cancer risks. While epidemiological studies in High background radiation areas

  20. Innovative directional and position specific sampling technique

    International Nuclear Information System (INIS)

    Hutzel, W.J.; Hill, J.L. III; Foster, E.L.

    1994-01-01

    UTD, Incorporated has developed a unique real-time, in-situ POsition LOcation (POLO) device which will directly enhance the Department of Energy's Environmental Restoration and Waste Management Program through improvements to finding contamination, identifying extent of contamination, remediating, and finally monitoring sites. POLO is smaller than existing technology and is unaffected by the presence of steel and other magnetic materials. The size of the device offers for the first time, the possibility to accurately determine the location of a penetrometer. It will be usable, in its present form, to map the position of a sampling device as that device is inserted into the ground. A three phase program was proposed to DOE to take the POLO System through the three levels of maturity prior to commercialization. Phase 1 included the design and testing of individual components of the device and met or exceeded success criteria. Phase 2 has included laboratory-scale tracking experiments of the integrated POLO System that have met or exceeded the success criterion as well. The success criterion is to demonstrate path tracking accuracy with a total error of less than 0.50% of the distance traveled for distances less than 70 meters. In Phase 3 we will develop and test a full-scale POLO System and conclude with a field demonstration. The goal of the Phase 3 effort is to achieve the same 0.50% accuracy or better with respect to the distance traveled for distances less than 70 meters in the field as was demonstrated in laboratory tests in the earlier phases. The final report details the design of a realistic laboratory-scale penetrometer path and describes the POLO tracking experiments that were conducted. Plans for demonstrations as well as commercialization and technology transfer with DOE field sites are described, highlighting the expected smooth transition to full-scale production

  1. Misrepresenting random sampling? A systematic review of research papers in the Journal of Advanced Nursing.

    Science.gov (United States)

    Williamson, Graham R

    2003-11-01

    This paper discusses the theoretical limitations of the use of random sampling and probability theory in the production of a significance level (or P-value) in nursing research. Potential alternatives, in the form of randomization tests, are proposed. Research papers in nursing, medicine and psychology frequently misrepresent their statistical findings, as the P-values reported assume random sampling. In this systematic review of studies published between January 1995 and June 2002 in the Journal of Advanced Nursing, 89 (68%) studies broke this assumption because they used convenience samples or entire populations. As a result, some of the findings may be questionable. The key ideas of random sampling and probability theory for statistical testing (for generating a P-value) are outlined. The result of a systematic review of research papers published in the Journal of Advanced Nursing is then presented, showing how frequently random sampling appears to have been misrepresented. Useful alternative techniques that might overcome these limitations are then discussed. REVIEW LIMITATIONS: This review is limited in scope because it is applied to one journal, and so the findings cannot be generalized to other nursing journals or to nursing research in general. However, it is possible that other nursing journals are also publishing research articles based on the misrepresentation of random sampling. The review is also limited because in several of the articles the sampling method was not completely clearly stated, and in this circumstance a judgment has been made as to the sampling method employed, based on the indications given by author(s). Quantitative researchers in nursing should be very careful that the statistical techniques they use are appropriate for the design and sampling methods of their studies. If the techniques they employ are not appropriate, they run the risk of misinterpreting findings by using inappropriate, unrepresentative and biased samples.

  2. Boat sampling technique for assessment of ageing of components

    International Nuclear Information System (INIS)

    Kumar, Kundan; Shyam, T.V.; Kayal, J.N.; Rupani, B.B.

    2006-01-01

    Boat sampling technique (BST) is a surface sampling technique, which has been developed for obtaining, in-situ, metal samples from the surface of an operating component without affecting its operating service life. The BST is non-destructive in nature and the sample is obtained without plastic deformation or without thermal degradation of the parent material. The shape and size of the sample depends upon the shape of the cutter and the surface geometry of the parent material. Miniature test specimens are generated from the sample and the specimens are subjected to various tests, viz. Metallurgical Evaluation, Metallographic Evaluation, Micro-hardness Evaluation, sensitisation test, small punch test etc. to confirm the integrity and assessment of safe operating life of the component. This paper highlights design objective of boat sampling technique, description of sampling module, sampling cutter and its performance evaluation, cutting process, boat samples, operational sequence of sampling module, qualification of sampling module, qualification of sampling technique, qualification of scooped region of the parent material, sample retrieval system, inspection, testing and examination to be carried out on the boat samples and scooped region. (author)

  3. Random sampling of evolution time space and Fourier transform processing

    International Nuclear Information System (INIS)

    Kazimierczuk, Krzysztof; Zawadzka, Anna; Kozminski, Wiktor; Zhukov, Igor

    2006-01-01

    Application of Fourier Transform for processing 3D NMR spectra with random sampling of evolution time space is presented. The 2D FT is calculated for pairs of frequencies, instead of conventional sequence of one-dimensional transforms. Signal to noise ratios and linewidths for different random distributions were investigated by simulations and experiments. The experimental examples include 3D HNCA, HNCACB and 15 N-edited NOESY-HSQC spectra of 13 C 15 N labeled ubiquitin sample. Obtained results revealed general applicability of proposed method and the significant improvement of resolution in comparison with conventional spectra recorded in the same time

  4. Adaptive importance sampling of random walks on continuous state spaces

    International Nuclear Information System (INIS)

    Baggerly, K.; Cox, D.; Picard, R.

    1998-01-01

    The authors consider adaptive importance sampling for a random walk with scoring in a general state space. Conditions under which exponential convergence occurs to the zero-variance solution are reviewed. These results generalize previous work for finite, discrete state spaces in Kollman (1993) and in Kollman, Baggerly, Cox, and Picard (1996). This paper is intended for nonstatisticians and includes considerable explanatory material

  5. An efficient method of randomly sampling the coherent angular scatter distribution

    International Nuclear Information System (INIS)

    Williamson, J.F.; Morin, R.L.

    1983-01-01

    Monte Carlo simulations of photon transport phenomena require random selection of an interaction process at each collision site along the photon track. Possible choices are usually limited to photoelectric absorption and incoherent scatter as approximated by the Klein-Nishina distribution. A technique is described for sampling the coherent angular scatter distribution, for the benefit of workers in medical physics. (U.K.)

  6. New trends in sample preparation techniques for environmental analysis.

    Science.gov (United States)

    Ribeiro, Cláudia; Ribeiro, Ana Rita; Maia, Alexandra S; Gonçalves, Virgínia M F; Tiritan, Maria Elizabeth

    2014-01-01

    Environmental samples include a wide variety of complex matrices, with low concentrations of analytes and presence of several interferences. Sample preparation is a critical step and the main source of uncertainties in the analysis of environmental samples, and it is usually laborious, high cost, time consuming, and polluting. In this context, there is increasing interest in developing faster, cost-effective, and environmentally friendly sample preparation techniques. Recently, new methods have been developed and optimized in order to miniaturize extraction steps, to reduce solvent consumption or become solventless, and to automate systems. This review attempts to present an overview of the fundamentals, procedure, and application of the most recently developed sample preparation techniques for the extraction, cleanup, and concentration of organic pollutants from environmental samples. These techniques include: solid phase microextraction, on-line solid phase extraction, microextraction by packed sorbent, dispersive liquid-liquid microextraction, and QuEChERS (Quick, Easy, Cheap, Effective, Rugged and Safe).

  7. Identification of System Parameters by the Random Decrement Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Kirkegaard, Poul Henning; Rytter, Anders

    1991-01-01

    -Walker equations and finally, least-square fitting of the theoretical correlation function. The results are compared to the results of fitting an Auto Regressive Moving Average (ARMA) model directly to the system output from a single-degree-of-freedom system loaded by white noise.......The aim of this paper is to investigate and illustrate the possibilities of using correlation functions estimated by the Random Decrement Technique as a basis for parameter identification. A two-stage system identification system is used: first, the correlation functions are estimated by the Random...... Decrement Technique, and then the system parameters are identified from the correlation function estimates. Three different techniques are used in the parameter identification process: a simple non-parametric method, estimation of an Auto Regressive (AR) model by solving an overdetermined set of Yule...

  8. Identification of System Parameters by the Random Decrement Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Kirkegaard, Poul Henning; Rytter, Anders

    -Walker equations and finally least square fitting of the theoretical correlation function. The results are compared to the results of fitting an Auto Regressive Moving Average(ARMA) model directly to the system output. All investigations are performed on the simulated output from a single degree-off-freedom system......The aim of this paper is to investigate and illustrate the possibilities of using correlation functions estimated by the Random Decrement Technique as a basis for parameter identification. A two-stage system identification method is used: first the correlation functions are estimated by the Random...... Decrement technique and then the system parameters are identified from the correlation function estimates. Three different techniques are used in the parameters identification process: a simple non-paramatic method, estimation of an Auto Regressive(AR) model by solving an overdetermined set of Yule...

  9. Cleaning and Cleanliness Verification Techniques for Mars Returned Sample Handling

    Science.gov (United States)

    Mickelson, E. T.; Lindstrom, D. J.; Allton, J. H.; Hittle, J. D.

    2002-01-01

    Precision cleaning and cleanliness verification techniques are examined as a subset of a comprehensive contamination control strategy for a Mars sample return mission. Additional information is contained in the original extended abstract.

  10. Key-space analysis of double random phase encryption technique

    Science.gov (United States)

    Monaghan, David S.; Gopinathan, Unnikrishnan; Naughton, Thomas J.; Sheridan, John T.

    2007-09-01

    We perform a numerical analysis on the double random phase encryption/decryption technique. The key-space of an encryption technique is the set of possible keys that can be used to encode data using that technique. In the case of a strong encryption scheme, many keys must be tried in any brute-force attack on that technique. Traditionally, designers of optical image encryption systems demonstrate only how a small number of arbitrary keys cannot decrypt a chosen encrypted image in their system. However, this type of demonstration does not discuss the properties of the key-space nor refute the feasibility of an efficient brute-force attack. To clarify these issues we present a key-space analysis of the technique. For a range of problem instances we plot the distribution of decryption errors in the key-space indicating the lack of feasibility of a simple brute-force attack.

  11. Estimation of Sensitive Proportion by Randomized Response Data in Successive Sampling

    Directory of Open Access Journals (Sweden)

    Bo Yu

    2015-01-01

    Full Text Available This paper considers the problem of estimation for binomial proportions of sensitive or stigmatizing attributes in the population of interest. Randomized response techniques are suggested for protecting the privacy of respondents and reducing the response bias while eliciting information on sensitive attributes. In many sensitive question surveys, the same population is often sampled repeatedly on each occasion. In this paper, we apply successive sampling scheme to improve the estimation of the sensitive proportion on current occasion.

  12. Micro and Nano Techniques for the Handling of Biological Samples

    DEFF Research Database (Denmark)

    Micro and Nano Techniques for the Handling of Biological Samples reviews the different techniques available to manipulate and integrate biological materials in a controlled manner, either by sliding them along a surface (2-D manipulation), or by gripping and moving them to a new position (3-D...

  13. Non-terminal blood sampling techniques in guinea pigs.

    Science.gov (United States)

    Birck, Malene M; Tveden-Nyborg, Pernille; Lindblad, Maiken M; Lykkesfeldt, Jens

    2014-10-11

    Guinea pigs possess several biological similarities to humans and are validated experimental animal models(1-3). However, the use of guinea pigs currently represents a relatively narrow area of research and descriptive data on specific methodology is correspondingly scarce. The anatomical features of guinea pigs are slightly different from other rodent models, hence modulation of sampling techniques to accommodate for species-specific differences, e.g., compared to mice and rats, are necessary to obtain sufficient and high quality samples. As both long and short term in vivo studies often require repeated blood sampling the choice of technique should be well considered in order to reduce stress and discomfort in the animals but also to ensure survival as well as compliance with requirements of sample size and accessibility. Venous blood samples can be obtained at a number of sites in guinea pigs e.g., the saphenous and jugular veins, each technique containing both advantages and disadvantages(4,5). Here, we present four different blood sampling techniques for either conscious or anaesthetized guinea pigs. The procedures are all non-terminal procedures provided that sample volumes and number of samples do not exceed guidelines for blood collection in laboratory animals(6). All the described methods have been thoroughly tested and applied for repeated in vivo blood sampling in studies within our research facility.

  14. Sampling techniques for thrips (Thysanoptera: Thripidae) in preflowering tomato.

    Science.gov (United States)

    Joost, P Houston; Riley, David G

    2004-08-01

    Sampling techniques for thrips (Thysanoptera: Thripidae) were compared in preflowering tomato plants at the Coastal Plain Experiment Station in Tifton, GA, in 2000 and 2003, to determine the most effective method of determining abundance of thrips on tomato foliage early in the growing season. Three relative sampling techniques, including a standard insect aspirator, a 946-ml beat cup, and an insect vacuum device, were compared for accuracy to an absolute method and to themselves for precision and efficiency of sampling thrips. Thrips counts of all relative sampling methods were highly correlated (R > 0.92) to the absolute method. The aspirator method was the most accurate compared with the absolute sample according to regression analysis in 2000. In 2003, all sampling methods were considered accurate according to Dunnett's test, but thrips numbers were lower and sample variation was greater than in 2000. In 2000, the beat cup method had the lowest relative variation (RV) or best precision, at 1 and 8 d after transplant (DAT). Only the beat cup method had RV values <25 for all sampling dates. In 2003, the beat cup method had the lowest RV value at 15 and 21 DAT. The beat cup method also was the most efficient method for all sample dates in both years. Frankliniella fusca (Pergande) was the most abundant thrips species on the foliage of preflowering tomato in both years of study at this location. Overall, the best thrips sampling technique tested was the beat cup method in terms of precision and sampling efficiency.

  15. Statistical sampling techniques as applied to OSE inspections

    International Nuclear Information System (INIS)

    Davis, J.J.; Cote, R.W.

    1987-01-01

    The need has been recognized for statistically valid methods for gathering information during OSE inspections; and for interpretation of results, both from performance testing and from records reviews, interviews, etc. Battelle Columbus Division, under contract to DOE OSE has performed and is continuing to perform work in the area of statistical methodology for OSE inspections. This paper represents some of the sampling methodology currently being developed for use during OSE inspections. Topics include population definition, sample size requirements, level of confidence and practical logistical constraints associated with the conduct of an inspection based on random sampling. Sequential sampling schemes and sampling from finite populations are also discussed. The methods described are applicable to various data gathering activities, ranging from the sampling and examination of classified documents to the sampling of Protective Force security inspectors for skill testing

  16. Differences in sampling techniques on total post-mortem tryptase.

    Science.gov (United States)

    Tse, R; Garland, J; Kesha, K; Elstub, H; Cala, A D; Ahn, Y; Stables, S; Palmiere, C

    2017-11-20

    The measurement of mast cell tryptase is commonly used to support the diagnosis of anaphylaxis. In the post-mortem setting, the literature recommends sampling from peripheral blood sources (femoral blood) but does not specify the exact sampling technique. Sampling techniques vary between pathologists, and it is unclear whether different sampling techniques have any impact on post-mortem tryptase levels. The aim of this study is to compare the difference in femoral total post-mortem tryptase levels between two sampling techniques. A 6-month retrospective study comparing femoral total post-mortem tryptase levels between (1) aspirating femoral vessels with a needle and syringe prior to evisceration and (2) femoral vein cut down during evisceration. Twenty cases were identified, with three cases excluded from analysis. There was a statistically significant difference (paired t test, p sampling methods. The clinical significance of this finding and what factors may contribute to it are unclear. When requesting post-mortem tryptase, the pathologist should consider documenting the exact blood collection site and method used for collection. In addition, blood samples acquired by different techniques should not be mixed together and should be analyzed separately if possible.

  17. Non-terminal blood sampling techniques in Guinea pigs

    DEFF Research Database (Denmark)

    Birck, Malene Muusfeldt; Tveden-Nyborg, Pernille; Lindblad, Maiken Marie

    2014-01-01

    Guinea pigs possess several biological similarities to humans and are validated experimental animal models(1-3). However, the use of guinea pigs currently represents a relatively narrow area of research and descriptive data on specific methodology is correspondingly scarce. The anatomical features...... of guinea pigs are slightly different from other rodent models, hence modulation of sampling techniques to accommodate for species-specific differences, e.g., compared to mice and rats, are necessary to obtain sufficient and high quality samples. As both long and short term in vivo studies often require...... repeated blood sampling the choice of technique should be well considered in order to reduce stress and discomfort in the animals but also to ensure survival as well as compliance with requirements of sample size and accessibility. Venous blood samples can be obtained at a number of sites in guinea pigs e...

  18. Water sampling techniques for continuous monitoring of pesticides in water

    Directory of Open Access Journals (Sweden)

    Šunjka Dragana

    2017-01-01

    Full Text Available Good ecological and chemical status of water represents the most important aim of the Water Framework Directive 2000/60/EC, which implies respect of water quality standards at the level of entire river basin (2008/105/EC and 2013/39/EC. This especially refers to the control of pesticide residues in surface waters. In order to achieve the set goals, a continuous monitoring program that should provide a comprehensive and interrelated overview of water status should be implemented. However, it demands the use of appropriate analysis techniques. Until now, the procedure for sampling and quantification of residual pesticide quantities in aquatic environment was based on the use of traditional sampling techniques that imply periodical collecting of individual samples. However, this type of sampling provides only a snapshot of the situation in regard to the presence of pollutants in water. As an alternative, the technique of passive sampling of pollutants in water, including pesticides has been introduced. Different samplers are available for pesticide sampling in surface water, depending on compounds. The technique itself is based on keeping a device in water over a longer period of time which varies from several days to several weeks, depending on the kind of compound. In this manner, the average concentrations of pollutants dissolved in water during a time period (time-weighted average concentrations, TWA are obtained, which enables monitoring of trends in areal and seasonal variations. The use of these techniques also leads to an increase in sensitivity of analytical methods, considering that pre-concentration of analytes takes place within the sorption medium. However, the use of these techniques for determination of pesticide concentrations in real water environments requires calibration studies for the estimation of sampling rates (Rs. Rs is a volume of water per time, calculated as the product of overall mass transfer coefficient and area of

  19. Assessment of Natural Radioactivity in TENORM Samples Using Different Techniques

    International Nuclear Information System (INIS)

    Salman, Kh.A.; Shahein, A.Y.

    2009-01-01

    In petroleum oil industries, technologically-enhanced, naturally occurring radioactive materials are produced. The presence of TENORM constitutes a significant radiological human health hazard. In the present work, liquid scintillation counting technique was used to determine both 222 Rn and 226 Ra concentrations in TENORM samples, by measuring 222 Rn concentrations in the sample at different intervals of time after preparation. The radiation doses from the TENORM samples were estimated using thermoluminenscent detector (TLD-4000). The estimated radiation doses were found to be proportional to both the measured radiation doses in site and natural activity concentration in the samples that measured with LSC

  20. Application of the Sampling Selection Technique in Approaching Financial Audit

    Directory of Open Access Journals (Sweden)

    Victor Munteanu

    2018-03-01

    Full Text Available In his professional approach, the financial auditor has a wide range of working techniques, including selection techniques. They are applied depending on the nature of the information available to the financial auditor, the manner in which they are presented - paper or electronic format, and, last but not least, the time available. Several techniques are applied, successively or in parallel, to increase the safety of the expressed opinion and to provide the audit report with a solid basis of information. Sampling is used in the phase of control or clarification of the identified error. The main purpose is to corroborate or measure the degree of risk detected following a pertinent analysis. Since the auditor does not have time or means to thoroughly rebuild the information, the sampling technique can provide an effective response to the need for valorization.

  1. New materials for sample preparation techniques in bioanalysis.

    Science.gov (United States)

    Nazario, Carlos Eduardo Domingues; Fumes, Bruno Henrique; da Silva, Meire Ribeiro; Lanças, Fernando Mauro

    2017-02-01

    The analysis of biological samples is a complex and difficult task owing to two basic and complementary issues: the high complexity of most biological matrices and the need to determine minute quantities of active substances and contaminants in such complex sample. To succeed in this endeavor samples are usually subject to three steps of a comprehensive analytical methodological approach: sample preparation, analytes isolation (usually utilizing a chromatographic technique) and qualitative/quantitative analysis (usually with the aid of mass spectrometric tools). Owing to the complex nature of bio-samples, and the very low concentration of the target analytes to be determined, selective sample preparation techniques is mandatory in order to overcome the difficulties imposed by these two constraints. During the last decade new chemical synthesis approaches has been developed and optimized, such as sol-gel and molecularly imprinting technologies, allowing the preparation of novel materials for sample preparation including graphene and derivatives, magnetic materials, ionic liquids, molecularly imprinted polymers, and much more. In this contribution we will review these novel techniques and materials, as well as their application to the bioanalysis niche. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. RandomSpot: A web-based tool for systematic random sampling of virtual slides.

    Science.gov (United States)

    Wright, Alexander I; Grabsch, Heike I; Treanor, Darren E

    2015-01-01

    This paper describes work presented at the Nordic Symposium on Digital Pathology 2014, Linköping, Sweden. Systematic random sampling (SRS) is a stereological tool, which provides a framework to quickly build an accurate estimation of the distribution of objects or classes within an image, whilst minimizing the number of observations required. RandomSpot is a web-based tool for SRS in stereology, which systematically places equidistant points within a given region of interest on a virtual slide. Each point can then be visually inspected by a pathologist in order to generate an unbiased sample of the distribution of classes within the tissue. Further measurements can then be derived from the distribution, such as the ratio of tumor to stroma. RandomSpot replicates the fundamental principle of traditional light microscope grid-shaped graticules, with the added benefits associated with virtual slides, such as facilitated collaboration and automated navigation between points. Once the sample points have been added to the region(s) of interest, users can download the annotations and view them locally using their virtual slide viewing software. Since its introduction, RandomSpot has been used extensively for international collaborative projects, clinical trials and independent research projects. So far, the system has been used to generate over 21,000 sample sets, and has been used to generate data for use in multiple publications, identifying significant new prognostic markers in colorectal, upper gastro-intestinal and breast cancer. Data generated using RandomSpot also has significant value for training image analysis algorithms using sample point coordinates and pathologist classifications.

  3. Randomized branch sampling to estimatefruit production in Pecan trees cv. ‘Barton’

    Directory of Open Access Journals (Sweden)

    Filemom Manoel Mokochinski

    Full Text Available ABSTRACT: Sampling techniques to quantify the production of fruits are still very scarce and create a gap in crop development research. This study was conducted in a rural property in the county of Cachoeira do Sul - RS to estimate the efficiency of randomized branch sampling (RBS in quantifying the production of pecan fruit at three different ages (5,7 and 10 years. Two selection techniques were tested: the probability proportional to the diameter (PPD and the uniform probability (UP techniques, which were performed on nine trees, three from each age and randomly chosen. The RBS underestimated fruit production for all ages, and its main drawback was the high sampling error (125.17% - PPD and 111.04% - UP. The UP was regarded as more efficient than the PPD, though both techniques estimated similar production and similar experimental errors. In conclusion, we reported that branch sampling was inaccurate for this case study, requiring new studies to produce estimates with smaller sampling error.

  4. Sample preparation for special PIE-techniques at ITU

    International Nuclear Information System (INIS)

    Toscano, E.H.; Manzel, R.

    2002-01-01

    Several sample preparation techniques were developed and installed in hot cells. The techniques were conceived to evaluate the performance of highly burnt fuel rods and include: (a) a device for the removal of the fuel, (b) a method for the preparation of the specimen ends for the welding of new end caps and for the careful cleaning of samples for Transmission Electron Microscopy and Glow Discharge Mass Spectroscopy, (c) a sample pressurisation device for long term creep tests, and (d) a diameter measuring device for creep or burst samples. Examples of the determination of the mechanical properties, the behaviour under transient conditions and for the assessment of the corrosion behaviour of high burnup cladding materials are presented. (author)

  5. Random sampling of quantum states: a survey of methods and some issues regarding the Overparametrized Method

    International Nuclear Information System (INIS)

    Maziero, Jonas

    2015-01-01

    The numerical generation of random quantum states (RQS) is an important procedure for investigations in quantum information science. Here, we review some methods that may be used for performing that task. We start by presenting a simple procedure for generating random state vectors, for which the main tool is the random sampling of unbiased discrete probability distributions (DPD). Afterwards, the creation of random density matrices is addressed. In this context, we first present the standard method, which consists in using the spectral decomposition of a quantum state for getting RQS from random DPDs and random unitary matrices. In the sequence, the Bloch vector parametrization method is described. This approach, despite being useful in several instances, is not in general convenient for RQS generation. In the last part of the article, we regard the overparametrized method (OPM) and the related Ginibre and Bures techniques. The OPM can be used to create random positive semidefinite matrices with unit trace from randomly produced general complex matrices in a simple way that is friendly for numerical implementations. We consider a physically relevant issue related to the possible domains that may be used for the real and imaginary parts of the elements of such general complex matrices. Subsequently, a too fast concentration of measure in the quantum state space that appears in this parametrization is noticed. (author)

  6. Use of nuclear technique in samples for agricultural purposes

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Kerley A. P. de; Sperling, Eduardo Von, E-mail: kerley@ufmg.br, E-mail: kerleyfisica@yahoo.com.br [Department of Sanitary and Environmental Engineering Federal University of Minas Gerais, Belo Horizonte (Brazil); Menezes, Maria Angela B. C.; Jacomino, Vanusa M.F. [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2013-01-15

    The concern related to environment is growing. Due to this, it is needed to determine chemical elements in a large range of concentration. The neutron activation technique (NAA) determines the elemental composition by the measurement of artificial radioactivity in a sample that was submitted to a neutron flux. NAA is a sensitive and accurate technique with low detection limits. An example of application of NAA was the measurement of concentrations of rare earth elements (REE) in waste samples of phosphogypsum (PG) and cerrado soil samples (clayey and sandy soils). Additionally, a soil reference material of the International Atomic Energy Agency (IAEA) was also analyzed. The REE concentration in PG samples was two times higher than those found in national fertilizers, (total of 4,000 mg kg{sup -1}), 154 times greater than the values found in the sandy soil (26 mg kg{sup -1}) and 14 times greater than the in clayey soil (280 mg kg{sup -1}). The experimental results for the reference material were inside the uncertainty of the certified values pointing out the accuracy of the method (95%). The determination of La, Ce, Pr, Nd, Pm, Sm, Eu, Tb, Dy, Ho, Er, Tm, Yb and Lu in the samples and reference material confirmed the versatility of the technique on REE determination in soil and phosphogypsum samples that are matrices for agricultural interest. (author)

  7. Development of analytical techniques for safeguards environmental samples at JAEA

    International Nuclear Information System (INIS)

    Sakurai, Satoshi; Magara, Masaaki; Usuda, Shigekazu; Watanabe, Kazuo; Esaka, Fumitaka; Hirayama, Fumio; Lee, Chi-Gyu; Yasuda, Kenichiro; Inagawa, Jun; Suzuki, Daisuke; Iguchi, Kazunari; Kokubu, Yoko S.; Miyamoto, Yutaka; Ohzu, Akira

    2007-01-01

    JAEA has been developing, under the auspices of the Ministry of Education, Culture, Sports, Science and Technology of Japan, analytical techniques for ultra-trace amounts of nuclear materials in environmental samples in order to contribute to the strengthened safeguards system. Development of essential techniques for bulk and particle analysis, as well as screening, of the environmental swipe samples has been established as ultra-trace analytical methods of uranium and plutonium. In January 2003, JAEA was qualified, including its quality control system, as a member of the JAEA network analytical laboratories for environmental samples. Since 2004, JAEA has conducted the analysis of domestic and the IAEA samples, through which JAEA's analytical capability has been verified and improved. In parallel, advanced techniques have been developed in order to expand the applicability to the samples of various elemental composition and impurities and to improve analytical accuracy and efficiency. This paper summarizes the trace of the technical development in environmental sample analysis at JAEA, and refers to recent trends of research and development in this field. (author)

  8. Use of nuclear technique in samples for agricultural purposes

    International Nuclear Information System (INIS)

    Oliveira, Kerley A. P. de; Sperling, Eduardo Von; Menezes, Maria Angela B. C.; Jacomino, Vanusa M.F.

    2013-01-01

    The concern related to environment is growing. Due to this, it is needed to determine chemical elements in a large range of concentration. The neutron activation technique (NAA) determines the elemental composition by the measurement of artificial radioactivity in a sample that was submitted to a neutron flux. NAA is a sensitive and accurate technique with low detection limits. An example of application of NAA was the measurement of concentrations of rare earth elements (REE) in waste samples of phosphogypsum (PG) and cerrado soil samples (clayey and sandy soils). Additionally, a soil reference material of the International Atomic Energy Agency (IAEA) was also analyzed. The REE concentration in PG samples was two times higher than those found in national fertilizers, (total of 4,000 mg kg -1 ), 154 times greater than the values found in the sandy soil (26 mg kg -1 ) and 14 times greater than the in clayey soil (280 mg kg -1 ). The experimental results for the reference material were inside the uncertainty of the certified values pointing out the accuracy of the method (95%). The determination of La, Ce, Pr, Nd, Pm, Sm, Eu, Tb, Dy, Ho, Er, Tm, Yb and Lu in the samples and reference material confirmed the versatility of the technique on REE determination in soil and phosphogypsum samples that are matrices for agricultural interest. (author)

  9. A Monte Carlo Sampling Technique for Multi-phonon Processes

    Energy Technology Data Exchange (ETDEWEB)

    Hoegberg, Thure

    1961-12-15

    A sampling technique for selecting scattering angle and energy gain in Monte Carlo calculations of neutron thermalization is described. It is supposed that the scattering is separated into processes involving different numbers of phonons. The number of phonons involved is first determined. Scattering angle and energy gain are then chosen by using special properties of the multi-phonon term.

  10. Comparison of sampling techniques for Rift Valley Fever virus ...

    African Journals Online (AJOL)

    We investigated mosquito sampling techniques with two types of traps and attractants at different time for trapping potential vectors for Rift Valley Fever virus. The study was conducted in six villages in Ngorongoro district in Tanzania from September to October 2012. A total of 1814 mosquitoes were collected, of which 738 ...

  11. Analytical techniques for measurement of 99Tc in environmental samples

    International Nuclear Information System (INIS)

    Anon.

    1979-01-01

    Three new methods have been developed for measuring 99 Tc in environmental samples. The most sensitive method is isotope dilution mass spectrometry, which allows measurement of about 1 x 10 -12 grams of 99 Tc. Results on analysis of five samples by this method compare very well with values obtained by a second independent method, which involves counting of beta particles from 99 Tc and internal conversion electrons from /sup 97m/Tc. A third method involving electrothermal atomic absorption has also been developed. Although this method is not as sensitive as the first two techniques, the cost per analysis is expected to be considerably less for certain types of samples

  12. Radioisotope Sample Measurement Techniques in Medicine and Biology. Proceedings of the Symposium on Radioisotope Sample Measurement Techniques

    International Nuclear Information System (INIS)

    1965-01-01

    The medical and biological applications of radioisotopes depend on two basically different types of measurements, those on living subjects in vivo and those on samples in vitro. The International Atomic Energy Agency has in the past held several meetings on in vivo measurement techniques, notably whole-body counting and radioisotope scanning. The present volume contains the Proceedings of the first Symposium the Agency has organized to discuss the various aspects of techniques for sample measurement in vitro. The range of these sample measurement techniques is very wide. The sample may weigh a few milligrams or several hundred grams, and may be in the gaseous, liquid or solid state. Its radioactive content may consist of a single, known radioisotope or several unknown ones. The concentration of radioactivity may be low, medium or high. The measurements may be made manually or automatically and any one of the many radiation detectors now available may be used. The 53 papers presented at the Symposium illustrate the great variety of methods now in use for radioactive- sample measurements. The first topic discussed is gamma-ray spectrometry, which finds an increasing number of applications in sample measurements. Other sections of the Proceedings deal with: the use of computers in gamma-ray spectrometry and multiple tracer techniques; recent developments in activation analysis where both gamma-ray spectrometry and computing techniques are applied; thin-layer and paper radio chromatographic techniques for use with low energy beta-ray emitters; various aspects of liquid scintillation counting techniques in the measurement of alpha- and beta-ray emitters, including chemical and colour quenching; autoradiographic techniques; calibration of equipment; and standardization of radioisotopes. Finally, some applications of solid-state detectors are presented; this section may be regarded as a preview of important future developments. The meeting was attended by 203 participants

  13. CONSISTENCY UNDER SAMPLING OF EXPONENTIAL RANDOM GRAPH MODELS.

    Science.gov (United States)

    Shalizi, Cosma Rohilla; Rinaldo, Alessandro

    2013-04-01

    The growing availability of network data and of scientific interest in distributed systems has led to the rapid development of statistical models of network structure. Typically, however, these are models for the entire network, while the data consists only of a sampled sub-network. Parameters for the whole network, which is what is of interest, are estimated by applying the model to the sub-network. This assumes that the model is consistent under sampling , or, in terms of the theory of stochastic processes, that it defines a projective family. Focusing on the popular class of exponential random graph models (ERGMs), we show that this apparently trivial condition is in fact violated by many popular and scientifically appealing models, and that satisfying it drastically limits ERGM's expressive power. These results are actually special cases of more general results about exponential families of dependent random variables, which we also prove. Using such results, we offer easily checked conditions for the consistency of maximum likelihood estimation in ERGMs, and discuss some possible constructive responses.

  14. Classification of Phishing Email Using Random Forest Machine Learning Technique

    OpenAIRE

    Akinyelu, Andronicus A.; Adewumi, Aderemi O.

    2013-01-01

    Phishing is one of the major challenges faced by the world of e-commerce today. Thanks to phishing attacks, billions of dollars have been lost by many companies and individuals. In 2012, an online report put the loss due to phishing attack at about $1.5 billion. This global impact of phishing attacks will continue to be on the increase and thus requires more efficient phishing detection techniques to curb the menace. This paper investigates and reports the use of random forest machine learnin...

  15. Application of digital sampling techniques to particle identification

    International Nuclear Information System (INIS)

    Bardelli, L.; Poggi, G.; Bini, M.; Carraresi, L.; Pasquali, G.; Taccetti, N.

    2003-01-01

    An application of digital sampling techniques is presented which can greatly simplify experiments involving sub-nanosecond time-mark determinations and energy measurements with nuclear detectors, used for Pulse Shape Analysis and Time of Flight measurements in heavy ion experiments. In this work a 100 M Sample/s, 12 bit analog to digital converter has been used: examples of this technique applied to Silicon and CsI(Tl) detectors in heavy-ions experiments involving particle identification via Pulse Shape analysis and Time of Flight measurements are presented. The system is suited for applications to large detector arrays and to different kinds of detectors. Some preliminary results regarding the simulation of current signals in Silicon detectors are also discussed. (authors)

  16. A secure cyclic steganographic technique for color images using randomization

    International Nuclear Information System (INIS)

    Muhammad, K.; Ahmad, J.; Rehman, N.U.

    2014-01-01

    Information Security is a major concern in today's modern era. Almost all the communicating bodies want the security, confidentiality and integrity of their personal data. But this security goal cannot be achieved easily when we are using an open network like internet. Steganography provides one of the best solutions to this problem. This paper represents a new Cyclic Steganographic Technique (CST) based on Least Significant Bit (LSB) for true color (RGB) images. The proposed method hides the secret data in the LSBs of cover image pixels in a randomized cyclic manner. The proposed technique is evaluated using both subjective and objective analysis using histograms changeability, Peak Signal-to-Noise Ratio (PSNR) and Mean Square Error (MSE). Experimentally it is found that the proposed method gives promising results in terms of security, imperceptibility and robustness as compared to some existent methods and vindicates this new algorithm. (author)

  17. Development of core sampling technique for ITER Type B radwaste

    Energy Technology Data Exchange (ETDEWEB)

    Kim, S. G.; Hong, K. P.; Oh, W. H.; Park, M. C.; Jung, S. H.; Ahn, S. B. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Type B radwaste (intermediate level and long lived radioactive waste) imported from ITER vacuum vessel are to be treated and stored in basement of hot cell building. The Type B radwaste treatment process is composed of buffer storage, cutting, sampling/tritium measurement, tritium removal, characterization, pre-packaging, inspection/decontamination, and storage etc. The cut slices of Type B radwaste components generated from cutting process undergo sampling process before and after tritium removal process. The purpose of sampling is to obtain small pieces of samples in order to investigate the tritium content and concentration of Type B radwaste. Core sampling, which is the candidates of sampling technique to be applied to ITER hot cell, is available for not thick (less than 50 mm) metal without use of coolant. Experimented materials were SS316L and CuCrZr in order to simulate ITER Type B radwaste. In core sampling, substantial secondary wastes from cutting chips will be produced unavoidably. Thus, core sampling machine will have to be equipped with disposal system such as suction equipment. Core sampling is considered an unfavorable method for tool wear compared to conventional drilling.

  18. Manipulation of biological samples using micro and nano techniques.

    Science.gov (United States)

    Castillo, Jaime; Dimaki, Maria; Svendsen, Winnie Edith

    2009-01-01

    The constant interest in handling, integrating and understanding biological systems of interest for the biomedical field, the pharmaceutical industry and the biomaterial researchers demand the use of techniques that allow the manipulation of biological samples causing minimal or no damage to their natural structure. Thanks to the advances in micro- and nanofabrication during the last decades several manipulation techniques offer us the possibility to image, characterize and manipulate biological material in a controlled way. Using these techniques the integration of biomaterials with remarkable properties with physical transducers has been possible, giving rise to new and highly sensitive biosensing devices. This article reviews the different techniques available to manipulate and integrate biological materials in a controlled manner either by sliding them along a surface (2-D manipulation), by grapping them and moving them to a new position (3-D manipulation), or by manipulating and relocating them applying external forces. The advantages and drawbacks are mentioned together with examples that reflect the state of the art of manipulation techniques for biological samples (171 references).

  19. Determination of palladium in biological samples applying nuclear analytical techniques

    International Nuclear Information System (INIS)

    Cavalcante, Cassio Q.; Sato, Ivone M.; Salvador, Vera L. R.; Saiki, Mitiko

    2008-01-01

    This study presents Pd determinations in bovine tissue samples containing palladium prepared in the laboratory, and CCQM-P63 automotive catalyst materials of the Proficiency Test, using instrumental thermal and epithermal neutron activation analysis and energy dispersive X-ray fluorescence techniques. Solvent extraction and solid phase extraction procedures were also applied to separate Pd from interfering elements before the irradiation in the nuclear reactor. The results obtained by different techniques were compared against each other to examine sensitivity, precision and accuracy. (author)

  20. Improved importance sampling technique for efficient simulation of digital communication systems

    Science.gov (United States)

    Lu, Dingqing; Yao, Kung

    1988-01-01

    A new, improved importance sampling (IIS) approach to simulation is considered. Some basic concepts of IS are introduced, and detailed evolutions of simulation estimation variances for Monte Carlo (MC) and IS simulations are given. The general results obtained from these evolutions are applied to the specific previously known conventional importance sampling (CIS) technique and the new IIS technique. The derivation for a linear system with no signal random memory is considered in some detail. For the CIS technique, the optimum input scaling parameter is found, while for the IIS technique, the optimum translation parameter is found. The results are generalized to a linear system with memory and signals. Specific numerical and simulation results are given which show the advantages of CIS over MC and IIS over CIS for simulations of digital communications systems.

  1. A Table-Based Random Sampling Simulation for Bioluminescence Tomography

    Directory of Open Access Journals (Sweden)

    Xiaomeng Zhang

    2006-01-01

    Full Text Available As a popular simulation of photon propagation in turbid media, the main problem of Monte Carlo (MC method is its cumbersome computation. In this work a table-based random sampling simulation (TBRS is proposed. The key idea of TBRS is to simplify multisteps of scattering to a single-step process, through randomly table querying, thus greatly reducing the computing complexity of the conventional MC algorithm and expediting the computation. The TBRS simulation is a fast algorithm of the conventional MC simulation of photon propagation. It retained the merits of flexibility and accuracy of conventional MC method and adapted well to complex geometric media and various source shapes. Both MC simulations were conducted in a homogeneous medium in our work. Also, we present a reconstructing approach to estimate the position of the fluorescent source based on the trial-and-error theory as a validation of the TBRS algorithm. Good agreement is found between the conventional MC simulation and the TBRS simulation.

  2. Proteomic Challenges: Sample Preparation Techniques for Microgram-Quantity Protein Analysis from Biological Samples

    Science.gov (United States)

    Feist, Peter; Hummon, Amanda B.

    2015-01-01

    Proteins regulate many cellular functions and analyzing the presence and abundance of proteins in biological samples are central focuses in proteomics. The discovery and validation of biomarkers, pathways, and drug targets for various diseases can be accomplished using mass spectrometry-based proteomics. However, with mass-limited samples like tumor biopsies, it can be challenging to obtain sufficient amounts of proteins to generate high-quality mass spectrometric data. Techniques developed for macroscale quantities recover sufficient amounts of protein from milligram quantities of starting material, but sample losses become crippling with these techniques when only microgram amounts of material are available. To combat this challenge, proteomicists have developed micro-scale techniques that are compatible with decreased sample size (100 μg or lower) and still enable excellent proteome coverage. Extraction, contaminant removal, protein quantitation, and sample handling techniques for the microgram protein range are reviewed here, with an emphasis on liquid chromatography and bottom-up mass spectrometry-compatible techniques. Also, a range of biological specimens, including mammalian tissues and model cell culture systems, are discussed. PMID:25664860

  3. Proteomic challenges: sample preparation techniques for microgram-quantity protein analysis from biological samples.

    Science.gov (United States)

    Feist, Peter; Hummon, Amanda B

    2015-02-05

    Proteins regulate many cellular functions and analyzing the presence and abundance of proteins in biological samples are central focuses in proteomics. The discovery and validation of biomarkers, pathways, and drug targets for various diseases can be accomplished using mass spectrometry-based proteomics. However, with mass-limited samples like tumor biopsies, it can be challenging to obtain sufficient amounts of proteins to generate high-quality mass spectrometric data. Techniques developed for macroscale quantities recover sufficient amounts of protein from milligram quantities of starting material, but sample losses become crippling with these techniques when only microgram amounts of material are available. To combat this challenge, proteomicists have developed micro-scale techniques that are compatible with decreased sample size (100 μg or lower) and still enable excellent proteome coverage. Extraction, contaminant removal, protein quantitation, and sample handling techniques for the microgram protein range are reviewed here, with an emphasis on liquid chromatography and bottom-up mass spectrometry-compatible techniques. Also, a range of biological specimens, including mammalian tissues and model cell culture systems, are discussed.

  4. Proteomic Challenges: Sample Preparation Techniques for Microgram-Quantity Protein Analysis from Biological Samples

    Directory of Open Access Journals (Sweden)

    Peter Feist

    2015-02-01

    Full Text Available Proteins regulate many cellular functions and analyzing the presence and abundance of proteins in biological samples are central focuses in proteomics. The discovery and validation of biomarkers, pathways, and drug targets for various diseases can be accomplished using mass spectrometry-based proteomics. However, with mass-limited samples like tumor biopsies, it can be challenging to obtain sufficient amounts of proteins to generate high-quality mass spectrometric data. Techniques developed for macroscale quantities recover sufficient amounts of protein from milligram quantities of starting material, but sample losses become crippling with these techniques when only microgram amounts of material are available. To combat this challenge, proteomicists have developed micro-scale techniques that are compatible with decreased sample size (100 μg or lower and still enable excellent proteome coverage. Extraction, contaminant removal, protein quantitation, and sample handling techniques for the microgram protein range are reviewed here, with an emphasis on liquid chromatography and bottom-up mass spectrometry-compatible techniques. Also, a range of biological specimens, including mammalian tissues and model cell culture systems, are discussed.

  5. Development of environmental sample analysis techniques for safeguards

    International Nuclear Information System (INIS)

    Magara, Masaaki; Hanzawa, Yukiko; Esaka, Fumitaka

    1999-01-01

    JAERI has been developing environmental sample analysis techniques for safeguards and preparing a clean chemistry laboratory with clean rooms. Methods to be developed are a bulk analysis and a particle analysis. In the bulk analysis, Inductively-Coupled Plasma Mass Spectrometer or Thermal Ionization Mass Spectrometer are used to measure nuclear materials after chemical treatment of sample. In the particle analysis, Electron Probe Micro Analyzer and Secondary Ion Mass Spectrometer are used for elemental analysis and isotopic analysis, respectively. The design of the clean chemistry laboratory has been carried out and construction will be completed by the end of March, 2001. (author)

  6. Nuclear analytical techniques and their application to environmental samples

    International Nuclear Information System (INIS)

    Lieser, K.H.

    1986-01-01

    A survey is given on nuclear analytical techniques and their application to environmental samples. Measurement of the inherent radioactivity of elements or radionuclides allows determination of natural radioelements (e.g. Ra), man-made radioelements (e.g. Pu) and radionuclides in the environment. Activation analysis, in particular instrumental neutron activation analysis, is a very reliable and sensitive method for determination of a great number of trace elements in environmental samples, because the most abundant main constituents are not activated. Tracer techniques are very useful for studies of the behaviour and of chemical reactions of trace elements and compounds in the environment. Radioactive sources are mainly applied for excitation of characteristic X-rays (X-ray fluorescence analysis). (author)

  7. Improved mesh based photon sampling techniques for neutron activation analysis

    International Nuclear Information System (INIS)

    Relson, E.; Wilson, P. P. H.; Biondo, E. D.

    2013-01-01

    The design of fusion power systems requires analysis of neutron activation of large, complex volumes, and the resulting particles emitted from these volumes. Structured mesh-based discretization of these problems allows for improved modeling in these activation analysis problems. Finer discretization of these problems results in large computational costs, which drives the investigation of more efficient methods. Within an ad hoc subroutine of the Monte Carlo transport code MCNP, we implement sampling of voxels and photon energies for volumetric sources using the alias method. The alias method enables efficient sampling of a discrete probability distribution, and operates in 0(1) time, whereas the simpler direct discrete method requires 0(log(n)) time. By using the alias method, voxel sampling becomes a viable alternative to sampling space with the 0(1) approach of uniformly sampling the problem volume. Additionally, with voxel sampling it is straightforward to introduce biasing of volumetric sources, and we implement this biasing of voxels as an additional variance reduction technique that can be applied. We verify our implementation and compare the alias method, with and without biasing, to direct discrete sampling of voxels, and to uniform sampling. We study the behavior of source biasing in a second set of tests and find trends between improvements and source shape, material, and material density. Overall, however, the magnitude of improvements from source biasing appears to be limited. Future work will benefit from the implementation of efficient voxel sampling - particularly with conformal unstructured meshes where the uniform sampling approach cannot be applied. (authors)

  8. IMAGE SEGMENTATION BASED ON MARKOV RANDOM FIELD AND WATERSHED TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    This paper presented a method that incorporates Markov Random Field(MRF), watershed segmentation and merging techniques for performing image segmentation and edge detection tasks. MRF is used to obtain an initial estimate of x regions in the image under process where in MRF model, gray level x, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The process needs an initial segmented result. An initial segmentation is got based on K-means clustering technique and the minimum distance, then the region process in modeled by MRF to obtain an image contains different intensity regions. Starting from this we calculate the gradient values of that image and then employ a watershed technique. When using MRF method it obtains an image that has different intensity regions and has all the edge and region information, then it improves the segmentation result by superimpose closed and an accurate boundary of each region using watershed algorithm. After all pixels of the segmented regions have been processed, a map of primitive region with edges is generated. Finally, a merge process based on averaged mean values is employed. The final segmentation and edge detection result is one closed boundary per actual region in the image.

  9. Path integral methods for primordial density perturbations - sampling of constrained Gaussian random fields

    International Nuclear Information System (INIS)

    Bertschinger, E.

    1987-01-01

    Path integrals may be used to describe the statistical properties of a random field such as the primordial density perturbation field. In this framework the probability distribution is given for a Gaussian random field subjected to constraints such as the presence of a protovoid or supercluster at a specific location in the initial conditions. An algorithm has been constructed for generating samples of a constrained Gaussian random field on a lattice using Monte Carlo techniques. The method makes possible a systematic study of the density field around peaks or other constrained regions in the biased galaxy formation scenario, and it is effective for generating initial conditions for N-body simulations with rare objects in the computational volume. 21 references

  10. A Comparison of Soil-Water Sampling Techniques

    Science.gov (United States)

    Tindall, J. A.; Figueroa-Johnson, M.; Friedel, M. J.

    2007-12-01

    The representativeness of soil pore water extracted by suction lysimeters in ground-water monitoring studies is a problem that often confounds interpretation of measured data. Current soil water sampling techniques cannot identify the soil volume from which a pore water sample is extracted, neither macroscopic, microscopic, or preferential flowpath. This research was undertaken to compare values of extracted suction lysimeters samples from intact soil cores with samples obtained by the direct extraction methods to determine what portion of soil pore water is sampled by each method. Intact soil cores (30 centimeter (cm) diameter by 40 cm height) were extracted from two different sites - a sandy soil near Altamonte Springs, Florida and a clayey soil near Centralia in Boone County, Missouri. Isotopically labeled water (O18? - analyzed by mass spectrometry) and bromide concentrations (KBr- - measured using ion chromatography) from water samples taken by suction lysimeters was compared with samples obtained by direct extraction methods of centrifugation and azeotropic distillation. Water samples collected by direct extraction were about 0.25 ? more negative (depleted) than that collected by suction lysimeter values from a sandy soil and about 2-7 ? more negative from a well structured clayey soil. Results indicate that the majority of soil water in well-structured soil is strongly bound to soil grain surfaces and is not easily sampled by suction lysimeters. In cases where a sufficient volume of water has passed through the soil profile and displaced previous pore water, suction lysimeters will collect a representative sample of soil pore water from the sampled depth interval. It is suggested that for stable isotope studies monitoring precipitation and soil water, suction lysimeter should be installed at shallow depths (10 cm). Samples should also be coordinated with precipitation events. The data also indicate that each extraction method be use to sample a different

  11. Comparison of correlation analysis techniques for irregularly sampled time series

    Directory of Open Access Journals (Sweden)

    K. Rehfeld

    2011-06-01

    Full Text Available Geoscientific measurements often provide time series with irregular time sampling, requiring either data reconstruction (interpolation or sophisticated methods to handle irregular sampling. We compare the linear interpolation technique and different approaches for analyzing the correlation functions and persistence of irregularly sampled time series, as Lomb-Scargle Fourier transformation and kernel-based methods. In a thorough benchmark test we investigate the performance of these techniques.

    All methods have comparable root mean square errors (RMSEs for low skewness of the inter-observation time distribution. For high skewness, very irregular data, interpolation bias and RMSE increase strongly. We find a 40 % lower RMSE for the lag-1 autocorrelation function (ACF for the Gaussian kernel method vs. the linear interpolation scheme,in the analysis of highly irregular time series. For the cross correlation function (CCF the RMSE is then lower by 60 %. The application of the Lomb-Scargle technique gave results comparable to the kernel methods for the univariate, but poorer results in the bivariate case. Especially the high-frequency components of the signal, where classical methods show a strong bias in ACF and CCF magnitude, are preserved when using the kernel methods.

    We illustrate the performances of interpolation vs. Gaussian kernel method by applying both to paleo-data from four locations, reflecting late Holocene Asian monsoon variability as derived from speleothem δ18O measurements. Cross correlation results are similar for both methods, which we attribute to the long time scales of the common variability. The persistence time (memory is strongly overestimated when using the standard, interpolation-based, approach. Hence, the Gaussian kernel is a reliable and more robust estimator with significant advantages compared to other techniques and suitable for large scale application to paleo-data.

  12. Separation Techniques for Quantification of Radionuclides in Environmental Samples

    Directory of Open Access Journals (Sweden)

    Dusan Galanda

    2009-01-01

    Full Text Available The reliable and quantitative measurement of radionuclides is important in order to determine environmental quality and radiation safety, and to monitor regulatory compliance. We examined soil samples from Podunajske Biskupice, near the city of Bratislava in the Slovak Republic, for the presence of several natural (238U, 232Th, 40K and anthropogenic (137Cs, 90Sr, 239Pu, 240Pu, 241Am radionuclides. The area is adjacent to a refinery and hazardous waste processing center, as well as the municipal incinerator plant, and so might possess an unusually high level of ecotoxic metals. We found that the levels of both naturally occurring and anthropogenic radionuclides fell within the expected ranges, indicating that these facilities pose no radiological threat to the local environment. During the course of our analysis, we modified existing techniques in order to allow us to handle the unusually large and complex samples that were needed to determine the levels of 239Pu, 240Pu, and 241Am activity. We also rated three commercial techniques for the separation of 90Sr from aqueous solutions and found that two of them, AnaLig Sr-01 and Empore Extraction Disks, were suitable for the quantitative and reliable separation of 90Sr, while the third, Sr-Spec Resin, was less so. The main criterion in evaluating these methods was the chemical recovery of 90Sr, which was less than we had expected. We also considered speed of separation and additional steps needed to prepare the sample for separation.

  13. Nonactivation interaction techniques in the analysis of environmental samples

    International Nuclear Information System (INIS)

    Tolgyessy, J.

    1986-01-01

    Nonactivation interaction analytical methods are based on the interaction processes of nuclear and X-ray radiation with a sample, leading to their absorption and backscattering, to the ionization of gases or excitation of fluorescent X-ray by radiation, but not to the activation of determined elements. From the point of view of environmental analysis, the most useful nonactivation interaction techniques are X-ray fluorescence by photon or charged particle excitation, ionization of gases by nuclear radiation, elastic scattering of charged particles and backscattering of beta radiation. The significant advantage of these methods is that they are nondestructive. (author)

  14. The electron transport problem sampling by Monte Carlo individual collision technique

    International Nuclear Information System (INIS)

    Androsenko, P.A.; Belousov, V.I.

    2005-01-01

    The problem of electron transport is of most interest in all fields of the modern science. To solve this problem the Monte Carlo sampling has to be used. The electron transport is characterized by a large number of individual interactions. To simulate electron transport the 'condensed history' technique may be used where a large number of collisions are grouped into a single step to be sampled randomly. Another kind of Monte Carlo sampling is the individual collision technique. In comparison with condensed history technique researcher has the incontestable advantages. For example one does not need to give parameters altered by condensed history technique like upper limit for electron energy, resolution, number of sub-steps etc. Also the condensed history technique may lose some very important tracks of electrons because of its limited nature by step parameters of particle movement and due to weakness of algorithms for example energy indexing algorithm. There are no these disadvantages in the individual collision technique. This report presents some sampling algorithms of new version BRAND code where above mentioned technique is used. All information on electrons was taken from Endf-6 files. They are the important part of BRAND. These files have not been processed but directly taken from electron information source. Four kinds of interaction like the elastic interaction, the Bremsstrahlung, the atomic excitation and the atomic electro-ionization were considered. In this report some results of sampling are presented after comparison with analogs. For example the endovascular radiotherapy problem (P2) of QUADOS2002 was presented in comparison with another techniques that are usually used. (authors)

  15. The electron transport problem sampling by Monte Carlo individual collision technique

    Energy Technology Data Exchange (ETDEWEB)

    Androsenko, P.A.; Belousov, V.I. [Obninsk State Technical Univ. of Nuclear Power Engineering, Kaluga region (Russian Federation)

    2005-07-01

    The problem of electron transport is of most interest in all fields of the modern science. To solve this problem the Monte Carlo sampling has to be used. The electron transport is characterized by a large number of individual interactions. To simulate electron transport the 'condensed history' technique may be used where a large number of collisions are grouped into a single step to be sampled randomly. Another kind of Monte Carlo sampling is the individual collision technique. In comparison with condensed history technique researcher has the incontestable advantages. For example one does not need to give parameters altered by condensed history technique like upper limit for electron energy, resolution, number of sub-steps etc. Also the condensed history technique may lose some very important tracks of electrons because of its limited nature by step parameters of particle movement and due to weakness of algorithms for example energy indexing algorithm. There are no these disadvantages in the individual collision technique. This report presents some sampling algorithms of new version BRAND code where above mentioned technique is used. All information on electrons was taken from Endf-6 files. They are the important part of BRAND. These files have not been processed but directly taken from electron information source. Four kinds of interaction like the elastic interaction, the Bremsstrahlung, the atomic excitation and the atomic electro-ionization were considered. In this report some results of sampling are presented after comparison with analogs. For example the endovascular radiotherapy problem (P2) of QUADOS2002 was presented in comparison with another techniques that are usually used. (authors)

  16. Waste minimization in analytical chemistry through innovative sample preparation techniques

    International Nuclear Information System (INIS)

    Smith, L. L.

    1998-01-01

    water samples. In this SPME technique, a fused-silica fiber coated with a polymeric film is exposed to the sample, extraction is allowed to take place, and then the analytes are thermally desorbed for GC analysis. Unlike liquid-liquid extraction or solid-phase extraction, SPME consumes all of the extracted sample in the analysis, significantly reducing the required sample volume

  17. Revisiting random walk based sampling in networks: evasion of burn-in period and frequent regenerations.

    Science.gov (United States)

    Avrachenkov, Konstantin; Borkar, Vivek S; Kadavankandy, Arun; Sreedharan, Jithin K

    2018-01-01

    In the framework of network sampling, random walk (RW) based estimation techniques provide many pragmatic solutions while uncovering the unknown network as little as possible. Despite several theoretical advances in this area, RW based sampling techniques usually make a strong assumption that the samples are in stationary regime, and hence are impelled to leave out the samples collected during the burn-in period. This work proposes two sampling schemes without burn-in time constraint to estimate the average of an arbitrary function defined on the network nodes, for example, the average age of users in a social network. The central idea of the algorithms lies in exploiting regeneration of RWs at revisits to an aggregated super-node or to a set of nodes, and in strategies to enhance the frequency of such regenerations either by contracting the graph or by making the hitting set larger. Our first algorithm, which is based on reinforcement learning (RL), uses stochastic approximation to derive an estimator. This method can be seen as intermediate between purely stochastic Markov chain Monte Carlo iterations and deterministic relative value iterations. The second algorithm, which we call the Ratio with Tours (RT)-estimator, is a modified form of respondent-driven sampling (RDS) that accommodates the idea of regeneration. We study the methods via simulations on real networks. We observe that the trajectories of RL-estimator are much more stable than those of standard random walk based estimation procedures, and its error performance is comparable to that of respondent-driven sampling (RDS) which has a smaller asymptotic variance than many other estimators. Simulation studies also show that the mean squared error of RT-estimator decays much faster than that of RDS with time. The newly developed RW based estimators (RL- and RT-estimators) allow to avoid burn-in period, provide better control of stability along the sample path, and overall reduce the estimation time. Our

  18. Stratified source-sampling techniques for Monte Carlo eigenvalue analysis

    International Nuclear Information System (INIS)

    Mohamed, A.

    1998-01-01

    In 1995, at a conference on criticality safety, a special session was devoted to the Monte Carlo ''Eigenvalue of the World'' problem. Argonne presented a paper, at that session, in which the anomalies originally observed in that problem were reproduced in a much simplified model-problem configuration, and removed by a version of stratified source-sampling. In this paper, stratified source-sampling techniques are generalized and applied to three different Eigenvalue of the World configurations which take into account real-world statistical noise sources not included in the model problem, but which differ in the amount of neutronic coupling among the constituents of each configuration. It is concluded that, in Monte Carlo eigenvalue analysis of loosely-coupled arrays, the use of stratified source-sampling reduces the probability of encountering an anomalous result over that if conventional source-sampling methods are used. However, this gain in reliability is substantially less than that observed in the model-problem results

  19. Identification of unknown sample using NAA, EDXRF, XRD techniques

    International Nuclear Information System (INIS)

    Dalvi, Aditi A.; Swain, K.K.; Chavan, Trupti; Remya Devi, P.S.; Wagh, D.N.; Verma, R.

    2015-01-01

    Analytical Chemistry Division (ACD), Bhabha Atomic Research Centre (BARC) receives samples from law enforcement agencies such as Directorate of Revenue Intelligence, Customs for analysis. Five unknown grey powdered samples were received for identification and were suspected to be Iridium (Ir). Identification of unknown sample is always a challenging task and suitable analytical techniques have to be judiciously utilized for arriving at the conclusion. Qualitative analysis was carried out using Jordan Valley, EX-3600 M Energy dispersive X-ray fluorescence (EDXRF) spectrometer at ACD, BARC. A SLP series LEO Si (Li) detector (active area: 30 mm 2 ; thickness: 3.5 mm; resolution: 140 eV at 5.9 keV of Mn K X-ray) was used during the measurement and only characteristic X-rays of Ir (Lα: 9.17 keV and Lβ: 10.70 keV) was seen in the X-ray spectrum. X-ray diffraction (XRD) measurement results indicated that the Ir was in the form of metal. To confirm the XRD data, neutron activation analysis (NAA) was carried out by irradiating samples and elemental standards (as comparator) in graphite reflector position of Advanced Heavy Water Reactor Critical Facility (AHWR CF) reactor, BARC, Mumbai. After suitable decay period, gamma activity measurements were carried out using 45% HPGe detector coupled to 8 k multi channel analyzer. Characteristic gamma line at 328.4 keV of the activation product 194 Ir was used for quantification of iridium and relative method of NAA was used for concentration calculations. NAA results confirmed that all the samples were Iridium metal. (author)

  20. Classification of Phishing Email Using Random Forest Machine Learning Technique

    Directory of Open Access Journals (Sweden)

    Andronicus A. Akinyelu

    2014-01-01

    Full Text Available Phishing is one of the major challenges faced by the world of e-commerce today. Thanks to phishing attacks, billions of dollars have been lost by many companies and individuals. In 2012, an online report put the loss due to phishing attack at about $1.5 billion. This global impact of phishing attacks will continue to be on the increase and thus requires more efficient phishing detection techniques to curb the menace. This paper investigates and reports the use of random forest machine learning algorithm in classification of phishing attacks, with the major objective of developing an improved phishing email classifier with better prediction accuracy and fewer numbers of features. From a dataset consisting of 2000 phishing and ham emails, a set of prominent phishing email features (identified from the literature were extracted and used by the machine learning algorithm with a resulting classification accuracy of 99.7% and low false negative (FN and false positive (FP rates.

  1. Determination of metals in air samples using X-Ray fluorescence associated the APDC preconcentration technique

    Energy Technology Data Exchange (ETDEWEB)

    Nardes, Raysa C.; Santos, Ramon S.; Sanches, Francis A.C.R.A.; Gama Filho, Hamilton S.; Oliveira, Davi F.; Anjos, Marcelino J., E-mail: rc.nardes@gmail.com, E-mail: ramonziosp@yahoo.com.br, E-mail: francissanches@gmail.com, E-mail: hamiltongamafilho@hotmail.com, E-mail: davi.oliveira@uerj.br, E-mail: marcelin@uerj.br [Universidade do Estado do Rio de Janeiro (UERJ), Rio de Janeiro, RJ (Brazil). Instituto de Fisica. Departamento de Fisica Aplicada e Termodinamica

    2015-07-01

    Air pollution has become one of the leading quality degradation factors of life for people in large urban centers. Studies indicate that the suspended particulate matter in the atmosphere is directly associated with risks to public health, in addition, it can cause damage to fauna, flora and public / cultural patrimonies. The inhalable particulate materials can cause the emergence and / or worsening of chronic diseases related to respiratory system and other diseases, such as reduced physical strength. In this study, we propose a new method to measure the concentration of total suspended particulate matter (TSP) in the air using an impinger as an air cleaning apparatus, preconcentration with APDC and Total Reflection X-ray Fluorescence technique (TXRF) to analyze the heavy metals present in the air. The samples were collected from five random points in the city of Rio de Janeiro/Brazil. Analyses of TXRF were performed at the Brazilian Synchrotron Light Laboratory (LNLS). The technique proved viable because it was able to detect five important metallic elements to environmental studies: Cr, Fe, Ni, Cu and Zn. This technique presented substantial efficiency in determining the elementary concentration of air pollutants, in addition to low cost. It can be concluded that the metals analysis technique in air samples using an impinger as sample collection instrument associated with a complexing agent (APDC) was viable because it is a low-cost technique, moreover, it was possible the detection of five important metal elements in environmental studies associated with industrial emissions and urban traffic. (author)

  2. Determination of metals in air samples using X-Ray fluorescence associated the APDC preconcentration technique

    International Nuclear Information System (INIS)

    Nardes, Raysa C.; Santos, Ramon S.; Sanches, Francis A.C.R.A.; Gama Filho, Hamilton S.; Oliveira, Davi F.; Anjos, Marcelino J.

    2015-01-01

    Air pollution has become one of the leading quality degradation factors of life for people in large urban centers. Studies indicate that the suspended particulate matter in the atmosphere is directly associated with risks to public health, in addition, it can cause damage to fauna, flora and public / cultural patrimonies. The inhalable particulate materials can cause the emergence and / or worsening of chronic diseases related to respiratory system and other diseases, such as reduced physical strength. In this study, we propose a new method to measure the concentration of total suspended particulate matter (TSP) in the air using an impinger as an air cleaning apparatus, preconcentration with APDC and Total Reflection X-ray Fluorescence technique (TXRF) to analyze the heavy metals present in the air. The samples were collected from five random points in the city of Rio de Janeiro/Brazil. Analyses of TXRF were performed at the Brazilian Synchrotron Light Laboratory (LNLS). The technique proved viable because it was able to detect five important metallic elements to environmental studies: Cr, Fe, Ni, Cu and Zn. This technique presented substantial efficiency in determining the elementary concentration of air pollutants, in addition to low cost. It can be concluded that the metals analysis technique in air samples using an impinger as sample collection instrument associated with a complexing agent (APDC) was viable because it is a low-cost technique, moreover, it was possible the detection of five important metal elements in environmental studies associated with industrial emissions and urban traffic. (author)

  3. Elemental analysis of brazing alloy samples by neutron activation technique

    International Nuclear Information System (INIS)

    Eissa, E.A.; Rofail, N.B.; Hassan, A.M.; El-Shershaby, A.; Walley El-Dine, N.

    1996-01-01

    Two brazing alloy samples (C P 2 and C P 3 ) have been investigated by Neutron activation analysis (NAA) technique in order to identify and estimate their constituent elements. The pneumatic irradiation rabbit system (PIRS), installed at the first egyptian research reactor (ETRR-1) was used for short-time irradiation (30 s) with a thermal neutron flux of 1.6 x 10 1 1 n/cm 2 /s in the reactor reflector, where the thermal to epithermal neutron flux ratio is 106. Long-time irradiation (48 hours) was performed at reactor core periphery with thermal neutron flux of 3.34 x 10 1 2 n/cm 2 /s, and thermal to epithermal neutron flux ratio of 79. Activation by epithermal neutrons was taken into account for the (1/v) and resonance neutron absorption in both methods. A hyper pure germanium detection system was used for gamma-ray acquisitions. The concentration values of Al, Cr, Fe, Co, Cu, Zn, Se, Ag and Sb were estimated as percentages of the sample weight and compared with reported values. 1 tab

  4. Analysis of soil samples from Gebeng area using NAA technique

    Science.gov (United States)

    Elias, Md Suhaimi; Wo, Yii Mei; Hamzah, Mohd Suhaimi; Shukor, Shakirah Abd; Rahman, Shamsiah Ab; Salim, Nazaratul Ashifa Abdullah; Azman, Muhamad Azfar; Hashim, Azian

    2017-01-01

    Rapid development and urbanization will increase number of residence and industrial area. Without proper management and control of pollution, these will give an adverse effect to environment and human life. The objective of this study to identify and quantify key contaminants into the environment of the Gebeng area as a result of industrial and human activities. Gebeng area was gazetted as one of the industrial estate in Pahang state. Assessment of elemental pollution in soil of Gebeng area base on level of concentration, enrichment factor and geo-accumulation index. The enrichment factors (EFs) were determined by the elemental rationing method, whilst the geo-accumulation index (Igeo) by comparing of current to continental crustal average concentration of element. Twenty-seven of soil samples were collected from Gebeng area. Soil samples were analysed by using Neutron Activation Analyses (NAA) technique. The obtained data showed higher concentration of iron (Fe) due to abundance in soil compared to other elements. The results of enrichment factor showed that Gebeng area have enrich with elements of As, Br, Hf, Sb, Th and U. Base on the geo-accumulation index (Igeo) classification, the soil quality of Gebeng area can be classified as class 0, (uncontaminated) to Class 3, (moderately to heavily contaminated).

  5. Elemental analysis of brazing alloy samples by neutron activation technique

    Energy Technology Data Exchange (ETDEWEB)

    Eissa, E A; Rofail, N B; Hassan, A M [Reactor and Neutron physics Department, Nuclear Research Centre, Atomic Energy Authority, Cairo (Egypt); El-Shershaby, A; Walley El-Dine, N [Physics Department, Faculty of Girls, Ain Shams Universty, Cairo (Egypt)

    1997-12-31

    Two brazing alloy samples (C P{sup 2} and C P{sup 3}) have been investigated by Neutron activation analysis (NAA) technique in order to identify and estimate their constituent elements. The pneumatic irradiation rabbit system (PIRS), installed at the first egyptian research reactor (ETRR-1) was used for short-time irradiation (30 s) with a thermal neutron flux of 1.6 x 10{sup 1}1 n/cm{sup 2}/s in the reactor reflector, where the thermal to epithermal neutron flux ratio is 106. Long-time irradiation (48 hours) was performed at reactor core periphery with thermal neutron flux of 3.34 x 10{sup 1}2 n/cm{sup 2}/s, and thermal to epithermal neutron flux ratio of 79. Activation by epithermal neutrons was taken into account for the (1/v) and resonance neutron absorption in both methods. A hyper pure germanium detection system was used for gamma-ray acquisitions. The concentration values of Al, Cr, Fe, Co, Cu, Zn, Se, Ag and Sb were estimated as percentages of the sample weight and compared with reported values. 1 tab.

  6. Active sampling technique to enhance chemical signature of buried explosives

    Science.gov (United States)

    Lovell, John S.; French, Patrick D.

    2004-09-01

    Deminers and dismounted countermine engineers commonly use metal detectors, ground penetrating radar and probes to locate mines. Many modern landmines have a very low metal content, which severely limits the effectiveness of metal detectors. Canines have also been used for landmine detection for decades. Experiments have shown that canines smell the explosives which are known to leak from most types of landmines. The fact that dogs can detect landmines indicates that vapor sensing is a viable approach to landmine detection. Several groups are currently developing systems to detect landmines by "sniffing" for the ultra-trace explosive vapors above the soil. The amount of material that is available to passive vapor sensing systems is limited to no more than the vapor in equilibrium with the explosive related chemicals (ERCs) distributed in the surface soils over and near the landmine. The low equilibrium vapor pressure of TNT in the soil/atmosphere boundary layer and the limited volume of the boundary layer air imply that passive chemical vapor sensing systems require sensitivities in the picogram range, or lower. ADA is working to overcome many of the limitations of passive sampling methods, by the use of an active sampling method that employs a high-powered (1,200+ joules) strobe lamp to create a highly amplified plume of vapor and/or ERC-bearing fine particulates. Initial investigations have demonstrated that this approach can amplify the detectability of TNT by two or three orders of magnitude. This new active sampling technique could be used with any suitable explosive sensor.

  7. ANALYSIS OF MONTE CARLO SIMULATION SAMPLING TECHNIQUES ON SMALL SIGNAL STABILITY OF WIND GENERATOR- CONNECTED POWER SYSTEM

    Directory of Open Access Journals (Sweden)

    TEMITOPE RAPHAEL AYODELE

    2016-04-01

    Full Text Available Monte Carlo simulation using Simple Random Sampling (SRS technique is popularly known for its ability to handle complex uncertainty problems. However, to produce a reasonable result, it requires huge sample size. This makes it to be computationally expensive, time consuming and unfit for online power system applications. In this article, the performance of Latin Hypercube Sampling (LHS technique is explored and compared with SRS in term of accuracy, robustness and speed for small signal stability application in a wind generator-connected power system. The analysis is performed using probabilistic techniques via eigenvalue analysis on two standard networks (Single Machine Infinite Bus and IEEE 16–machine 68 bus test system. The accuracy of the two sampling techniques is determined by comparing their different sample sizes with the IDEAL (conventional. The robustness is determined based on a significant variance reduction when the experiment is repeated 100 times with different sample sizes using the two sampling techniques in turn. Some of the results show that sample sizes generated from LHS for small signal stability application produces the same result as that of the IDEAL values starting from 100 sample size. This shows that about 100 sample size of random variable generated using LHS method is good enough to produce reasonable results for practical purpose in small signal stability application. It is also revealed that LHS has the least variance when the experiment is repeated 100 times compared to SRS techniques. This signifies the robustness of LHS over that of SRS techniques. 100 sample size of LHS produces the same result as that of the conventional method consisting of 50000 sample size. The reduced sample size required by LHS gives it computational speed advantage (about six times over the conventional method.

  8. The contribution of simple random sampling to observed variations in faecal egg counts.

    Science.gov (United States)

    Torgerson, Paul R; Paul, Michaela; Lewis, Fraser I

    2012-09-10

    It has been over 100 years since the classical paper published by Gosset in 1907, under the pseudonym "Student", demonstrated that yeast cells suspended in a fluid and measured by a haemocytometer conformed to a Poisson process. Similarly parasite eggs in a faecal suspension also conform to a Poisson process. Despite this there are common misconceptions how to analyse or interpret observations from the McMaster or similar quantitative parasitic diagnostic techniques, widely used for evaluating parasite eggs in faeces. The McMaster technique can easily be shown from a theoretical perspective to give variable results that inevitably arise from the random distribution of parasite eggs in a well mixed faecal sample. The Poisson processes that lead to this variability are described and illustrative examples of the potentially large confidence intervals that can arise from observed faecal eggs counts that are calculated from the observations on a McMaster slide. Attempts to modify the McMaster technique, or indeed other quantitative techniques, to ensure uniform egg counts are doomed to failure and belie ignorance of Poisson processes. A simple method to immediately identify excess variation/poor sampling from replicate counts is provided. Copyright © 2012 Elsevier B.V. All rights reserved.

  9. Development of analytical techniques for water and environmental samples (2)

    Energy Technology Data Exchange (ETDEWEB)

    Eum, Chul Hun; Jeon, Chi Wan; Jung, Kang Sup; Song, Kyung Sun; Kim, Sang Yeon [Korea Institute of Geology Mining and Materials, Taejon (Korea)

    1998-12-01

    The purpose of this study is to develop new analytical methods with good detection limit for toxic inorganic and organic compounds. The analyses of CN, organic acids, particulate materials in environmental samples have been done using several methods such as Ion Chromatography, SPE, SPME, GC/MS, GC/FID, SPLITT (split-flow thin cell fractionation) during the second year of this project. Advantage and disadvantage of several distillation method (by KS, JIS, EPA) for CN analysis in wastewater were investigated. As the results, we proposed new distillation apparatus for CN analysis, which was proved to be simpler, faster and to get better recovery than conventional apparatus. And ion chromatograph/pulsed amperometric detector (IC/PAD) system instead of colorimetry for CN detection was setup to solve matrix interference. And SPE(solid phase extraction) and SPME (solid phase micro extraction) as liquid-solid extraction technique were applied to the analysis of phenols in wastewater. Optimum experimental conditions and factors influencing analytical results were determined. From these results, It could be concluded that C{sub 18} cartridge and polystyrene-divinylbenzene disk in SPE method, polyacrylate fiber in SPME were proper solid phase adsorbent for phenol. Optimum conditions to analyze phenol derivatives simultaneously were established. Also, Continuous SPLITT (Split-flow thin cell) Fractionation (CSF) is a new preparative separation technique that is useful for fractionation of particulate and macromolecular materials. CSF is carried out in a thin ribbon-like channel equipped with two splitters at both inlet and outlet of the channel. In this work, we set up a new CSF system, and tested using polystyrene latex standard particles. And then we fractionated particles contained in air and underground water based on their sedimentation coefficients using CSF. (author). 27 refs., 13 tabs., 31 figs.

  10. Voltammetric technique, a panacea for analytical examination of environmental samples

    International Nuclear Information System (INIS)

    Zahir, E.; Mohiuddin, S.; Naqvi, I.I.

    2012-01-01

    Voltammetric methods for trace metal analysis in environmental samples of marine origin like mangrove, sediments and shrimps are generally recommended. Three different electro-analytical techniques i.e. polarography, anodic stripping voltammetry (ASV) and adsorptive stripping voltammetry (ADSV) have been used. Cd/sub 2/+, Pb/sub 2/+, Cu/sub 2/+ and Mn/sub 2/+ were determined through ASV, Cr/sub 6/+ was analyzed by ADSV and Fe/sub 2/+, Zn/sub 2/+, Ni/sub 2/+ and Co/sub 2/+ were determined through polarography. Out of which pairs of Fe/sub 2/+Zn/sub 2/+ and Ni/sub 2/+Co/sub 2/+ were determined in two separate runs while Cd/sub 2/+, Pb/sub 2/+, Cu/sub 2/+ were analyzed in single run of ASV. Sensitivity and speciation capabilities of voltammetric methods have been employed. Analysis conditions were optimized that includes selection of supporting electrolyte, pH, working electrodes, sweep rate etc. Stripping voltammetry was adopted for analysis at ultra trace levels. Statistical parameters for analytical method development like selectivity factor, interference, repeatability (0.0065-0.130 macro g/g), reproducibility (0.08125-1.625 macro g/g), detection limits (0.032-5.06 macro g/g), limits of quantification (0.081-12.652 macro g/g), sensitivities (5.636-2.15 nA mL macro g-1) etc. were also determined. The percentage recoveries were found in between 95-105% using certified reference materials. Real samples of complex marine environment from Karachi coastline were also analyzed. The standard addition method was employed where any matrix effect was evidenced. (author)

  11. Estimates of Inequality Indices Based on Simple Random, Ranked Set, and Systematic Sampling

    OpenAIRE

    Bansal, Pooja; Arora, Sangeeta; Mahajan, Kalpana K.

    2013-01-01

    Gini index, Bonferroni index, and Absolute Lorenz index are some popular indices of inequality showing different features of inequality measurement. In general simple random sampling procedure is commonly used to estimate the inequality indices and their related inference. The key condition that the samples must be drawn via simple random sampling procedure though makes calculations much simpler but this assumption is often violated in practice as the data does not always yield simple random ...

  12. Event-triggered synchronization for reaction-diffusion complex networks via random sampling

    Science.gov (United States)

    Dong, Tao; Wang, Aijuan; Zhu, Huiyun; Liao, Xiaofeng

    2018-04-01

    In this paper, the synchronization problem of the reaction-diffusion complex networks (RDCNs) with Dirichlet boundary conditions is considered, where the data is sampled randomly. An event-triggered controller based on the sampled data is proposed, which can reduce the number of controller and the communication load. Under this strategy, the synchronization problem of the diffusion complex network is equivalently converted to the stability of a of reaction-diffusion complex dynamical systems with time delay. By using the matrix inequality technique and Lyapunov method, the synchronization conditions of the RDCNs are derived, which are dependent on the diffusion term. Moreover, it is found the proposed control strategy can get rid of the Zeno behavior naturally. Finally, a numerical example is given to verify the obtained results.

  13. Macro elemental analysis of food samples by nuclear analytical technique

    Science.gov (United States)

    Syahfitri, W. Y. N.; Kurniawati, S.; Adventini, N.; Damastuti, E.; Lestiani, D. D.

    2017-06-01

    Energy-dispersive X-ray fluorescence (EDXRF) spectrometry is a non-destructive, rapid, multi elemental, accurate, and environment friendly analysis compared with other detection methods. Thus, EDXRF spectrometry is applicable for food inspection. The macro elements calcium and potassium constitute important nutrients required by the human body for optimal physiological functions. Therefore, the determination of Ca and K content in various foods needs to be done. The aim of this work is to demonstrate the applicability of EDXRF for food analysis. The analytical performance of non-destructive EDXRF was compared with other analytical techniques; neutron activation analysis and atomic absorption spectrometry. Comparison of methods performed as cross checking results of the analysis and to overcome the limitations of the three methods. Analysis results showed that Ca found in food using EDXRF and AAS were not significantly different with p-value 0.9687, whereas p-value of K between EDXRF and NAA is 0.6575. The correlation between those results was also examined. The Pearson correlations for Ca and K were 0.9871 and 0.9558, respectively. Method validation using SRM NIST 1548a Typical Diet was also applied. The results showed good agreement between methods; therefore EDXRF method can be used as an alternative method for the determination of Ca and K in food samples.

  14. A new simple technique for improving the random properties of chaos-based cryptosystems

    Science.gov (United States)

    Garcia-Bosque, M.; Pérez-Resa, A.; Sánchez-Azqueta, C.; Celma, S.

    2018-03-01

    A new technique for improving the security of chaos-based stream ciphers has been proposed and tested experimentally. This technique manages to improve the randomness properties of the generated keystream by preventing the system to fall into short period cycles due to digitation. In order to test this technique, a stream cipher based on a Skew Tent Map algorithm has been implemented on a Virtex 7 FPGA. The randomness of the keystream generated by this system has been compared to the randomness of the keystream generated by the same system with the proposed randomness-enhancement technique. By subjecting both keystreams to the National Institute of Standards and Technology (NIST) tests, we have proved that our method can considerably improve the randomness of the generated keystreams. In order to incorporate our randomness-enhancement technique, only 41 extra slices have been needed, proving that, apart from effective, this method is also efficient in terms of area and hardware resources.

  15. Random On-Board Pixel Sampling (ROPS) X-Ray Camera

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Zhehui [Los Alamos; Iaroshenko, O. [Los Alamos; Li, S. [Los Alamos; Liu, T. [Fermilab; Parab, N. [Argonne (main); Chen, W. W. [Purdue U.; Chu, P. [Los Alamos; Kenyon, G. [Los Alamos; Lipton, R. [Fermilab; Sun, K.-X. [Nevada U., Las Vegas

    2017-09-25

    Recent advances in compressed sensing theory and algorithms offer new possibilities for high-speed X-ray camera design. In many CMOS cameras, each pixel has an independent on-board circuit that includes an amplifier, noise rejection, signal shaper, an analog-to-digital converter (ADC), and optional in-pixel storage. When X-ray images are sparse, i.e., when one of the following cases is true: (a.) The number of pixels with true X-ray hits is much smaller than the total number of pixels; (b.) The X-ray information is redundant; or (c.) Some prior knowledge about the X-ray images exists, sparse sampling may be allowed. Here we first illustrate the feasibility of random on-board pixel sampling (ROPS) using an existing set of X-ray images, followed by a discussion about signal to noise as a function of pixel size. Next, we describe a possible circuit architecture to achieve random pixel access and in-pixel storage. The combination of a multilayer architecture, sparse on-chip sampling, and computational image techniques, is expected to facilitate the development and applications of high-speed X-ray camera technology.

  16. Moving Target Techniques: Cyber Resilience throught Randomization, Diversity, and Dynamism

    Science.gov (United States)

    2017-03-03

    attacks. Simply put, these techniques turn systems into moving targets that will be hard for cyber attackers to compromise. MT techniques leverage...been diversified, they can attack it as if it was not diversified at all. Dynamic Data: Techniques in the dynamic data domain change the format

  17. Importance sampling of heavy-tailed iterated random functions

    NARCIS (Netherlands)

    B. Chen (Bohan); C.H. Rhee (Chang-Han); A.P. Zwart (Bert)

    2016-01-01

    textabstractWe consider a stochastic recurrence equation of the form $Z_{n+1} = A_{n+1} Z_n+B_{n+1}$, where $\\mathbb{E}[\\log A_1]<0$, $\\mathbb{E}[\\log^+ B_1]<\\infty$ and $\\{(A_n,B_n)\\}_{n\\in\\mathbb{N}}$ is an i.i.d. sequence of positive random vectors. The stationary distribution of this Markov

  18. Effectiveness of three different oral hygiene techniques on Viridans streptococci: A randomized controlled trial

    Directory of Open Access Journals (Sweden)

    N Naveen

    2016-01-01

    Full Text Available Introduction: Tongue cleaning is an important aspect of oral hygiene maintenance along with other mechanical and chemical aids. These methods have an influence on microorganism count in saliva. Aim: To assess the effectiveness of three different oral hygiene techniques on Viridans streptococci. Materials and Methods: This was a randomized controlled trial with 45 study subjects aged between 14 and 16 years and were randomly allocated into three groups: Group A - plastic tongue scraper, Group B - chlorhexidine mouthwash along with plastic tongue scraper, and Group C - chlorhexidine mouthwash. Unstimulated salivary samples were collected on the 1st, 7th, and 15th day before routine oral hygiene practices. Saliva samples were collected and incubated for 48 h on itis Salivarius(MS agar. Streptococcus mitis, Streptococcus mutans, and Streptococcus salivarius were counted. Data were analyzed using descriptive and inferential statistics. Results: The mean count of S. mitis, S. mutans, and S. salivarius for Group A, B, and C was found to be significant (P < 0.001 when compared between 1st, 7th, and 15th day. Between-groups comparisons revealed a significant difference between Groups A and C, B and C (P < 0.001. Conclusion: There was a significant reduction in bacterial count in all the participants indicating that all the three methods are useful in improving oral hygiene. Combination technique was found to be most effective.

  19. TECHNIQUES WITH POTENTIAL FOR HANDLING ENVIRONMENTAL SAMPLES IN CAPILLARY ELECTROPHORESIS

    Science.gov (United States)

    An assessment of the methods for handling environmental samples prior to capillary electrophoresis (CE) is presented for both aqueous and solid matrices. Sample handling in environmental analyses is the subject of ongoing research at the Environmental Protection Agency's National...

  20. Multiple sensitive estimation and optimal sample size allocation in the item sum technique.

    Science.gov (United States)

    Perri, Pier Francesco; Rueda García, María Del Mar; Cobo Rodríguez, Beatriz

    2018-01-01

    For surveys of sensitive issues in life sciences, statistical procedures can be used to reduce nonresponse and social desirability response bias. Both of these phenomena provoke nonsampling errors that are difficult to deal with and can seriously flaw the validity of the analyses. The item sum technique (IST) is a very recent indirect questioning method derived from the item count technique that seeks to procure more reliable responses on quantitative items than direct questioning while preserving respondents' anonymity. This article addresses two important questions concerning the IST: (i) its implementation when two or more sensitive variables are investigated and efficient estimates of their unknown population means are required; (ii) the determination of the optimal sample size to achieve minimum variance estimates. These aspects are of great relevance for survey practitioners engaged in sensitive research and, to the best of our knowledge, were not studied so far. In this article, theoretical results for multiple estimation and optimal allocation are obtained under a generic sampling design and then particularized to simple random sampling and stratified sampling designs. Theoretical considerations are integrated with a number of simulation studies based on data from two real surveys and conducted to ascertain the efficiency gain derived from optimal allocation in different situations. One of the surveys concerns cannabis consumption among university students. Our findings highlight some methodological advances that can be obtained in life sciences IST surveys when optimal allocation is achieved. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Manipulation of biological samples using micro and nano techniques

    DEFF Research Database (Denmark)

    Castillo, Jaime; Dimaki, Maria; Svendsen, Winnie Edith

    2009-01-01

    to their natural structure. Thanks to the advances in micro- and nanofabrication during the last decades several manipulation techniques offer us the possibility to image, characterize and manipulate biological material in a controlled way. Using these techniques the integration of biomaterials with remarkable...

  2. A simple air sampling technique for monitoring nitrous oxide pollution

    Energy Technology Data Exchange (ETDEWEB)

    Austin, J C; Shaw, R; Moyes, D; Cleaton-Jones, P E

    1981-01-01

    A simple, inexpensive device for the continuous low-flow sampling of air was devised to permit monitoring of pollution by gaseous anaesthetics. The device consisted of a water-filled Perspex cylinder in which a double-walled flexible-film gas sample collection bag was suspended. Air samples could be aspirated into the collection bag at flow rates of as low as 1 ml min-1 by allowing the water to drain from the cylinder at a controlled rate. The maintenance of sample integrity with aspiration and storage of samples of nitrous oxide in air at concentrations of 1000, 100 and 30 p.p.m. v/v was examined using gas chromatography. The sample bags retained a mean 94% of the nitrous oxide in air samples containing nitrous oxide 25 p.p.m. over a 72-h storage period.

  3. Determination of Initial Conditions for the Safety Analysis by Random Sampling of Operating Parameters

    International Nuclear Information System (INIS)

    Jeong, Hae-Yong; Park, Moon-Ghu

    2015-01-01

    In most existing evaluation methodologies, which follow a conservative approach, the most conservative initial conditions are searched for each transient scenario through tremendous assessment for wide operating windows or limiting conditions for operation (LCO) allowed by the operating guidelines. In this procedure, a user effect could be involved and a remarkable time and human resources are consumed. In the present study, we investigated a more effective statistical method for the selection of the most conservative initial condition by the use of random sampling of operating parameters affecting the initial conditions. A method for the determination of initial conditions based on random sampling of plant design parameters is proposed. This method is expected to be applied for the selection of the most conservative initial plant conditions in the safety analysis using a conservative evaluation methodology. In the method, it is suggested that the initial conditions of reactor coolant flow rate, pressurizer level, pressurizer pressure, and SG level are adjusted by controlling the pump rated flow, setpoints of PLCS, PPCS, and FWCS, respectively. The proposed technique is expected to contribute to eliminate the human factors introduced in the conventional safety analysis procedure and also to reduce the human resources invested in the safety evaluation of nuclear power plants

  4. Equipment and techniques for remote sampling of stored radioactive waste

    International Nuclear Information System (INIS)

    Nance, T.A.

    1996-01-01

    Several tools have been developed at the Savannah River Site (SRS) to remotely sample stored radioactive waste. These sampling tools have been developed to determine the chemical characteristics of the waste prior to processing. The processing of waste material varies according to the chemical characteristics of the waste, which change due to additions, settling, mixing, and chemical reactions during storage. Once the waste has been sampled to identify its characteristics, the chemical composition of the waste can then be altered if needed to prepare for processing. Various types of waste material in several types of containment must be sampled at SRS. Stored waste materials consist of liquids, floating organics, sludge, salt and solids. Waste is stored in four basic types of tanks with different means of access and interior obstructions. The waste tanks can only be accessed by small openings: access ports, risers and downcomers. Requirements for sampling depend on the type of tank being accessed, the waste within the tank, and the particular location in the tank desired for taking the sample. Sampling devices have been developed to sample all of the waste material forms found in the SRS tank farms. The fluid type samplers are capable of sampling surface liquid, subsurface liquid at varying depth, surface sludge, subsurface sludge, and floating organics. The solid type samplers are capable of sampling salt, sampling a solid layer on the bottom of the tank, and capturing a small solid mass on the tank bottom. The sampling devices are all designed to access the tanks through small access ports. The samplers are reusable and are designed to allow quick transfer of the samples to shielded packaging for transport, reducing the amount of radiation exposure to sampling personnel. The samplers weigh less than 100 lb. and are designed in sections to allow easy disassembly for storage and transport by personnel. (Abstract Truncated)

  5. Correlated random sampling for multivariate normal and log-normal distributions

    International Nuclear Information System (INIS)

    Žerovnik, Gašper; Trkov, Andrej; Kodeli, Ivan A.

    2012-01-01

    A method for correlated random sampling is presented. Representative samples for multivariate normal or log-normal distribution can be produced. Furthermore, any combination of normally and log-normally distributed correlated variables may be sampled to any requested accuracy. Possible applications of the method include sampling of resonance parameters which are used for reactor calculations.

  6. Assessment of fracture risk: value of random population-based samples--the Geelong Osteoporosis Study.

    Science.gov (United States)

    Henry, M J; Pasco, J A; Seeman, E; Nicholson, G C; Sanders, K M; Kotowicz, M A

    2001-01-01

    Fracture risk is determined by bone mineral density (BMD). The T-score, a measure of fracture risk, is the position of an individual's BMD in relation to a reference range. The aim of this study was to determine the magnitude of change in the T-score when different sampling techniques were used to produce the reference range. Reference ranges were derived from three samples, drawn from the same region: (1) an age-stratified population-based random sample, (2) unselected volunteers, and (3) a selected healthy subset of the population-based sample with no diseases or drugs known to affect bone. T-scores were calculated using the three reference ranges for a cohort of women who had sustained a fracture and as a group had a low mean BMD (ages 35-72 yr; n = 484). For most comparisons, the T-scores for the fracture cohort were more negative using the population reference range. The difference in T-scores reached 1.0 SD. The proportion of the fracture cohort classified as having osteoporosis at the spine was 26, 14, and 23% when the population, volunteer, and healthy reference ranges were applied, respectively. The use of inappropriate reference ranges results in substantial changes to T-scores and may lead to inappropriate management.

  7. A modified random decrement technique for modal identification from nonstationary ambient response data only

    International Nuclear Information System (INIS)

    Lin, Chang Sheng; Chiang, Dar Yun

    2012-01-01

    Modal identification is considered from response data of structural system under nonstationary ambient vibration. In a previous paper, we showed that by assuming the ambient excitation to be nonstationary white noise in the form of a product model, the nonstationary response signals can be converted into free-vibration data via the correlation technique. In the present paper, if the ambient excitation can be modeled as a nonstationary white noise in the form of a product model, then the nonstationary cross random decrement signatures of structural response evaluated at any fixed time instant are shown theoretically to be proportional to the nonstationary cross-correlation functions. The practical problem of insufficient data samples available for evaluating nonstationary random decrement signatures can be approximately resolved by first extracting the amplitude-modulating function from the response and then transforming the nonstationary responses into stationary ones. Modal-parameter identification can then be performed using the Ibrahim time-domain technique, which is effective at identifying closely spaced modes. The theory proposed can be further extended by using the filtering concept to cover the case of nonstationary color excitations. Numerical simulations confirm the validity of the proposed method for identification of modal parameters from nonstationary ambient response data

  8. Comparison of sampling techniques for Rift Valley Fever virus ...

    African Journals Online (AJOL)

    time for trapping potential vectors for Rift Valley Fever virus. ..... Krockel, U., Rose, A., Eiras, A.E. & Geier, M. (2006) New tools for surveillance of adult yellow fever ... baited trapping systems for sampling outdoor mosquito populations in ...

  9. A non-destructive DNA sampling technique for herbarium specimens.

    Science.gov (United States)

    Shepherd, Lara D

    2017-01-01

    Herbarium specimens are an important source of DNA for plant research but current sampling methods require the removal of material for DNA extraction. This is undesirable for irreplaceable specimens such as rare species or type material. Here I present the first non-destructive sampling method for extracting DNA from herbarium specimens. DNA was successfully retrieved from robust leaves and/or stems of herbarium specimens up to 73 years old.

  10. Sample preparation techniques of biological material for isotope analysis

    International Nuclear Information System (INIS)

    Axmann, H.; Sebastianelli, A.; Arrillaga, J.L.

    1990-01-01

    Sample preparation is an essential step in all isotope-aided experiments but often it is not given enough attention. The methods of sample preparation are very important to obtain reliable and precise analytical data and for further interpretation of results. The size of a sample required for chemical analysis is usually very small (10mg-1500mg). On the other hand the amount of harvested plant material from plots in a field experiment is often bulky (several kilograms) and the entire sample is too large for processing. In addition, while approaching maturity many crops show not only differences in physical consistency but also a non-uniformity in 15 N content among plant parts, requiring a plant fractionation or separation into parts (vegetative and reproductive) e.g. shoots and spikes, in case of small grain cereals, shoots and pods in case of grain legumes and tops and roots or beets (including crown) in case of sugar beet, etc. In any case the ultimate goal of these procedures is to obtain representative subsample harvested from greenhouse or field experiments for chemical analysis. Before harvesting an isotopic-aided experiment the method of sampling has to be selected. It should be based on the type of information required in relation to the objectives of the research and the availability of resources (staff, sample preparation equipment, analytical facilities, chemicals and supplies, etc.). 10 refs, 3 figs, 3 tabs

  11. Comparison of sampling techniques for Bayesian parameter estimation

    Science.gov (United States)

    Allison, Rupert; Dunkley, Joanna

    2014-02-01

    The posterior probability distribution for a set of model parameters encodes all that the data have to tell us in the context of a given model; it is the fundamental quantity for Bayesian parameter estimation. In order to infer the posterior probability distribution we have to decide how to explore parameter space. Here we compare three prescriptions for how parameter space is navigated, discussing their relative merits. We consider Metropolis-Hasting sampling, nested sampling and affine-invariant ensemble Markov chain Monte Carlo (MCMC) sampling. We focus on their performance on toy-model Gaussian likelihoods and on a real-world cosmological data set. We outline the sampling algorithms themselves and elaborate on performance diagnostics such as convergence time, scope for parallelization, dimensional scaling, requisite tunings and suitability for non-Gaussian distributions. We find that nested sampling delivers high-fidelity estimates for posterior statistics at low computational cost, and should be adopted in favour of Metropolis-Hastings in many cases. Affine-invariant MCMC is competitive when computing clusters can be utilized for massive parallelization. Affine-invariant MCMC and existing extensions to nested sampling naturally probe multimodal and curving distributions.

  12. A randomized controlled trial of nasolaryngoscopy training techniques.

    Science.gov (United States)

    Smith, Matthew E; Leung, Billy C; Sharma, Rishi; Nazeer, Sammar; McFerran, Don J

    2014-09-01

    Flexible nasolaryngoscopy is an essential skill for otolaryngology trainees to develop, but there is a lack of standardized training for this procedure. The aim of this study was to assess whether using training on a realistic human mannequin together with structured video feedback improved trainees' performance at flexible nasolaryngoscopy. Three-armed, single-blinded, randomized controlled study. Thirty-six junior doctors and final-year medical students were randomly allocated to one of three groups. All received a lecture and video presentation on flexible nasolaryngoscopy. One group received additional tuition using a training mannequin. The last group received mannequin training and feedback on their performance using a video recording. The trainees then undertook flexible nasolaryngoscopy on volunteers with these endoscopies recorded. Blinded observers scored the trainees on a range of objective and subjective measures. The volunteers who were also blinded to the candidates' training scored the comfort of the procedure. Adding mannequin training showed a trend toward improvement of performance but did not reach statistical significance. Mannequin training together with video feedback produced significant performance improvement in patient comfort (P = .0065), time to reach the vocal folds (P = .017), and global ability (P = .0006). Inter-rater reliability was excellent with P training using an anatomically correct model of the upper airway together with formalized video-assisted feedback on that training is a simple and effective way to improve endoscopy skills prior to starting flexible nasolaryngoscopy on patients. © 2014 The American Laryngological, Rhinological and Otological Society, Inc.

  13. An alternative procedure for estimating the population mean in simple random sampling

    Directory of Open Access Journals (Sweden)

    Housila P. Singh

    2012-03-01

    Full Text Available This paper deals with the problem of estimating the finite population mean using auxiliary information in simple random sampling. Firstly we have suggested a correction to the mean squared error of the estimator proposed by Gupta and Shabbir [On improvement in estimating the population mean in simple random sampling. Jour. Appl. Statist. 35(5 (2008, pp. 559-566]. Later we have proposed a ratio type estimator and its properties are studied in simple random sampling. Numerically we have shown that the proposed class of estimators is more efficient than different known estimators including Gupta and Shabbir (2008 estimator.

  14. Health plan auditing: 100-percent-of-claims vs. random-sample audits.

    Science.gov (United States)

    Sillup, George P; Klimberg, Ronald K

    2011-01-01

    The objective of this study was to examine the relative efficacy of two different methodologies for auditing self-funded medical claim expenses: 100-percent-of-claims auditing versus random-sampling auditing. Multiple data sets of claim errors or 'exceptions' from two Fortune-100 corporations were analysed and compared to 100 simulated audits of 300- and 400-claim random samples. Random-sample simulations failed to identify a significant number and amount of the errors that ranged from $200,000 to $750,000. These results suggest that health plan expenses of corporations could be significantly reduced if they audited 100% of claims and embraced a zero-defect approach.

  15. Fast egg collection method greatly improves randomness of egg sampling in Drosophila melanogaster

    DEFF Research Database (Denmark)

    Schou, Mads Fristrup

    2013-01-01

    When obtaining samples for population genetic studies, it is essential that the sampling is random. For Drosophila, one of the crucial steps in sampling experimental flies is the collection of eggs. Here an egg collection method is presented, which randomizes the eggs in a water column...... and diminishes environmental variance. This method was compared with a traditional egg collection method where eggs are collected directly from the medium. Within each method the observed and expected standard deviations of egg-to-adult viability were compared, whereby the difference in the randomness...... and to obtain a representative collection of genotypes, the method presented here is strongly recommended when collecting eggs from Drosophila....

  16. Gray bootstrap method for estimating frequency-varying random vibration signals with small samples

    Directory of Open Access Journals (Sweden)

    Wang Yanqing

    2014-04-01

    Full Text Available During environment testing, the estimation of random vibration signals (RVS is an important technique for the airborne platform safety and reliability. However, the available methods including extreme value envelope method (EVEM, statistical tolerances method (STM and improved statistical tolerance method (ISTM require large samples and typical probability distribution. Moreover, the frequency-varying characteristic of RVS is usually not taken into account. Gray bootstrap method (GBM is proposed to solve the problem of estimating frequency-varying RVS with small samples. Firstly, the estimated indexes are obtained including the estimated interval, the estimated uncertainty, the estimated value, the estimated error and estimated reliability. In addition, GBM is applied to estimating the single flight testing of certain aircraft. At last, in order to evaluate the estimated performance, GBM is compared with bootstrap method (BM and gray method (GM in testing analysis. The result shows that GBM has superiority for estimating dynamic signals with small samples and estimated reliability is proved to be 100% at the given confidence level.

  17. Study of gastric cancer samples using terahertz techniques

    Science.gov (United States)

    Wahaia, Faustino; Kasalynas, Irmantas; Seliuta, Dalius; Molis, Gediminas; Urbanowicz, Andrzej; Carvalho Silva, Catia D.; Carneiro, Fatima; Valusis, Gintaras; Granja, Pedro L.

    2014-08-01

    In the present work, samples of healthy and adenocarcinoma-affected human gastric tissue were analyzed using transmission time-domain THz spectroscopy (THz-TDS) and spectroscopic THz imaging at 201 and 590 GHz. The work shows that it is possible to distinguish between normal and cancerous regions in dried and paraffin-embedded samples. Plots of absorption coefficient α and refractive index n of normal and cancer affected tissues, as well as 2-D transmission THz images are presented and the conditions for discrimination between normal and affected tissues are discussed.

  18. Electrodeposition as a sample preparation technique for TXRF analysis

    International Nuclear Information System (INIS)

    Griesel, S.; Reus, U.; Prange, A.

    2000-01-01

    TXRF analysis of trace elements at concentrations in the μg/L range and below in high salt matrices normally requires a number of sample preparation steps that include separation of the salt matrix and preconcentration of the trace elements. A neat approach which allows samples to be prepared straightforwardly in a single step involves the application of electrochemical deposition using the TXRF sample support itself as an electrode. For this work a common three-electrode arrangement (radiometer analytical) with a rotating disc electrode as the working electrode, as is frequently employed in voltametric analysis, has been used. A special electrode tip has been constructed as a holder for the sample carrier which consists of polished glassy carbon. This material has been proven to be suitable for both its electrical and chemical properties. Measurements of the trace elements were performed using the ATOMIKA 8030C TXRF spectrometer, with the option of variable incident angles. In first experiments an artificial sea water matrix containing various trace elements in the μg/L range has been used. Elements such as Cr, Mn, Fe, Co, Ni, Cu, Zn, Ag, Cd, Hg, and Pb deposited on glassy carbon carriers. The deposition can be optimized by controlling the potential of the working electrode with respect to the reference electrode. Metal ions with a suitable standard potential are reduced to the metallic state and plated onto the electrode surface. When deposition is finished the sample carrier is demounted, rinsed with ultra-pure water and measured directly. Deposition yields for the elements under investigation are quite similar, and with an appropriate choice of the reference element, quantification can be achieved directly by internal standardization. The influence of parameters such as time, pH value, and trace element concentration on the deposition yield has been examined, and the results will be presented along with reproducibility studies. (author)

  19. 40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or as...

  20. Bioassay techniques for {sup 55}Fe in urine samples

    Energy Technology Data Exchange (ETDEWEB)

    Cregan, S P; Leon, J W; Linauskas, S H

    1993-11-01

    Solvent extraction, ion chromatography and several rapid screening methods were developed and evaluated for {sup 55}Fe bioassay applications. Isopropyl ether and TNOA column extractions had radiochemical recoveries exceeding 90%. These were very reproducible with a coefficient of variation less than 5%. Screening techniques investigated included direct counting of ashed urine solids, and Fe(OH){sub 3}. precipitated from urine. The sensitivities (2-50 Bq/d urine) of the screening methods were usually limited by the effective urine volume that could be counted in a liquid scintillation counter. The reference isopropyl ether and chromatography methods could easily achieve sensitivities well below the 1 Bq/d urine output target. (author). 49 refs., 3 tabs., 5 figs.

  1. Bioassay techniques for 55Fe in urine samples

    International Nuclear Information System (INIS)

    Cregan, S.P.; Leon, J.W.; Linauskas, S.H.

    1993-11-01

    Solvent extraction, ion chromatography and several rapid screening methods were developed and evaluated for 55 Fe bioassay applications. Isopropyl ether and TNOA column extractions had radiochemical recoveries exceeding 90%. These were very reproducible with a coefficient of variation less than 5%. Screening techniques investigated included direct counting of ashed urine solids, and Fe(OH) 3 . precipitated from urine. The sensitivities (2-50 Bq/d urine) of the screening methods were usually limited by the effective urine volume that could be counted in a liquid scintillation counter. The reference isopropyl ether and chromatography methods could easily achieve sensitivities well below the 1 Bq/d urine output target. (author). 49 refs., 3 tabs., 5 figs

  2. Image acquisition system using on sensor compressed sampling technique

    Science.gov (United States)

    Gupta, Pravir Singh; Choi, Gwan Seong

    2018-01-01

    Advances in CMOS technology have made high-resolution image sensors possible. These image sensors pose significant challenges in terms of the amount of raw data generated, energy efficiency, and frame rate. This paper presents a design methodology for an imaging system and a simplified image sensor pixel design to be used in the system so that the compressed sensing (CS) technique can be implemented easily at the sensor level. This results in significant energy savings as it not only cuts the raw data rate but also reduces transistor count per pixel; decreases pixel size; increases fill factor; simplifies analog-to-digital converter, JPEG encoder, and JPEG decoder design; decreases wiring; and reduces the decoder size by half. Thus, CS has the potential to increase the resolution of image sensors for a given technology and die size while significantly decreasing the power consumption and design complexity. We show that it has potential to reduce power consumption by about 23% to 65%.

  3. Monitoring well design and sampling techniques at NAPL sites

    International Nuclear Information System (INIS)

    Collins, M.; Rohrman, W.R.; Drake, K.D.

    1992-01-01

    The existence of Non-Aqueous Phase Liquids (NAPLs) at many Superfund and RCRA hazardous waste sites has become a recognized problem in recent years. The large number of sites exhibiting this problem results from the fact that many of the most frequently used industrial solvents and petroleum products can exist as NAPLs. Hazardous waste constituents occurring as NAPLs possess a common characteristic that causes great concern during groundwater contamination evaluation: while solubility in water is generally very low, it is sufficient to cause groundwater to exceed Maximum Contamination Levels (MCLs). Thus, even a small quantity of NAPL within a groundwater regime can act as a point source with the ability to contaminate vast quantities of groundwater over time. This property makes it imperative that groundwater investigations focus heavily on characterizing the nature, extent, and migration pathways of NAPLs at sites where it exists. Two types of NAPLs may exist in a groundwater system. Water-immiscible liquid constituents having a specific gravity greater than one are termed Dense Non-Aqueous Phase Liquids, while those with a specific gravity less than one are considered Light Non-Aqueous Phase Liquids. For a groundwater investigation to properly characterize the two types of NAPLs, careful consideration must be given to the placement and sampling of groundwater monitoring wells. Unfortunately, technical reviewers at EPA Region VII and the Corps of Engineers find that many groundwater investigations fall short in characterizing NAPLs because several basic considerations were overlooked. Included among these are monitoring well location and screen placement with respect to the water table and significant confining units, and the ability of the well sampling method to obtain samples of NAPL. Depending on the specific gravity of the NAPL that occurs at a site, various considerations can substantially enhance adequate characterization of NAPL contaminants

  4. Sampling hydrometeors in clouds in-situ - the replicator technique

    Science.gov (United States)

    Wex, Heike; Löffler, Mareike; Griesche, Hannes; Bühl, Johannes; Stratmann, Frank; Schmitt, Carl; Dirksen, Ruud; Reichardt, Jens; Wolf, Veronika; Kuhn, Thomas; Prager, Lutz; Seifert, Patric

    2017-04-01

    For the examination of ice crystals in clouds, concerning their number concentrations, sizes and shapes, often instruments mounted on fast flying aircraft are used. One related disadvantage is possible shattering of the ice crystals on inlets, which has been improved with the introduction of the "Korolev-tip" and by accounting for inter-arrival times (Korolev et al., 2013, 2015), but additionally, the typically fast flying aircraft allow only for a low spatial resolution. Alternative sampling methods have been introduced as e.g., a replicator by Miloshevich & Heymsfield (1997) and an in-situ imager by by Kuhn & Heymsfield (2016). They both sample ice crystals onto an advancing stripe while ascending on a balloon, conserving the ice crystals either in formvar for later off-line analysis under a microscope (Miloshevich & Heymsfield, 1997) or imaging them upon their impaction on silicone oil (Kuhn & Heymsfield, 2016), both yielding vertical profiles for different ice crystal properties. A measurement campaign was performed at the Lindenberg Meteorological Observatory of the German Meteorological Service (DWD) in Germany in October 2016, during which both types of instruments were used during balloon ascents, while ground-based Lidar and cloud-radar measurements were performed simultaneously. The two ice particle sondes were operated by people from the Lulea University of Technology and from TROPOS, where the latter one was made operational only recently. Here, we will show first results of the TROPOS replicator on ice crystals sampled during one ascent, for which the collected ice crystals were analyzed off-line using a microscope. Literature: Korolev, A., E. Emery, and K. Creelman (2013), Modification and tests of particle probe tips to mitigate effects of ice shattering, J. Atmos. Ocean. Tech., 30, 690-708, 2013. Korolev, A., and P. R. Field (2015), Assessment of the performance of the inter-arrival time algorithm to identify ice shattering artifacts in cloud

  5. Random Sampling of Correlated Parameters – a Consistent Solution for Unfavourable Conditions

    Energy Technology Data Exchange (ETDEWEB)

    Žerovnik, G., E-mail: gasper.zerovnik@ijs.si [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Trkov, A. [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); International Atomic Energy Agency, PO Box 100, A-1400 Vienna (Austria); Kodeli, I.A. [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Capote, R. [International Atomic Energy Agency, PO Box 100, A-1400 Vienna (Austria); Smith, D.L. [Argonne National Laboratory, 1710 Avenida del Mundo, Coronado, CA 92118-3073 (United States)

    2015-01-15

    Two methods for random sampling according to a multivariate lognormal distribution – the correlated sampling method and the method of transformation of correlation coefficients – are briefly presented. The methods are mathematically exact and enable consistent sampling of correlated inherently positive parameters with given information on the first two distribution moments. Furthermore, a weighted sampling method to accelerate the convergence of parameters with extremely large relative uncertainties is described. However, the method is efficient only for a limited number of correlated parameters.

  6. Estimation of the Coefficient of Restitution of Rocking Systems by the Random Decrement Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Demosthenous, Milton; Manos, George C.

    1994-01-01

    The aim of this paper is to investigate the possibility of estimating an average damping parameter for a rocking system due to impact, the so-called coefficient of restitution, from the random response, i.e. when the loads are random and unknown, and the response is measured. The objective...... is to obtain an estimate of the free rocking response from the measured random response using the Random Decrement (RDD) Technique, and then estimate the coefficient of restitution from this free response estimate. In the paper this approach is investigated by simulating the response of a single degree...

  7. The optimal injection technique for the osteoarthritic ankle: A randomized, cross-over trial

    NARCIS (Netherlands)

    Witteveen, Angelique G. H.; Kok, Aimee; Sierevelt, Inger N.; Kerkhoffs, Gino M. M. J.; van Dijk, C. Niek

    2013-01-01

    Background: To optimize the injection technique for the osteoarthritic ankle in order to enhance the effect of intra-articular injections and minimize adverse events. Methods: Randomized cross-over trial. Comparing two injection techniques in patients with symptomatic ankle osteoarthritis. Patients

  8. Generating Random Samples of a Given Size Using Social Security Numbers.

    Science.gov (United States)

    Erickson, Richard C.; Brauchle, Paul E.

    1984-01-01

    The purposes of this article are (1) to present a method by which social security numbers may be used to draw cluster samples of a predetermined size and (2) to describe procedures used to validate this method of drawing random samples. (JOW)

  9. Occupational position and its relation to mental distress in a random sample of Danish residents

    DEFF Research Database (Denmark)

    Rugulies, Reiner Ernst; Madsen, Ida E H; Nielsen, Maj Britt D

    2010-01-01

    PURPOSE: To analyze the distribution of depressive, anxiety, and somatization symptoms across different occupational positions in a random sample of Danish residents. METHODS: The study sample consisted of 591 Danish residents (50% women), aged 20-65, drawn from an age- and gender-stratified random...... sample of the Danish population. Participants filled out a survey that included the 92 item version of the Hopkins Symptom Checklist (SCL-92). We categorized occupational position into seven groups: high- and low-grade non-manual workers, skilled and unskilled manual workers, high- and low-grade self...

  10. Solid Phase Microextraction and Related Techniques for Drugs in Biological Samples

    OpenAIRE

    Moein, Mohammad Mahdi; Said, Rana; Bassyouni, Fatma; Abdel-Rehim, Mohamed

    2014-01-01

    In drug discovery and development, the quantification of drugs in biological samples is an important task for the determination of the physiological performance of the investigated drugs. After sampling, the next step in the analytical process is sample preparation. Because of the low concentration levels of drug in plasma and the variety of the metabolites, the selected extraction technique should be virtually exhaustive. Recent developments of sample handling techniques are directed, from o...

  11. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality

    Science.gov (United States)

    Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Introduction Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. Materials and methods The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. Results The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. Conclusions The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in

  12. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality.

    Science.gov (United States)

    L'Engle, Kelly; Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit dialing of mobile

  13. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality.

    Directory of Open Access Journals (Sweden)

    Kelly L'Engle

    Full Text Available Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample.The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census.The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample.The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit

  14. Neurofeedback Against Binge Eating: A Randomized Controlled Trial in a Female Subclinical Threshold Sample.

    Science.gov (United States)

    Schmidt, Jennifer; Martin, Alexandra

    2016-09-01

    Brain-directed treatment techniques, such as neurofeedback, have recently been proposed as adjuncts in the treatment of eating disorders to improve therapeutic outcomes. In line with this recommendation, a cue exposure EEG-neurofeedback protocol was developed. The present study aimed at the evaluation of the specific efficacy of neurofeedback to reduce subjective binge eating in a female subthreshold sample. A total of 75 subjects were randomized to EEG-neurofeedback, mental imagery with a comparable treatment set-up or a waitlist group. At post-treatment, only EEG-neurofeedback led to a reduced frequency of binge eating (p = .015, g = 0.65). The effects remained stable to a 3-month follow-up. EEG-neurofeedback further showed particular beneficial effects on perceived stress and dietary self-efficacy. Differences in outcomes did not arise from divergent treatment expectations. Because EEG-neurofeedback showed a specific efficacy, it may be a promising brain-directed approach that should be tested as a treatment adjunct in clinical groups with binge eating. Copyright © 2016 John Wiley & Sons, Ltd and Eating Disorders Association. Copyright © 2016 John Wiley & Sons, Ltd and Eating Disorders Association.

  15. A cost-saving statistically based screening technique for focused sampling of a lead-contaminated site

    International Nuclear Information System (INIS)

    Moscati, A.F. Jr.; Hediger, E.M.; Rupp, M.J.

    1986-01-01

    High concentrations of lead in soils along an abandoned railroad line prompted a remedial investigation to characterize the extent of contamination across a 7-acre site. Contamination was thought to be spotty across the site reflecting its past use in battery recycling operations at discrete locations. A screening technique was employed to delineate the more highly contaminated areas by testing a statistically determined minimum number of random samples from each of seven discrete site areas. The approach not only quickly identified those site areas which would require more extensive grid sampling, but also provided a statistically defensible basis for excluding other site areas from further consideration, thus saving the cost of additional sample collection and analysis. The reduction in the number of samples collected in ''clean'' areas of the site ranged from 45 to 60%

  16. Field-based random sampling without a sampling frame: control selection for a case-control study in rural Africa.

    Science.gov (United States)

    Crampin, A C; Mwinuka, V; Malema, S S; Glynn, J R; Fine, P E

    2001-01-01

    Selection bias, particularly of controls, is common in case-control studies and may materially affect the results. Methods of control selection should be tailored both for the risk factors and disease under investigation and for the population being studied. We present here a control selection method devised for a case-control study of tuberculosis in rural Africa (Karonga, northern Malawi) that selects an age/sex frequency-matched random sample of the population, with a geographical distribution in proportion to the population density. We also present an audit of the selection process, and discuss the potential of this method in other settings.

  17. A new sampling technique for surface exposure dating using a portable electric rock cutter

    Directory of Open Access Journals (Sweden)

    Yusuke Suganuma

    2012-07-01

    Full Text Available Surface exposure dating using in situ cosmogenic nuclides has contributed to our understanding of Earth-surface processes. The precision of the ages estimated by this method is affected by the sample geometry; therefore, high accuracy measurements of the thickness and shape of the rock sample (thickness and shape is crucial. However, it is sometimes diffi cult to meet these requirements by conventional sampling methods with a hammer and chisel. Here, we propose a new sampling technique using a portable electric rock cutter. This sampling technique is faster, produces more precisely shaped samples, and allows for a more precise age interpretation. A simple theoretical modeldemonstrates that the age error due to defective sample geometry increases as the total sample thickness increases, indicating the importance of precise sampling for surface exposure dating.

  18. 40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sample selection by random number... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square... area created in accordance with paragraph (a) of this section, select two random numbers: one each for...

  19. Magnetic separation techniques in sample preparation for biological analysis: a review.

    Science.gov (United States)

    He, Jincan; Huang, Meiying; Wang, Dongmei; Zhang, Zhuomin; Li, Gongke

    2014-12-01

    Sample preparation is a fundamental and essential step in almost all the analytical procedures, especially for the analysis of complex samples like biological and environmental samples. In past decades, with advantages of superparamagnetic property, good biocompatibility and high binding capacity, functionalized magnetic materials have been widely applied in various processes of sample preparation for biological analysis. In this paper, the recent advancements of magnetic separation techniques based on magnetic materials in the field of sample preparation for biological analysis were reviewed. The strategy of magnetic separation techniques was summarized. The synthesis, stabilization and bio-functionalization of magnetic nanoparticles were reviewed in detail. Characterization of magnetic materials was also summarized. Moreover, the applications of magnetic separation techniques for the enrichment of protein, nucleic acid, cell, bioactive compound and immobilization of enzyme were described. Finally, the existed problems and possible trends of magnetic separation techniques for biological analysis in the future were proposed. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Random sampling of elementary flux modes in large-scale metabolic networks.

    Science.gov (United States)

    Machado, Daniel; Soons, Zita; Patil, Kiran Raosaheb; Ferreira, Eugénio C; Rocha, Isabel

    2012-09-15

    The description of a metabolic network in terms of elementary (flux) modes (EMs) provides an important framework for metabolic pathway analysis. However, their application to large networks has been hampered by the combinatorial explosion in the number of modes. In this work, we develop a method for generating random samples of EMs without computing the whole set. Our algorithm is an adaptation of the canonical basis approach, where we add an additional filtering step which, at each iteration, selects a random subset of the new combinations of modes. In order to obtain an unbiased sample, all candidates are assigned the same probability of getting selected. This approach avoids the exponential growth of the number of modes during computation, thus generating a random sample of the complete set of EMs within reasonable time. We generated samples of different sizes for a metabolic network of Escherichia coli, and observed that they preserve several properties of the full EM set. It is also shown that EM sampling can be used for rational strain design. A well distributed sample, that is representative of the complete set of EMs, should be suitable to most EM-based methods for analysis and optimization of metabolic networks. Source code for a cross-platform implementation in Python is freely available at http://code.google.com/p/emsampler. dmachado@deb.uminho.pt Supplementary data are available at Bioinformatics online.

  1. A Nationwide Random Sampling Survey of Potential Complicated Grief in Japan

    Science.gov (United States)

    Mizuno, Yasunao; Kishimoto, Junji; Asukai, Nozomu

    2012-01-01

    To investigate the prevalence of significant loss, potential complicated grief (CG), and its contributing factors, we conducted a nationwide random sampling survey of Japanese adults aged 18 or older (N = 1,343) using a self-rating Japanese-language version of the Complicated Grief Brief Screen. Among them, 37.0% experienced their most significant…

  2. A simple sample size formula for analysis of covariance in cluster randomized trials.

    NARCIS (Netherlands)

    Teerenstra, S.; Eldridge, S.; Graff, M.J.; Hoop, E. de; Borm, G.F.

    2012-01-01

    For cluster randomized trials with a continuous outcome, the sample size is often calculated as if an analysis of the outcomes at the end of the treatment period (follow-up scores) would be performed. However, often a baseline measurement of the outcome is available or feasible to obtain. An

  3. Random selection of items. Selection of n1 samples among N items composing a stratum

    International Nuclear Information System (INIS)

    Jaech, J.L.; Lemaire, R.J.

    1987-02-01

    STR-224 provides generalized procedures to determine required sample sizes, for instance in the course of a Physical Inventory Verification at Bulk Handling Facilities. The present report describes procedures to generate random numbers and select groups of items to be verified in a given stratum through each of the measurement methods involved in the verification. (author). 3 refs

  4. Reinforcing Sampling Distributions through a Randomization-Based Activity for Introducing ANOVA

    Science.gov (United States)

    Taylor, Laura; Doehler, Kirsten

    2015-01-01

    This paper examines the use of a randomization-based activity to introduce the ANOVA F-test to students. The two main goals of this activity are to successfully teach students to comprehend ANOVA F-tests and to increase student comprehension of sampling distributions. Four sections of students in an advanced introductory statistics course…

  5. Sample preparation techniques based on combustion reactions in closed vessels - A brief overview and recent applications

    International Nuclear Information System (INIS)

    Flores, Erico M.M.; Barin, Juliano S.; Mesko, Marcia F.; Knapp, Guenter

    2007-01-01

    In this review, a general discussion of sample preparation techniques based on combustion reactions in closed vessels is presented. Applications for several kinds of samples are described, taking into account the literature data reported in the last 25 years. The operational conditions as well as the main characteristics and drawbacks are discussed for bomb combustion, oxygen flask and microwave-induced combustion (MIC) techniques. Recent applications of MIC techniques are discussed with special concern for samples not well digested by conventional microwave-assisted wet digestion as, for example, coal and also for subsequent determination of halogens

  6. The concentration of heavy metals: zinc, cadmium, lead, copper, mercury, iron and calcium in head hair of a randomly selected sample of Kenyan people

    International Nuclear Information System (INIS)

    Wandiga, S.O.; Jumba, I.O.

    1982-01-01

    An intercomparative analysis of the concentration of heavy metals:zinc, cadmium, lead, copper, mercury, iron and calcium in head hair of a randomly selected sample of Kenyan people using the techniques of atomic absorption spectrophotometry (AAS) and differential pulse anodic stripping voltammetry (DPAS) has been undertaken. The percent relative standard deviation for each sample analysed using either of the techniques show good sensitivity and correlation between the techniques. The DPAS was found to be slightly sensitive than the AAs instrument used. The recalculated body burden rations of Cd to Zn, Pb to Fe reveal no unusual health impairement symptoms and suggest a relatively clean environment in Kenya.(author)

  7. Monitoring oil persistence on beaches : SCAT versus stratified random sampling designs

    International Nuclear Information System (INIS)

    Short, J.W.; Lindeberg, M.R.; Harris, P.M.; Maselko, J.M.; Pella, J.J.; Rice, S.D.

    2003-01-01

    In the event of a coastal oil spill, shoreline clean-up assessment teams (SCAT) commonly rely on visual inspection of the entire affected area to monitor the persistence of the oil on beaches. Occasionally, pits are excavated to evaluate the persistence of subsurface oil. This approach is practical for directing clean-up efforts directly following a spill. However, sampling of the 1989 Exxon Valdez oil spill in Prince William Sound 12 years later has shown that visual inspection combined with pit excavation does not offer estimates of contaminated beach area of stranded oil volumes. This information is needed to statistically evaluate the significance of change with time. Assumptions regarding the correlation of visually-evident surface oil and cryptic subsurface oil are usually not evaluated as part of the SCAT mandate. Stratified random sampling can avoid such problems and could produce precise estimates of oiled area and volume that allow for statistical assessment of major temporal trends and the extent of the impact. The 2001 sampling of the shoreline of Prince William Sound showed that 15 per cent of surface oil occurrences were associated with subsurface oil. This study demonstrates the usefulness of the stratified random sampling method and shows how sampling design parameters impact statistical outcome. Power analysis based on the study results, indicate that optimum power is derived when unnecessary stratification is avoided. It was emphasized that sampling effort should be balanced between choosing sufficient beaches for sampling and the intensity of sampling

  8. The Bootstrap, the Jackknife, and the Randomization Test: A Sampling Taxonomy.

    Science.gov (United States)

    Rodgers, J L

    1999-10-01

    A simple sampling taxonomy is defined that shows the differences between and relationships among the bootstrap, the jackknife, and the randomization test. Each method has as its goal the creation of an empirical sampling distribution that can be used to test statistical hypotheses, estimate standard errors, and/or create confidence intervals. Distinctions between the methods can be made based on the sampling approach (with replacement versus without replacement) and the sample size (replacing the whole original sample versus replacing a subset of the original sample). The taxonomy is useful for teaching the goals and purposes of resampling schemes. An extension of the taxonomy implies other possible resampling approaches that have not previously been considered. Univariate and multivariate examples are presented.

  9. Methodological integrative review of the work sampling technique used in nursing workload research.

    Science.gov (United States)

    Blay, Nicole; Duffield, Christine M; Gallagher, Robyn; Roche, Michael

    2014-11-01

    To critically review the work sampling technique used in nursing workload research. Work sampling is a technique frequently used by researchers and managers to explore and measure nursing activities. However, work sampling methods used are diverse making comparisons of results between studies difficult. Methodological integrative review. Four electronic databases were systematically searched for peer-reviewed articles published between 2002-2012. Manual scanning of reference lists and Rich Site Summary feeds from contemporary nursing journals were other sources of data. Articles published in the English language between 2002-2012 reporting on research which used work sampling to examine nursing workload. Eighteen articles were reviewed. The review identified that the work sampling technique lacks a standardized approach, which may have an impact on the sharing or comparison of results. Specific areas needing a shared understanding included the training of observers and subjects who self-report, standardization of the techniques used to assess observer inter-rater reliability, sampling methods and reporting of outcomes. Work sampling is a technique that can be used to explore the many facets of nursing work. Standardized reporting measures would enable greater comparison between studies and contribute to knowledge more effectively. Author suggestions for the reporting of results may act as guidelines for researchers considering work sampling as a research method. © 2014 John Wiley & Sons Ltd.

  10. A line-based vegetation sampling technique and its application in ...

    African Journals Online (AJOL)

    percentage cover, density and intercept frequency) and also provides plant size distributions, yet requires no more sampling effort than the line-intercept method.. A field test of the three techniques in succulent karoo, showed that the discriminating ...

  11. Fabrication Techniques of Stretchable and Cloth Electroadhesion Samples for Implementation on Devices with Space Application

    Data.gov (United States)

    National Aeronautics and Space Administration — The purpose of this study is to determine materials and fabrication techniques for efficient space-rated electroadhesion (EA) samples. Liquid metals, including...

  12. Additive non-uniform random sampling in superimposed fiber Bragg grating strain gauge

    International Nuclear Information System (INIS)

    Ma, Y C; Liu, H Y; Yan, S B; Li, J M; Tang, J; Yang, Y H; Yang, M W

    2013-01-01

    This paper demonstrates an additive non-uniform random sampling and interrogation method for dynamic and/or static strain gauge using a reflection spectrum from two superimposed fiber Bragg gratings (FBGs). The superimposed FBGs are designed to generate non-equidistant space of a sensing pulse train in the time domain during dynamic strain gauge. By combining centroid finding with smooth filtering methods, both the interrogation speed and accuracy are improved. A 1.9 kHz dynamic strain is measured by generating an additive non-uniform randomly distributed 2 kHz optical sensing pulse train from a mean 500 Hz triangular periodically changing scanning frequency. (paper)

  13. Additive non-uniform random sampling in superimposed fiber Bragg grating strain gauge

    Science.gov (United States)

    Ma, Y. C.; Liu, H. Y.; Yan, S. B.; Yang, Y. H.; Yang, M. W.; Li, J. M.; Tang, J.

    2013-05-01

    This paper demonstrates an additive non-uniform random sampling and interrogation method for dynamic and/or static strain gauge using a reflection spectrum from two superimposed fiber Bragg gratings (FBGs). The superimposed FBGs are designed to generate non-equidistant space of a sensing pulse train in the time domain during dynamic strain gauge. By combining centroid finding with smooth filtering methods, both the interrogation speed and accuracy are improved. A 1.9 kHz dynamic strain is measured by generating an additive non-uniform randomly distributed 2 kHz optical sensing pulse train from a mean 500 Hz triangular periodically changing scanning frequency.

  14. Piezoelectric Versus Conventional Rotary Techniques for Impacted Third Molar Extraction: A Meta-analysis of Randomized Controlled Trials.

    Science.gov (United States)

    Jiang, Qian; Qiu, Yating; Yang, Chi; Yang, Jingyun; Chen, Minjie; Zhang, Zhiyuan

    2015-10-01

    Impacted third molars are frequently encountered in clinical work. Surgical removal of impacted third molars is often required to prevent clinical symptoms. Traditional rotary cutting instruments are potentially injurious, and piezosurgery, as a new osteotomy technique, has been introduced in oral and maxillofacial surgery. No consistent conclusion has been reached regarding whether this new technique is associated with fewer or less severe postoperative sequelae after third molar extraction.The aim of this study was to compare piezosurgery with rotary osteotomy techniques, with regard to surgery time and the severity of postoperative sequelae, including pain, swelling, and trismus.We conducted a systematic literature search in the Cochrane Library, PubMed, Embase, and Google Scholar.The eligibility criteria of this study included the following: the patients were clearly diagnosed as having impacted mandibular third molars; the patients underwent piezosurgery osteotomy, and in the control group rotary osteotomy techniques, for removing impacted third molars; the outcomes of interest include surgery time, trismus, swelling or pain; the studies are randomized controlled trials.We used random-effects models to calculate the difference in the outcomes, and the corresponding 95% confidence interval. We calculated the weighted mean difference if the trials used the same measurement, and a standardized mean difference if otherwise.A total of seven studies met the eligibility criteria and were included in our analysis. Compared with rotary osteotomy, patients undergoing piezosurgery experienced longer surgery time (mean difference 4.13 minutes, 95% confidence interval 2.75-5.52, P piezosurgery groups.The number of included randomized controlled trials and the sample size of each trial were relatively small, double blinding was not possible, and cost analysis was unavailable due to a lack of data.Our meta-analysis indicates that although patients undergoing piezosurgery

  15. Sampling phased array a new technique for signal processing and ultrasonic imaging

    OpenAIRE

    Bulavinov, A.; Joneit, D.; Kröning, M.; Bernus, L.; Dalichow, M.H.; Reddy, K.M.

    2006-01-01

    Different signal processing and image reconstruction techniques are applied in ultrasonic non-destructive material evaluation. In recent years, rapid development in the fields of microelectronics and computer engineering lead to wide application of phased array systems. A new phased array technique, called "Sampling Phased Array" has been developed in Fraunhofer Institute for non-destructive testing. It realizes unique approach of measurement and processing of ultrasonic signals. The sampling...

  16. Fixed-location hydroacoustic monitoring designs for estimating fish passage using stratified random and systematic sampling

    International Nuclear Information System (INIS)

    Skalski, J.R.; Hoffman, A.; Ransom, B.H.; Steig, T.W.

    1993-01-01

    Five alternate sampling designs are compared using 15 d of 24-h continuous hydroacoustic data to identify the most favorable approach to fixed-location hydroacoustic monitoring of salmonid outmigrants. Four alternative aproaches to systematic sampling are compared among themselves and with stratified random sampling (STRS). Stratifying systematic sampling (STSYS) on a daily basis is found to reduce sampling error in multiday monitoring studies. Although sampling precision was predictable with varying levels of effort in STRS, neither magnitude nor direction of change in precision was predictable when effort was varied in systematic sampling (SYS). Furthermore, modifying systematic sampling to include replicated (e.g., nested) sampling (RSYS) is further shown to provide unbiased point and variance estimates as does STRS. Numerous short sampling intervals (e.g., 12 samples of 1-min duration per hour) must be monitored hourly using RSYS to provide efficient, unbiased point and interval estimates. For equal levels of effort, STRS outperformed all variations of SYS examined. Parametric approaches to confidence interval estimates are found to be superior to nonparametric interval estimates (i.e., bootstrap and jackknife) in estimating total fish passage. 10 refs., 1 fig., 8 tabs

  17. Standardization of proton-induced x-ray emission technique for analysis of thick samples

    Science.gov (United States)

    Ali, Shad; Zeb, Johar; Ahad, Abdul; Ahmad, Ishfaq; Haneef, M.; Akbar, Jehan

    2015-09-01

    This paper describes the standardization of the proton-induced x-ray emission (PIXE) technique for finding the elemental composition of thick samples. For the standardization, three different samples of standard reference materials (SRMs) were analyzed using this technique and the data were compared with the already known data of these certified SRMs. These samples were selected in order to cover the maximum range of elements in the periodic table. Each sample was irradiated for three different values of collected beam charges at three different times. A proton beam of 2.57 MeV obtained using 5UDH-II Pelletron accelerator was used for excitation of x-rays from the sample. The acquired experimental data were analyzed using the GUPIXWIN software. The results show that the SRM data and the data obtained using the PIXE technique are in good agreement.

  18. Cone penetrometer tests and HydroPunch sampling: A screening technique for plume definition

    International Nuclear Information System (INIS)

    Smolley, M.; Kappmeyer, J.C.

    1991-01-01

    Cone penetrometer tests and HydroPunch sampling were used to define the extent of volatile organic compounds in ground water. The investigation indicated that the combination of the these techniques is effective for obtaining ground water samples for preliminary plume definition. HydroPunch samples can be collected in unconsolidated sediments and the analytical results obtained from these samples are comparable to those obtained from adjacent monitoring wells. This sampling method is a rapid and cost-effective screening technique for characterizing the extent of contaminant plumes in soft sediment environments. Use of this screening technique allowed monitoring wells to be located at the plume boundary, thereby reducing the number of wells installed and the overall cost of the plume definition program

  19. Characterization of electron microscopes with binary pseudo-random multilayer test samples

    Science.gov (United States)

    Yashchuk, Valeriy V.; Conley, Raymond; Anderson, Erik H.; Barber, Samuel K.; Bouet, Nathalie; McKinney, Wayne R.; Takacs, Peter Z.; Voronov, Dmitriy L.

    2011-09-01

    Verification of the reliability of metrology data from high quality X-ray optics requires that adequate methods for test and calibration of the instruments be developed. For such verification for optical surface profilometers in the spatial frequency domain, a modulation transfer function (MTF) calibration method based on binary pseudo-random (BPR) gratings and arrays has been suggested [1,2] and proven to be an effective calibration method for a number of interferometric microscopes, a phase shifting Fizeau interferometer, and a scatterometer [5]. Here we describe the details of development of binary pseudo-random multilayer (BPRML) test samples suitable for characterization of scanning (SEM) and transmission (TEM) electron microscopes. We discuss the results of TEM measurements with the BPRML test samples fabricated from a WiSi 2/Si multilayer coating with pseudo-randomly distributed layers. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML test samples can be used to characterize X-ray microscopes. Corresponding work with X-ray microscopes is in progress.

  20. Characterization of electron microscopes with binary pseudo-random multilayer test samples

    International Nuclear Information System (INIS)

    Yashchuk, Valeriy V.; Conley, Raymond; Anderson, Erik H.; Barber, Samuel K.; Bouet, Nathalie; McKinney, Wayne R.; Takacs, Peter Z.; Voronov, Dmitriy L.

    2011-01-01

    Verification of the reliability of metrology data from high quality X-ray optics requires that adequate methods for test and calibration of the instruments be developed. For such verification for optical surface profilometers in the spatial frequency domain, a modulation transfer function (MTF) calibration method based on binary pseudo-random (BPR) gratings and arrays has been suggested and proven to be an effective calibration method for a number of interferometric microscopes, a phase shifting Fizeau interferometer, and a scatterometer [5]. Here we describe the details of development of binary pseudo-random multilayer (BPRML) test samples suitable for characterization of scanning (SEM) and transmission (TEM) electron microscopes. We discuss the results of TEM measurements with the BPRML test samples fabricated from a WiSi 2 /Si multilayer coating with pseudo-randomly distributed layers. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML test samples can be used to characterize X-ray microscopes. Corresponding work with X-ray microscopes is in progress.

  1. LOD score exclusion analyses for candidate genes using random population samples.

    Science.gov (United States)

    Deng, H W; Li, J; Recker, R R

    2001-05-01

    While extensive analyses have been conducted to test for, no formal analyses have been conducted to test against, the importance of candidate genes with random population samples. We develop a LOD score approach for exclusion analyses of candidate genes with random population samples. Under this approach, specific genetic effects and inheritance models at candidate genes can be analysed and if a LOD score is < or = - 2.0, the locus can be excluded from having an effect larger than that specified. Computer simulations show that, with sample sizes often employed in association studies, this approach has high power to exclude a gene from having moderate genetic effects. In contrast to regular association analyses, population admixture will not affect the robustness of our analyses; in fact, it renders our analyses more conservative and thus any significant exclusion result is robust. Our exclusion analysis complements association analysis for candidate genes in random population samples and is parallel to the exclusion mapping analyses that may be conducted in linkage analyses with pedigrees or relative pairs. The usefulness of the approach is demonstrated by an application to test the importance of vitamin D receptor and estrogen receptor genes underlying the differential risk to osteoporotic fractures.

  2. Accounting for Sampling Error in Genetic Eigenvalues Using Random Matrix Theory.

    Science.gov (United States)

    Sztepanacz, Jacqueline L; Blows, Mark W

    2017-07-01

    The distribution of genetic variance in multivariate phenotypes is characterized by the empirical spectral distribution of the eigenvalues of the genetic covariance matrix. Empirical estimates of genetic eigenvalues from random effects linear models are known to be overdispersed by sampling error, where large eigenvalues are biased upward, and small eigenvalues are biased downward. The overdispersion of the leading eigenvalues of sample covariance matrices have been demonstrated to conform to the Tracy-Widom (TW) distribution. Here we show that genetic eigenvalues estimated using restricted maximum likelihood (REML) in a multivariate random effects model with an unconstrained genetic covariance structure will also conform to the TW distribution after empirical scaling and centering. However, where estimation procedures using either REML or MCMC impose boundary constraints, the resulting genetic eigenvalues tend not be TW distributed. We show how using confidence intervals from sampling distributions of genetic eigenvalues without reference to the TW distribution is insufficient protection against mistaking sampling error as genetic variance, particularly when eigenvalues are small. By scaling such sampling distributions to the appropriate TW distribution, the critical value of the TW statistic can be used to determine if the magnitude of a genetic eigenvalue exceeds the sampling error for each eigenvalue in the spectral distribution of a given genetic covariance matrix. Copyright © 2017 by the Genetics Society of America.

  3. Sample preparation techniques in trace element analysis by X-ray emission spectroscopy

    International Nuclear Information System (INIS)

    Valkovic, V.

    1983-11-01

    The report, written under a research contract with the IAEA, contains a detailed presentation of the most difficult problem encountered in the trace element analysis by methods of the X-ray emission spectroscopy, namely the sample preparation techniques. The following items are covered. Sampling - with specific consideration of aerosols, water, soil, biological materials, petroleum and its products, storage of samples and their handling. Pretreatment of samples - preconcentration, ashing, solvent extraction, ion exchange and electrodeposition. Sample preparations for PIXE - analysis - backings, target uniformity and homogeneity, effects of irradiation, internal standards and specific examples of preparation (aqueous, biological, blood serum and solid samples). Sample preparations for radioactive sources or tube excitation - with specific examples (water, liquid and solid samples, soil, geological, plants and tissue samples). Finally, the problem of standards and reference materials, as well as that of interlaboratory comparisons, is discussed

  4. Mantle biopsy: a technique for nondestructive tissue-sampling of freshwater mussels

    Science.gov (United States)

    David J. Berg; Wendell R. Haag; Sheldon I. Guttman; James B. Sickel

    1995-01-01

    Mantle biopsy is a means of obtaining tissue samples for genetic, physiological, and contaminant studies of bivalves; but the effects of this biopsy on survival have not been determined. We describe a simple technique for obtaining such samples from unionacean bivalves and how we compared survival among biopsied and control organisms in field experiments. Survival was...

  5. RF Sub-sampling Receiver Architecture based on Milieu Adapting Techniques

    DEFF Research Database (Denmark)

    Behjou, Nastaran; Larsen, Torben; Jensen, Ole Kiel

    2012-01-01

    A novel sub-sampling based architecture is proposed which has the ability of reducing the problem of image distortion and improving the signal to noise ratio significantly. The technique is based on sensing the environment and adapting the sampling rate of the receiver to the best possible...

  6. Thermophilic Campylobacter spp. in turkey samples: evaluation of two automated enzyme immunoassays and conventional microbiological techniques

    DEFF Research Database (Denmark)

    Borck, Birgitte; Stryhn, H.; Ersboll, A.K.

    2002-01-01

    Aims: To determine the sensitivity and specificity of two automated enzyme immunoassays (EIA), EiaFoss and Minividas, and a conventional microbiological culture technique for detecting thermophilic Campylobacter spp. in turkey samples. Methods and Results: A total of 286 samples (faecal, meat...

  7. Laboratory techniques for safe encapsulation of α-emitting powder samples

    International Nuclear Information System (INIS)

    Chamberlain, H.E.; Pottinger, J.S.

    1984-01-01

    Plutonium oxide powder samples can be encapsulated in thin plastic film to prevent spread of contamination in counting and X-ray diffraction equipment. The film has to be thin enough to transmit X-rays and α-particles. Techniques are described for the wrapping process and the precautions necessary to keep the sample processing line free of significant contamination. (author)

  8. A simple and efficient alternative to implementing systematic random sampling in stereological designs without a motorized microscope stage.

    Science.gov (United States)

    Melvin, Neal R; Poda, Daniel; Sutherland, Robert J

    2007-10-01

    When properly applied, stereology is a very robust and efficient method to quantify a variety of parameters from biological material. A common sampling strategy in stereology is systematic random sampling, which involves choosing a random sampling [corrected] start point outside the structure of interest, and sampling relevant objects at [corrected] sites that are placed at pre-determined, equidistant intervals. This has proven to be a very efficient sampling strategy, and is used widely in stereological designs. At the microscopic level, this is most often achieved through the use of a motorized stage that facilitates the systematic random stepping across the structure of interest. Here, we report a simple, precise and cost-effective software-based alternative to accomplishing systematic random sampling under the microscope. We believe that this approach will facilitate the use of stereological designs that employ systematic random sampling in laboratories that lack the resources to acquire costly, fully automated systems.

  9. A fully blanketed early B star LTE model atmosphere using an opacity sampling technique

    International Nuclear Information System (INIS)

    Phillips, A.P.; Wright, S.L.

    1980-01-01

    A fully blanketed LTE model of a stellar atmosphere with Tsub(e) = 21914 K (thetasub(e) = 0.23), log g = 4 is presented. The model includes an explicit representation of the opacity due to the strongest lines, and uses a statistical opacity sampling technique to represent the weaker line opacity. The sampling technique is subjected to several tests and the model is compared with an atmosphere calculated using the line-distribution function method. The limitations of the distribution function method and the particular opacity sampling method used here are discussed in the light of the results obtained. (author)

  10. Prompt Gamma Activation Analysis (PGAA): Technique of choice for nondestructive bulk analysis of returned comet samples

    International Nuclear Information System (INIS)

    Lindstrom, D.J.; Lindstrom, R.M.

    1989-01-01

    Prompt gamma activation analysis (PGAA) is a well-developed analytical technique. The technique involves irradiation of samples in an external neutron beam from a nuclear reactor, with simultaneous counting of gamma rays produced in the sample by neutron capture. Capture of neutrons leads to excited nuclei which decay immediately with the emission of energetic gamma rays to the ground state. PGAA has several advantages over other techniques for the analysis of cometary materials: (1) It is nondestructive; (2) It can be used to determine abundances of a wide variety of elements, including most major and minor elements (Na, Mg, Al, Si, P, K, Ca, Ti, Cr, Mn, Fe, Co, Ni), volatiles (H, C, N, F, Cl, S), and some trace elements (those with high neutron capture cross sections, including B, Cd, Nd, Sm, and Gd); and (3) It is a true bulk analysis technique. Recent developments should improve the technique's sensitivity and accuracy considerably

  11. Can groundwater sampling techniques used in monitoring wells influence methane concentrations and isotopes?

    Science.gov (United States)

    Rivard, Christine; Bordeleau, Geneviève; Lavoie, Denis; Lefebvre, René; Malet, Xavier

    2018-03-06

    Methane concentrations and isotopic composition in groundwater are the focus of a growing number of studies. However, concerns are often expressed regarding the integrity of samples, as methane is very volatile and may partially exsolve during sample lifting in the well and transfer to sampling containers. While issues concerning bottle-filling techniques have already been documented, this paper documents a comparison of methane concentration and isotopic composition obtained with three devices commonly used to retrieve water samples from dedicated observation wells. This work lies within the framework of a larger project carried out in the Saint-Édouard area (southern Québec, Canada), whose objective was to assess the risk to shallow groundwater quality related to potential shale gas exploitation. The selected sampling devices, which were tested on ten wells during three sampling campaigns, consist of an impeller pump, a bladder pump, and disposable sampling bags (HydraSleeve). The sampling bags were used both before and after pumping, to verify the appropriateness of a no-purge approach, compared to the low-flow approach involving pumping until stabilization of field physicochemical parameters. Results show that methane concentrations obtained with the selected sampling techniques are usually similar and that there is no systematic bias related to a specific technique. Nonetheless, concentrations can sometimes vary quite significantly (up to 3.5 times) for a given well and sampling event. Methane isotopic composition obtained with all sampling techniques is very similar, except in some cases where sampling bags were used before pumping (no-purge approach), in wells where multiple groundwater sources enter the borehole.

  12. Long-term monitoring of the Danube river-Sampling techniques, radionuclide metrology and radioecological assessment

    International Nuclear Information System (INIS)

    Maringer, F.J.; Gruber, V.; Hrachowitz, M.; Baumgartner, A.; Weilner, S.; Seidel, C.

    2009-01-01

    Sampling techniques and radiometric methods, developed and applied in a comprehensive radioecological study of the Danube River are presented. Results and radiometric data of sediment samples, collected by sediment traps in Austria and additionally by grab sampling in the Danube during research cruises between Germany and the delta (Black sea) are shown and discussed. Goal of the investigation is the protection of public and environment, especially the sustainable use and conservation of human freshwater resources against harmful radioactive exposure.

  13. Comparison between correlated sampling and the perturbation technique of MCNP5 for fixed-source problems

    International Nuclear Information System (INIS)

    He Tao; Su Bingjing

    2011-01-01

    Highlights: → The performance of the MCNP differential operator perturbation technique is compared with that of the MCNP correlated sampling method for three types of fixed-source problems. → In terms of precision, the MCNP perturbation technique outperforms correlated sampling for one type of problem but performs comparably with or even under-performs correlated sampling for the other two types of problems. → In terms of accuracy, the MCNP perturbation calculations may predict inaccurate results for some of the test problems. However, the accuracy can be improved if the midpoint correction technique is used. - Abstract: Correlated sampling and the differential operator perturbation technique are two methods that enable MCNP (Monte Carlo N-Particle) to simulate small response change between an original system and a perturbed system. In this work the performance of the MCNP differential operator perturbation technique is compared with that of the MCNP correlated sampling method for three types of fixed-source problems. In terms of precision of predicted response changes, the MCNP perturbation technique outperforms correlated sampling for the problem involving variation of nuclide concentrations in the same direction but performs comparably with or even underperforms correlated sampling for the other two types of problems that involve void or variation of nuclide concentrations in opposite directions. In terms of accuracy, the MCNP differential operator perturbation calculations may predict inaccurate results that deviate from the benchmarks well beyond their uncertainty ranges for some of the test problems. However, the accuracy of the MCNP differential operator perturbation can be improved if the midpoint correction technique is used.

  14. Precision of systematic and random sampling in clustered populations: habitat patches and aggregating organisms.

    Science.gov (United States)

    McGarvey, Richard; Burch, Paul; Matthews, Janet M

    2016-01-01

    Natural populations of plants and animals spatially cluster because (1) suitable habitat is patchy, and (2) within suitable habitat, individuals aggregate further into clusters of higher density. We compare the precision of random and systematic field sampling survey designs under these two processes of species clustering. Second, we evaluate the performance of 13 estimators for the variance of the sample mean from a systematic survey. Replicated simulated surveys, as counts from 100 transects, allocated either randomly or systematically within the study region, were used to estimate population density in six spatial point populations including habitat patches and Matérn circular clustered aggregations of organisms, together and in combination. The standard one-start aligned systematic survey design, a uniform 10 x 10 grid of transects, was much more precise. Variances of the 10 000 replicated systematic survey mean densities were one-third to one-fifth of those from randomly allocated transects, implying transect sample sizes giving equivalent precision by random survey would need to be three to five times larger. Organisms being restricted to patches of habitat was alone sufficient to yield this precision advantage for the systematic design. But this improved precision for systematic sampling in clustered populations is underestimated by standard variance estimators used to compute confidence intervals. True variance for the survey sample mean was computed from the variance of 10 000 simulated survey mean estimates. Testing 10 published and three newly proposed variance estimators, the two variance estimators (v) that corrected for inter-transect correlation (ν₈ and ν(W)) were the most accurate and also the most precise in clustered populations. These greatly outperformed the two "post-stratification" variance estimators (ν₂ and ν₃) that are now more commonly applied in systematic surveys. Similar variance estimator performance rankings were found with

  15. Iterative algorithm of discrete Fourier transform for processing randomly sampled NMR data sets

    International Nuclear Information System (INIS)

    Stanek, Jan; Kozminski, Wiktor

    2010-01-01

    Spectra obtained by application of multidimensional Fourier Transformation (MFT) to sparsely sampled nD NMR signals are usually corrupted due to missing data. In the present paper this phenomenon is investigated on simulations and experiments. An effective iterative algorithm for artifact suppression for sparse on-grid NMR data sets is discussed in detail. It includes automated peak recognition based on statistical methods. The results enable one to study NMR spectra of high dynamic range of peak intensities preserving benefits of random sampling, namely the superior resolution in indirectly measured dimensions. Experimental examples include 3D 15 N- and 13 C-edited NOESY-HSQC spectra of human ubiquitin.

  16. Republic of Georgia estimates for prevalence of drug use: Randomized response techniques suggest under-estimation.

    Science.gov (United States)

    Kirtadze, Irma; Otiashvili, David; Tabatadze, Mzia; Vardanashvili, Irina; Sturua, Lela; Zabransky, Tomas; Anthony, James C

    2018-06-01

    Validity of responses in surveys is an important research concern, especially in emerging market economies where surveys in the general population are a novelty, and the level of social control is traditionally higher. The Randomized Response Technique (RRT) can be used as a check on response validity when the study aim is to estimate population prevalence of drug experiences and other socially sensitive and/or illegal behaviors. To apply RRT and to study potential under-reporting of drug use in a nation-scale, population-based general population survey of alcohol and other drug use. For this first-ever household survey on addictive substances for the Country of Georgia, we used the multi-stage probability sampling of 18-to-64-year-old household residents of 111 urban and 49 rural areas. During the interviewer-administered assessments, RRT involved pairing of sensitive and non-sensitive questions about drug experiences. Based upon the standard household self-report survey estimate, an estimated 17.3% [95% confidence interval, CI: 15.5%, 19.1%] of Georgian household residents have tried cannabis. The corresponding RRT estimate was 29.9% [95% CI: 24.9%, 34.9%]. The RRT estimates for other drugs such as heroin also were larger than the standard self-report estimates. We remain unsure about what is the "true" value for prevalence of using illegal psychotropic drugs in the Republic of Georgia study population. Our RRT results suggest that standard non-RRT approaches might produce 'under-estimates' or at best, highly conservative, lower-end estimates. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Elemental analyses of goundwater: demonstrated advantage of low-flow sampling and trace-metal clean techniques over standard techniques

    Science.gov (United States)

    Creasey, C. L.; Flegal, A. R.

    The combined use of both (1) low-flow purging and sampling and (2) trace-metal clean techniques provides more representative measurements of trace-element concentrations in groundwater than results derived with standard techniques. The use of low-flow purging and sampling provides relatively undisturbed groundwater samples that are more representative of in situ conditions, and the use of trace-element clean techniques limits the inadvertent introduction of contaminants during sampling, storage, and analysis. When these techniques are applied, resultant trace-element concentrations are likely to be markedly lower than results based on standard sampling techniques. In a comparison of data derived from contaminated and control groundwater wells at a site in California, USA, trace-element concentrations from this study were 2-1000 times lower than those determined by the conventional techniques used in sampling of the same wells prior to (5months) and subsequent to (1month) the collections for this study. Specifically, the cadmium and chromium concentrations derived using standard sampling techniques exceed the California Maximum Contaminant Levels (MCL), whereas in this investigation concentrations of both of those elements are substantially below their MCLs. Consequently, the combined use of low-flow and trace-metal clean techniques may preclude erroneous reports of trace-element contamination in groundwater. Résumé L'utilisation simultanée de la purge et de l'échantillonnage à faible débit et des techniques sans traces de métaux permet d'obtenir des mesures de concentrations en éléments en traces dans les eaux souterraines plus représentatives que les résultats fournis par les techniques classiques. L'utilisation de la purge et de l'échantillonnage à faible débit donne des échantillons d'eau souterraine relativement peu perturbés qui sont plus représentatifs des conditions in situ, et le recours aux techniques sans éléments en traces limite l

  18. Improving ambulatory saliva-sampling compliance in pregnant women: a randomized controlled study.

    Directory of Open Access Journals (Sweden)

    Julian Moeller

    Full Text Available OBJECTIVE: Noncompliance with scheduled ambulatory saliva sampling is common and has been associated with biased cortisol estimates in nonpregnant subjects. This study is the first to investigate in pregnant women strategies to improve ambulatory saliva-sampling compliance, and the association between sampling noncompliance and saliva cortisol estimates. METHODS: We instructed 64 pregnant women to collect eight scheduled saliva samples on two consecutive days each. Objective compliance with scheduled sampling times was assessed with a Medication Event Monitoring System and self-reported compliance with a paper-and-pencil diary. In a randomized controlled study, we estimated whether a disclosure intervention (informing women about objective compliance monitoring and a reminder intervention (use of acoustical reminders improved compliance. A mixed model analysis was used to estimate associations between women's objective compliance and their diurnal cortisol profiles, and between deviation from scheduled sampling and the cortisol concentration measured in the related sample. RESULTS: Self-reported compliance with a saliva-sampling protocol was 91%, and objective compliance was 70%. The disclosure intervention was associated with improved objective compliance (informed: 81%, noninformed: 60%, F(1,60  = 17.64, p<0.001, but not the reminder intervention (reminders: 68%, without reminders: 72%, F(1,60 = 0.78, p = 0.379. Furthermore, a woman's increased objective compliance was associated with a higher diurnal cortisol profile, F(2,64  = 8.22, p<0.001. Altered cortisol levels were observed in less objective compliant samples, F(1,705  = 7.38, p = 0.007, with delayed sampling associated with lower cortisol levels. CONCLUSIONS: The results suggest that in pregnant women, objective noncompliance with scheduled ambulatory saliva sampling is common and is associated with biased cortisol estimates. To improve sampling compliance, results suggest

  19. LVQ-SMOTE - Learning Vector Quantization based Synthetic Minority Over-sampling Technique for biomedical data.

    Science.gov (United States)

    Nakamura, Munehiro; Kajiwara, Yusuke; Otsuka, Atsushi; Kimura, Haruhiko

    2013-10-02

    Over-sampling methods based on Synthetic Minority Over-sampling Technique (SMOTE) have been proposed for classification problems of imbalanced biomedical data. However, the existing over-sampling methods achieve slightly better or sometimes worse result than the simplest SMOTE. In order to improve the effectiveness of SMOTE, this paper presents a novel over-sampling method using codebooks obtained by the learning vector quantization. In general, even when an existing SMOTE applied to a biomedical dataset, its empty feature space is still so huge that most classification algorithms would not perform well on estimating borderlines between classes. To tackle this problem, our over-sampling method generates synthetic samples which occupy more feature space than the other SMOTE algorithms. Briefly saying, our over-sampling method enables to generate useful synthetic samples by referring to actual samples taken from real-world datasets. Experiments on eight real-world imbalanced datasets demonstrate that our proposed over-sampling method performs better than the simplest SMOTE on four of five standard classification algorithms. Moreover, it is seen that the performance of our method increases if the latest SMOTE called MWMOTE is used in our algorithm. Experiments on datasets for β-turn types prediction show some important patterns that have not been seen in previous analyses. The proposed over-sampling method generates useful synthetic samples for the classification of imbalanced biomedical data. Besides, the proposed over-sampling method is basically compatible with basic classification algorithms and the existing over-sampling methods.

  20. Modified emission-transmission method for determining trace elements in solid samples using the XRF techniques

    International Nuclear Information System (INIS)

    Poblete, V.; Alvarez, M.; Hermosilla, M.

    2000-01-01

    This is a study of an analysis of trace elements in medium thick solid samples, by the modified transmission emission method, using the energy dispersion X-ray fluorescence technique (EDXRF). The effects of absorption and reinforcement are the main disadvantages of the EDXRF technique for the quantitative analysis of bigger elements and trace elements in solid samples. The implementation of this method and its application to a variety of samples was carried out using an infinitely thick multi-element white sample that calculates the correction factors by absorbing all the analytes in the sample. The discontinuities in the masic absorption coefficients versus energies association for each element, with medium thick and homogenous samples, are analyzed and corrected. A thorough analysis of the different theoretical and test variables are proven by using real samples, including certified material with known concentration. The simplicity of the calculation method and the results obtained show the method's major precision, with possibilities for the non-destructive routine analysis of different solid samples, using the EDXRF technique (author)

  1. X-ray spectrometry and X-ray microtomography techniques for soil and geological samples analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kubala-Kukuś, A.; Banaś, D.; Braziewicz, J. [Institute of Physics, Jan Kochanowski University, ul. Świetokrzyska 15, 25-406 Kielce (Poland); Holycross Cancer Center, ul. Artwińskiego 3, 25-734 Kielce (Poland); Dziadowicz, M.; Kopeć, E. [Institute of Physics, Jan Kochanowski University, ul. Świetokrzyska 15, 25-406 Kielce (Poland); Majewska, U. [Institute of Physics, Jan Kochanowski University, ul. Świetokrzyska 15, 25-406 Kielce (Poland); Holycross Cancer Center, ul. Artwińskiego 3, 25-734 Kielce (Poland); Mazurek, M.; Pajek, M.; Sobisz, M.; Stabrawa, I. [Institute of Physics, Jan Kochanowski University, ul. Świetokrzyska 15, 25-406 Kielce (Poland); Wudarczyk-Moćko, J. [Holycross Cancer Center, ul. Artwińskiego 3, 25-734 Kielce (Poland); Góźdź, S. [Holycross Cancer Center, ul. Artwińskiego 3, 25-734 Kielce (Poland); Institute of Public Health, Jan Kochanowski University, IX Wieków Kielc 19, 25-317 Kielce (Poland)

    2015-12-01

    A particular subject of X-ray fluorescence analysis is its application in studies of the multielemental sample of composition in a wide range of concentrations, samples with different matrices, also inhomogeneous ones and those characterized with different grain size. Typical examples of these kinds of samples are soil or geological samples for which XRF elemental analysis may be difficult due to XRF disturbing effects. In this paper the WDXRF technique was applied in elemental analysis concerning different soil and geological samples (therapeutic mud, floral soil, brown soil, sandy soil, calcium aluminum cement). The sample morphology was analyzed using X-ray microtomography technique. The paper discusses the differences between the composition of samples, the influence of procedures with respect to the preparation of samples as regards their morphology and, finally, a quantitative analysis. The results of the studies were statistically tested (one-way ANOVA and correlation coefficients). For lead concentration determination in samples of sandy soil and cement-like matrix, the WDXRF spectrometer calibration was performed. The elemental analysis of the samples was complemented with knowledge of chemical composition obtained by X-ray powder diffraction.

  2. X-ray spectrometry and X-ray microtomography techniques for soil and geological samples analysis

    International Nuclear Information System (INIS)

    Kubala-Kukuś, A.; Banaś, D.; Braziewicz, J.; Dziadowicz, M.; Kopeć, E.; Majewska, U.; Mazurek, M.; Pajek, M.; Sobisz, M.; Stabrawa, I.; Wudarczyk-Moćko, J.; Góźdź, S.

    2015-01-01

    A particular subject of X-ray fluorescence analysis is its application in studies of the multielemental sample of composition in a wide range of concentrations, samples with different matrices, also inhomogeneous ones and those characterized with different grain size. Typical examples of these kinds of samples are soil or geological samples for which XRF elemental analysis may be difficult due to XRF disturbing effects. In this paper the WDXRF technique was applied in elemental analysis concerning different soil and geological samples (therapeutic mud, floral soil, brown soil, sandy soil, calcium aluminum cement). The sample morphology was analyzed using X-ray microtomography technique. The paper discusses the differences between the composition of samples, the influence of procedures with respect to the preparation of samples as regards their morphology and, finally, a quantitative analysis. The results of the studies were statistically tested (one-way ANOVA and correlation coefficients). For lead concentration determination in samples of sandy soil and cement-like matrix, the WDXRF spectrometer calibration was performed. The elemental analysis of the samples was complemented with knowledge of chemical composition obtained by X-ray powder diffraction.

  3. Symbol synchronization and sampling frequency synchronization techniques in real-time DDO-OFDM systems

    Science.gov (United States)

    Chen, Ming; He, Jing; Cao, Zizheng; Tang, Jin; Chen, Lin; Wu, Xian

    2014-09-01

    In this paper, we propose and experimentally demonstrate a symbol synchronization and sampling frequency synchronization techniques in real-time direct-detection optical orthogonal frequency division multiplexing (DDO-OFDM) system, over 100-km standard single mode fiber (SSMF) using a cost-effective directly modulated distributed feedback (DFB) laser. The experiment results show that the proposed symbol synchronization based on training sequence (TS) has a low complexity and high accuracy even at a sampling frequency offset (SFO) of 5000-ppm. Meanwhile, the proposed pilot-assisted sampling frequency synchronization between digital-to-analog converter (DAC) and analog-to-digital converter (ADC) is capable of estimating SFOs with an accuracy of technique can also compensate SFO effects within a small residual SFO caused by deviation of SFO estimation and low-precision or unstable clock source. The two synchronization techniques are suitable for high-speed DDO-OFDM transmission systems.

  4. A technique for extracting blood samples from mice in fire toxicity tests

    Science.gov (United States)

    Bucci, T. J.; Hilado, C. J.; Lopez, M. T.

    1976-01-01

    The extraction of adequate blood samples from moribund and dead mice has been a problem because of the small quantity of blood in each animal and the short time available between the animals' death and coagulation of the blood. These difficulties are particularly critical in fire toxicity tests because removal of the test animals while observing proper safety precautions for personnel is time-consuming. Techniques for extracting blood samples from mice were evaluated, and a technique was developed to obtain up to 0.8 ml of blood from a single mouse after death. The technique involves rapid exposure and cutting of the posterior vena cava and accumulation of blood in the peritoneal space. Blood samples of 0.5 ml or more from individual mice have been consistently obtained as much as 16 minutes after apparent death. Results of carboxyhemoglobin analyses of blood appeared reproducible and consistent with carbon monoxide concentrations in the exposure chamber.

  5. Dental Students' Perceptions of Digital and Conventional Impression Techniques: A Randomized Controlled Trial.

    Science.gov (United States)

    Zitzmann, Nicola U; Kovaltschuk, Irina; Lenherr, Patrik; Dedem, Philipp; Joda, Tim

    2017-10-01

    The aim of this randomized controlled trial was to analyze inexperienced dental students' perceptions of the difficulty and applicability of digital and conventional implant impressions and their preferences including performance. Fifty undergraduate dental students at a dental school in Switzerland were randomly divided into two groups (2×25). Group A first took digital impressions in a standardized phantom model and then conventional impressions, while the procedures were reversed for Group B. Participants were asked to complete a VAS questionnaire (0-100) on the level of difficulty and applicability (user/patient-friendliness) of both techniques. They were asked which technique they preferred and perceived to be more efficient. A quotient of "effective scan time per software-recorded time" (TRIOS) was calculated as an objective quality indicator for intraoral optical scanning (IOS). The majority of students perceived IOS as easier than the conventional technique. Most (72%) preferred the digital approach using IOS to take the implant impression to the conventional method (12%) or had no preference (12%). Although total work was similar for males and females, the TRIOS quotient indicated that male students tended to use their time more efficiently. In this study, dental students with no clinical experience were very capable of acquiring digital tools, indicating that digital impression techniques can be included early in the dental curriculum to help them catch up with ongoing development in computer-assisted technologies used in oral rehabilitation.

  6. Systematic comparison of static and dynamic headspace sampling techniques for gas chromatography.

    Science.gov (United States)

    Kremser, Andreas; Jochmann, Maik A; Schmidt, Torsten C

    2016-09-01

    Six automated, headspace-based sample preparation techniques were used to extract volatile analytes from water with the goal of establishing a systematic comparison between commonly available instrumental alternatives. To that end, these six techniques were used in conjunction with the same gas chromatography instrument for analysis of a common set of volatile organic carbon (VOC) analytes. The methods were thereby divided into three classes: static sampling (by syringe or loop), static enrichment (SPME and PAL SPME Arrow), and dynamic enrichment (ITEX and trap sampling). For PAL SPME Arrow, different sorption phase materials were also included in the evaluation. To enable an effective comparison, method detection limits (MDLs), relative standard deviations (RSDs), and extraction yields were determined and are discussed for all techniques. While static sampling techniques exhibited sufficient extraction yields (approx. 10-20 %) to be reliably used down to approx. 100 ng L(-1), enrichment techniques displayed extraction yields of up to 80 %, resulting in MDLs down to the picogram per liter range. RSDs for all techniques were below 27 %. The choice on one of the different instrumental modes of operation (aforementioned classes) was thereby the most influential parameter in terms of extraction yields and MDLs. Individual methods inside each class showed smaller deviations, and the least influences were observed when evaluating different sorption phase materials for the individual enrichment techniques. The option of selecting specialized sorption phase materials may, however, be more important when analyzing analytes with different properties such as high polarity or the capability of specific molecular interactions. Graphical Abstract PAL SPME Arrow during the extraction of volatile analytes from the headspace of an aqueous sample.

  7. Effect of joint mobilization techniques for primary total knee arthroplasty: Study protocol for a randomized controlled trial.

    Science.gov (United States)

    Xu, Jiao; Zhang, Juan; Wang, Xue-Qiang; Wang, Xuan-Lin; Wu, Ya; Chen, Chan-Cheng; Zhang, Han-Yu; Zhang, Zhi-Wan; Fan, Kai-Yi; Zhu, Qiang; Deng, Zhi-Wei

    2017-12-01

    Total knee arthroplasty (TKA) has become the most preferred procedure by patients for the relief of pain caused by knee osteoarthritis. TKA patients aim a speedy recovery after the surgery. Joint mobilization techniques for rehabilitation have been widely used to relieve pain and improve joint mobility. However, relevant randomized controlled trials showing the curative effect of these techniques remain lacking to date. Accordingly, this study aims to investigate whether joint mobilization techniques are valid for primary TKA. We will manage a single-blind, prospective, randomized, controlled trial of 120 patients with unilateral TKA. Patients will be randomized into an intervention group, a physical modality therapy group, and a usual care group. The intervention group will undergo joint mobilization manipulation treatment once a day and regular training twice a day for a month. The physical modality therapy group will undergo physical therapy once a day and regular training twice a day for a month. The usual care group will perform regular training twice a day for a month. Primary outcome measures will be based on the visual analog scale, the knee joint Hospital for Special Surgery score, range of motion, surrounded degree, and adverse effect. Secondary indicators will include manual muscle testing, 36-Item Short Form Health Survey, Berg Balance Scale function evaluation, Pittsburgh Sleep Quality Index, proprioception, and muscle morphology. We will direct intention-to-treat analysis if a subject withdraws from the trial. The important features of this trial for joint mobilization techniques in primary TKA are randomization procedures, single-blind, large sample size, and standardized protocol. This study aims to investigate whether joint mobilization techniques are effective for early TKA patients. The result of this study may serve as a guide for TKA patients, medical personnel, and healthcare decision makers. It has been registered at http

  8. Sampling phased array - a new technique for ultrasonic signal processing and imaging

    OpenAIRE

    Verkooijen, J.; Boulavinov, A.

    2008-01-01

    Over the past 10 years, the improvement in the field of microelectronics and computer engineering has led to significant advances in ultrasonic signal processing and image construction techniques that are currently being applied to non-destructive material evaluation. A new phased array technique, called 'Sampling Phased Array', has been developed in the Fraunhofer Institute for Non-Destructive Testing([1]). It realises a unique approach of measurement and processing of ultrasonic signals. Th...

  9. Sampling phased array, a new technique for ultrasonic signal processing and imaging now available to industry

    OpenAIRE

    Verkooijen, J.; Bulavinov, A.

    2008-01-01

    Over the past 10 years the improvement in the field of microelectronics and computer engineering has led to significant advances in ultrasonic signal processing and image construction techniques that are currently being applied to non-destructive material evaluation. A new phased array technique, called "Sampling Phased Array" has been developed in the Fraunhofer Institute for non-destructive testing [1]. It realizes a unique approach of measurement and processing of ultrasonic signals. The s...

  10. Acute stress symptoms during the second Lebanon war in a random sample of Israeli citizens.

    Science.gov (United States)

    Cohen, Miri; Yahav, Rivka

    2008-02-01

    The aims of this study were to assess prevalence of acute stress disorder (ASD) and acute stress symptoms (ASS) in Israel during the second Lebanon war. A telephone survey was conducted in July 2006 of a random sample of 235 residents of northern Israel, who were subjected to missile attacks, and of central Israel, who were not subjected to missile attacks. Results indicate that ASS scores were higher in the northern respondents; 6.8% of the northern sample and 3.9% of the central sample met ASD criteria. Appearance of each symptom ranged from 15.4% for dissociative to 88.4% for reexperiencing, with significant differences between northern and central respondents only for reexperiencing and arousal. A low ASD rate and a moderate difference between areas subjected and not subjected to attack were found.

  11. Chance constrained problems: penalty reformulation and performance of sample approximation technique

    Czech Academy of Sciences Publication Activity Database

    Branda, Martin

    2012-01-01

    Roč. 48, č. 1 (2012), s. 105-122 ISSN 0023-5954 R&D Projects: GA ČR(CZ) GBP402/12/G097 Institutional research plan: CEZ:AV0Z10750506 Keywords : chance constrained problems * penalty functions * asymptotic equivalence * sample approximation technique * investment problem Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.619, year: 2012 http://library.utia.cas.cz/separaty/2012/E/branda-chance constrained problems penalty reformulation and performance of sample approximation technique.pdf

  12. Human mixed lymphocyte cultures. Evaluation of microculture technique utilizing the multiple automated sample harvester (MASH)

    Science.gov (United States)

    Thurman, G. B.; Strong, D. M.; Ahmed, A.; Green, S. S.; Sell, K. W.; Hartzman, R. J.; Bach, F. H.

    1973-01-01

    Use of lymphocyte cultures for in vitro studies such as pretransplant histocompatibility testing has established the need for standardization of this technique. A microculture technique has been developed that has facilitated the culturing of lymphocytes and increased the quantity of cultures feasible, while lowering the variation between replicate samples. Cultures were prepared for determination of tritiated thymidine incorporation using a Multiple Automated Sample Harvester (MASH). Using this system, the parameters that influence the in vitro responsiveness of human lymphocytes to allogeneic lymphocytes have been investigated. PMID:4271568

  13. Experimental study of laser ablation as sample introduction technique for inductively coupled plasma-mass spectrometry

    International Nuclear Information System (INIS)

    Van Winckel, S.

    2001-01-01

    The contribution consists of an abstract of a PhD thesis. In the PhD study, several complementary applications of laser-ablation were investigated in order to characterise experimentally laser ablation (LA) as a sample introduction technique for ICP-MS. Three applications of LA as a sample introduction technique are discussed: (1) the microchemical analysis of the patina of weathered marble; (2) the possibility to measure isotope ratios (in particular Pb isotope ratios in archaeological bronze artefacts); and (3) the determination of Si in Al as part of a dosimetric study of the BR2 reactor vessel

  14. Distributed fiber sparse-wideband vibration sensing by sub-Nyquist additive random sampling

    Science.gov (United States)

    Zhang, Jingdong; Zheng, Hua; Zhu, Tao; Yin, Guolu; Liu, Min; Bai, Yongzhong; Qu, Dingrong; Qiu, Feng; Huang, Xianbing

    2018-05-01

    The round trip time of the light pulse limits the maximum detectable vibration frequency response range of phase-sensitive optical time domain reflectometry ({\\phi}-OTDR). Unlike the uniform laser pulse interval in conventional {\\phi}-OTDR, we randomly modulate the pulse interval, so that an equivalent sub-Nyquist additive random sampling (sNARS) is realized for every sensing point of the long interrogation fiber. For an {\\phi}-OTDR system with 10 km sensing length, the sNARS method is optimized by theoretical analysis and Monte Carlo simulation, and the experimental results verify that a wide-band spars signal can be identified and reconstructed. Such a method can broaden the vibration frequency response range of {\\phi}-OTDR, which is of great significance in sparse-wideband-frequency vibration signal detection, such as rail track monitoring and metal defect detection.

  15. Location and multi-depot vehicle routing for emergency vehicles using tour coverage and random sampling

    Directory of Open Access Journals (Sweden)

    Alireza Goli

    2015-09-01

    Full Text Available Distribution and optimum allocation of emergency resources are the most important tasks, which need to be accomplished during crisis. When a natural disaster such as earthquake, flood, etc. takes place, it is necessary to deliver rescue efforts as quickly as possible. Therefore, it is important to find optimum location and distribution of emergency relief resources. When a natural disaster occurs, it is not possible to reach some damaged areas. In this paper, location and multi-depot vehicle routing for emergency vehicles using tour coverage and random sampling is investigated. In this study, there is no need to visit all the places and some demand points receive their needs from the nearest possible location. The proposed study is implemented for some randomly generated numbers in different sizes. The preliminary results indicate that the proposed method was capable of reaching desirable solutions in reasonable amount of time.

  16. Preparation of quality control samples for thyroid hormones T3 and T4 in radioimmunoassay techniques

    International Nuclear Information System (INIS)

    Ahmed, F.O.A.

    2006-03-01

    Today, the radioimmunoassay becomes one of the best techniques for quantitative analysis of very low concentration of different substances. RIA is being widely used in medical and research laboratories. To maintain high specificity and accuracy in RIA and other related techniques the quality controls must be introduced. In this dissertation quality control samples for thyroid hormones (Triiodothyronine T3 and Thyroxin T4), using RIA techniques. Ready made chinese T4, T3 RIA kits were used. IAEA statistical package were selected.(Author)

  17. Practical aspects of the resin bead technique for mass spectrometric sample loading

    International Nuclear Information System (INIS)

    Walker, R.L.; Pritchard, C.A.; Carter, J.A.; Smith, D.H.

    1976-07-01

    Using an anion resin bead as a loading vehicle for uranium and plutonium samples which are to be analyzed isotopically in a mass spectrometer has many advantages over conventional techniques. It is applicable to any laboratory routinely performing such analyses, but should be particularly relevant for Safeguards' purposes. Because the techniques required differ markedly from those of conventional methods, this report has been written to describe them in detail to enable those unfamiliar with the technique to master it with a minimum of trouble

  18. Recent advances in sample preparation techniques and methods of sulfonamides detection - A review.

    Science.gov (United States)

    Dmitrienko, Stanislava G; Kochuk, Elena V; Apyari, Vladimir V; Tolmacheva, Veronika V; Zolotov, Yury A

    2014-11-19

    Sulfonamides (SAs) have been the most widely used antimicrobial drugs for more than 70 years, and their residues in foodstuffs and environmental samples pose serious health hazards. For this reason, sensitive and specific methods for the quantification of these compounds in numerous matrices have been developed. This review intends to provide an updated overview of the recent trends over the past five years in sample preparation techniques and methods for detecting SAs. Examples of the sample preparation techniques, including liquid-liquid and solid-phase extraction, dispersive liquid-liquid microextraction and QuEChERS, are given. Different methods of detecting the SAs present in food and feed and in environmental, pharmaceutical and biological samples are discussed. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Analysis of pure and malachite green doped polysulfone sample using FT-IR technique

    Science.gov (United States)

    Nayak, Rashmi J.; Khare, P. K.; Nayak, J. G.

    2018-05-01

    The sample of pure and malachite green doped Polysulfone in the form of foil was prepared by isothermal immersion technique. For the preparation of pure sample 4 gm of Polysulfone was dissolved in 50 ml of Dimethyl farmamide (DMF) solvent, while for the preparation of doped sample 10 mg, 50 mg and 100 mg Malachite Green was mixed with 4 gm of Polysulfone respectively. For the study of structural characterization of these pure and doped sample, Fourier Transform Infra-Red Spectroscopy (FT-IR) technique was used. This study shows that the intensity of transmittance decreases as the ratio of doping increases in pure polysulfone. The reduction in intensity of transmittance is clearly apparent in the present case more over the bands were broader which indicates towards charge transfer interaction between the donar and acceptor molecule.

  20. Prevalence and correlates of problematic smartphone use in a large random sample of Chinese undergraduates.

    Science.gov (United States)

    Long, Jiang; Liu, Tie-Qiao; Liao, Yan-Hui; Qi, Chang; He, Hao-Yu; Chen, Shu-Bao; Billieux, Joël

    2016-11-17

    Smartphones are becoming a daily necessity for most undergraduates in Mainland China. Because the present scenario of problematic smartphone use (PSU) is largely unexplored, in the current study we aimed to estimate the prevalence of PSU and to screen suitable predictors for PSU among Chinese undergraduates in the framework of the stress-coping theory. A sample of 1062 undergraduate smartphone users was recruited by means of the stratified cluster random sampling strategy between April and May 2015. The Problematic Cellular Phone Use Questionnaire was used to identify PSU. We evaluated five candidate risk factors for PSU by using logistic regression analysis while controlling for demographic characteristics and specific features of smartphone use. The prevalence of PSU among Chinese undergraduates was estimated to be 21.3%. The risk factors for PSU were majoring in the humanities, high monthly income from the family (≥1500 RMB), serious emotional symptoms, high perceived stress, and perfectionism-related factors (high doubts about actions, high parental expectations). PSU among undergraduates appears to be ubiquitous and thus constitutes a public health issue in Mainland China. Although further longitudinal studies are required to test whether PSU is a transient phenomenon or a chronic and progressive condition, our study successfully identified socio-demographic and psychological risk factors for PSU. These results, obtained from a random and thus representative sample of undergraduates, opens up new avenues in terms of prevention and regulation policies.

  1. Sampling methods and non-destructive examination techniques for large radioactive waste packages

    International Nuclear Information System (INIS)

    Green, T.H.; Smith, D.L.; Burgoyne, K.E.; Maxwell, D.J.; Norris, G.H.; Billington, D.M.; Pipe, R.G.; Smith, J.E.; Inman, C.M.

    1992-01-01

    Progress is reported on work undertaken to evaluate quality checking methods for radioactive wastes. A sampling rig was designed, fabricated and used to develop techniques for the destructive sampling of cemented simulant waste using remotely operated equipment. An engineered system for the containment of cooling water was designed and manufactured and successfully demonstrated with the drum and coring equipment mounted in both vertical and horizontal orientations. The preferred in-cell orientation was found to be with the drum and coring machinery mounted in a horizontal position. Small powdered samples can be taken from cemented homogeneous waste cores using a hollow drill/vacuum section technique with the preferred subsampling technique being to discard the outer 10 mm layer to obtain a representative sample of the cement core. Cement blends can be dissolved using fusion techniques and the resulting solutions are stable to gelling for periods in excess of one year. Although hydrochloric acid and nitric acid are promising solvents for dissolution of cement blends, the resultant solutions tend to form silicic acid gels. An estimate of the beta-emitter content of cemented waste packages can be obtained by a combination of non-destructive and destructive techniques. The errors will probably be in excess of +/-60 % at the 95 % confidence level. Real-time X-ray video-imaging techniques have been used to analyse drums of uncompressed, hand-compressed, in-drum compacted and high-force compacted (i.e. supercompacted) simulant waste. The results have confirmed the applicability of this technique for NDT of low-level waste. 8 refs., 12 figs., 3 tabs

  2. Comparison of Techniques for Sampling Adult Necrophilous Insects From Pig Carcasses.

    Science.gov (United States)

    Cruise, Angela; Hatano, Eduardo; Watson, David W; Schal, Coby

    2018-02-06

    Studies of the pre-colonization interval and mechanisms driving necrophilous insect ecological succession depend on effective sampling of adult insects and knowledge of their diel and successional activity patterns. The number of insects trapped, their diversity, and diel periodicity were compared with four sampling methods on neonate pigs. Sampling method, time of day and decomposition age of the pigs significantly affected the number of insects sampled from pigs. We also found significant interactions of sampling method and decomposition day, time of sampling and decomposition day. No single method was superior to the other methods during all three decomposition days. Sampling times after noon yielded the largest samples during the first 2 d of decomposition. On day 3 of decomposition however, all sampling times were equally effective. Therefore, to maximize insect collections from neonate pigs, the method used to sample must vary by decomposition day. The suction trap collected the most species-rich samples, but sticky trap samples were the most diverse, when both species richness and evenness were factored into a Shannon diversity index. Repeated sampling during the noon to 18:00 hours period was most effective to obtain the maximum diversity of trapped insects. The integration of multiple sampling techniques would most effectively sample the necrophilous insect community. However, because all four tested methods were deficient at sampling beetle species, future work should focus on optimizing the most promising methods, alone or in combinations, and incorporate hand-collections of beetles. © The Author(s) 2018. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  3. Experimental technique to measure thoron generation rate of building material samples using RAD7 detector

    International Nuclear Information System (INIS)

    Csige, I.; Szabó, Zs.; Szabó, Cs.

    2013-01-01

    Thoron ( 220 Rn) is the second most abundant radon isotope in our living environment. In some dwellings it is present in significant amount which calls for its identification and remediation. Indoor thoron originates mainly from building materials. In this work we have developed and tested an experimental technique to measure thoron generation rate in building material samples using RAD7 radon-thoron detector. The mathematical model of the measurement technique provides the thoron concentration response of RAD7 as a function of the sample thickness. For experimental validation of the technique an adobe building material sample was selected for measuring the thoron concentration at nineteen different sample thicknesses. Fitting the parameters of the model to the measurement results, both the generation rate and the diffusion length of thoron was estimated. We have also determined the optimal sample thickness for estimating the thoron generation rate from a single measurement. -- Highlights: • RAD7 is used for the determination of thoron generation rate (emanation). • The described model takes into account the thoron decay and attenuation. • The model describes well the experimental results. • A single point measurement method is offered at a determined sample thickness

  4. A smart rotary technique versus conventional pulpectomy for primary teeth: A randomized controlled clinical study.

    Science.gov (United States)

    Mokhtari, Negar; Shirazi, Alireza-Sarraf; Ebrahimi, Masoumeh

    2017-11-01

    Techniques with adequate accuracy of working length determination along with shorter duration of treatment in pulpectomy procedure seems to be essential in pediatric dentistry. The aim of the present study was to evaluate the accuracy of root canal length measurement with Root ZX II apex locator and rotary system in pulpectomy of primary teeth. In this randomized control clinical trial complete pulpectomy was performed on 80 mandibular primary molars in 80, 4-6-year-old children. The study population was randomly divided into case and control groups. In control group conventional pulpectomy was performed and in the case group working length was determined by electronic apex locator Root ZXII and instrumented with Mtwo rotary files. Statistical evaluation was performed using Mann-Whitney and Chi-Square tests ( P <0.05). There were no significant differences between electronic apex locator Root ZXII and conventional method in accuracy of root canal length determination. However significantly less time was needed for instrumenting with rotary files ( P =0.000). Considering the comparable results in accuracy of root canal length determination and the considerably shorter instrumentation time in Root ZXII apex locator and rotary system, it may be suggested for pulpectomy in primary molar teeth. Key words: Rotary technique, conventional technique, pulpectomy, primary teeth.

  5. Effect of manual therapy techniques on headache disability in patients with tension-type headache. Randomized controlled trial.

    Science.gov (United States)

    Espí-López, G V; Rodríguez-Blanco, C; Oliva-Pascual-Vaca, A; Benítez-Martínez, J C; Lluch, E; Falla, D

    2014-12-01

    Tension-type headache (TTH) is the most common type of primary headache however there is no clear evidence as to which specific treatment is most effective or whether combined treatment is more effective than individual treatments. To assess the effectiveness of manual therapy techniques, applied to the suboccipital region, on aspects of disability in a sample of patients with tension-type headache. Randomized Controlled Trial. Specialized centre for headache treatment. Seventy-six (62 women) patients (age: 39.9 ± 10.9 years) with episodic chronic TTH. Patients were randomly divided into four treatment groups: 1) suboccipital soft tissue inhibition; 2) occiput-atlas-axis manipulation; 3) combined treatment of both techniques; 4) control. Four sessions were applied over 4 weeks and disability was assessed before and after treatment using the Headache Disability Inventory (HDI). Headache frequency, severity and the functional and emotional subscales of the questionnaire were assessed. Photophobia, phonophobia and pericranial tenderness were also monitored. Headache frequency was significantly reduced with the manipulative and combined treatment (Ptreatment groups (Ptreatment also reduced the score on the emotional subscale of the HDI (Ptreatments were combined, effectiveness was noted for all aspects of disability and other symptoms including photophobia, phonophobia and pericranial tenderness. Although individual manual therapy treatments showed a positive change in headache features, measures of photophobia, photophobia and pericranial tenderness only improved in the group that received the combined treatment suggesting that combined treatment is the most appropriate for symptomatic relief of TTH.

  6. 3D-Laser-Scanning Technique Applied to Bulk Density Measurements of Apollo Lunar Samples

    Science.gov (United States)

    Macke, R. J.; Kent, J. J.; Kiefer, W. S.; Britt, D. T.

    2015-01-01

    In order to better interpret gravimetric data from orbiters such as GRAIL and LRO to understand the subsurface composition and structure of the lunar crust, it is import to have a reliable database of the density and porosity of lunar materials. To this end, we have been surveying these physical properties in both lunar meteorites and Apollo lunar samples. To measure porosity, both grain density and bulk density are required. For bulk density, our group has historically utilized sub-mm bead immersion techniques extensively, though several factors have made this technique problematic for our work with Apollo samples. Samples allocated for measurement are often smaller than optimal for the technique, leading to large error bars. Also, for some samples we were required to use pure alumina beads instead of our usual glass beads. The alumina beads were subject to undesirable static effects, producing unreliable results. Other investigators have tested the use of 3d laser scanners on meteorites for measuring bulk volumes. Early work, though promising, was plagued with difficulties including poor response on dark or reflective surfaces, difficulty reproducing sharp edges, and large processing time for producing shape models. Due to progress in technology, however, laser scanners have improved considerably in recent years. We tested this technique on 27 lunar samples in the Apollo collection using a scanner at NASA Johnson Space Center. We found it to be reliable and more precise than beads, with the added benefit that it involves no direct contact with the sample, enabling the study of particularly friable samples for which bead immersion is not possible

  7. Review of sample preparation techniques for the analysis of pesticide residues in soil.

    Science.gov (United States)

    Tadeo, José L; Pérez, Rosa Ana; Albero, Beatriz; García-Valcárcel, Ana I; Sánchez-Brunete, Consuelo

    2012-01-01

    This paper reviews the sample preparation techniques used for the analysis of pesticides in soil. The present status and recent advances made during the last 5 years in these methods are discussed. The analysis of pesticide residues in soil requires the extraction of analytes from this matrix, followed by a cleanup procedure, when necessary, prior to their instrumental determination. The optimization of sample preparation is a very important part of the method development that can reduce the analysis time, the amount of solvent, and the size of samples. This review considers all aspects of sample preparation, including extraction and cleanup. Classical extraction techniques, such as shaking, Soxhlet, and ultrasonic-assisted extraction, and modern techniques like pressurized liquid extraction, microwave-assisted extraction, solid-phase microextraction and QuEChERS (Quick, Easy, Cheap, Effective, Rugged, and Safe) are reviewed. The different cleanup strategies applied for the purification of soil extracts are also discussed. In addition, the application of these techniques to environmental studies is considered.

  8. A review of analytical techniques for the determination of carbon-14 in environmental samples

    International Nuclear Information System (INIS)

    Milton, G.M.; Brown, R.M.

    1993-11-01

    This report contains a brief summary of analytical techniques commonly used for the determination of radiocarbon in a variety of environmental samples. Details of the applicable procedures developed and tested in the Environmental Research Branch at Chalk River Laboratories are appended

  9. Attempts to develop a new nuclear measurement technique of β-glucuronidase levels in biological samples

    International Nuclear Information System (INIS)

    Unak, T.; Avcibasi, U.; Yildirim, Y.; Cetinkaya, B.

    2003-01-01

    β-Glucuronidase is one of the most important hydrolytic enzymes in living systems and plays an essential role in the detoxification pathway of toxic materials incorporated into the metabolism. Some organs, especially liver and some tumour tissues, have high level of β-glucuronidase activity. As a result the enzymatic activity of some kind of tumour cells, the radiolabelled glucuronide conjugates of cytotoxic, as well as radiotoxic compounds have potentially very valuable diagnostic and therapeutic applications in cancer research. For this reason, a sensitive measurement of β-glucuronidase levels in normal and tumour tissues is a very important step for these kinds of applications. According to the classical measurement method of β-glucuronidase activity, in general, the quantity of phenolphthalein liberated from its glucuronide conjugate, i.e. phenolphthalein-glucuronide, by β-glucuronidase has been measured by use of the spectrophotometric technique. The lower detection limit of phenolphthalein by the spectrophotometric technique is about 1-3 mg. This means that the β-glucuronidase levels could not be detected in biological samples having lower levels of β-glucuronidase activity and therefore the applications of the spectrophotometric technique in cancer research are very seriously limited. Starting from this consideration, we recently attempted to develop a new nuclear technique to measure much lower concentrations of β-glucuronidase in biological samples. To improve the detection limit, phenolphthalein-glucuronide and also phenyl-N-glucuronide were radioiodinated with 131 I and their radioactivity was measured by use of the counting technique. Therefore, the quantity of phenolphthalein or aniline radioiodinated with 131 I and liberated by the deglucuronidation reactivity of β-glucuronidase was used in an attempt to measure levels lower than the spectrophotometric measurement technique. The results obtained clearly verified that 0.01 pg level of

  10. Atmospheric pressure surface sampling/ionization techniques for direct coupling of planar separations with mass spectrometry.

    Science.gov (United States)

    Pasilis, Sofie P; Van Berkel, Gary J

    2010-06-18

    Planar separations, which include thin layer chromatography and gel electrophoresis, are in widespread use as important and powerful tools for conducting separations of complex mixtures. To increase the utility of planar separations, new methods are needed that allow in situ characterization of the individual components of the separated mixtures. A large number of atmospheric pressure surface sampling and ionization techniques for use with mass spectrometry have emerged in the past several years, and several have been investigated as a means for mass spectrometric read-out of planar separations. In this article, we review the atmospheric pressure surface sampling and ionization techniques that have been used for the read-out of planar separation media. For each technique, we briefly explain the operational basics and discuss the analyte type for which it is appropriate and some specific applications from the literature. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  11. Heating and thermal control of brazing technique to break contamination path for potential Mars sample return

    Science.gov (United States)

    Bao, Xiaoqi; Badescu, Mircea; Sherrit, Stewart; Bar-Cohen, Yoseph; Campos, Sergio

    2017-04-01

    The potential return of Mars sample material is of great interest to the planetary science community, as it would enable extensive analysis of samples with highly sensitive laboratory instruments. It is important to make sure such a mission concept would not bring any living microbes, which may possibly exist on Mars, back to Earth's environment. In order to ensure the isolation of Mars microbes from Earth's Atmosphere, a brazing sealing and sterilizing technique was proposed to break the Mars-to-Earth contamination path. Effectively, heating the brazing zone in high vacuum space and controlling the sample temperature for integrity are key challenges to the implementation of this technique. The break-thechain procedures for container configurations, which are being considered, were simulated by multi-physics finite element models. Different heating methods including induction and resistive/radiation were evaluated. The temperature profiles of Martian samples in a proposed container structure were predicted. The results show that the sealing and sterilizing process can be controlled such that the samples temperature is maintained below the level that may cause damage, and that the brazing technique is a feasible approach to breaking the contamination path.

  12. THE STUDY OF HEAVY METAL FROM ENVIRONMENTAL SAMPLES BY ATOMIC TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Ion V. POPESCU

    2011-05-01

    Full Text Available Using the Atomic Absorption Spectrometry (AAS and Energy Dispersive X-ray spectrometry (EDXRF techniques we analyzed the contents of heavy metals ( Cd, Cr, Ni, Pb, Ti, Sr, Co, Bi from eight wild mushrooms and soil substrate samples (48 samples of eight fungal species and 32 underlying soil samples, collected from ten forest sites of Dambovița County Romania. It was determined that the elements, especially heavy metals, in soil were characteristic of the acidic soils of the Romanian forest lands and are influenced by industrial pollution. Analytical possibilities of AAS and EDXRF analytical techniques have been compared and the heavy metal transfer from substrate to mushrooms has been studied. The coefficient of accumulation of essential and heavy metals has been calculated as well. Heavy metal contents of all analyzed mushrooms were generally higher than previously reported in literature.

  13. Toward greener analytical techniques for the absolute quantification of peptides in pharmaceutical and biological samples.

    Science.gov (United States)

    Van Eeckhaut, Ann; Mangelings, Debby

    2015-09-10

    Peptide-based biopharmaceuticals represent one of the fastest growing classes of new drug molecules. New reaction types included in the synthesis strategies to reduce the rapid metabolism of peptides, along with the availability of new formulation and delivery technologies, resulted in an increased marketing of peptide drug products. In this regard, the development of analytical methods for quantification of peptides in pharmaceutical and biological samples is of utmost importance. From the sample preparation step to their analysis by means of chromatographic or electrophoretic methods, many difficulties should be tackled to analyze them. Recent developments in analytical techniques emphasize more and more on the use of green analytical techniques. This review will discuss the progresses in and challenges observed during green analytical method development for the quantification of peptides in pharmaceutical and biological samples. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. [Influence of Natural Dissolved Organic Matter on the Passive Sampling Technique and its Application].

    Science.gov (United States)

    Yu, Shang-yun; Zhou, Yan-mei

    2015-08-01

    This paper studied the effects of different concentrations of natural dissolved organic matter (DOM) on the passive sampling technique. The results showed that the presence of DOM affected the organic pollutant adsorption ability of the membrane. For lgK(OW), 3-5, DOM had less impact on the adsorption of organic matter by the membrane; for lgK(OW), > 5.5, DOM significantly increased the adsorption capacity of the membrane. Meanwhile, LDPE passive sampling technique was applied to monitor PAHs and PAEs in pore water of three surface sediments in Taizi River. All of the target pollutants were detected in varying degrees at each sampling point. Finally, the quotient method was used to assess the ecological risks of PAHs and PAEs. The results showed that fluoranthene exceeded the reference value of the aquatic ecosystem, meaning there was a big ecological risk.

  15. Estimation of trace levels of plutonium in urine samples by fission track technique

    International Nuclear Information System (INIS)

    Sawant, P.D.; Prabhu, S.; Pendharkar, K.A.; Kalsi, P.C.

    2009-01-01

    Individual monitoring of radiation workers handling Pu in various nuclear installations requires the detection of trace levels of plutonium in bioassay samples. It is necessary to develop methods that can detect urinary excretion of Pu in fraction of mBq range. Therefore, a sensitive method such as fission track analysis has been developed for the measurement of trace levels of Pu in bioassay samples. In this technique, chemically separated plutonium from the sample and a Pu standard were electrodeposited on planchettes and covered with Lexan solid state nuclear track detector (SSNTD) and irradiated with thermal neutrons in APSARA reactor of Bhabha Atomic Research Centre, India. The fission track densities in the Lexan films of the sample and the standard were used to calculate the amount of Pu in the sample. The minimum amount of Pu that can be analyzed by this method using doubly distilled electronic grade (E. G.) reagents is about 12 μBq/L. (author)

  16. A radioanalytical technique using (n,2n) reaction for the elemental analysis of samples

    International Nuclear Information System (INIS)

    Labor, M.

    1985-11-01

    A technique to determine elemental composition of samples is reported. The principle of the technique employs the internal standard method and involves the resolution of complex annihilation spectra. The technique has been applied to the determination of the mass of nitrogen, msub(N), and that of potassium, msub(K), in known masses of potassium nitrate. The percentage difference between the calculated mass and actual masses in 2g and 3g of potassium nitrate is 1.0 and 0.7 respectively for potassium, and 1.0 for nitrogen. The use of more simultaneous equations than necessary in solving for msub(N) and msub(K) offers one of the advantages of the technique. (author)

  17. Accelerated Solvent Extraction: An Innovative Sample Extraction Technique for Natural Products

    International Nuclear Information System (INIS)

    Hazlina Ahmad Hassali; Azfar Hanif Abd Aziz; Rosniza Razali

    2015-01-01

    Accelerated solvent extraction (ASE) is one of the novel techniques that have been developed for the extraction of phytochemicals from plants in order to shorten the extraction time, decrease the solvent consumption, increase the extraction yield and enhance the quality of extracts. This technique combines elevated temperatures and pressure with liquid solvents. This paper gives a brief overview of accelerated solvent extraction technique for sample preparation and its application to the extraction of natural products. Through practical examples, the effects of operational parameters such as temperature, volume of solvent used, extraction time and extraction yields on the performance of ASE are discussed. It is demonstrated that ASE technique allows reduced solvent consumption and shorter extraction time, while the extraction yields are even higher than those obtained with conventional methods. (author)

  18. A systematic examination of a random sampling strategy for source apportionment calculations.

    Science.gov (United States)

    Andersson, August

    2011-12-15

    Estimating the relative contributions from multiple potential sources of a specific component in a mixed environmental matrix is a general challenge in diverse fields such as atmospheric, environmental and earth sciences. Perhaps the most common strategy for tackling such problems is by setting up a system of linear equations for the fractional influence of different sources. Even though an algebraic solution of this approach is possible for the common situation with N+1 sources and N source markers, such methodology introduces a bias, since it is implicitly assumed that the calculated fractions and the corresponding uncertainties are independent of the variability of the source distributions. Here, a random sampling (RS) strategy for accounting for such statistical bias is examined by investigating rationally designed synthetic data sets. This random sampling methodology is found to be robust and accurate with respect to reproducibility and predictability. This method is also compared to a numerical integration solution for a two-source situation where source variability also is included. A general observation from this examination is that the variability of the source profiles not only affects the calculated precision but also the mean/median source contributions. Copyright © 2011 Elsevier B.V. All rights reserved.

  19. LOD score exclusion analyses for candidate QTLs using random population samples.

    Science.gov (United States)

    Deng, Hong-Wen

    2003-11-01

    While extensive analyses have been conducted to test for, no formal analyses have been conducted to test against, the importance of candidate genes as putative QTLs using random population samples. Previously, we developed an LOD score exclusion mapping approach for candidate genes for complex diseases. Here, we extend this LOD score approach for exclusion analyses of candidate genes for quantitative traits. Under this approach, specific genetic effects (as reflected by heritability) and inheritance models at candidate QTLs can be analyzed and if an LOD score is < or = -2.0, the locus can be excluded from having a heritability larger than that specified. Simulations show that this approach has high power to exclude a candidate gene from having moderate genetic effects if it is not a QTL and is robust to population admixture. Our exclusion analysis complements association analysis for candidate genes as putative QTLs in random population samples. The approach is applied to test the importance of Vitamin D receptor (VDR) gene as a potential QTL underlying the variation of bone mass, an important determinant of osteoporosis.

  20. A Combined Weighting Method Based on Hybrid of Interval Evidence Fusion and Random Sampling

    Directory of Open Access Journals (Sweden)

    Ying Yan

    2017-01-01

    Full Text Available Due to the complexity of system and lack of expertise, epistemic uncertainties may present in the experts’ judgment on the importance of certain indices during group decision-making. A novel combination weighting method is proposed to solve the index weighting problem when various uncertainties are present in expert comments. Based on the idea of evidence theory, various types of uncertain evaluation information are uniformly expressed through interval evidence structures. Similarity matrix between interval evidences is constructed, and expert’s information is fused. Comment grades are quantified using the interval number, and cumulative probability function for evaluating the importance of indices is constructed based on the fused information. Finally, index weights are obtained by Monte Carlo random sampling. The method can process expert’s information with varying degrees of uncertainties, which possesses good compatibility. Difficulty in effectively fusing high-conflict group decision-making information and large information loss after fusion is avertible. Original expert judgments are retained rather objectively throughout the processing procedure. Cumulative probability function constructing and random sampling processes do not require any human intervention or judgment. It can be implemented by computer programs easily, thus having an apparent advantage in evaluation practices of fairly huge index systems.

  1. Randomized comparison of vaginal self-sampling by standard vs. dry swabs for Human papillomavirus testing

    International Nuclear Information System (INIS)

    Eperon, Isabelle; Vassilakos, Pierre; Navarria, Isabelle; Menoud, Pierre-Alain; Gauthier, Aude; Pache, Jean-Claude; Boulvain, Michel; Untiet, Sarah; Petignat, Patrick

    2013-01-01

    To evaluate if human papillomavirus (HPV) self-sampling (Self-HPV) using a dry vaginal swab is a valid alternative for HPV testing. Women attending colposcopy clinic were recruited to collect two consecutive Self-HPV samples: a Self-HPV using a dry swab (S-DRY) and a Self-HPV using a standard wet transport medium (S-WET). These samples were analyzed for HPV using real time PCR (Roche Cobas). Participants were randomized to determine the order of the tests. Questionnaires assessing preferences and acceptability for both tests were conducted. Subsequently, women were invited for colposcopic examination; a physician collected a cervical sample (physician-sampling) with a broom-type device and placed it into a liquid-based cytology medium. Specimens were then processed for the production of cytology slides and a Hybrid Capture HPV DNA test (Qiagen) was performed from the residual liquid. Biopsies were performed if indicated. Unweighted kappa statistics (κ) and McNemar tests were used to measure the agreement among the sampling methods. A total of 120 women were randomized. Overall HPV prevalence was 68.7% (95% Confidence Interval (CI) 59.3–77.2) by S-WET, 54.4% (95% CI 44.8–63.9) by S-DRY and 53.8% (95% CI 43.8–63.7) by HC. Among paired samples (S-WET and S-DRY), the overall agreement was good (85.7%; 95% CI 77.8–91.6) and the κ was substantial (0.70; 95% CI 0.57-0.70). The proportion of positive type-specific HPV agreement was also good (77.3%; 95% CI 68.2-84.9). No differences in sensitivity for cervical intraepithelial neoplasia grade one (CIN1) or worse between the two Self-HPV tests were observed. Women reported the two Self-HPV tests as highly acceptable. Self-HPV using dry swab transfer does not appear to compromise specimen integrity. Further study in a large screening population is needed. ClinicalTrials.gov: http://clinicaltrials.gov/show/NCT01316120

  2. Random-Access Technique for Self-Organization of 5G Millimeter-Wave Cellular Communications

    Directory of Open Access Journals (Sweden)

    Jasper Meynard Arana

    2016-01-01

    Full Text Available The random-access (RA technique is a key procedure in cellular networks and self-organizing networks (SONs, but the overall processing time of this technique in millimeter-wave (mm-wave cellular systems with directional beams is very long because RA preambles (RAPs should be transmitted in all directions of Tx and Rx beams. In this paper, two different types of preambles (RAP-1 and RAP-2 are proposed to reduce the processing time in the RA stage. After analyzing the correlation property, false-alarm probability, and detection probability of the proposed RAPs, we perform simulations to show that the RAP-2 is suitable for RA in mm-wave cellular systems with directional beams because of the smaller processing time and high detection probability in multiuser environments.

  3. Smoothing the redshift distributions of random samples for the baryon acoustic oscillations: applications to the SDSS-III BOSS DR12 and QPM mock samples

    Science.gov (United States)

    Wang, Shao-Jiang; Guo, Qi; Cai, Rong-Gen

    2017-12-01

    We investigate the impact of different redshift distributions of random samples on the baryon acoustic oscillations (BAO) measurements of D_V(z)r_d^fid/r_d from the two-point correlation functions of galaxies in the Data Release 12 of the Baryon Oscillation Spectroscopic Survey (BOSS). Big surveys, such as BOSS, usually assign redshifts to the random samples by randomly drawing values from the measured redshift distributions of the data, which would necessarily introduce fiducial signals of fluctuations into the random samples, weakening the signals of BAO, if the cosmic variance cannot be ignored. We propose a smooth function of redshift distribution that fits the data well to populate the random galaxy samples. The resulting cosmological parameters match the input parameters of the mock catalogue very well. The significance of BAO signals has been improved by 0.33σ for a low-redshift sample and by 0.03σ for a constant-stellar-mass sample, though the absolute values do not change significantly. Given the precision of the measurements of current cosmological parameters, it would be appreciated for the future improvements on the measurements of galaxy clustering.

  4. Lot quality assurance sampling techniques in health surveys in developing countries: advantages and current constraints.

    Science.gov (United States)

    Lanata, C F; Black, R E

    1991-01-01

    Traditional survey methods, which are generally costly and time-consuming, usually provide information at the regional or national level only. The utilization of lot quality assurance sampling (LQAS) methodology, developed in industry for quality control, makes it possible to use small sample sizes when conducting surveys in small geographical or population-based areas (lots). This article describes the practical use of LQAS for conducting health surveys to monitor health programmes in developing countries. Following a brief description of the method, the article explains how to build a sample frame and conduct the sampling to apply LQAS under field conditions. A detailed description of the procedure for selecting a sampling unit to monitor the health programme and a sample size is given. The sampling schemes utilizing LQAS applicable to health surveys, such as simple- and double-sampling schemes, are discussed. The interpretation of the survey results and the planning of subsequent rounds of LQAS surveys are also discussed. When describing the applicability of LQAS in health surveys in developing countries, the article considers current limitations for its use by health planners in charge of health programmes, and suggests ways to overcome these limitations through future research. It is hoped that with increasing attention being given to industrial sampling plans in general, and LQAS in particular, their utilization to monitor health programmes will provide health planners in developing countries with powerful techniques to help them achieve their health programme targets.

  5. Characterization of electron microscopes with binary pseudo-random multilayer test samples

    International Nuclear Information System (INIS)

    Yashchuk, Valeriy V.; Conley, Raymond; Anderson, Erik H.; Barber, Samuel K.; Bouet, Nathalie; McKinney, Wayne R.; Takacs, Peter Z.; Voronov, Dmitriy L.

    2010-01-01

    We discuss the results of SEM and TEM measurements with the BPRML test samples fabricated from a BPRML (WSi2/Si with fundamental layer thickness of 3 nm) with a Dual Beam FIB (focused ion beam)/SEM technique. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML test samples can be used to characterize x-ray microscopes. Corresponding work with x-ray microscopes is in progress.

  6. Techniques of sample attack used in soil and mineral analysis. Phase I

    International Nuclear Information System (INIS)

    Chiu, N.W.; Dean, J.R.; Sill, C.W.

    1984-07-01

    Several techniques of sample attack for the determination of radioisotopes are reviewed. These techniques include: 1) digestion with nitric or hydrochloric acid in Parr digestion bomb, 2) digestion with a mixture of nitric and hydrochloric acids, 3) digestion with a mixture of hydrofluoric, nitric and perchloric acids, and 4) fusion with sodium carbonate, potassium fluoride or alkali pyrosulfates. The effectiveness of these techniques to decompose various soils and minerals containing radioisotopes such as lead-210 uranium, thorium and radium-226 are discussed. The combined procedure of potassium fluoride fusion followed by alkali pyrosulfate fusion is recommended for radium-226, uranium and thorium analysis. This technique guarantees the complete dissolution of samples containing refractory materials such as silica, silicates, carbides, oxides and sulfates. For the lead-210 analysis, the procedure of digestion with a mixture of hydrofluoric, nitric and perchloric acids followed by fusion with alkali pyrosulfate is recommended. These two procedures are detailed. Schemes for the sequential separation of the radioisotopes from a dissolved sample solution are outlined. Procedures for radiochemical analysis are suggested

  7. The role of graphene-based sorbents in modern sample preparation techniques.

    Science.gov (United States)

    de Toffoli, Ana Lúcia; Maciel, Edvaldo Vasconcelos Soares; Fumes, Bruno Henrique; Lanças, Fernando Mauro

    2018-01-01

    The application of graphene-based sorbents in sample preparation techniques has increased significantly since 2011. These materials have good physicochemical properties to be used as sorbent and have shown excellent results in different sample preparation techniques. Graphene and its precursor graphene oxide have been considered to be good candidates to improve the extraction and concentration of different classes of target compounds (e.g., parabens, polycyclic aromatic hydrocarbon, pyrethroids, triazines, and so on) present in complex matrices. Its applications have been employed during the analysis of different matrices (e.g., environmental, biological and food). In this review, we highlight the most important characteristics of graphene-based material, their properties, synthesis routes, and the most important applications in both off-line and on-line sample preparation techniques. The discussion of the off-line approaches includes methods derived from conventional solid-phase extraction focusing on the miniaturized magnetic and dispersive modes. The modes of microextraction techniques called stir bar sorptive extraction, solid phase microextraction, and microextraction by packed sorbent are discussed. The on-line approaches focus on the use of graphene-based material mainly in on-line solid phase extraction, its variation called in-tube solid-phase microextraction, and on-line microdialysis systems. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. 238U And 232Th Concentration In Rock Samples using Alpha Autoradiography and Gamma Spectroscopy Techniques

    International Nuclear Information System (INIS)

    Hafez, A.F.; El-Farrash, A.H.; Yousef, H.A.

    2009-01-01

    The activity concentrations of uranium and thorium were measured for some rock samples selected from Dahab region in the south tip of Sinai. In order to detect any harmful radiation that would affect on the tourists and is becoming economic resource because Dahab have open fields of tourism in Egypt. The activity concentration of uranium and thorium in rocks samples was measured using two techniques. The first is .-autoradiography technique with LR-115 and CR-39 detectors and the second is gamma spectroscopic technique with NaI(Tl) detector. It was found that the average activity concentrations of uranium and thorium using .-autoradiography technique ranged from 6.41-49.31 Bqkg-1, 4.86- 40.87 Bqkg-1 respectively and by gamma detector are ranged from 6.70- 49.50 Bqkg-1, 4.47- 42.33 Bqkg-1 respectively. From the obtained data we can conclude that there is no radioactive healthy hazard for human and living beings in the area under investigation. It was found that there are no big differences between the calculated thorium to uranium ratios in both techniques

  9. Review of online coupling of sample preparation techniques with liquid chromatography.

    Science.gov (United States)

    Pan, Jialiang; Zhang, Chengjiang; Zhang, Zhuomin; Li, Gongke

    2014-03-07

    Sample preparation is still considered as the bottleneck of the whole analytical procedure, and efforts has been conducted towards the automation, improvement of sensitivity and accuracy, and low comsuption of organic solvents. Development of online sample preparation techniques (SP) coupled with liquid chromatography (LC) is a promising way to achieve these goals, which has attracted great attention. This article reviews the recent advances on the online SP-LC techniques. Various online SP techniques have been described and summarized, including solid-phase-based extraction, liquid-phase-based extraction assisted with membrane, microwave assisted extraction, ultrasonic assisted extraction, accelerated solvent extraction and supercritical fluids extraction. Specially, the coupling approaches of online SP-LC systems and the corresponding interfaces have been discussed and reviewed in detail, such as online injector, autosampler combined with transport unit, desorption chamber and column switching. Typical applications of the online SP-LC techniques have been summarized. Then the problems and expected trends in this field are attempted to be discussed and proposed in order to encourage the further development of online SP-LC techniques. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Advances in modern sample preparation techniques using microwaves assisted chemistry for metal species determination (W1)

    International Nuclear Information System (INIS)

    Ponard, O.F.X.

    2002-01-01

    Full text: Sample preparation has long been the bottleneck of environmental analysis for both total and species specific analysis. Digestion, extraction and preparation of the analytes are relying on a series of chemical reactions. The introduction of microwave assisted sample preparation has first been viewed as a mean to accelerate the kinetics of digestion of the matrix for total elements and fast samples preparation procedures. However, the extensive development and success of microwave digestion procedures in total elemental analysis has now allowed to have a larger insight of the perspectives offered by this technique. Microwave technologies now offer to have a precise control of the temperature and indirectly control the reaction kinetics taking place during the sample preparation procedures. Microwave assisted chemistry permits to perform simultaneously the fundamental steps required for metal species extraction and derivatization. The number of sample preparation steps used for organotin or organomercury species have been reduced to one and the total time of sample preparation brought down for a few hours to some minutes. Further, the developments of GC/ICP/MS techniques allow to routinely use speciated isotopic dilution methods has internal probe of the chemical reactions. These new approaches allow us to use the addition of the labeled species for isotopic dilution as a mean to evaluate and follow the chemical processes taking place during the extraction procedure. These procedures will help us to understand and check for the stability of the analytes during the chemistry of the sample preparation procedure and bring some insights of the chemistry taking place during the extraction. Understanding the different mechanisms involved in the sample preparation steps will allow us in return to further improve all theses procedures and bring us to the horizon of 'on-line sample preparation and detection'. (author)

  11. Determination of some trace elements in biological samples using XRF and TXRF techniques

    International Nuclear Information System (INIS)

    Khuder, A.; Karjou, J.; Sawan, M. K.

    2006-07-01

    XRF and TXRF techniques were successfully used for the multi-element determination of trace elements in whole blood and human head hair samples. This was achieved by the direct analysis using XRF technique with different collimation units and by the optimized chemical procedures for TXRF analysis. Light element of S and P were preferably determined by XRF with primary x-ray excitation, while, elements of K, Ca, Fe, and Br were determined with a very good accuracy and precision using XRF with Cu- and Mo-secondary targets. The chemical procedure dependent on the preconcentration of trace elements by APDC was superiorly used for the determination of traces of Ni and Pb in the range of 1.0-1.7 μg/dl and 11-23 μg/dl, respectively, in whole blood samples by TXRF technique; determination of other elements as Cu and Zn was also achievable using this approach. Rb in whole blood samples was determined directly after the digestion of samples using PTFE-bomb for TXRF analysis. (author)

  12. Geospatial techniques for developing a sampling frame of watersheds across a region

    Science.gov (United States)

    Gresswell, Robert E.; Bateman, Douglas S.; Lienkaemper, George; Guy, T.J.

    2004-01-01

    Current land-management decisions that affect the persistence of native salmonids are often influenced by studies of individual sites that are selected based on judgment and convenience. Although this approach is useful for some purposes, extrapolating results to areas that were not sampled is statistically inappropriate because the sampling design is usually biased. Therefore, in recent investigations of coastal cutthroat trout (Oncorhynchus clarki clarki) located above natural barriers to anadromous salmonids, we used a methodology for extending the statistical scope of inference. The purpose of this paper is to apply geospatial tools to identify a population of watersheds and develop a probability-based sampling design for coastal cutthroat trout in western Oregon, USA. The population of mid-size watersheds (500-5800 ha) west of the Cascade Range divide was derived from watershed delineations based on digital elevation models. Because a database with locations of isolated populations of coastal cutthroat trout did not exist, a sampling frame of isolated watersheds containing cutthroat trout had to be developed. After the sampling frame of watersheds was established, isolated watersheds with coastal cutthroat trout were stratified by ecoregion and erosion potential based on dominant bedrock lithology (i.e., sedimentary and igneous). A stratified random sample of 60 watersheds was selected with proportional allocation in each stratum. By comparing watershed drainage areas of streams in the general population to those in the sampling frame and the resulting sample (n = 60), we were able to evaluate the how representative the subset of watersheds was in relation to the population of watersheds. Geospatial tools provided a relatively inexpensive means to generate the information necessary to develop a statistically robust, probability-based sampling design.

  13. Direct sampling technique of bees on Vriesea philippocoburgii (Bromeliaceae, Tillandsioideae flowers

    Directory of Open Access Journals (Sweden)

    Afonso Inácio Orth

    2004-11-01

    Full Text Available In our study on Vriesea philippocoburgii Wawra pollination, due to the small proportion of flowers in anthesis on a single day and the damage caused to inflorescences when netting directly on flowers, we used the direct sampling technique (DST of bees on flowers. This technique was applied to 40 flowering plants and resulted in the capture of 160 specimens, belonging to nine genera of Apoidea and separated into 19 morph species. As DST maintains the integrity of flowers for later Bees’ visits, it can enhance the survey’s performance, constituting an alternative methodology for the collection of bees visiting flowering plants.

  14. Peyton's four-step approach for teaching complex spinal manipulation techniques - a prospective randomized trial.

    Science.gov (United States)

    Gradl-Dietsch, Gertraud; Lübke, Cavan; Horst, Klemens; Simon, Melanie; Modabber, Ali; Sönmez, Tolga T; Münker, Ralf; Nebelung, Sven; Knobe, Matthias

    2016-11-03

    The objectives of this prospective randomized trial were to assess the impact of Peyton's four-step approach on the acquisition of complex psychomotor skills and to examine the influence of gender on learning outcomes. We randomly assigned 95 third to fifth year medical students to an intervention group which received instructions according to Peyton (PG) or a control group, which received conventional teaching (CG). Both groups attended four sessions on the principles of manual therapy and specific manipulative and diagnostic techniques for the spine. We assessed differences in theoretical knowledge (multiple choice (MC) exam) and practical skills (Objective Structured Practical Examination (OSPE)) with respect to type of intervention and gender. Participants took a second OSPE 6 months after completion of the course. There were no differences between groups with respect to the MC exam. Students in the PG group scored significantly higher in the OSPE. Gender had no additional impact. Results of the second OSPE showed a significant decline in competency regardless of gender and type of intervention. Peyton's approach is superior to standard instruction for teaching complex spinal manipulation skills regardless of gender. Skills retention was equally low for both techniques.

  15. Effects of pushing techniques during the second stage of labor: A randomized controlled trial.

    Science.gov (United States)

    Koyucu, Refika Genç; Demirci, Nurdan

    2017-10-01

    Spontaneous pushing is a method that is used in the management of the second stage of labor and suggested to be more physiological for the mother and infant. The present study aims to evaluate the effects of pushing techniques on the mother and newborn. This randomized prospective study was performed between June 2013-March 2014 in a tertiary maternity clinic in Istanbul. 80 low risk, nulliparous cases were randomized to pushing groups. Valsalva pushing group was told to hold their breath while pushing. No visual-verbal instructions were given to spontaneous pushing group and they were encouraged to push without preventing respiration. Demographic data, second stage period, perineal laceration rates, fetal heart rate patterns, presence of meconium stained amniotic liquid, newborn APGAR scores, POP-Q examination and Q-tip test results were evaluated in these cases. The second stage of labor was significantly longer with spontaneous pushing. Decrease in Hb levels in valsalva pushing group was determined to be higher than spontaneous pushing group. An increased urethral mobility was observed in valsalva pushing group. Although the duration of the second stage of labor was longer compared to valsalva pushing technique, women were able to give birth without requiring any verbal or visual instruction, without exceeding the limit value of two hours and without affecting fetal wellness and neonatal results. Copyright © 2017. Published by Elsevier B.V.

  16. Fast patient-specific Monte Carlo brachytherapy dose calculations via the correlated sampling variance reduction technique

    Energy Technology Data Exchange (ETDEWEB)

    Sampson, Andrew; Le Yi; Williamson, Jeffrey F. [Department of Radiation Oncology, Virginia Commonwealth University, Richmond, Virginia 23298 (United States)

    2012-02-15

    Purpose: To demonstrate potential of correlated sampling Monte Carlo (CMC) simulation to improve the calculation efficiency for permanent seed brachytherapy (PSB) implants without loss of accuracy. Methods: CMC was implemented within an in-house MC code family (PTRAN) and used to compute 3D dose distributions for two patient cases: a clinical PSB postimplant prostate CT imaging study and a simulated post lumpectomy breast PSB implant planned on a screening dedicated breast cone-beam CT patient exam. CMC tallies the dose difference, {Delta}D, between highly correlated histories in homogeneous and heterogeneous geometries. The heterogeneous geometry histories were derived from photon collisions sampled in a geometrically identical but purely homogeneous medium geometry, by altering their particle weights to correct for bias. The prostate case consisted of 78 Model-6711 {sup 125}I seeds. The breast case consisted of 87 Model-200 {sup 103}Pd seeds embedded around a simulated lumpectomy cavity. Systematic and random errors in CMC were unfolded using low-uncertainty uncorrelated MC (UMC) as the benchmark. CMC efficiency gains, relative to UMC, were computed for all voxels, and the mean was classified in regions that received minimum doses greater than 20%, 50%, and 90% of D{sub 90}, as well as for various anatomical regions. Results: Systematic errors in CMC relative to UMC were less than 0.6% for 99% of the voxels and 0.04% for 100% of the voxels for the prostate and breast cases, respectively. For a 1 x 1 x 1 mm{sup 3} dose grid, efficiency gains were realized in all structures with 38.1- and 59.8-fold average gains within the prostate and breast clinical target volumes (CTVs), respectively. Greater than 99% of the voxels within the prostate and breast CTVs experienced an efficiency gain. Additionally, it was shown that efficiency losses were confined to low dose regions while the largest gains were located where little difference exists between the homogeneous and

  17. Determination of trace element contents in grass samples for cattle feeding using NAA techniques

    Energy Technology Data Exchange (ETDEWEB)

    Yusof, Alias Mohamad; Jagir Singh, Jasbir Kaur

    1987-09-01

    An investigation on trace elements contents in six types of grass samples used for cattle feeding have been carried out using NAA techniques. The grass samples, Mardi Digit, African Star, Signal, Guinea, Setaria and Setaria Splendida were found to contain at least 19 trace elements in varying concentrations. The results were compared to the figures obtained from available sources to ascertain the status as to whether the grass samples studied would satisfy the minimum requirements of trace elements present in grass for cattle feeding or otherwise. Preference made on the suitability of the grass samples for cattle feeding was based on the availability and abundance of the trace elements, taking into account factors such as the degree of toxicity, inadequate amounts and contamination due to the presence of other trace elements not essential for cattle feeding.

  18. Determination of trace element contents in grass samples for cattle feeding using NAA techniques

    International Nuclear Information System (INIS)

    Alias Mohamad Yusof; Jasbir Kaur Jagir Singh

    1987-01-01

    An investigation on trace elements contents in six types of grass samples used for cattle feeding have been carried out using NAA techniques. The grass samples, Mardi Digit, African Star, Signal, Guinea, Setaria and Setaria Splendida were found to contain at least 19 trace elements in varying concentrations. The results were compared to the figures obtained from available sources to ascertain the status as to whether the grass samples studied would satisfy the minimum requirements of trace elements present in grass for cattle feeding or otherwise. Preference made on the suitability of the grass samples for cattle feeding was based on the availability and abundance of the trace elements, taking into account factors such as the degree of toxicity, inadequate amounts and contamination due to the presence of other trace elements not essential for cattle feeding. (author)

  19. Sampling methods for rumen microbial counts by Real-Time PCR techniques

    Directory of Open Access Journals (Sweden)

    S. Puppo

    2010-02-01

    Full Text Available Fresh rumen samples were withdrawn from 4 cannulated buffalo females fed a fibrous diets in order to quantify bacteria concentration in the rumen by Real-Time PCR techniques. To obtain DNA of a good quality from whole rumen fluid, eight (M1-M8 different pre-filtration methods (cheese cloths, glass-fibre and nylon filter in combination with various centrifugation speeds (1000, 5000 and 14,000 rpm were tested. Genomic DNA extraction was performed either on fresh or frozen samples (-20°C. The quantitative bacteria analysis was realized according to Real-Time PCR procedure for Butyrivibrio fibrisolvens reported in literature. M5 resulted the best sampling procedure allowing to obtain a suitable genomic DNA. No differences were revealed between fresh and frozen samples.

  20. Application of digital sampling techniques to particle identification in scintillation detectors

    International Nuclear Information System (INIS)

    Bardelli, L.; Bini, M.; Poggi, G.; Taccetti, N.

    2002-01-01

    In this paper, the use of a fast digitizing system for identification of fast charged particles with scintillation detectors is discussed. The three-layer phoswich detectors developed in the framework of the FIASCO experiment for the detection of light charged particles (LCP) and intermediate mass fragments (IMF) emitted in heavy-ion collisions at Fermi energies are briefly discussed. The standard analog electronics treatment of the signals for particle identification is illustrated. After a description of the digitizer designed to perform a fast digital sampling of the phoswich signals, the feasibility of particle identification on the sampled data is demonstrated. The results obtained with two different pulse shape discrimination analyses based on the digitally sampled data are compared with the standard analog signal treatment. The obtained results suggest, for the present application, the replacement of the analog methods with the digital sampling technique

  1. Uranium content measurement in drinking water samples using track etch technique

    International Nuclear Information System (INIS)

    Kumar, Mukesh; Kumar, Ajay; Singh, Surinder; Mahajan, R.K.; Walia, T.P.S.

    2003-01-01

    The concentration of uranium has been assessed in drinking water samples collected from different locations in Bathinda district, Punjab, India. The water samples are taken from hand pumps and tube wells. Uranium is determined using fission track technique. Uranium concentration in the water samples varies from 1.65±0.06 to 74.98±0.38 μg/l. These values are compared with safe limit values recommended for drinking water. Most of the water samples are found to have uranium concentration above the safe limit. Analysis of some heavy metals (Zn, Cd, Pb and Cu) in water is also done in order to see if some correlation exists between the concentration of uranium and these heavy metals. A weak positive correlation has been observed between the concentration of uranium and heavy metals of Pb, Cd and Cu

  2. Machine-learning techniques for family demography: an application of random forests to the analysis of divorce determinants in Germany

    OpenAIRE

    Arpino, Bruno; Le Moglie, Marco; Mencarini, Letizia

    2018-01-01

    Demographers often analyze the determinants of life-course events with parametric regression-type approaches. Here, we present a class of nonparametric approaches, broadly defined as machine learning (ML) techniques, and discuss advantages and disadvantages of a popular type known as random forest. We argue that random forests can be useful either as a substitute, or a complement, to more standard parametric regression modeling. Our discussion of random forests is intuitive and...

  3. Effects of pushing techniques in birth on mother and fetus: a randomized study.

    Science.gov (United States)

    Yildirim, Gulay; Beji, Nezihe Kizilkaya

    2008-03-01

    The Valsalva pushing technique is used routinely in the second stage of labor in many countries, and it is accepted as standard obstetric management in Turkey. The purpose of this study was to determine the effects of pushing techniques on mother and fetus in birth in this setting. This randomized study was conducted between July 2003 and June 2004 in Bakirkoy Maternity and Children's Teaching Hospital in Istanbul, Turkey. One hundred low-risk primiparas between 38 and 42 weeks' gestation, who expected a spontaneous vaginal delivery, were randomized to either a spontaneous pushing group or a Valsalva-type pushing group. Spontaneous pushing women were informed during the first stage of labor about spontaneous pushing technique (open glottis pushing while breathing out) and were supported in pushing spontaneously in the second stage of labor. Similarly, Valsalva pushing women were informed during the first stage of labor about the Valsalva pushing technique (closed glottis pushing while holding their breath) and were supported in using Valsalva pushing in the second stage of labor. Perineal tears, postpartum hemorrhage, and hemoglobin levels were evaluated in mothers; and umbilical artery pH, Po(2) (mmHg), and Pco(2) (mmHg) levels and Apgar scores at 1 and 5 minutes were evaluated in newborns in both groups. No significant differences were found between the two groups in their demographics, incidence of nonreassuring fetal surveillance patterns, or use of oxytocin. The second stage of labor and duration of the expulsion phase were significantly longer with Valsalva-type pushing. Differences in the incidence of episiotomy, perineal tears, or postpartum hemorrhage were not significant between the groups. The baby fared better with spontaneous pushing, with higher 1- and 5-minute Apgar scores, and higher umbilical cord pH and Po(2) levels. After the birth, women expressed greater satisfaction with spontaneous pushing. Educating women about the spontaneous pushing

  4. Recent Trends in Microextraction Techniques Employed in Analytical and Bioanalytical Sample Preparation

    Directory of Open Access Journals (Sweden)

    Abuzar Kabir

    2017-12-01

    Full Text Available Sample preparation has been recognized as a major step in the chemical analysis workflow. As such, substantial efforts have been made in recent years to simplify the overall sample preparation process. Major focusses of these efforts have included miniaturization of the extraction device; minimizing/eliminating toxic and hazardous organic solvent consumption; eliminating sample pre-treatment and post-treatment steps; reducing the sample volume requirement; reducing extraction equilibrium time, maximizing extraction efficiency etc. All these improved attributes are congruent with the Green Analytical Chemistry (GAC principles. Classical sample preparation techniques such as solid phase extraction (SPE and liquid-liquid extraction (LLE are being rapidly replaced with emerging miniaturized and environmentally friendly techniques such as Solid Phase Micro Extraction (SPME, Stir bar Sorptive Extraction (SBSE, Micro Extraction by Packed Sorbent (MEPS, Fabric Phase Sorptive Extraction (FPSE, and Dispersive Liquid-Liquid Micro Extraction (DLLME. In addition to the development of many new generic extraction sorbents in recent years, a large number of molecularly imprinted polymers (MIPs created using different template molecules have also enriched the large cache of microextraction sorbents. Application of nanoparticles as high-performance extraction sorbents has undoubtedly elevated the extraction efficiency and method sensitivity of modern chromatographic analyses to a new level. Combining magnetic nanoparticles with many microextraction sorbents has opened up new possibilities to extract target analytes from sample matrices containing high volumes of matrix interferents. The aim of the current review is to critically audit the progress of microextraction techniques in recent years, which has indisputably transformed the analytical chemistry practices, from biological and therapeutic drug monitoring to the environmental field; from foods to phyto

  5. Development of Large Sample Neutron Activation Technique for New Applications in Thailand

    International Nuclear Information System (INIS)

    Laoharojanaphand, S.; Tippayakul, C.; Wonglee, S.; Channuie, J.

    2018-01-01

    The development of the Large Sample Neutron Activation Analysis (LSNAA) in Thailand is presented in this paper. The technique had been firstly developed with rice sample as the test subject. The Thai Research Reactor-1/Modification 1 (TRR-1/M1) was used as the neutron source. The first step was to select and characterize an appropriate irradiation facility for the research. An out-core irradiation facility (A4 position) was first attempted. The results performed with the A4 facility were then used as guides for the subsequent experiments with the thermal column facility. The characterization of the thermal column was performed with Cu-wire to determine spatial distribution without and with rice sample. The flux depression without rice sample was observed to be less than 30% while the flux depression with rice sample increased to within 60%. The flux monitors internal to the rice sample were used to determine average flux over the rice sample. The gamma selfshielding effect during gamma measurement was corrected using the Monte Carlo simulation. The ratio between the efficiencies of the volume source and the point source for each energy point was calculated by the MCNPX code. The research team adopted the k0-NAA methodology to calculate the element concentration in the research. The k0-NAA program which developed by IAEA was set up to simulate the conditions of the irradiation and measurement facilities used in this research. The element concentrations in the bulk rice sample were then calculated taking into account the flux depression and gamma efficiency corrections. At the moment, the results still show large discrepancies with the reference values. However, more research on the validation will be performed to identify sources of errors. Moreover, this LS-NAA technique was introduced for the activation analysis of the IAEA archaeological mock-up. The results are provided in this report. (author)

  6. Characterization of Some Iraqi Archaeological Samples Using IBA, Analytical X-ray and Other Complementary Techniques

    International Nuclear Information System (INIS)

    Al-Sarraj, Ziyad Shihab; Damboos, Hassan I; Roumie, Mohamad

    2012-01-01

    The present work aimed at investigating the compositions and microstructures of some archaeological samples which dated back to various periods of the ancient Iraqi civilizations using PIXE, XRF, XRD, and SEM techniques. The models selected for the study (ceramics, glaze, etc.) were diverse in size and nature, therefore a limited number of samples were then butted from them by a small diamond wheel. Conventional powder metallurgy method was then used to prepare the samples. Dried samples were then coated with a thin layer of carbon, and analyzed using the ion beam accelerator of the LAEC. Three other groups of samples were also prepared for the purpose of analysis by X-ray fluorescence (XRF), X-ray diffraction (XRD), and scanning electron microscope (SEM). Analysis results of the chemical composition showed good agreement between the various techniques as well as for phases, while the fine structure analysis obtained by optical and scanning microscopy exhibited features of a structure where it got an intensified densification in the final stage of sintering and accompanied by quasi-homogeneous distribution of the closed pores. This will lead to the conclusion that the temperature used for sintering by ancient Iraqi was sufficient and it may fall in the range between 950-1200°C, also the mixes and the forming methods used by them, were both suitable to obtain good sintered bodies with even distribution of pores. A ring-shaped trace noticed in SEM micrographs need more work and study to explain what it is?

  7. Determination of elements in industrial waste sample and TENORM using XRF Technique in Nuclear Malaysia

    International Nuclear Information System (INIS)

    Paulus, W.; Sarimah Mahat; Meor Yusoff Meor Sulaiman

    2011-01-01

    Industrial waste such as aluminium dross and TENORM waste, oil sludge has been used as sample in this research. Determination of main elements by using X-Ray Fluorescence (XRF) in Material Technology Group, Malaysian Nuclear Agency. Results shows that main elements in these samples, aluminium and silicon, respectively. Thereby, this research shows that XRF can be considered as one of the techniques that can be used in waste characterization and furthermore, it can help researchers and engineer in the research related to waste treatment especially radioactive waste. (author)

  8. Application of nuclear and allied techniques for the characterisation of forensic samples

    International Nuclear Information System (INIS)

    Sudersanan, M.; Kayasth, S.R.; Pant, D.R.; Chattopadhyay, N.; Bhattacharyya, C.N.

    2002-01-01

    Full text: Forensic science deals with the application of various techniques for physics, chemistry and biology for crime investigation. The legal implication of such analysis put considerable restriction on the choice of analytical techniques. Moreover, the unknown nature of the materials, the limited availability of samples and the large number of elements to be analysed put considerable strain on the analytical chemist on the selection of the appropriate technique. The availability of nuclear techniques has considerably enhanced the scope of forensic analysis. This paper deals with the recent results on the use of nuclear and allied analytical techniques for forensic applications. One of the important types of samples of forensic importance pertain to the identification of gunshot residues. The use of nuclear techniques has considerably simplified the interpretation of results through the use of appropriate elements like Ba, Cu, Sb, Zn, As and Sn etc. The combination of non-nuclear techniques for elements like Pb and Ni which are not easily amenable to be analysed by NAA and the use of appropriate separation procedure has led to the use of this method as a valid and versatile analytical procedure. In view of the presence of a large amounts of extraneous materials like cloth, body tissues etc in these samples and the limited availability of materials, the procedures for sample collection, dissolution and analysis have been standardized. Analysis of unknown materials like powders, metallic pieces etc. for the possible presence of nuclear materials or as materials in illicit trafficking is becoming important in recent years. The use of multi-technique approach is important in this case. Use of non-destructive techniques like XRF and radioactive counting enables the preliminary identification of materials and for the detection of radioactivity. Subsequent analysis by NAA or other appropriate analytical methods allows the characterization of the materials. Such

  9. Comparison of different anesthesia techniques during esophagogastroduedenoscopy in children: a randomized trial.

    Science.gov (United States)

    Patino, Mario; Glynn, Susan; Soberano, Mark; Putnam, Philip; Hossain, Md Monir; Hoffmann, Clifford; Samuels, Paul; Kibelbek, Michael J; Gunter, Joel

    2015-10-01

    Esophagogastroduedenoscopy (EGD) in children is usually performed under general anesthesia. Anesthetic goals include minimization of airway complications while maximizing operating room (OR) efficiency. Currently, there is no consensus on which anesthetic technique best meets these goals. We performed a prospective randomized study comparing three different anesthetic techniques. To evaluate the incidence of respiratory complications (primary aim) and institutional efficiency (secondary aim) among three different anesthetic techniques in children undergoing EGD. Subjects received a standardized inhalation induction of anesthesia followed by randomization to one of the three groups: Group intubated, sevoflurane (IS), Group intubated, propofol (IP), and Group native airway, nonintubated, propofol (NA). Respiratory complications included minor desaturation (SpO2 between 94% and 85%), severe desaturation (SpO2 < 85%), apnea, airway obstruction/laryngospasm, aspiration, and/or inadequate anesthesia during the endoscopy. Evaluation of institutional efficiency was determined by examining the time spent during the different phases of care (anesthesia preparation, procedure, OR stay, recovery, and total perioperative care). One hundred and seventy-nine children aged 1-12 years (median 7 years; 4.0, 10.0) were enrolled (Group IS N = 60, Group IP N = 59, Group NA N = 61). The incidence of respiratory complications was higher in the Group NA (0.459) vs Group IS (0.033) or Group IP (0.086) (P < 0.0001). The most commonly observed complications were desaturation, inadequate anesthesia, and apnea. There were no differences in institutional efficiency among the three groups. Respiratory complications were more common in Group NA. The use of native airway with propofol maintenance during EGD does not offer advantages with respect to respiratory complications or institutional efficiency. © 2015 John Wiley & Sons Ltd.

  10. Accounting for randomness in measurement and sampling in studying cancer cell population dynamics.

    Science.gov (United States)

    Ghavami, Siavash; Wolkenhauer, Olaf; Lahouti, Farshad; Ullah, Mukhtar; Linnebacher, Michael

    2014-10-01

    Knowing the expected temporal evolution of the proportion of different cell types in sample tissues gives an indication about the progression of the disease and its possible response to drugs. Such systems have been modelled using Markov processes. We here consider an experimentally realistic scenario in which transition probabilities are estimated from noisy cell population size measurements. Using aggregated data of FACS measurements, we develop MMSE and ML estimators and formulate two problems to find the minimum number of required samples and measurements to guarantee the accuracy of predicted population sizes. Our numerical results show that the convergence mechanism of transition probabilities and steady states differ widely from the real values if one uses the standard deterministic approach for noisy measurements. This provides support for our argument that for the analysis of FACS data one should consider the observed state as a random variable. The second problem we address is about the consequences of estimating the probability of a cell being in a particular state from measurements of small population of cells. We show how the uncertainty arising from small sample sizes can be captured by a distribution for the state probability.

  11. RANdom SAmple Consensus (RANSAC) algorithm for material-informatics: application to photovoltaic solar cells.

    Science.gov (United States)

    Kaspi, Omer; Yosipof, Abraham; Senderowitz, Hanoch

    2017-06-06

    An important aspect of chemoinformatics and material-informatics is the usage of machine learning algorithms to build Quantitative Structure Activity Relationship (QSAR) models. The RANdom SAmple Consensus (RANSAC) algorithm is a predictive modeling tool widely used in the image processing field for cleaning datasets from noise. RANSAC could be used as a "one stop shop" algorithm for developing and validating QSAR models, performing outlier removal, descriptors selection, model development and predictions for test set samples using applicability domain. For "future" predictions (i.e., for samples not included in the original test set) RANSAC provides a statistical estimate for the probability of obtaining reliable predictions, i.e., predictions within a pre-defined number of standard deviations from the true values. In this work we describe the first application of RNASAC in material informatics, focusing on the analysis of solar cells. We demonstrate that for three datasets representing different metal oxide (MO) based solar cell libraries RANSAC-derived models select descriptors previously shown to correlate with key photovoltaic properties and lead to good predictive statistics for these properties. These models were subsequently used to predict the properties of virtual solar cells libraries highlighting interesting dependencies of PV properties on MO compositions.

  12. Coacervative extraction as a green technique for sample preparation for the analysis of organic compounds.

    Science.gov (United States)

    Melnyk, A; Wolska, L; Namieśnik, J

    2014-04-25

    One of the present trends in analytical chemistry is miniaturization, which is one of the methods of green analytical chemistry application. A particular emphasis is placed on the elimination of the use of large amounts of organic solvents which are toxic and harmful to the environment, maintaining high efficiency of the extraction process, high recovery values and low values of quantification (LOQ) and detection (LOD) limits. These requirements are fulfilled by coacervative extraction (CAE) technique. In this review, theoretical aspects of the coacervation process are presented along with environmental and bioanalytical applications of this technique, its advantages, limitations and competitiveness with other techniques. Due to its simplicity and rapidity, CAE is an excellent alternative for currently practiced procedures of sample preparation for the analysis of organic compounds. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Nuclear techniques for trace element analysis. PIXE and its applications to biomedical samples

    International Nuclear Information System (INIS)

    Cata-Danil, I.; Moro, R.; Gialanella, G.

    1996-01-01

    Problems in understanding the role of trace elements in the functioning of life processes are discussed. A brief review of the state of the PIXE technique is given. Principles and recent advances in beam systems, instrumentation and sample handling are covered. A rather comprehensive list of references regarding varies methodological aspects and biomedical applications is given. Some applications are discussed. In particular, preliminary results of an investigation regarding pediatric obesity are presented. (author) 5 tabs., 21 refs

  14. A simple technique for measuring the superconducting critical temperature of small (>= 10 μg) samples

    International Nuclear Information System (INIS)

    Pereira, R.F.R.; Meyer, E.; Silveira, M.F. da.

    1983-01-01

    A simple technique for measuring the superconducting critical temperature of small (>=10μg) samples is described. The apparatus is built in the form of a probe, which can be introduced directly into a liquid He storage dewar and permits the determination of the critical temperature, with an imprecision of +- 0.05 K above 4.2 K, in about 10 minutes. (Author) [pt

  15. Comparative evaluation of gingival depigmentation by tetrafluroethane cryosurgery and surgical scalpel technique. A randomized clinical study

    Directory of Open Access Journals (Sweden)

    Suraj D Narayankar

    2017-01-01

    Full Text Available Introduction: Importance of good smile cannot be underestimated in enhancement of beauty, self-confidence and personality of a person. Health and appearance of gingiva is an essential part of attractive smile. Gingival pigmentation gives rise to unesthetic smile line. In present world, with increasing awareness to esthetic, people have become highly concerned about black gums. Various treatment modalities like abrasion, scrapping, scalpel technique, cryosurgery, electrosurgery and laser are available for treatment of gingival pigmentation. The present study was conducted with an objective of comparing efficacy of gingival depigmentation by cryosurgery and scalpel technique. Method: A Randomized control split mouth study was conducted for 25 patients with gingival pigmentation. Gingival pigmentation Index (GPI for pigmentation and Visual Analoug Scale (VAS for pain was evaluated for both test (Cryosurgery and control sites (Scalpel technique at baseline, 1month, 3months and 6 months. Results: GPI score was 3 and 2 for 21/25 and 4/25 control sites and was 22/25 and 3/25 test sites respectively at baseline. Both the groups showed significant reduction in GPI score i.e., 0 at 1 and 3 months interval after treatment. GPI score increased to 1 for 5/25 sites treated with scalpel technique and 2/25 sites treated with cryosurgery at 6 months interval (P=0.0691. This indicates recurrence rate for pigmentation is higher after scalpel treatment. VAS Score was 3 for 10/25 sites treated with scalpel and was 2 for 12/25 sites treated with cryosurgery (P<0.001. Conclusion: It can be concluded that cryosurgery can be effectively and efficiently used for depigmentation by keeping patients acceptance and comfort in mind and also the long term results and ease of use when compared to scalpel technique.

  16. Comparison of four techniques of nasogastric tube insertion in anaesthetised, intubated patients: A randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Mohan Chandra Mandal

    2014-01-01

    Full Text Available Background and Aims: Insertion of nasogastric tubes (NGTs in anaesthetised, intubated patients with a conventional method is sometimes difficult. Different techniques of NGT insertion have been tried with varying degree of success. The aim of this prospective, randomised, open-label study was to evaluate three modified techniques of NGT insertion comparing with the conventional method in respect of success rate, time taken for insertion and the adverse events. Methods: In the operation theatre of general surgery, the patients were randomly allocated into four groups: Group C (control group, n = 54, Group W (ureteral guide wire group, n = 54, Group F (neck flexion with lateral pressure, n = 54 and Group R (reverse Sellick′s manoeuvre, n = 54. The number of attempts for successful NGT insertion, time taken for insertion and adverse events were noted. Results: All the three modified techniques were found more successful than the conventional method on the first attempt. The least time taken for insertion was noted in the reverse Sellick′s method. However, on intergroup analysis, neck flexion and reverse Sellick′s methods were comparable but significantly faster than the other two methods with respect to time taken for insertion. Conclusion: Reverse Sellick′s manoeuver, neck flexion with lateral neck pressure and guide wire-assisted techniques are all better alternatives to the conventional method for successful, quick and reliable NGT insertion with permissible adverse events in anaesthetised, intubated adult patients. Further studies after eliminating major limitations of the present study are warranted to establish the superiority of any one of these modified techniques.

  17. Multi-element analysis of lubricant oil by WDXRF technique using thin-film sample preparation

    International Nuclear Information System (INIS)

    Scapin, M. A.; Salvador, V. L. R.; Lopes, C. D.; Sato, I. M.

    2006-01-01

    The quantitative analysis of the chemical elements in matrices like oils or gels represents a challenge for the analytical chemists. The classics methods or instrumental techniques such as atomic absorption spectrometry (AAS) and plasma optical emission spectrometry (ICP-OES) need chemical treatments, mainly sample dissolution and degradation processes. X-ray fluorescence technique allows a direct and multi-element analysis without previous sample treatments. In this work, a sensible method for the determination of elements Mg, Al, Si, P, Ca, Ti, V, Cr, Mn, Fe, Ni, Cu, Zn, Mo, Ag, Sn, Ba and Pb in lubricating oil is presented. The x-ray fluorescence (WDXRF) technique using linear regression method and thin film sample preparation was used. The validation of the methodology (repeatability and accuracy) was obtained by the analysis of the standard reference materials SRM Alpha AESAR lot 703527D, applying the Chauvenet, Cochrane, ANOVA and Z-score statistical tests. The method presents a relative standard deviation lower than 10% for all the elements, except for Pb determination (RSD Pb 15%). The Z-score values for all the elements were in the range -2 < Z < 2, indicating a very good accuracy.(Full text)

  18. Detection of equine herpesvirus in horses with idiopathic keratoconjunctivitis and comparison of three sampling techniques.

    Science.gov (United States)

    Hollingsworth, Steven R; Pusterla, Nicola; Kass, Philip H; Good, Kathryn L; Brault, Stephanie A; Maggs, David J

    2015-09-01

    To determine the role of equine herpesvirus (EHV) in idiopathic keratoconjunctivitis in horses and to determine whether sample collection method affects detection of EHV DNA by quantitative polymerase chain reaction (qPCR). Twelve horses with idiopathic keratoconjunctivitis and six horses without signs of ophthalmic disease. Conjunctival swabs, corneal scrapings, and conjunctival biopsies were collected from 18 horses: 12 clinical cases with idiopathic keratoconjunctivitis and six euthanized controls. In horses with both eyes involved, the samples were taken from the eye judged to be more severely affected. Samples were tested with qPCR for EHV-1, EHV-2, EHV-4, and EHV-5 DNA. Quantity of EHV DNA and viral replicative activity were compared between the two populations and among the different sampling techniques; relative sensitivities of the sampling techniques were determined. Prevalence of EHV DNA as assessed by qPCR did not differ significantly between control horses and those with idiopathic keratoconjunctivitis. Sampling by conjunctival swab was more likely to yield viral DNA as assessed by qPCR than was conjunctival biopsy. EHV-1 and EHV-4 DNA were not detected in either normal or IKC-affected horses; EHV-2 DNA was detected in two of 12 affected horses but not in normal horses. EHV-5 DNA was commonly found in ophthalmically normal horses and horses with idiopathic keratoconjunctivitis. Because EHV-5 DNA was commonly found in control horses and in horses with idiopathic keratoconjunctivitis, qPCR was not useful for the etiological diagnosis of equine keratoconjunctivitis. Conjunctival swabs were significantly better at obtaining viral DNA samples than conjunctival biopsy in horses in which EHV-5 DNA was found. © 2015 American College of Veterinary Ophthalmologists.

  19. Colorization-Based RGB-White Color Interpolation using Color Filter Array with Randomly Sampled Pattern.

    Science.gov (United States)

    Oh, Paul; Lee, Sukho; Kang, Moon Gi

    2017-06-28

    Recently, several RGB-White (RGBW) color filter arrays (CFAs) have been proposed, which have extra white (W) pixels in the filter array that are highly sensitive. Due to the high sensitivity, the W pixels have better SNR (Signal to Noise Ratio) characteristics than other color pixels in the filter array, especially, in low light conditions. However, most of the RGBW CFAs are designed so that the acquired RGBW pattern image can be converted into the conventional Bayer pattern image, which is then again converted into the final color image by using conventional demosaicing methods, i.e., color interpolation techniques. In this paper, we propose a new RGBW color filter array based on a totally different color interpolation technique, the colorization algorithm. The colorization algorithm was initially proposed for colorizing a gray image into a color image using a small number of color seeds. Here, we adopt this algorithm as a color interpolation technique, so that the RGBW color filter array can be designed with a very large number of W pixels to make the most of the highly sensitive characteristics of the W channel. The resulting RGBW color filter array has a pattern with a large proportion of W pixels, while the small-numbered RGB pixels are randomly distributed over the array. The colorization algorithm makes it possible to reconstruct the colors from such a small number of RGB values. Due to the large proportion of W pixels, the reconstructed color image has a high SNR value, especially higher than those of conventional CFAs in low light condition. Experimental results show that many important information which are not perceived in color images reconstructed with conventional CFAs are perceived in the images reconstructed with the proposed method.

  20. Randomized controlled trial of attention bias modification in a racially diverse, socially anxious, alcohol dependent sample.

    Science.gov (United States)

    Clerkin, Elise M; Magee, Joshua C; Wells, Tony T; Beard, Courtney; Barnett, Nancy P

    2016-12-01

    Attention biases may be an important treatment target for both alcohol dependence and social anxiety. This is the first ABM trial to investigate two (vs. one) targets of attention bias within a sample with co-occurring symptoms of social anxiety and alcohol dependence. Additionally, we used trial-level bias scores (TL-BS) to capture the phenomena of attention bias in a more ecologically valid, dynamic way compared to traditional attention bias scores. Adult participants (N = 86; 41% Female; 52% African American; 40% White) with elevated social anxiety symptoms and alcohol dependence were randomly assigned to an 8-session training condition in this 2 (Social Anxiety ABM vs. Social Anxiety Control) by 2 (Alcohol ABM vs. Alcohol Control) design. Symptoms of social anxiety, alcohol dependence, and attention bias were assessed across time. Multilevel models estimated the trajectories for each measure within individuals, and tested whether these trajectories differed according to the randomized training conditions. Across time, there were significant or trending decreases in all attention TL-BS parameters (but not traditional attention bias scores) and most symptom measures. However, there were not significant differences in the trajectories of change between any ABM and control conditions for any symptom measures. These findings add to previous evidence questioning the robustness of ABM and point to the need to extend the effects of ABM to samples that are racially diverse and/or have co-occurring psychopathology. The results also illustrate the potential importance of calculating trial-level attention bias scores rather than only including traditional bias scores. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Randomized Controlled Trial of Attention Bias Modification in a Racially Diverse, Socially Anxious, Alcohol Dependent Sample

    Science.gov (United States)

    Clerkin, Elise M.; Magee, Joshua C.; Wells, Tony T.; Beard, Courtney; Barnett, Nancy P.

    2016-01-01

    Objective Attention biases may be an important treatment target for both alcohol dependence and social anxiety. This is the first ABM trial to investigate two (vs. one) targets of attention bias within a sample with co-occurring symptoms of social anxiety and alcohol dependence. Additionally, we used trial-level bias scores (TL-BS) to capture the phenomena of attention bias in a more ecologically valid, dynamic way compared to traditional attention bias scores. Method Adult participants (N=86; 41% Female; 52% African American; 40% White) with elevated social anxiety symptoms and alcohol dependence were randomly assigned to an 8-session training condition in this 2 (Social Anxiety ABM vs. Social Anxiety Control) by 2 (Alcohol ABM vs. Alcohol Control) design. Symptoms of social anxiety, alcohol dependence, and attention bias were assessed across time. Results Multilevel models estimated the trajectories for each measure within individuals, and tested whether these trajectories differed according to the randomized training conditions. Across time, there were significant or trending decreases in all attention TL-BS parameters (but not traditional attention bias scores) and most symptom measures. However, there were not significant differences in the trajectories of change between any ABM and control conditions for any symptom measures. Conclusions These findings add to previous evidence questioning the robustness of ABM and point to the need to extend the effects of ABM to samples that are racially diverse and/or have co-occurring psychopathology. The results also illustrate the potential importance of calculating trial-level attention bias scores rather than only including traditional bias scores. PMID:27591918

  2. Relative efficiency and sample size for cluster randomized trials with variable cluster sizes.

    Science.gov (United States)

    You, Zhiying; Williams, O Dale; Aban, Inmaculada; Kabagambe, Edmond Kato; Tiwari, Hemant K; Cutter, Gary

    2011-02-01

    The statistical power of cluster randomized trials depends on two sample size components, the number of clusters per group and the numbers of individuals within clusters (cluster size). Variable cluster sizes are common and this variation alone may have significant impact on study power. Previous approaches have taken this into account by either adjusting total sample size using a designated design effect or adjusting the number of clusters according to an assessment of the relative efficiency of unequal versus equal cluster sizes. This article defines a relative efficiency of unequal versus equal cluster sizes using noncentrality parameters, investigates properties of this measure, and proposes an approach for adjusting the required sample size accordingly. We focus on comparing two groups with normally distributed outcomes using t-test, and use the noncentrality parameter to define the relative efficiency of unequal versus equal cluster sizes and show that statistical power depends only on this parameter for a given number of clusters. We calculate the sample size required for an unequal cluster sizes trial to have the same power as one with equal cluster sizes. Relative efficiency based on the noncentrality parameter is straightforward to calculate and easy to interpret. It connects the required mean cluster size directly to the required sample size with equal cluster sizes. Consequently, our approach first determines the sample size requirements with equal cluster sizes for a pre-specified study power and then calculates the required mean cluster size while keeping the number of clusters unchanged. Our approach allows adjustment in mean cluster size alone or simultaneous adjustment in mean cluster size and number of clusters, and is a flexible alternative to and a useful complement to existing methods. Comparison indicated that we have defined a relative efficiency that is greater than the relative efficiency in the literature under some conditions. Our measure

  3. Evaluation of primary immunization coverage of infants under universal immunization programme in an urban area of Bangalore city using cluster sampling and lot quality assurance sampling techniques

    Directory of Open Access Journals (Sweden)

    Punith K

    2008-01-01

    Full Text Available Research Question: Is LQAS technique better than cluster sampling technique in terms of resources to evaluate the immunization coverage in an urban area? Objective: To assess and compare the lot quality assurance sampling against cluster sampling in the evaluation of primary immunization coverage. Study Design: Population-based cross-sectional study. Study Setting: Areas under Mathikere Urban Health Center. Study Subjects: Children aged 12 months to 23 months. Sample Size: 220 in cluster sampling, 76 in lot quality assurance sampling. Statistical Analysis: Percentages and Proportions, Chi square Test. Results: (1 Using cluster sampling, the percentage of completely immunized, partially immunized and unimmunized children were 84.09%, 14.09% and 1.82%, respectively. With lot quality assurance sampling, it was 92.11%, 6.58% and 1.31%, respectively. (2 Immunization coverage levels as evaluated by cluster sampling technique were not statistically different from the coverage value as obtained by lot quality assurance sampling techniques. Considering the time and resources required, it was found that lot quality assurance sampling is a better technique in evaluating the primary immunization coverage in urban area.

  4. A systematic random sampling scheme optimized to detect the proportion of rare synapses in the neuropil.

    Science.gov (United States)

    da Costa, Nuno Maçarico; Hepp, Klaus; Martin, Kevan A C

    2009-05-30

    Synapses can only be morphologically identified by electron microscopy and this is often a very labor-intensive and time-consuming task. When quantitative estimates are required for pathways that contribute a small proportion of synapses to the neuropil, the problems of accurate sampling are particularly severe and the total time required may become prohibitive. Here we present a sampling method devised to count the percentage of rarely occurring synapses in the neuropil using a large sample (approximately 1000 sampling sites), with the strong constraint of doing it in reasonable time. The strategy, which uses the unbiased physical disector technique, resembles that used in particle physics to detect rare events. We validated our method in the primary visual cortex of the cat, where we used biotinylated dextran amine to label thalamic afferents and measured the density of their synapses using the physical disector method. Our results show that we could obtain accurate counts of the labeled synapses, even when they represented only 0.2% of all the synapses in the neuropil.

  5. Random Photon Absorption Model Elucidates How Early Gain Control in Fly Photoreceptors Arises from Quantal Sampling

    Science.gov (United States)

    Song, Zhuoyi; Zhou, Yu; Juusola, Mikko

    2016-01-01

    Many diurnal photoreceptors encode vast real-world light changes effectively, but how this performance originates from photon sampling is unclear. A 4-module biophysically-realistic fly photoreceptor model, in which information capture is limited by the number of its sampling units (microvilli) and their photon-hit recovery time (refractoriness), can accurately simulate real recordings and their information content. However, sublinear summation in quantum bump production (quantum-gain-nonlinearity) may also cause adaptation by reducing the bump/photon gain when multiple photons hit the same microvillus simultaneously. Here, we use a Random Photon Absorption Model (RandPAM), which is the 1st module of the 4-module fly photoreceptor model, to quantify the contribution of quantum-gain-nonlinearity in light adaptation. We show how quantum-gain-nonlinearity already results from photon sampling alone. In the extreme case, when two or more simultaneous photon-hits reduce to a single sublinear value, quantum-gain-nonlinearity is preset before the phototransduction reactions adapt the quantum bump waveform. However, the contribution of quantum-gain-nonlinearity in light adaptation depends upon the likelihood of multi-photon-hits, which is strictly determined by the number of microvilli and light intensity. Specifically, its contribution to light-adaptation is marginal (≤ 1%) in fly photoreceptors with many thousands of microvilli, because the probability of simultaneous multi-photon-hits on any one microvillus is low even during daylight conditions. However, in cells with fewer sampling units, the impact of quantum-gain-nonlinearity increases with brightening light. PMID:27445779

  6. Improved sample preparation and counting techniques for enhanced tritium measurement sensitivity

    Science.gov (United States)

    Moran, J.; Aalseth, C.; Bailey, V. L.; Mace, E. K.; Overman, C.; Seifert, A.; Wilcox Freeburg, E. D.

    2015-12-01

    Tritium (T) measurements offer insight to a wealth of environmental applications including hydrologic tracking, discerning ocean circulation patterns, and aging ice formations. However, the relatively short half-life of T (12.3 years) limits its effective age dating range. Compounding this limitation is the decrease in atmospheric T content by over two orders of magnitude (from 1000-2000 TU in 1962 to testing in the 1960's. We are developing sample preparation methods coupled to direct counting of T via ultra-low background proportional counters which, when combined, offer improved T measurement sensitivity (~4.5 mmoles of H2 equivalent) and will help expand the application of T age dating to smaller sample sizes linked to persistent environmental questions despite the limitations above. For instance, this approach can be used to T date ~ 2.2 mmoles of CH4 collected from sample-limited systems including microbial communities, soils, or subsurface aquifers and can be combined with radiocarbon dating to distinguish the methane's formation age from C age in a system. This approach can also expand investigations into soil organic C where the improved sensitivity will permit resolution of soil C into more descriptive fractions and provide direct assessments of the stability of specific classes of organic matter in soils environments. We are employing a multiple step sample preparation system whereby organic samples are first combusted with resulting CO2 and H2O being used as a feedstock to synthesize CH4. This CH4 is mixed with Ar and loaded directly into an ultra-low background proportional counter for measurement of T β decay in a shallow underground laboratory. Analysis of water samples requires only the addition of geologic CO2 feedstock with the sample for methane synthesis. The chemical nature of the preparation techniques enable high sample throughput with only the final measurement requiring T decay with total sample analysis time ranging from 2 -5 weeks

  7. Sample preparation for large-scale bioanalytical studies based on liquid chromatographic techniques.

    Science.gov (United States)

    Medvedovici, Andrei; Bacalum, Elena; David, Victor

    2018-01-01

    Quality of the analytical data obtained for large-scale and long term bioanalytical studies based on liquid chromatography depends on a number of experimental factors including the choice of sample preparation method. This review discusses this tedious part of bioanalytical studies, applied to large-scale samples and using liquid chromatography coupled with different detector types as core analytical technique. The main sample preparation methods included in this paper are protein precipitation, liquid-liquid extraction, solid-phase extraction, derivatization and their versions. They are discussed by analytical performances, fields of applications, advantages and disadvantages. The cited literature covers mainly the analytical achievements during the last decade, although several previous papers became more valuable in time and they are included in this review. Copyright © 2017 John Wiley & Sons, Ltd.

  8. Decomposition and (importance) sampling techniques for multi-stage stochastic linear programs

    Energy Technology Data Exchange (ETDEWEB)

    Infanger, G.

    1993-11-01

    The difficulty of solving large-scale multi-stage stochastic linear programs arises from the sheer number of scenarios associated with numerous stochastic parameters. The number of scenarios grows exponentially with the number of stages and problems get easily out of hand even for very moderate numbers of stochastic parameters per stage. Our method combines dual (Benders) decomposition with Monte Carlo sampling techniques. We employ importance sampling to efficiently obtain accurate estimates of both expected future costs and gradients and right-hand sides of cuts. The method enables us to solve practical large-scale problems with many stages and numerous stochastic parameters per stage. We discuss the theory of sharing and adjusting cuts between different scenarios in a stage. We derive probabilistic lower and upper bounds, where we use importance path sampling for the upper bound estimation. Initial numerical results turned out to be promising.

  9. Texture investigation in aluminium and iron - silicon samples by neutron diffraction technique

    International Nuclear Information System (INIS)

    Pugliese, R.; Yamasaki, J.M.

    1988-09-01

    By means of the neutron diffraction technique the texture of 5% and 98% rolled-aluminium and of iron-silicon steel used in the core of electric transformers, have been determined. The measurements were performed by using a neutron diffractometer installed at the IEA-R1 Nuclear Research Reactor, in the Beam-Hole n 0 . 6. To avoid corrections such as neutron absorption and sample luminosity the geometric form of the samples were approximated to spheric or octagonal prism, and its dimensions do not exceed that of the neutron beam. The texture of the samples were analysed with the help of a computer programme that analyses the intensity of the diffracted neutron beam and plot the pole figures. (author) [pt

  10. Random sampling or geostatistical modelling? Choosing between design-based and model-based sampling strategies for soil (with discussion)

    NARCIS (Netherlands)

    Brus, D.J.; Gruijter, de J.J.

    1997-01-01

    Classical sampling theory has been repeatedly identified with classical statistics which assumes that data are identically and independently distributed. This explains the switch of many soil scientists from design-based sampling strategies, based on classical sampling theory, to the model-based

  11. Water stable isotope measurements of Antarctic samples by means of IRMS and WS-CRDS techniques

    Science.gov (United States)

    Michelini, Marzia; Bonazza, Mattia; Braida, Martina; Flora, Onelio; Dreossi, Giuliano; Stenni, Barbara

    2010-05-01

    In the last years in the scientific community there has been an increasing interest for the application of stable isotope techniques to several environmental problems such as drinking water safeguarding, groundwater management, climate change, soils and paleoclimate studies etc. For example, the water stable isotopes, being natural tracers of the hydrological cycle, have been extensively used as tools to characterize regional aquifers and to reconstruct past temperature changes from polar ice cores. Here the need for improvements in analytical techniques: the high request for information calls for technologies that can offer a great quantity of analyses in short times and with low costs. Furthermore, sometimes it is difficult to obtain big amount of samples (as is the case for Antarctic ice cores or interstitial water) preventing the possibility to replicate the analyses. Here, we present oxygen and hydrogen measurements performed on water samples covering a big range of isotopic values (from very negative antarctic precipitation to mid-latitude precipitation values) carried out with both the conventional Isotope Ratio Mass Spectrometry (IRMS) technique and with a new method based on laser absorption techniques, the Wavelenght Scanned Cavity Ringdown Spectroscopy (WS-CRDS). This study is focusing on improving the precision of the measurements carried out with WS-CRDS in order to extensively apply this method to Antarctic ice core paleoclimate studies. The WS-CRDS is a variation of the CRDS developed in 1988 by O'Keef and Deacon. In CRDS a pulse of light goes through a box with high reflective inner surfaces; when there is no sample in the box the light beam doesn't find any obstacle in its path, but the reflectivity of the walls is not perfect so eventually there will be an absorption of the light beam; when the sample is injected in the box there is absorption and the difference between the time of absorption without and with sample is proportional to the quantity

  12. Determination of trace elements in plant samples using XRF, PIXE and ICP-OES techniques

    International Nuclear Information System (INIS)

    Ahmed, Hassan Elzain Hassan

    2014-07-01

    The purpose of this study is to determine trace element concentration (Ca, Cu, Cr, K,Fe, Mn,Sr, and Za) in some sudanese wild plants namely, Ziziphus Abyssinica and Grewia Tenax. X-ray fluorescence ( X RF), particle-induced x-ray emission ( PIXE) and inductively coupled plasma-optical emission spectroscopy (ICP-OES) techniques were used for element determination. A series of plants standard references materials were used to check the reliability of the different employed techniques as well as to estimate possible factors for correcting the concentration of some elements that deviated significantly from their actual concentration. The results showed that, X RF, PIXE and ICP-OES are equally competitive methods for measuring Ca,K, Fe, Sr and Zn elements. Unlikely to ICP-OES seems to be superior techniques tend to be appropriate methods for Cu determination in plant samples however, for Mn element PIXE and ICP-OES are advisable techniques for measuring this element rather than X RF method. On the other hand, ICP-OES seems to be the superior techniques over PIXE and X RF methods for Cr and Ni determination in plant samples. The effect of geographical location on trace elements concentration in plants has been examined through determination of element in different species of Grewia Tenax than collected from different location. Most of measured elements showed similarity indicating there is no significant impact of locations on the difference of element contents. In addition, two plants with different genetic families namely, Ziziphus Spina Christi and Ziziphus Abyssinica were collected from the same location and screened for their trace element content. It was found that there were no difference between the two plants for Ca, K, Cu, Fe, and Sr element. However, significant variations were observed for Mn and Zn concentrations implying the possibility of using of those two elements for plant taxonomy purposes.(Author)

  13. Neutron activation analysis technique and X-ray fluorescence in bovine liver sample

    International Nuclear Information System (INIS)

    Maihara, V.A.; Favaro, D.I.T.; Vasconcellos, M.B.A.; Sato, I.M.; Salvador, V.L.

    2002-01-01

    Many analytical techniques have been used in food and diet analysis in order to determine a great number of nutritional elements, ranging from percentage to ng g -1 , with high sensitivity and accuracy. Instrumental Neutron activation Analysis (INAA) has been employed to certificate many trace elements in biological reference materials. More recently, the X-Ray Fluorescence (FRX-WD) has been also used to determine some essential elements in food samples. The INAA has been applied in nutrition studies in our laboratory at IPEN since the 80 s. For the development of analytical methodologies the use of the reference materials with the same characteristics of the sample analyzed is essential. Several Brazilian laboratories do not have conditions to use these materials due their high cost.In this paper preliminary results of commercial bovine liver sample analyses obtained by INAA and WD-XRF methods are presented. This sample was prepared to be a Brazilian candidate of reference material for a group of laboratories participating in a research project sponsored by FAPESP. The concentrations of some elements like Cl, K, Na, P, S and trace elements Br, Ca, Co, Cu, Fe, Mg, Mn, Mo, Rb, Se and Zn were determined by INAA and WD-XFR. For validation methodology of both techniques, NIST SRM 1577b Bovine Liver reference material was analyzed and the detection limits were calculated. The concentrations of elements determined by both analytical techniques were compared by using the Student's t-test and for Cl, Cu, Fe, K, Mg, Na, Rn and Zn the results do show no statistical difference for 95% significance level. (author)

  14. Randomized clinical trial comparing control of maxillary anchorage with 2 retraction techniques.

    Science.gov (United States)

    Xu, Tian-Min; Zhang, Xiaoyun; Oh, Hee Soo; Boyd, Robert L; Korn, Edward L; Baumrind, Sheldon

    2010-11-01

    The objective of this pilot randomized clinical trial was to investigate the relative effectiveness of anchorage conservation of en-masse and 2-step retraction techniques during maximum anchorage treatment in patients with Angle Class I and Class II malocclusions. Sixty-four growing subjects (25 boys, 39 girls; 10.2-15.9 years old) who required maximum anchorage were randomized to 2 treatment techniques: en-masse retraction (n = 32) and 2-step retraction (n = 32); the groups were stratified by sex and starting age. Each patient was treated by a full-time clinic instructor experienced in the use of both retraction techniques at the orthodontic clinic of Peking University School of Stomatology in China. All patients used headgear, and most had transpalatal appliances. Lateral cephalograms taken before treatment and at the end of treatment were used to evaluate treatment-associated changes. Differences in maxillary molar mesial displacement and maxillary incisor retraction were measured with the before and after treatment tracings superimposed on the anatomic best fit of the palatal structures. Differences in mesial displacement of the maxillary first molar were compared between the 2 treatment techniques, between sexes, and between different starting-age groups. Average mesial displacement of the maxillary first molar was slightly less in the en-masse group than in the 2-step group (mean, -0.36 mm; 95% CI, -1.42 to 0.71 mm). The average mesial displacement of the maxillary first molar for both treatment groups pooled (n = 63, because 1 patient was lost to follow-up) was 4.3 ± 2.1 mm (mean ± standard deviation). Boys had significantly more mesial displacement than girls (mean difference, 1.3 mm; P <0.03). Younger adolescents had significantly more mesial displacement than older adolescents (mean difference, 1.3 mm; P <0.02). Average mesial displacement of the maxillary first molar with 2-step retraction was slightly greater than that for en-masse retraction, but the

  15. Advantages of Arthroscopic Rotator Cuff Repair With a Transosseous Suture Technique: A Prospective Randomized Controlled Trial.

    Science.gov (United States)

    Randelli, Pietro; Stoppani, Carlo Alberto; Zaolino, Carlo; Menon, Alessandra; Randelli, Filippo; Cabitza, Paolo

    2017-07-01

    Rotator cuff tear is a common finding in patients with painful, poorly functioning shoulders. The surgical management of this disorder has improved greatly and can now be fully arthroscopic. To evaluate clinical and radiological results of arthroscopic rotator cuff repair using 2 different techniques: single-row anchor fixation versus transosseous hardware-free suture repair. Randomized controlled trial; Level of evidence, 1. Sixty-nine patients with rotator cuff tears were enrolled: 35 patients were operated with metal anchors and 34 with standardized transosseous repair. The patients were clinically evaluated before surgery, during the 28 days after surgery, and at least 1 year after the operation by the use of validated rating scores (Constant score, QuickDASH, and numerical rating scale [NRS]). Final follow-up was obtained at more than 3 years by a QuickDASH evaluation to detect any difference from the previous follow-up. During the follow-up, rotator cuff integrity was determined through magnetic resonance imaging and was classified according to the 5 Sugaya categories. Patients operated with the transosseous technique had significantly less pain, especially from the 15th postoperative day: In the third week, the mean NRS value for the anchor group was 3.00 while that for transosseous group was 2.46 ( P = .02); in the fourth week, the values were 2.44 and 1.76, respectively ( P rotator cuff repair integrity, based on Sugaya magnetic resonance imaging classification, no significant difference was found between the 2 techniques in terms of retear rate ( P = .81). No significant differences were found between the 2 arthroscopic repair techniques in terms of functional and radiological results. However, postoperative pain decreased more quickly after the transosseous procedure, which therefore emerges as a possible improvement in the surgical repair of the rotator cuff. Registration: NCT01815177 ( ClinicalTrials.gov identifier).

  16. Assessment of Random Assignment in Training and Test Sets using Generalized Cluster Analysis Technique

    Directory of Open Access Journals (Sweden)

    Sorana D. BOLBOACĂ

    2011-06-01

    Full Text Available Aim: The properness of random assignment of compounds in training and validation sets was assessed using the generalized cluster technique. Material and Method: A quantitative Structure-Activity Relationship model using Molecular Descriptors Family on Vertices was evaluated in terms of assignment of carboquinone derivatives in training and test sets during the leave-many-out analysis. Assignment of compounds was investigated using five variables: observed anticancer activity and four structure descriptors. Generalized cluster analysis with K-means algorithm was applied in order to investigate if the assignment of compounds was or not proper. The Euclidian distance and maximization of the initial distance using a cross-validation with a v-fold of 10 was applied. Results: All five variables included in analysis proved to have statistically significant contribution in identification of clusters. Three clusters were identified, each of them containing both carboquinone derivatives belonging to training as well as to test sets. The observed activity of carboquinone derivatives proved to be normal distributed on every. The presence of training and test sets in all clusters identified using generalized cluster analysis with K-means algorithm and the distribution of observed activity within clusters sustain a proper assignment of compounds in training and test set. Conclusion: Generalized cluster analysis using the K-means algorithm proved to be a valid method in assessment of random assignment of carboquinone derivatives in training and test sets.

  17. Evaluation of primary immunization coverage of infants under universal immunization programme in an urban area of bangalore city using cluster sampling and lot quality assurance sampling techniques.

    Science.gov (United States)

    K, Punith; K, Lalitha; G, Suman; Bs, Pradeep; Kumar K, Jayanth

    2008-07-01

    Is LQAS technique better than cluster sampling technique in terms of resources to evaluate the immunization coverage in an urban area? To assess and compare the lot quality assurance sampling against cluster sampling in the evaluation of primary immunization coverage. Population-based cross-sectional study. Areas under Mathikere Urban Health Center. Children aged 12 months to 23 months. 220 in cluster sampling, 76 in lot quality assurance sampling. Percentages and Proportions, Chi square Test. (1) Using cluster sampling, the percentage of completely immunized, partially immunized and unimmunized children were 84.09%, 14.09% and 1.82%, respectively. With lot quality assurance sampling, it was 92.11%, 6.58% and 1.31%, respectively. (2) Immunization coverage levels as evaluated by cluster sampling technique were not statistically different from the coverage value as obtained by lot quality assurance sampling techniques. Considering the time and resources required, it was found that lot quality assurance sampling is a better technique in evaluating the primary immunization coverage in urban area.

  18. Convergence analysis for Latin-hypercube lattice-sample selection strategies for 3D correlated random hydraulic-conductivity fields

    OpenAIRE

    Simuta-Champo, R.; Herrera-Zamarrón, G. S.

    2010-01-01

    The Monte Carlo technique provides a natural method for evaluating uncertainties. The uncertainty is represented by a probability distribution or by related quantities such as statistical moments. When the groundwater flow and transport governing equations are solved and the hydraulic conductivity field is treated as a random spatial function, the hydraulic head, velocities and concentrations also become random spatial functions. When that is the case, for the stochastic simulation of groundw...

  19. Use of a multigrid technique to study effects of limited sampling of heterogeneity on transport prediction

    International Nuclear Information System (INIS)

    Cole, C.R.; Foote, H.P.

    1987-02-01

    Reliable ground water transport prediction requires accurate spatial and temporal characterization of a hydrogeologic system. However, cost constraints and the desire to maintain site integrity by minimizing drilling can restrict the amount of spatial sampling that can be obtained to resolve the flow parameter variability associated with heterogeneities. This study quantifies the errors in subsurface transport predictions resulting from incomplete characterization of hydraulic conductivity heterogeneity. A multigrid technique was used to simulate two-dimensional flow velocity fields with high resolution. To obtain these velocity fields, the finite difference code MGRID, which implements a multigrid solution technique, was applied to compute stream functions on a 256-by-256 grid for a variety of hypothetical systems having detailed distributions of hydraulic conductivity. Spatial variability in hydraulic conductivity distributions was characterized by the components in the spectrum of spatial frequencies. A low-pass spatial filtering technique was applied to the base case hydraulic conductivity distribution to produce a data set with lower spatial frequency content. Arrival time curves were then calculated for filtered hydraulic conductivity distribution and compared to base case results to judge the relative importance of the higher spatial frequency components. Results indicate a progression from multimode to single-mode arrival time curves as the number and extent of distinct flow pathways are reduced by low-pass filtering. This relationship between transport predictions and spatial frequencies was used to judge the consequences of sampling the hydraulic conductivity with reduced spatial resolution. 22 refs., 17 figs

  20. A cost-effective technique for integrating personal radiation dose assessment with personal gravimetric sampling

    International Nuclear Information System (INIS)

    Strydom, R.; Rolle, R.; Van der Linde, A.

    1992-01-01

    During recent years there has been an increasing awareness internationally of radiation levels in the mining and milling of radioactive ores, including those from non-uranium mines. A major aspect of radiation control is concerned with the measurement of radiation levels and the assessment of radiation doses incurred by individual workers. Current techniques available internationally for personnel monitoring of radiation exposures are expensive and there is a particular need to reduce the cost of personal radiation monitoring in South African gold mines because of the large labour force employed. In this regard the obvious benefits of integrating personal radiation monitoring with existing personal monitoring systems already in place in South African gold mines should be exploited. A system which can be utilized for this purpose is personal gravimetric sampling. A new cost-effective technique for personal radiation monitoring, which can be fully integrated with the personal gravimetric sampling strategy being implemented on mines, has been developed in South Africa. The basic principles of this technique and its potential in South African mines are described. 9 refs., 7 figs

  1. An Electromagnetic Gauge Technique for Measuring Shocked Particle Velocity in Electrically Conductive Samples

    Science.gov (United States)

    Cheng, David; Yoshinaka, Akio

    2014-11-01

    Electromagnetic velocity (EMV) gauges are a class of film gauges which permit the direct in-situ measurement of shocked material flow velocity. The active sensing element, typically a metallic foil, requires exposure to a known external magnetic field in order to produce motional electromotive force (emf). Due to signal distortion caused by mutual inductance between sample and EMV gauge, this technique is typically limited to shock waves in non-conductive materials. In conductive samples, motional emf generated in the EMV gauge has to be extracted from the measured signal which results from the combined effects of both motional emf and voltage changes from induced currents. An electromagnetic technique is presented which analytically models the dynamics of induced current between a copper disk moving as a rigid body with constant 1D translational velocity toward an EMV gauge, where both disk and gauge are exposed to a uniform external static magnetic field. The disk is modelled as a magnetic dipole loop where its Foucault current is evaluated from the characteristics of the fields, whereas the EMV gauge is modelled as a circuit loop immersed in the field of the magnetic dipole loop, the intensity of which is calculated as a function of space and, implicitly, time. Equations of mutual induction are derived and the current induced in the EMV gauge loop is solved, allowing discrimination of the motional emf. Numerical analysis is provided for the step response of the induced EMV gauge current with respect to the Foucault current in the moving copper sample.

  2. Coherent optical adaptive technique improves the spatial resolution of STED microscopy in thick samples

    Science.gov (United States)

    Yan, Wei; Yang, Yanlong; Tan, Yu; Chen, Xun; Li, Yang; Qu, Junle; Ye, Tong

    2018-01-01

    Stimulated emission depletion microscopy (STED) is one of far-field optical microscopy techniques that can provide sub-diffraction spatial resolution. The spatial resolution of the STED microscopy is determined by the specially engineered beam profile of the depletion beam and its power. However, the beam profile of the depletion beam may be distorted due to aberrations of optical systems and inhomogeneity of specimens’ optical properties, resulting in a compromised spatial resolution. The situation gets deteriorated when thick samples are imaged. In the worst case, the sever distortion of the depletion beam profile may cause complete loss of the super resolution effect no matter how much depletion power is applied to specimens. Previously several adaptive optics approaches have been explored to compensate aberrations of systems and specimens. However, it is hard to correct the complicated high-order optical aberrations of specimens. In this report, we demonstrate that the complicated distorted wavefront from a thick phantom sample can be measured by using the coherent optical adaptive technique (COAT). The full correction can effectively maintain and improve the spatial resolution in imaging thick samples. PMID:29400356

  3. Localisation and identification of radioactive particles in solid samples by means of a nuclear track technique

    International Nuclear Information System (INIS)

    Boehnke, Antje; Treutler, Hanns-Christian; Freyer, Klaus; Schubert, Michael; Holger Weiss

    2005-01-01

    This study is aimed to develop a generally applicable methodology of investigation that can be used for the localisation of single alpha-active particles in solid samples, such as industrial dust or natural soils, sediments and rocks by autoradiography using solid-state nuclear track detectors. The developed technique allows the detection of local enrichments of alpha-emitters in any solid material. The results of such an investigation are of interest from technical, biological and environmental points of view. The idea behind the methodology is to locate the position of alpha-active spots in a sample by attaching the track detector to the sample in a defined manner, thoroughly described in the paper. The located alpha-active particles are subsequently analysed by an electron microscope and an electron microprobe. An example of the application of this methodology is also given. An ultra-fine -grained ore-processing residue, which causes serious environmental pollution in the respective mining district and thus limits possible land use and affects quality of life in the area, was examined using the described technique. The investigation revealed considerable amounts of alpha-active particles in this material

  4. Review of cleaning techniques and their effects on the chemical composition of foliar samples

    Energy Technology Data Exchange (ETDEWEB)

    Rossini Oliva, S.; Raitio, H.

    2003-07-01

    Chemical foliar analysis is a tool widely used to study tree nutrition and to monitor the impact and extent of air pollutants. This paper reviews a number of cleaning methods, and the effects of cleaning on foliar chemistry. Cleaning may include mechanical techniques such as the use of dry or moistened tissues, shaking, blowing, and brushing, or use various washing techniques with water or other solvents. Owing to the diversity of plant species, tissue differences, etc., there is no standard procedure for all kinds of samples. Analysis of uncleaned leaves is considered a good method for assessing the degree of air contamination because it provides an estimate of the element content of the deposits on leaf surfaces or when the analysis is aimed at the investigation of transfer of elements along the food chain. Sample cleaning is recommended in order (1) to investigate the transfer rate of chemical elements from soil to plants, (2) to qualify the washoff of dry deposition from foliage and (3) to separate superficially absorbed and biomass-incorporated elements. Since there is not a standard cleaning procedure for all kinds of samples and aims, it is advised to conduct a pilot study in order to be able to establish a cleaning procedure to provide reliable foliar data. (orig.)

  5. Convergence and Efficiency of Adaptive Importance Sampling Techniques with Partial Biasing

    Science.gov (United States)

    Fort, G.; Jourdain, B.; Lelièvre, T.; Stoltz, G.

    2018-04-01

    We propose a new Monte Carlo method to efficiently sample a multimodal distribution (known up to a normalization constant). We consider a generalization of the discrete-time Self Healing Umbrella Sampling method, which can also be seen as a generalization of well-tempered metadynamics. The dynamics is based on an adaptive importance technique. The importance function relies on the weights (namely the relative probabilities) of disjoint sets which form a partition of the space. These weights are unknown but are learnt on the fly yielding an adaptive algorithm. In the context of computational statistical physics, the logarithm of these weights is, up to an additive constant, the free-energy, and the discrete valued function defining the partition is called the collective variable. The algorithm falls into the general class of Wang-Landau type methods, and is a generalization of the original Self Healing Umbrella Sampling method in two ways: (i) the updating strategy leads to a larger penalization strength of already visited sets in order to escape more quickly from metastable states, and (ii) the target distribution is biased using only a fraction of the free-energy, in order to increase the effective sample size and reduce the variance of importance sampling estimators. We prove the convergence of the algorithm and analyze numerically its efficiency on a toy example.

  6. Preoperative warm-up the key to improved resident technique: a randomized study.

    Science.gov (United States)

    Moran-Atkin, Erin; Abdalla, Gamal; Chen, Grace; Magnuson, Thomas H; Lidor, Anne O; Schweitzer, Michael A; Steele, Kimberley E

    2015-05-01

    The ACGME has required that a skills lab be incorporated into the surgical residency curriculum. While the value of warm-up is generally accepted in other areas requiring complex motor skills, there is little evidence to support the benefits of warm-up prior to performing surgery. We are conducting this study in an attempt to identify whether a warm-up period prior to operating impacts operative technique. All general surgery residents and MIS fellows were included in this IRB-approved randomized study. Participants were randomized to either warm-up or no warm-up groups. Participants randomized to the warm-up group completed a 10 min practice session in the simulation lab within 1 h of starting the case, using an FLS training box. At the conclusion of the operation, the participant was evaluated by the attending surgeon using the validated global rating scales of Reznick and Vassiliou. The attending surgeons were blinded to the use of pre-procedure warm-up. The results of the questionnaire were analyzed using student's t test with p warm-up (19) or no warm-up (21). There was a statistically significant improvement in depth perception (p = 0.02), bimanual dexterity (p = 0.01), and efficiency of movements (p = 0.03) for those randomized to warm-up. There was statistical improvement when we preformed a composite scoring of the attending evaluations for each of the Reznick (p = 0.008) and the Vassiliou (p = 0.01) global rating scales. Preoperative warm-up significantly improves depth perception, bimanual dexterity, and efficiency of movements, as well as improvement in composite scores as judged by the attending surgeon. The lack of self-perceived improvement by the residents may be a reflection of the high standards and intense self-critique that is common among surgical trainees. We believe that our findings, while preliminary, reflect that surgical performance can be enhanced through structured warm-up activities.

  7. An R package for spatial coverage sampling and random sampling from compact geographical strata by k-means

    NARCIS (Netherlands)

    Walvoort, D.J.J.; Brus, D.J.; Gruijter, de J.J.

    2010-01-01

    Both for mapping and for estimating spatial means of an environmental variable, the accuracy of the result will usually be increased by dispersing the sample locations so that they cover the study area as uniformly as possible. We developed a new R package for designing spatial coverage samples for

  8. Low-mass molecular dynamics simulation: A simple and generic technique to enhance configurational sampling

    Energy Technology Data Exchange (ETDEWEB)

    Pang, Yuan-Ping, E-mail: pang@mayo.edu

    2014-09-26

    Highlights: • Reducing atomic masses by 10-fold vastly improves sampling in MD simulations. • CLN025 folded in 4 of 10 × 0.5-μs MD simulations when masses were reduced by 10-fold. • CLN025 folded as early as 96.2 ns in 1 of the 4 simulations that captured folding. • CLN025 did not fold in 10 × 0.5-μs MD simulations when standard masses were used. • Low-mass MD simulation is a simple and generic sampling enhancement technique. - Abstract: CLN025 is one of the smallest fast-folding proteins. Until now it has not been reported that CLN025 can autonomously fold to its native conformation in a classical, all-atom, and isothermal–isobaric molecular dynamics (MD) simulation. This article reports the autonomous and repeated folding of CLN025 from a fully extended backbone conformation to its native conformation in explicit solvent in multiple 500-ns MD simulations at 277 K and 1 atm with the first folding event occurring as early as 66.1 ns. These simulations were accomplished by using AMBER forcefield derivatives with atomic masses reduced by 10-fold on Apple Mac Pros. By contrast, no folding event was observed when the simulations were repeated using the original AMBER forcefields of FF12SB and FF14SB. The results demonstrate that low-mass MD simulation is a simple and generic technique to enhance configurational sampling. This technique may propel autonomous folding of a wide range of miniature proteins in classical, all-atom, and isothermal–isobaric MD simulations performed on commodity computers—an important step forward in quantitative biology.

  9. Non-destructive high-resolution thermal imaging techniques to evaluate wildlife and delicate biological samples

    International Nuclear Information System (INIS)

    Lavers, C; Franklin, P; Franklin, P; Plowman, A; Sayers, G; Bol, J; Shepard, D; Fields, D

    2009-01-01

    Thermal imaging cameras now allows routine monitoring of dangerous yet endangered wildlife in captivity. This study looks at the potential applications of radiometrically calibrated thermal data to wildlife, as well as providing parameters for future materials applications. We present a non-destructive active testing technique suitable for enhancing imagery contrast of thin or delicate biological specimens yielding improved thermal contrast at room temperature, for analysis of sample thermal properties. A broad spectrum of animals is studied with different textured surfaces, reflective and emissive properties in the infra red part of the electromagnetic spectrum. Some surface features offer biomimetic materials design opportunities.

  10. A novel in-situ sampling and VFA sensor technique for anaerobic systems

    DEFF Research Database (Denmark)

    Pind, Peter Frode; Angelidaki, Irini; Ahring, Birgitte Kiær

    2002-01-01

    A key information for understanding and controlling the anaerobic biogas process is the concentration of Volatile Fatty Acids (VFA). However, access to this information has so far been limited to off-line measurements by manual time and labour consuming methods. We have developed a new technique ...... than 1000 samples on both a fullscale biogas plant and lab-scale reactors. The measuring range covers specific measurements of acetate, propionate, iso-/n-butyrate and iso-/n-valerate from 0.1 to 50 mM (6–3,000 mg)....

  11. Investigation of an egyptian phosphate ore sample by neutron activation analysis technique

    International Nuclear Information System (INIS)

    Eissa, E.A.; Aly, R.A.; Rofail, N.B.; Hassan, A.M.

    1995-01-01

    A domestic phosphate ore sample has been analysed by means of prompt and delayed gamma-ray spectrometry following the activation by thermal neutron capture technique. The rabbit pneumatic transfer system (RPTS), long irradiation facility and two Pu/Be (2,5 Ci each) neutron sources set-Pu for prompt (n,gamma) were applied. The high purity germanium (HPGe) gamma-ray spectrometer with a personal computer analyzer (PCA) system were used for spectrum measurements. Programmes on the VAX computer were utilized for estimating the elemental concentrations of 22 out of 36 elements identified in this work. 2 tabs

  12. Non-destructive high-resolution thermal imaging techniques to evaluate wildlife and delicate biological samples

    Energy Technology Data Exchange (ETDEWEB)

    Lavers, C; Franklin, P; Franklin, P; Plowman, A; Sayers, G; Bol, J; Shepard, D; Fields, D, E-mail: brnc-radarcomms1@nrta.mod.u [Sensors Team, Plymouth University at Britannia Royal Naval College, Dartmouth, Devon (United Kingdom) and Paignton Zoological Park, Paignton, Devon (United Kingdom); Thermal Wave Imaging, Inc., 845 Livernoise St, Ferndale, MI (United States); Buckfast Butterfly and Otter Sanctuary, Buckfast, Devon (United Kingdom)

    2009-07-01

    Thermal imaging cameras now allows routine monitoring of dangerous yet endangered wildlife in captivity. This study looks at the potential applications of radiometrically calibrated thermal data to wildlife, as well as providing parameters for future materials applications. We present a non-destructive active testing technique suitable for enhancing imagery contrast of thin or delicate biological specimens yielding improved thermal contrast at room temperature, for analysis of sample thermal properties. A broad spectrum of animals is studied with different textured surfaces, reflective and emissive properties in the infra red part of the electromagnetic spectrum. Some surface features offer biomimetic materials design opportunities.

  13. Analysis of boron utilization in sample preparation for microorganisms detection by neutron radiography technique

    International Nuclear Information System (INIS)

    Wacha, Reinaldo; Crispim, Verginia R.

    2000-01-01

    The neutron radiography technique applied to the microorganisms detection is the study of a new and faster alternative for diagnosis of infectious means. This work presents the parameters and the effects involved in the use of the boron as a conversion agent, that convert neutrons in a particles, capable ones of generating latent tracks in a solid state nuclear tracks detector, CR-39. The collected samples are doped with the boron by the incubation method, propitiating an interaction microorganisms/boron, that will guarantee the identification of the images of those microorganisms, through your morphology. (author)

  14. A rapid method of radium-226 analysis in water samples using an alpha spectroscopic technique

    International Nuclear Information System (INIS)

    Lim, T.P.

    1981-01-01

    A fast, reliable and accurate method for radium-226 determination in environmental water samples has been devised, using an alpha spectroscopic technique. The correlation between barium-133 and radium-226 in the barium-radium sulphate precipitation mechanism was studied and in the limited experimental recovery range, the coefficient of correlation was r = 0.986. A self-absorption study for various barium carrier concentrations was also undertaken to obtain the least broadening of alpha energy line widths. An optimum value of 0.3 mg barium carrier was obtained for chemical recovery in the range of 85 percent. (auth)

  15. TRAN-STAT: statistics for environmental studies, Number 22. Comparison of soil-sampling techniques for plutonium at Rocky Flats

    International Nuclear Information System (INIS)

    Gilbert, R.O.; Bernhardt, D.E.; Hahn, P.B.

    1983-01-01

    A summary of a field soil sampling study conducted around the Rocky Flats Colorado plant in May 1977 is preseted. Several different soil sampling techniques that had been used in the area were applied at four different sites. One objective was to comparethe average 239 - 240 Pu concentration values obtained by the various soil sampling techniques used. There was also interest in determining whether there are differences in the reproducibility of the various techniques and how the techniques compared with the proposed EPA technique of sampling to 1 cm depth. Statistically significant differences in average concentrations between the techniques were found. The differences could be largely related to the differences in sampling depth-the primary physical variable between the techniques. The reproducibility of the techniques was evaluated by comparing coefficients of variation. Differences between coefficients of variation were not statistically significant. Average (median) coefficients ranged from 21 to 42 percent for the five sampling techniques. A laboratory study indicated that various sample treatment and particle sizing techniques could increase the concentration of plutonium in the less than 10 micrometer size fraction by up to a factor of about 4 compared to the 2 mm size fraction

  16. A random sampling approach for robust estimation of tissue-to-plasma ratio from extremely sparse data.

    Science.gov (United States)

    Chu, Hui-May; Ette, Ene I

    2005-09-02

    his study was performed to develop a new nonparametric approach for the estimation of robust tissue-to-plasma ratio from extremely sparsely sampled paired data (ie, one sample each from plasma and tissue per subject). Tissue-to-plasma ratio was estimated from paired/unpaired experimental data using independent time points approach, area under the curve (AUC) values calculated with the naïve data averaging approach, and AUC values calculated using sampling based approaches (eg, the pseudoprofile-based bootstrap [PpbB] approach and the random sampling approach [our proposed approach]). The random sampling approach involves the use of a 2-phase algorithm. The convergence of the sampling/resampling approaches was investigated, as well as the robustness of the estimates produced by different approaches. To evaluate the latter, new data sets were generated by introducing outlier(s) into the real data set. One to 2 concentration values were inflated by 10% to 40% from their original values to produce the outliers. Tissue-to-plasma ratios computed using the independent time points approach varied between 0 and 50 across time points. The ratio obtained from AUC values acquired using the naive data averaging approach was not associated with any measure of uncertainty or variability. Calculating the ratio without regard to pairing yielded poorer estimates. The random sampling and pseudoprofile-based bootstrap approaches yielded tissue-to-plasma ratios with uncertainty and variability. However, the random sampling approach, because of the 2-phase nature of its algorithm, yielded more robust estimates and required fewer replications. Therefore, a 2-phase random sampling approach is proposed for the robust estimation of tissue-to-plasma ratio from extremely sparsely sampled data.

  17. Variances in the projections, resulting from CLIMEX, Boosted Regression Trees and Random Forests techniques

    Science.gov (United States)

    Shabani, Farzin; Kumar, Lalit; Solhjouy-fard, Samaneh

    2017-08-01

    The aim of this study was to have a comparative investigation and evaluation of the capabilities of correlative and mechanistic modeling processes, applied to the projection of future distributions of date palm in novel environments and to establish a method of minimizing uncertainty in the projections of differing techniques. The location of this study on a global scale is in Middle Eastern Countries. We compared the mechanistic model CLIMEX (CL) with the correlative models MaxEnt (MX), Boosted Regression Trees (BRT), and Random Forests (RF) to project current and future distributions of date palm ( Phoenix dactylifera L.). The Global Climate Model (GCM), the CSIRO-Mk3.0 (CS) using the A2 emissions scenario, was selected for making projections. Both indigenous and alien distribution data of the species were utilized in the modeling process. The common areas predicted by MX, BRT, RF, and CL from the CS GCM were extracted and compared to ascertain projection uncertainty levels of each individual technique. The common areas identified by all four modeling techniques were used to produce a map indicating suitable and unsuitable areas for date palm cultivation for Middle Eastern countries, for the present and the year 2100. The four different modeling approaches predict fairly different distributions. Projections from CL were more conservative than from MX. The BRT and RF were the most conservative methods in terms of projections for the current time. The combination of the final CL and MX projections for the present and 2100 provide higher certainty concerning those areas that will become highly suitable for future date palm cultivation. According to the four models, cold, hot, and wet stress, with differences on a regional basis, appears to be the major restrictions on future date palm distribution. The results demonstrate variances in the projections, resulting from different techniques. The assessment and interpretation of model projections requires reservations

  18. Three Proposed Compendia for Genesis Solar Wind Samples: Science Results, Collector Materials Characterization and Cleaning Techniques

    Science.gov (United States)

    Allton, J. H.; Calaway, M. J.; Nyquist, L. E.; Jurewicz, A. J. G.; Burnett, D. S.

    2018-01-01

    Final Paper and not the abstract is attached. Introduction: Planetary material and cosmochemistry research using Genesis solar wind samples (including the development and implementation of cleaning and analytical techniques) has matured sufficiently that compilations on several topics, if made publically accessible, would be beneficial for researchers and reviewers. We propose here three compendia based on content, organization and source of documents (e.g. published peer-reviewed, published, internal memos, archives). For planning purposes, suggestions are solicited from potential users of Genesis solar wind samples for the type of science content and/or organizational style that would be most useful to them. These compendia are proposed as living documents, periodically updated. Similar to the existing compendia described below, the curation compendia are like library or archival finding aids, they are guides to published or archival documents and should not be cited as primary sources.

  19. High Field In Vivo 13C Magnetic Resonance Spectroscopy of Brain by Random Radiofrequency Heteronuclear Decoupling and Data Sampling

    Science.gov (United States)

    Li, Ningzhi; Li, Shizhe; Shen, Jun

    2017-06-01

    In vivo 13C magnetic resonance spectroscopy (MRS) is a unique and effective tool for studying dynamic human brain metabolism and the cycling of neurotransmitters. One of the major technical challenges for in vivo 13C-MRS is the high radio frequency (RF) power necessary for heteronuclear decoupling. In the common practice of in vivo 13C-MRS, alkanyl carbons are detected in the spectra range of 10-65ppm. The amplitude of decoupling pulses has to be significantly greater than the large one-bond 1H-13C scalar coupling (1JCH=125-145 Hz). Two main proton decoupling methods have been developed: broadband stochastic decoupling and coherent composite or adiabatic pulse decoupling (e.g., WALTZ); the latter is widely used because of its efficiency and superb performance under inhomogeneous B1 field. Because the RF power required for proton decoupling increases quadratically with field strength, in vivo 13C-MRS using coherent decoupling is often limited to low magnetic fields (protons via weak long-range 1H-13C scalar couplings, which can be decoupled using low RF power broadband stochastic decoupling. Recently, the carboxylic/amide 13C-MRS technique using low power random RF heteronuclear decoupling was safely applied to human brain studies at 7T. Here, we review the two major decoupling methods and the carboxylic/amide 13C-MRS with low power decoupling strategy. Further decreases in RF power deposition by frequency-domain windowing and time-domain random under-sampling are also discussed. Low RF power decoupling opens the possibility of performing in vivo 13C experiments of human brain at very high magnetic fields (such as 11.7T), where signal-to-noise ratio as well as spatial and temporal spectral resolution are more favorable than lower fields.

  20. A scatter-corrected list-mode reconstruction and a practical scatter/random approximation technique for dynamic PET imaging

    International Nuclear Information System (INIS)

    Cheng, J-C; Rahmim, Arman; Blinder, Stephan; Camborde, Marie-Laure; Raywood, Kelvin; Sossi, Vesna

    2007-01-01

    We describe an ordinary Poisson list-mode expectation maximization (OP-LMEM) algorithm with a sinogram-based scatter correction method based on the single scatter simulation (SSS) technique and a random correction method based on the variance-reduced delayed-coincidence technique. We also describe a practical approximate scatter and random-estimation approach for dynamic PET studies based on a time-averaged scatter and random estimate followed by scaling according to the global numbers of true coincidences and randoms for each temporal frame. The quantitative accuracy achieved using OP-LMEM was compared to that obtained using the histogram-mode 3D ordinary Poisson ordered subset expectation maximization (3D-OP) algorithm with similar scatter and random correction methods, and they showed excellent agreement. The accuracy of the approximated scatter and random estimates was tested by comparing time activity curves (TACs) as well as the spatial scatter distribution from dynamic non-human primate studies obtained from the conventional (frame-based) approach and those obtained from the approximate approach. An excellent agreement was found, and the time required for the calculation of scatter and random estimates in the dynamic studies became much less dependent on the number of frames (we achieved a nearly four times faster performance on the scatter and random estimates by applying the proposed method). The precision of the scatter fraction was also demonstrated for the conventional and the approximate approach using phantom studies

  1. Refinement of NMR structures using implicit solvent and advanced sampling techniques.

    Science.gov (United States)

    Chen, Jianhan; Im, Wonpil; Brooks, Charles L

    2004-12-15

    NMR biomolecular structure calculations exploit simulated annealing methods for conformational sampling and require a relatively high level of redundancy in the experimental restraints to determine quality three-dimensional structures. Recent advances in generalized Born (GB) implicit solvent models should make it possible to combine information from both experimental measurements and accurate empirical force fields to improve the quality of NMR-derived structures. In this paper, we study the influence of implicit solvent on the refinement of protein NMR structures and identify an optimal protocol of utilizing these improved force fields. To do so, we carry out structure refinement experiments for model proteins with published NMR structures using full NMR restraints and subsets of them. We also investigate the application of advanced sampling techniques to NMR structure refinement. Similar to the observations of Xia et al. (J.Biomol. NMR 2002, 22, 317-331), we find that the impact of implicit solvent is rather small when there is a sufficient number of experimental restraints (such as in the final stage of NMR structure determination), whether implicit solvent is used throughout the calculation or only in the final refinement step. The application of advanced sampling techniques also seems to have minimal impact in this case. However, when the experimental data are limited, we demonstrate that refinement with implicit solvent can substantially improve the quality of the structures. In particular, when combined with an advanced sampling technique, the replica exchange (REX) method, near-native structures can be rapidly moved toward the native basin. The REX method provides both enhanced sampling and automatic selection of the most native-like (lowest energy) structures. An optimal protocol based on our studies first generates an ensemble of initial structures that maximally satisfy the available experimental data with conventional NMR software using a simplified

  2. Issues in the analyze of low content gold mining samples by fire assay technique

    Science.gov (United States)

    Cetean, Valentina

    2016-04-01

    The classic technique analyze of samples with low gold content - below 0.1 g/t (=100 ppb = parts per billion), either ore or gold sediments, involves the preparation of sample by fire assay extraction, followed by the chemical attack with aqua regia (hydrochloric and nitric acid) and measuring the gold content by atomic absorption spectrometry or inductively coupled mass spectrometry. The issues raised by this analysis are well known for the world laboratories, commercial or research ones. The author's knowledge regarding this method of determining the gold content, accumulated in such laboratory from Romania (with more than 40 years of experience, even if not longer available from 2014) confirms the obtaining of reliable results required a lot of attention, amount of work and the involving of an experienced fire assayer specialist. The analytical conclusion for a research laboratory is that most reliable and statistically valid results are till reached for samples with more than 100 ppb gold content; the degree of confidence below this value is lower than 90%. Usually, for samples below 50 ppb, it does not exceed 50-70 %, unless without very strictly control of each stage, that involve additional percentage of hours allocated for successive extracting tests and knowing more precisely the other compounds that appear in the sample (Cu, Sb, As, sulfur / sulphides, Te, organic matter, etc.) or impurities. The most important operation is the preparation, namely: - grinding and splitting of sample (which can cause uneven distribution of gold flakes in the double samples for analyzed); - pyro-metallurgical recovery of gold = fire assay stage, involving the more precise temperature control in furnace during all stages (fusion and cupellation) and adjusting of the fire assay flux components to produce a successful fusion depending of the sample matrix and content; - reducing the sample weight to decrease the amount of impurities that can be concentrated in the lead button

  3. The study of combining Latin Hypercube Sampling method and LU decomposition method (LULHS method) for constructing spatial random field

    Science.gov (United States)

    WANG, P. T.

    2015-12-01

    Groundwater modeling requires to assign hydrogeological properties to every numerical grid. Due to the lack of detailed information and the inherent spatial heterogeneity, geological properties can be treated as random variables. Hydrogeological property is assumed to be a multivariate distribution with spatial correlations. By sampling random numbers from a given statistical distribution and assigning a value to each grid, a random field for modeling can be completed. Therefore, statistics sampling plays an important role in the efficiency of modeling procedure. Latin Hypercube Sampling (LHS) is a stratified random sampling procedure that provides an efficient way to sample variables from their multivariate distributions. This study combines the the stratified random procedure from LHS and the simulation by using LU decomposition to form LULHS. Both conditional and unconditional simulations of LULHS were develpoed. The simulation efficiency and spatial correlation of LULHS are compared to the other three different simulation methods. The results show that for the conditional simulation and unconditional simulation, LULHS method is more efficient in terms of computational effort. Less realizations are required to achieve the required statistical accuracy and spatial correlation.

  4. A novel non-invasive diagnostic sampling technique for cutaneous leishmaniasis.

    Directory of Open Access Journals (Sweden)

    Yasaman Taslimi

    2017-07-01

    Full Text Available Accurate diagnosis of cutaneous leishmaniasis (CL is important for chemotherapy and epidemiological studies. Common approaches for Leishmania detection involve the invasive collection of specimens for direct identification of amastigotes by microscopy and the culturing of promastigotes from infected tissues. Although these techniques are highly specific, they require highly skilled health workers and have the inherent risks of all invasive procedures, such as pain and risk of bacterial and fungal super-infection. Therefore, it is essential to reduce discomfort, potential infection and scarring caused by invasive diagnostic approaches especially for children. In this report, we present a novel non-invasive method, that is painless, rapid and user-friendly, using sequential tape strips for sampling and isolation of DNA from the surface of active and healed skin lesions of CL patients. A total of 119 patients suspected of suffering from cutaneous leishmaniasis with different clinical manifestations were recruited and samples were collected both from their lesions and from uninfected areas. In addition, 15 fungal-infected lesions and 54 areas of healthy skin were examined. The duration of sampling is short (less than one minute and species identification by PCR is highly specific and sensitive. The sequential tape stripping sampling method is a sensitive, non-invasive and cost-effective alternative to traditional diagnostic assays and it is suitable for field studies as well as for use in health care centers.

  5. Separation of arsenic species by capillary electrophoresis with sample-stacking techniques

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Zu Liang; Naidu, Ravendra [Adelaide Laboratory, CSIRO Land and Water, PMB2, 5064, Glen Osmond, SA (Australia); Lin, Jin-Ming [Research Center for Eco-Environmental Sciences, Chinese Academy of Sciences, P.O. Box 2871, 100085, Beijing (China)

    2003-03-01

    A simple capillary zone electrophoresis procedure was developed for the separation of arsenic species (AsO{sub 2}{sup 2-}, AsO{sub 4}{sup 2-}, and dimethylarsinic acid, DMA). Both counter-electroosmotic and co-electroosmotic (EOF) modes were investigated for the separation of arsenic species with direct UV detection at 185 nm using 20 mmol L{sup -1} sodium phosphate as the electrolyte. The separation selectivity mainly depends on the separation modes and electrolyte pH. Inorganic anions (Cl{sup -}, NO{sub 2}{sup -}, NO{sub 3}{sup -} and SO{sub 4}{sup 2-}) presented in real samples did not interfere with arsenic speciation in either separation mode. To improve the detection limits, sample-stacking techniques, including large-volume sample stacking (LVSS) and field-amplified sample injection (FASI), were investigated for the preconcentration of As species in co-CZE mode. Less than 1 {mu}mol L{sup -1} of detection limits for As species were achieved using FASI. The proposed method was demonstrated for the separation and detection of As species in water. (orig.)

  6. Comparing cluster-level dynamic treatment regimens using sequential, multiple assignment, randomized trials: Regression estimation and sample size considerations.

    Science.gov (United States)

    NeCamp, Timothy; Kilbourne, Amy; Almirall, Daniel

    2017-08-01

    Cluster-level dynamic treatment regimens can be used to guide sequential treatment decision-making at the cluster level in order to improve outcomes at the individual or patient-level. In a cluster-level dynamic treatment regimen, the treatment is potentially adapted and re-adapted over time based on changes in the cluster that could be impacted by prior intervention, including aggregate measures of the individuals or patients that compose it. Cluster-randomized sequential multiple assignment randomized trials can be used to answer multiple open questions preventing scientists from developing high-quality cluster-level dynamic treatment regimens. In a cluster-randomized sequential multiple assignment randomized trial, sequential randomizations occur at the cluster level and outcomes are observed at the individual level. This manuscript makes two contributions to the design and analysis of cluster-randomized sequential multiple assignment randomized trials. First, a weighted least squares regression approach is proposed for comparing the mean of a patient-level outcome between the cluster-level dynamic treatment regimens embedded in a sequential multiple assignment randomized trial. The regression approach facilitates the use of baseline covariates which is often critical in the analysis of cluster-level trials. Second, sample size calculators are derived for two common cluster-randomized sequential multiple assignment randomized trial designs for use when the primary aim is a between-dynamic treatment regimen comparison of the mean of a continuous patient-level outcome. The methods are motivated by the Adaptive Implementation of Effective Programs Trial which is, to our knowledge, the first-ever cluster-randomized sequential multiple assignment randomized trial in psychiatry.

  7. Hardware architecture for projective model calculation and false match refining using random sample consensus algorithm

    Science.gov (United States)

    Azimi, Ehsan; Behrad, Alireza; Ghaznavi-Ghoushchi, Mohammad Bagher; Shanbehzadeh, Jamshid

    2016-11-01

    The projective model is an important mapping function for the calculation of global transformation between two images. However, its hardware implementation is challenging because of a large number of coefficients with different required precisions for fixed point representation. A VLSI hardware architecture is proposed for the calculation of a global projective model between input and reference images and refining false matches using random sample consensus (RANSAC) algorithm. To make the hardware implementation feasible, it is proved that the calculation of the projective model can be divided into four submodels comprising two translations, an affine model and a simpler projective mapping. This approach makes the hardware implementation feasible and considerably reduces the required number of bits for fixed point representation of model coefficients and intermediate variables. The proposed hardware architecture for the calculation of a global projective model using the RANSAC algorithm was implemented using Verilog hardware description language and the functionality of the design was validated through several experiments. The proposed architecture was synthesized by using an application-specific integrated circuit digital design flow utilizing 180-nm CMOS technology as well as a Virtex-6 field programmable gate array. Experimental results confirm the efficiency of the proposed hardware architecture in comparison with software implementation.

  8. Accelerating Markov chain Monte Carlo simulation by differential evolution with self-adaptive randomized subspace sampling

    Energy Technology Data Exchange (ETDEWEB)

    Vrugt, Jasper A [Los Alamos National Laboratory; Hyman, James M [Los Alamos National Laboratory; Robinson, Bruce A [Los Alamos National Laboratory; Higdon, Dave [Los Alamos National Laboratory; Ter Braak, Cajo J F [NETHERLANDS; Diks, Cees G H [UNIV OF AMSTERDAM

    2008-01-01

    Markov chain Monte Carlo (MCMC) methods have found widespread use in many fields of study to estimate the average properties of complex systems, and for posterior inference in a Bayesian framework. Existing theory and experiments prove convergence of well constructed MCMC schemes to the appropriate limiting distribution under a variety of different conditions. In practice, however this convergence is often observed to be disturbingly slow. This is frequently caused by an inappropriate selection of the proposal distribution used to generate trial moves in the Markov Chain. Here we show that significant improvements to the efficiency of MCMC simulation can be made by using a self-adaptive Differential Evolution learning strategy within a population-based evolutionary framework. This scheme, entitled DiffeRential Evolution Adaptive Metropolis or DREAM, runs multiple different chains simultaneously for global exploration, and automatically tunes the scale and orientation of the proposal distribution in randomized subspaces during the search. Ergodicity of the algorithm is proved, and various examples involving nonlinearity, high-dimensionality, and multimodality show that DREAM is generally superior to other adaptive MCMC sampling approaches. The DREAM scheme significantly enhances the applicability of MCMC simulation to complex, multi-modal search problems.

  9. Discriminative motif discovery via simulated evolution and random under-sampling.

    Directory of Open Access Journals (Sweden)

    Tao Song

    Full Text Available Conserved motifs in biological sequences are closely related to their structure and functions. Recently, discriminative motif discovery methods have attracted more and more attention. However, little attention has been devoted to the data imbalance problem, which is one of the main reasons affecting the performance of the discriminative models. In this article, a simulated evolution method is applied to solve the multi-class imbalance problem at the stage of data preprocessing, and at the stage of Hidden Markov Models (HMMs training, a random under-sampling method is introduced for the imbalance between the positive and negative datasets. It is shown that, in the task of discovering targeting motifs of nine subcellular compartments, the motifs found by our method are more conserved than the methods without considering data imbalance problem and recover the most known targeting motifs from Minimotif Miner and InterPro. Meanwhile, we use the found motifs to predict protein subcellular localization and achieve higher prediction precision and recall for the minority classes.

  10. Discriminative motif discovery via simulated evolution and random under-sampling.

    Science.gov (United States)

    Song, Tao; Gu, Hong

    2014-01-01

    Conserved motifs in biological sequences are closely related to their structure and functions. Recently, discriminative motif discovery methods have attracted more and more attention. However, little attention has been devoted to the data imbalance problem, which is one of the main reasons affecting the performance of the discriminative models. In this article, a simulated evolution method is applied to solve the multi-class imbalance problem at the stage of data preprocessing, and at the stage of Hidden Markov Models (HMMs) training, a random under-sampling method is introduced for the imbalance between the positive and negative datasets. It is shown that, in the task of discovering targeting motifs of nine subcellular compartments, the motifs found by our method are more conserved than the methods without considering data imbalance problem and recover the most known targeting motifs from Minimotif Miner and InterPro. Meanwhile, we use the found motifs to predict protein subcellular localization and achieve higher prediction precision and recall for the minority classes.

  11. Solving mercury (Hg) speciation in soil samples by synchrotron X-ray microspectroscopic techniques.

    Science.gov (United States)

    Terzano, Roberto; Santoro, Anna; Spagnuolo, Matteo; Vekemans, Bart; Medici, Luca; Janssens, Koen; Göttlicher, Jörg; Denecke, Melissa A; Mangold, Stefan; Ruggiero, Pacifico

    2010-08-01

    Direct mercury (Hg) speciation was assessed for soil samples with a Hg concentration ranging from 7 up to 240 mg kg(-1). Hg chemical forms were identified and quantified by sequential extractions and bulk- and micro-analytical techniques exploiting synchrotron generated X-rays. In particular, microspectroscopic techniques such as mu-XRF, mu-XRD and mu-XANES were necessary to solve bulk Hg speciation, in both soil fractions soil samples were metacinnabar (beta-HgS), cinnabar (alpha-HgS), corderoite (Hg(3)S(2)Cl(2)), and an amorphous phase containing Hg bound to chlorine and sulfur. The amount of metacinnabar and amorphous phases increased in the fraction soil components was observed. All the observed Hg-species originated from the slow weathering of an inert Hg-containing waste material (K106, U.S. EPA) dumped in the area several years ago, which is changing into a relatively more dangerous source of pollution. Copyright 2010 Elsevier Ltd. All rights reserved.

  12. Sample preparation technique for transmission electron microscopy anodized Al-Li-SiC metal matrix composite

    International Nuclear Information System (INIS)

    Shahid, M.; Thomson, G.E.

    1997-01-01

    Along with improved mechanical properties, metal matrix composites (MMC) have a disadvantage of enhanced corrosion susceptibility in aggressive environments. Recent studies on corrosion behaviour of an Al-alloy 8090/SiC MMC, revealed considerably high corrosion rates of the MMC in near neutral solutions containing chloride ions. Anodizing is one of the potential surface treatment for the MMC to provide protective coating against corrosion. The surface and cross section of the anodized MMC can easily be observed using scanning electron microscope. The anodizing behaviour of the MMC can be understood further if the anodized cross section in examined under transmission electron microscope (TEM). However, it is relatively difficult to prepare small (3 mm diameter) electron transparent specimens of the MMC supporting an anodic film. In the present study a technique has been developed for preparing thin electron transparent specimens of the anodized MMC. This technique employed conventional ion beam thinning process but the preparation of small discs was a problem. A MMMC consisting of Al-alloy 8090 with 20 % (by weight) SiC particulate with an average size of 5 Mu m, was anodized and observed in TEM after preparing the samples using the above mentioned techniques. (author)

  13. The Random-Map Technique: Enhancing Mind-Mapping with a Conceptual Combination Technique to Foster Creative Potential

    Science.gov (United States)

    Malycha, Charlotte P.; Maier, Günter W.

    2017-01-01

    Although creativity techniques are highly recommended in working environments, their effects have been scarcely investigated. Two cognitive processes are often considered to foster creative potential and are, therefore, taken as a basis for creativity techniques: knowledge activation and conceptual combination. In this study, both processes were…

  14. Endodontic pathogens causing deep neck space infections: clinical impact of different sampling techniques and antibiotic susceptibility.

    Science.gov (United States)

    Poeschl, Paul W; Crepaz, Valentina; Russmueller, Guenter; Seemann, Rudolf; Hirschl, Alexander M; Ewers, Rolf

    2011-09-01

    The aims of the present study were to compare microbial populations in patients suffering from deep neck space abscesses caused by primary endodontic infections by sampling the infections with aspiration or swabbing techniques and to determine the susceptibility rates of the isolated bacteria to commonly used antibiotics. A total of 89 patients with deep neck space abscesses caused by primary endodontic infections requiring extraoral incision and drainage under general anesthesia were included. Either aspiration or swabbing was used to sample microbial pus specimens. The culture of the microbial specimens and susceptibility testing were performed following standard procedures. A total of 142 strains were recovered from 76 patients. In 13 patients, no bacteria were found. The predominant bacteria observed were streptococci (36%), staphylococci (13%), Prevotella (8%), and Peptostreptococcus (6%). A statistically significant greater number of obligate anaerobes were found in the aspiration group. The majority of patients presented a mixed aerobic-anaerobic population of bacterial flora (62%). The antibiotic resistance rates for the predominant bacteria were 10% for penicillin G, 9% for amoxicillin, 0% for amoxicillin clavulanate, 24% for clindamycin, and 24% for erythromycin. The results of our study indicated that a greater number of anaerobes were found when sampling using the aspiration technique. Penicillin G and aminopenicillins alone are not always sufficient for the treatment of severe deep neck space abscesses; beta-lactamase inhibitor combinations are more effective. Bacteria showed significant resistant rates to clindamycin. Thus, its single use in penicillin-allergic patients has to be carefully considered. Copyright © 2011 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  15. Application of neutron activation analysis technique in elemental determination of lichen samples

    International Nuclear Information System (INIS)

    Djoko Prakoso Dwi Atmodjo; Syukria Kurniawati; Woro Yatu Niken Syahfitri; Nana Suherman; Dadang Supriatna

    2010-01-01

    Lichen is one of the biological materials as pollution monitor that can give information about level, direction, and history of various pollutants in environment. Small sample weights and elemental content of lichens is on the order of ppm, so that its characterization requires advanced analytical techniques that has high sensitivity and capable of analyzing samples with weight of - 25 mg, such as neutron activation analysis. In this research, determination of elements was done in lichen samples obtained from Kiaracondong and Holis areas in Bandung city, to understanding the difference of industrial exposure level on surrounding environment. Samples were irradiated in RSG GA Siwabessy, Serpong, at 15 MW for 1-2 and 60 minutes for short irradiation and long irradiation, respectively. The samples were then counted using HPGe detector with GENIE 2000 software. The level of element in lichen for Kiaracondong area were Co, Cr, Cs, Fe, Mg, Mn, Sb, Sc, and V in the range of 0.55-0.86, 1.47-2.57, 0.87-1.19, 540-1005, 949-1674, 34.91-45.94, 0.08-0.14, 0.16-0.31, and ≤ 2.33 mg/kg, respectively, while for Holis area were 1.04-2.37, 4.41-10.36, 0.41-0.89, 3166-709, 1131-1422, 40.97-72.51, 0.33-0.50, 0.98-2.18, and 5.30-13.05 mg/kg respectively. From these results, it is known that pollution exposure from the semi industrial area Holis provide greater influence than in the semi industrial area Kiaracondong. (author)

  16. Randomization-Based Inference about Latent Variables from Complex Samples: The Case of Two-Stage Sampling

    Science.gov (United States)

    Li, Tiandong

    2012-01-01

    In large-scale assessments, such as the National Assessment of Educational Progress (NAEP), plausible values based on Multiple Imputations (MI) have been used to estimate population characteristics for latent constructs under complex sample designs. Mislevy (1991) derived a closed-form analytic solution for a fixed-effect model in creating…

  17. Trace uranium analysis in Indian coal samples using the fission track technique

    International Nuclear Information System (INIS)

    Jojo, P.J.; Rawat, A.; Kumar, Ashavani; Prasad, Rajendra

    1993-01-01

    The ever-growing demand for energy has resulted in the extensive use of fossil fuels, especially coal, for power generation. Coal and its by-products often contain significant amounts of radionuclides, including uranium, which is the ultimate source of the radioactive gas Radon-222. The present study gives the concentration of uranium in coal samples of different collieries in India, collected from various thermal power plants in the state of Uttar Pradesh. The estimates were made using the fission track technique. Latent damage tracks were not found to be uniformly distributed but showed sun bursts and clusters. Non-uniform distributions of trace elements are a very common phenomenon in rocks. The levels of uranium in the coal samples were found to vary from 2.0 to 4.9 ppm in uniform distributions and from 21.3 to 41.0 ppm in non-uniform distributions. Measurements were also made on fly ash samples where the average uranium concentration was found to be 8.4 and 49.3 ppm in uniform and non-uniform distributions, respectively. (author)

  18. Performance evaluation of an importance sampling technique in a Jackson network

    Science.gov (United States)

    brahim Mahdipour, E.; Masoud Rahmani, Amir; Setayeshi, Saeed

    2014-03-01

    Importance sampling is a technique that is commonly used to speed up Monte Carlo simulation of rare events. However, little is known regarding the design of efficient importance sampling algorithms in the context of queueing networks. The standard approach, which simulates the system using an a priori fixed change of measure suggested by large deviation analysis, has been shown to fail in even the simplest network settings. Estimating probabilities associated with rare events has been a topic of great importance in queueing theory, and in applied probability at large. In this article, we analyse the performance of an importance sampling estimator for a rare event probability in a Jackson network. This article carries out strict deadlines to a two-node Jackson network with feedback whose arrival and service rates are modulated by an exogenous finite state Markov process. We have estimated the probability of network blocking for various sets of parameters, and also the probability of missing the deadline of customers for different loads and deadlines. We have finally shown that the probability of total population overflow may be affected by various deadline values, service rates and arrival rates.

  19. Determination of organically bound Tritium in environmental samples by application of the oxidizing plasma technique

    International Nuclear Information System (INIS)

    Strack, S.; Koenig, L.A.

    1981-12-01

    The low-temperature oxidizing plasma technique with a suitable system for trapping the water formed in the oxidation process can be used to determine T bound organically in low-level samples. First, the samples are freeze-dried and the tissue water obtained in this way is measured, after distillation, in a liquid scintillation spectrometer. The residual dry matter is ashed in the reactor chamber of the plasma system. Oxidation takes place at temperatures not exceeding 200 0 C in an oxygen flow of about 40 ml/min. The water of oxidation is collected in a cold trap installed behind the reactor chamber. A volume of about 10 ml of water is sufficient to measure the tritium activity without enrichment. The oxidation behavior of various organic materials has been tested. Some first results of T concentrations in tissue water and the organic dry matter from food and plant samples collected in the vicinity of the Nuclear Research Center are presented. The method has the advantage that a commercially available instrument can be used requiring only little additional equipment. Handling is much less dangerous and contamination effects by atmospheric T can be easily kept at a minimum. (orig./HP) [de

  20. Techniques for sampling nuclear waste tank contents and in situ measurement of activity

    International Nuclear Information System (INIS)

    Lawrence, R.C.

    1978-04-01

    A study was conducted to develop suitable sampling equipment and techniques for characterizing the mechanical properties of nuclear wastes; identifying effective means of measuring radiation levels, temperatures, and neutron fluxes in situ in wastes; and developing a waste core sampler. A portable, stainless steel probe was developed which is placed in the tank through a riser. This probe is built for the insertion of instrumentation that can measure the contents of the tank at any level and take temperature, radiation, and neutron activation readings with reliable accuracy. A simple and reliable instrument for the in situ extraction of waste materials ranging from liquid to concrete-like substances was also developed. This portable, stainless steel waste core sampler can remove up to one liter of radioactive waste from tanks for transportation to hot cell laboratories for analysis of hardness, chemical form, and isotopic content. A cask for transporting the waste samples from the tanks to the laboratory under radiation-protected conditions was also fabricated. This cask was designed with a ''boot'' or inner-seal liner to contain any radioactive wastes that might remain on the outside of the waste core sampling device

  1. Comparative study of modern extraction techniques for the determination of environmental samples (M14)

    International Nuclear Information System (INIS)

    Gfrerer, M.; Lankmayr, E.

    2002-01-01

    Full text: The aim of any extraction method in analytical chemistry is, to effectively separate the analytes from the matrix. The whole step should be fast and quantitative with minimal solvent and time required. The classical Soxhlet extraction usually requires large volumes (up to 200 ml) of solvent to be refluxed through the solid samples for several hours. Therefore, in the last decades, alternatives for this extraction method have been presented and investigated such as ultrasonic extraction (UE), supercritical fluid extraction (SFE), accelerated solvent extraction (ASE), microwave-assisted extraction (MAE) and fluidized-bed extraction (FBE). The actual choice for analytical application is frequently the initial capital cost, operating costs, simplicity of operation, amount of organic solvent required and sample throughput. Since sample preparation is a critical step in the analytical cycle, special care has to be taken for an accurate choice and optimization of extraction techniques and clean-up procedures. Therefore, MAE and FBE were investigated for their influential extraction parameters and these parameters were optimized for the extraction of organochlorine biocides, polychlorinated biphenyls and polycyclic aromatic hydrocarbons from environmental matrices like soil, sediment and sewage sludge. The extraction yields were compared with those obtained by Soxhlet extraction performed following DIN-methods. Finally, the optimized modern methods were validated by systematic experiments with certified reference materials. Refs. 3 (author)

  2. At convenience and systematic random sampling: effects on the prognostic value of nuclear area assessments in breast cancer patients.

    Science.gov (United States)

    Jannink, I; Bennen, J N; Blaauw, J; van Diest, P J; Baak, J P

    1995-01-01

    This study compares the influence of two different nuclear sampling methods on the prognostic value of assessments of mean and standard deviation of nuclear area (MNA, SDNA) in 191 consecutive invasive breast cancer patients with long term follow up. The first sampling method used was 'at convenience' sampling (ACS); the second, systematic random sampling (SRS). Both sampling methods were tested with a sample size of 50 nuclei (ACS-50 and SRS-50). To determine whether, besides the sampling methods, sample size had impact on prognostic value as well, the SRS method was also tested using a sample size of 100 nuclei (SRS-100). SDNA values were systematically lower for ACS, obviously due to (unconsciously) not including small and large nuclei. Testing prognostic value of a series of cut off points, MNA and SDNA values assessed by the SRS method were prognostically significantly stronger than the values obtained by the ACS method. This was confirmed in Cox regression analysis. For the MNA, the Mantel-Cox p-values from SRS-50 and SRS-100 measurements were not significantly different. However, for the SDNA, SRS-100 yielded significantly lower p-values than SRS-50. In conclusion, compared with the 'at convenience' nuclear sampling method, systematic random sampling of nuclei is not only superior with respect to reproducibility of results, but also provides a better prognostic value in patients with invasive breast cancer.

  3. Boosting the FM-Index on the GPU: Effective Techniques to Mitigate Random Memory Access.

    Science.gov (United States)

    Chacón, Alejandro; Marco-Sola, Santiago; Espinosa, Antonio; Ribeca, Paolo; Moure, Juan Carlos

    2015-01-01

    The recent advent of high-throughput sequencing machines producing big amounts of short reads has boosted the interest in efficient string searching techniques. As of today, many mainstream sequence alignment software tools rely on a special data structure, called the FM-index, which allows for fast exact searches in large genomic references. However, such searches translate into a pseudo-random memory access pattern, thus making memory access the limiting factor of all computation-efficient implementations, both on CPUs and GPUs. Here, we show that several strategies can be put in place to remove the memory bottleneck on the GPU: more compact indexes can be implemented by having more threads work cooperatively on larger memory blocks, and a k-step FM-index can be used to further reduce the number of memory accesses. The combination of those and other optimisations yields an implementation that is able to process about two Gbases of queries per second on our test platform, being about 8 × faster than a comparable multi-core CPU version, and about 3 × to 5 × faster than the FM-index implementation on the GPU provided by the recently announced Nvidia NVBIO bioinformatics library.

  4. Duodenum preserving pancreatectomy in chronic pancreatitis: Design of a randomized controlled trial comparing two surgical techniques [ISRCTN50638764

    Directory of Open Access Journals (Sweden)

    Reidel Margot A

    2006-05-01

    Full Text Available Abstract Background Chronic pancreatitis is an inflammatory disease which is characterized by an irreversible conversion of pancreatic parenchyma to fibrous tissue. Beside obstructive jaundice and pseudocyst formation, about half of the patients need surgical intervention due to untreatable chronic pain during the course of the disease. In most of the patients with chronic pancreatitis, the head of the pancreas is the trigger of the chronic inflammatory process. Therefore, resection of pancreatic head tissue must be the central part of any surgical intervention. However, it is unclear to which extent the surgical procedure must be radical in order to obtain a favourable outcome for the patients. Design A single centre randomized controlled, superiority trial to compare two techniques of duodenum preserving pancreatic head resection. Sample size: 65 patients will be included and randomized intraoperatively. Eligibility criteria: All patients with chronic pancreatitis and indication for surgical resection and signed informed consent. Cumulative primary endpoint (hierarchical model: duration of surgical procedure, quality of life after one year, duration of intensive care unit stay, duration of hospital stay. Reference treatment: Resection of the pancreatic head with dissection of the pancreas from the portal vein and transsection of the gland (Beger procedure. Intervention: Partial Resection of the pancreatic head without transsection of the organ and visualization of the portal vein (Berne procedure. Duration: September 2003-October 2007. Organisation/responsibility The trial is conducted in compliance with the protocol and in accordance with the moral, ethical, regulatory and scientific principles governing clinical research as set out in the Declaration of Helsinki (1989 and the Good Clinical Practice guideline (GCP. The Center for Clinical Studies of the Department of Surgery Heidelberg is responsible for planning, conducting and final

  5. A cosmetic evaluation of breast cancer treatment: A randomized study of radiotherapy boost technique

    International Nuclear Information System (INIS)

    Vass, Sylvie; Bairati, Isabelle

    2005-01-01

    Purpose: To compare cosmetic results of two different radiotherapy (RT) boost techniques used in the treatment of breast cancer after whole breast radiotherapy and to identify factors affecting cosmetic outcomes. Methods and Materials: Between 1996 and 1998, 142 patients with Stage I and II breast cancer were treated with breast conservative surgery and adjuvant RT. Patients were then randomly assigned to receive a boost dose of 15 Gy delivered to the tumor bed either by iridium 192, or a combination of photons and electrons. Cosmetic evaluations were done on a 6-month basis, with a final evaluation at 36 months after RT. The evaluations were done using a panel of global and specific subjective scores, a digitized scoring system using the breast retraction assessment (BRA) measurement, and a patient's self-assessment evaluation. As cosmetic results were graded according to severity, the comparison of boost techniques was done using the ordinal logistic regression model. Adjusted odds ratios (OR) and their 95% confidence intervals (CI) are presented. Results: At 36 months of follow-up, there was no significant difference between the two groups with respect to the global subjective cosmetic outcome (OR = 1.40; 95%CI = 0.69-2.85, p = 0.35). Good to excellent scores were observed in 65% of implant patients and 62% of photon/electron patients. At 24 months and beyond, telangiectasia was more severe in the implant group with an OR of 9.64 (95%CI = 4.05-22.92, p < 0.0001) at 36 months. The only variable associated with a worse global cosmetic outcome was the presence of concomitant chemotherapy (OR = 3.87; 95%CI = 1.74-8.62). The BRA value once adjusted for age, concomitant chemotherapy, and boost volume showed a positive association with the boost technique. The BRA value was significantly greater in the implant group (p 0.03). There was no difference in the patient's final self-assessment score between the two groups. Three variables were statistically associated with

  6. Randomized controlled trial of yogic meditation techniques for patients with obsessive-compulsive disorder.

    Science.gov (United States)

    Shannahoff-Khalsa, D S; Ray, L E; Levine, S; Gallen, C C; Schwartz, B J; Sidorowich, J J

    1999-12-01

    The objective of this study was to compare efficacy of two meditation protocols for treating patients with obsessive-compulsive disorder (OCD). Patients were randomized to two groups-matched for sex, age, and medication status-and blinded to the comparison protocol. They were told the trial would last for 12 months, unless one protocol proved to be more efficacious. If so, groups would merge, and the group that received the less efficacious treatment would also be afforded 12 months of the more effective one. The study was conducted at Children's Hospital, San Diego, Calif. Patients were selected according to Diagnostic and Statistical Manual of Mental Disorders, Third Edition-Revised (DSM-III-R) criteria and recruited by advertisements and referral. At baseline, Group 1 included 11 adults and 1 adolescent, and Group 2 included 10 adults. Group 1 employed a kundalini yoga meditation protocol and Group 2 employed the Relaxation Response plus Mindfulness Meditation technique. Baseline and 3-month interval testing was conducted using the Yale-Brown Obsessive Compulsive Scale (Y-BOCS), Symptoms Checklist-90-Revised Obsessive Compulsive (SCL-90-R OC) and Global Severity Index (SCL-90-R GSI) scales, Profile of Moods scale (POMS), Perceived Stress Scale (PSS), and Purpose in Life (PIL) test. Seven adults in each group completed 3 months of therapy. At 3 months, Group 1 demonstrated greater improvements (Student's independent groups t-test) on the Y-BOCS, SCL-90-R OC and GSI scales, and POMS, and greater but nonsignificant improvements on the PSS and PIL test. An intent-to-treat analysis (Y-BOCS) for the baseline and 3-month tests showed that only Group 1 improved. Within-group statistics (Student's paired t-tests) showed that Group 1 significantly improved on all six scales, but Group 2 had no improvements. Groups were merged for an additional year using Group 1 techniques. At 15 months, the final group (N=11) improved 71%, 62%, 66%, 74%, 39%, and 23%, respectively, on

  7. Benefits of expressive writing in reducing test anxiety: A randomized controlled trial in Chinese samples.

    Science.gov (United States)

    Shen, Lujun; Yang, Lei; Zhang, Jing; Zhang, Meng

    2018-01-01

    To explore the effect of expressive writing of positive emotions on test anxiety among senior-high-school students. The Test Anxiety Scale (TAS) was used to assess the anxiety level of 200 senior-high-school students. Seventy-five students with high anxiety were recruited and divided randomly into experimental and control groups. Each day for 30 days, the experimental group engaged in 20 minutes of expressive writing of positive emotions, while the control group was asked to merely write down their daily events. A second test was given after the month-long experiment to analyze whether there had been a reduction in anxiety among the sample. Quantitative data was obtained from TAS scores. The NVivo10.0 software program was used to examine the frequency of particular word categories used in participants' writing manuscripts. Senior-high-school students indicated moderate to high test anxiety. There was a significant difference in post-test results (P 0.05). Students' writing manuscripts were mainly encoded on five code categories: cause, anxiety manifestation, positive emotion, insight and evaluation. There was a negative relation between positive emotion, insight codes and test anxiety. There were significant differences in the positive emotion, anxiety manifestation, and insight code categories between the first 10 days' manuscripts and the last 10 days' ones. Long-term expressive writing of positive emotions appears to help reduce test anxiety by using insight and positive emotion words for Chinese students. Efficient and effective intervention programs to ease test anxiety can be designed based on this study.

  8. Benefits of expressive writing in reducing test anxiety: A randomized controlled trial in Chinese samples.

    Directory of Open Access Journals (Sweden)

    Lujun Shen

    Full Text Available To explore the effect of expressive writing of positive emotions on test anxiety among senior-high-school students.The Test Anxiety Scale (TAS was used to assess the anxiety level of 200 senior-high-school students. Seventy-five students with high anxiety were recruited and divided randomly into experimental and control groups. Each day for 30 days, the experimental group engaged in 20 minutes of expressive writing of positive emotions, while the control group was asked to merely write down their daily events. A second test was given after the month-long experiment to analyze whether there had been a reduction in anxiety among the sample. Quantitative data was obtained from TAS scores. The NVivo10.0 software program was used to examine the frequency of particular word categories used in participants' writing manuscripts.Senior-high-school students indicated moderate to high test anxiety. There was a significant difference in post-test results (P 0.05. Students' writing manuscripts were mainly encoded on five code categories: cause, anxiety manifestation, positive emotion, insight and evaluation. There was a negative relation between positive emotion, insight codes and test anxiety. There were significant differences in the positive emotion, anxiety manifestation, and insight code categories between the first 10 days' manuscripts and the last 10 days' ones.Long-term expressive writing of positive emotions appears to help reduce test anxiety by using insight and positive emotion words for Chinese students. Efficient and effective intervention programs to ease test anxiety can be designed based on this study.

  9. Benefits of expressive writing in reducing test anxiety: A randomized controlled trial in Chinese samples

    Science.gov (United States)

    Zhang, Jing; Zhang, Meng

    2018-01-01

    Purpose To explore the effect of expressive writing of positive emotions on test anxiety among senior-high-school students. Methods The Test Anxiety Scale (TAS) was used to assess the anxiety level of 200 senior-high-school students. Seventy-five students with high anxiety were recruited and divided randomly into experimental and control groups. Each day for 30 days, the experimental group engaged in 20 minutes of expressive writing of positive emotions, while the control group was asked to merely write down their daily events. A second test was given after the month-long experiment to analyze whether there had been a reduction in anxiety among the sample. Quantitative data was obtained from TAS scores. The NVivo10.0 software program was used to examine the frequency of particular word categories used in participants’ writing manuscripts. Results Senior-high-school students indicated moderate to high test anxiety. There was a significant difference in post-test results (P 0.05). Students’ writing manuscripts were mainly encoded on five code categories: cause, anxiety manifestation, positive emotion, insight and evaluation. There was a negative relation between positive emotion, insight codes and test anxiety. There were significant differences in the positive emotion, anxiety manifestation, and insight code categories between the first 10 days’ manuscripts and the last 10 days’ ones. Conclusions Long-term expressive writing of positive emotions appears to help reduce test anxiety by using insight and positive emotion words for Chinese students. Efficient and effective intervention programs to ease test anxiety can be designed based on this study. PMID:29401473

  10. Whole arm manipulation planning based on feedback velocity fields and sampling-based techniques.

    Science.gov (United States)

    Talaei, B; Abdollahi, F; Talebi, H A; Omidi Karkani, E

    2013-09-01

    Changing the configuration of a cooperative whole arm manipulator is not easy while enclosing an object. This difficulty is mainly because of risk of jamming caused by kinematic constraints. To reduce this risk, this paper proposes a feedback manipulation planning algorithm that takes grasp kinematics into account. The idea is based on a vector field that imposes perturbation in object motion inducing directions when the movement is considerably along manipulator redundant directions. Obstacle avoidance problem is then considered by combining the algorithm with sampling-based techniques. As experimental results confirm, the proposed algorithm is effective in avoiding jamming as well as obstacles for a 6-DOF dual arm whole arm manipulator. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  11. Yield strength, shear stress and toughness of YBCO samples textured by Bridgman technique

    International Nuclear Information System (INIS)

    Roa, J J; Capdevila, X G; Martinez, M; Segarra, M; Jimenez-Pique, E

    2008-01-01

    Mechanical properties of the orthorhombic phase of YBa 2 Cu 3 O 7-δ (Y-123) at room temperature have been investigated at different applied loads using nanoindentation technique. The study was carried out for several monodomains on the (001) planes for textured Bridgman samples with dispersed Y 2 BaCuO 5 (Y-211) particles as pinning centers. The yield strength (σ ys ), shear stress (τ m ) and toughness (K IC ) of Y123/Y211composite was determined at different applied loads. First and second mechanical properties have been calculated though the Hertz equations and the last one with Lawn et al. equations. Finally, the ultra-low imprints obtained by nanoindentation have been correlated with parameters obtained by Field Emission Scanning Electron Microscope (FE-SEM)

  12. Importance sampling techniques and treatment of electron transport in MCNP 4A

    International Nuclear Information System (INIS)

    Ueki, K.

    1994-01-01

    The continuous energy Monte Carlo code MCNP was developed by the Radiation Transport Group at Los Alamos National Laboratory and the MCNP 4A version is available, now. The MCNP 4A is able to do the coupled neutron-secondary gamma-ray-electron-bremsstrahlung calculation. The calculated results, such as energy spectra, tally fluctuation chart, and geometrical input data can be displayed by using a work station. The document of the MCNP 4A code has no description on the subroutines, except few ones of 'SOURCE', 'TALLYX'. However, when we want to improve the MCNP Monte Carlo sampling techniques to get more accuracy or efficiency results for some problems, some subroutines are required or needed to revised. Three subroutines have been revised and built in the MCNP 4A code. (author)

  13. Simulative Investigation on Spectral Efficiency of Unipolar Codes based OCDMA System using Importance Sampling Technique

    Science.gov (United States)

    Farhat, A.; Menif, M.; Rezig, H.

    2013-09-01

    This paper analyses the spectral efficiency of Optical Code Division Multiple Access (OCDMA) system using Importance Sampling (IS) technique. We consider three configurations of OCDMA system namely Direct Sequence (DS), Spectral Amplitude Coding (SAC) and Fast Frequency Hopping (FFH) that exploits the Fiber Bragg Gratings (FBG) based encoder/decoder. We evaluate the spectral efficiency of the considered system by taking into consideration the effect of different families of unipolar codes for both coherent and incoherent sources. The results show that the spectral efficiency of OCDMA system with coherent source is higher than the incoherent case. We demonstrate also that DS-OCDMA outperforms both others in terms of spectral efficiency in all conditions.

  14. Calibrated Phase-Shifting Digital Holographic Microscope Using a Sampling Moiré Technique

    Directory of Open Access Journals (Sweden)

    Peng Xia

    2018-05-01

    Full Text Available A calibrated phase-shifting digital holographic microscope system capable of improving the quality of reconstructed images is proposed. Phase-shifting errors are introduced in phase-shifted holograms for numerous reasons, such as the non-linearity of piezoelectric transducers (PZTs, wavelength fluctuations in lasers, and environmental disturbances, leading to poor-quality reconstructions. In our system, in addition to the camera used to record object information, an extra camera is used to record interferograms, which are used to analyze phase-shifting errors using a sampling Moiré technique. The quality of the reconstructed object images can be improved by the phase-shifting error compensation algorithm. Both the numerical simulation and experiment demonstrate the effectiveness of the proposed system.

  15. Probabilistic techniques using Monte Carlo sampling for multi- component system diagnostics

    International Nuclear Information System (INIS)

    Aumeier, S.E.; Lee, J.C.; Akcasu, A.Z.

    1995-01-01

    We outline the structure of a new approach at multi-component system fault diagnostics which utilizes detailed system simulation models, uncertain system observation data, statistical knowledge of system parameters, expert opinion, and component reliability data in an effort to identify incipient component performance degradations of arbitrary number and magnitude. The technique involves the use of multiple adaptive Kalman filters for fault estimation, the results of which are screened using standard hypothesis testing procedures to define a set of component events that could have transpired. Latin Hypercube sample each of these feasible component events in terms of uncertain component reliability data and filter estimates. The capabilities of the procedure are demonstrated through the analysis of a simulated small magnitude binary component fault in a boiling water reactor balance of plant. The results show that the procedure has the potential to be a very effective tool for incipient component fault diagnosis

  16. Comparison of mobile and stationary spore-sampling techniques for estimating virulence frequencies in aerial barley powdery mildew populations

    DEFF Research Database (Denmark)

    Hovmøller, M.S.; Munk, L.; Østergård, Hanne

    1995-01-01

    Gene frequencies in samples of aerial populations of barley powdery mildew (Erysiphe graminis f.sp. hordei), which were collected in adjacent barley areas and in successive periods of time, were compared using mobile and stationary sampling techniques. Stationary samples were collected from trap ...

  17. Tracer techniques for urine volume determination and urine collection and sampling back-up system

    Science.gov (United States)

    Ramirez, R. V.

    1971-01-01

    The feasibility, functionality, and overall accuracy of the use of lithium were investigated as a chemical tracer in urine for providing a means of indirect determination of total urine volume by the atomic absorption spectrophotometry method. Experiments were conducted to investigate the parameters of instrumentation, tracer concentration, mixing times, and methods for incorporating the tracer material in the urine collection bag, and to refine and optimize the urine tracer technique to comply with the Skylab scheme and operational parameters of + or - 2% of volume error and + or - 1% accuracy of amount of tracer added to each container. In addition, a back-up method for urine collection and sampling system was developed and evaluated. This back-up method incorporates the tracer technique for volume determination in event of failure of the primary urine collection and preservation system. One chemical preservative was selected and evaluated as a contingency chemical preservative for the storage of urine in event of failure of the urine cooling system.

  18. Synchrotron X-ray measurement techniques for thermal barrier coated cylindrical samples under thermal gradients

    Energy Technology Data Exchange (ETDEWEB)

    Siddiqui, Sanna F.; Knipe, Kevin; Manero, Albert; Raghavan, Seetha [Department of Mechanical and Aerospace Engineering, University of Central Florida, Orlando, Florida 32816 (United States); Meid, Carla; Wischek, Janine; Bartsch, Marion [German Aerospace Center (DLR), Institute of Materials Research, 51147 Cologne (Germany); Okasinski, John; Almer, Jonathan [X-Ray Science Division, Advanced Photon Source, Argonne National Laboratory, Argonne, Illinois 60439 (United States); Karlsson, Anette M. [Cleveland State University, 2121 Euclid Avenue, Cleveland, Ohio 44115 (United States)

    2013-08-15

    Measurement techniques to obtain accurate in situ synchrotron strain measurements of thermal barrier coating systems (TBCs) applied to hollow cylindrical specimens are presented in this work. The Electron Beam Physical Vapor Deposition coated specimens with internal cooling were designed to achieve realistic temperature gradients over the TBC coated material such as that occurring in the turbine blades of aeroengines. Effects of the circular cross section on the x-ray diffraction (XRD) measurements in the various layers, including the thermally grown oxide, are investigated using high-energy synchrotron x-rays. Multiple approaches for beam penetration including collection, tangential, and normal to the layers, along with variations in collection parameters are compared for their ability to attain high-resolution XRD data from the internal layers. This study displays the ability to monitor in situ, the response of the internal layers within the TBC, while implementing a thermal gradient across the thickness of the coated sample. The thermal setup maintained coating surface temperatures in the range of operating conditions, while monitoring the substrate cooling, for a controlled thermal gradient. Through variation in measurement location and beam parameters, sufficient intensities are obtained from the internal layers which can be used for depth resolved strain measurements. Results are used to establish the various techniques for obtaining XRD measurements through multi-layered coating systems and their outcomes will pave the way towards goals in achieving realistic in situ testing of these coatings.

  19. The development and use of parametric sampling techniques for probabilistic risk assessment

    International Nuclear Information System (INIS)

    Dalrymple, G.J.; Broyd, T.W.

    1987-01-01

    In order to enable evaluation to be made of proposals for the underground disposal of low and intermediate level radioactive wastes in the United Kingdom, the Department of the Environment (DoE) research programme includes development of computer-based methods for use in a multistage assessment process. To test the adequacy of the various methods of data acquisitions and radiological assessment a mock assessment exercise is currently being conducted by the department. This paper outlines the proposed methodology which provides for the use of probabilistic modelling based upon the Atomic Energy of Canada Ltd SYVAC variability analysis approach using new models (SYVAC 'A') and data appropriate to UK conditions for a deep horizontal tunnel repository concept. This chapter describes the choice of a suitable technique for the sampling of data input to the SYVAC 'A' model and techniques for analysing the predictions of dose and risk made by the model. The sensitivity of the model predictions (risk and dose to man) to the input parameters was compared for four different methods. All four methods identified the same geological parameters as the most important. (author)

  20. Improved technique for measuring the size distribution of black carbon particles in rainwater and snow samples

    Science.gov (United States)

    Mori, T.; Moteki, N.; Ohata, S.; Koike, M.; Azuma, K. G.; Miyazaki, Y.; Kondo, Y.

    2015-12-01

    Black carbon (BC) is the strongest contributor to sunlight absorption among atmospheric aerosols. Quantitative understanding of wet deposition of BC, which strongly affects the spatial distribution of BC, is important to improve our understandings on climate change. We have devised a technique for measuring the masses of individual BC particles in rainwater and snow samples, as a combination of a nebulizer and a single-particle soot photometer (SP2) (Ohata et al. 2011, 2013; Schwarz et al. 2012; Mori et al. 2014). We show two important improvements in this technique: 1)We have extended the upper limit of detectable BC particle diameter from 0.9 μm to about 4.0 μm by modifying the photodetector for measuring the laser-induced incandescence signal. 2)We introduced a pneumatic nebulizer Marin-5 (Cetac Technologies Inc., Omaha, NE, USA) and experimentally confirmed its high extraction efficiency (~50%) independent of particle diameter up to 2.0 μm. Using our improved system, we simultaneously measured the size distribution of BC particles in air and rainwater in Tokyo. We observed that the size distribution of BC in rainwater was larger than that in air, indicating that large BC particles were effectively removed by precipitation. We also observed BC particles with diameters larger than 1.0 μm, indicating that further studies of wet deposition of BC will require the use of the modified SP2.

  1. Solving mercury (Hg) speciation in soil samples by synchrotron X-ray microspectroscopic techniques

    International Nuclear Information System (INIS)

    Terzano, Roberto; Santoro, Anna; Spagnuolo, Matteo; Vekemans, Bart; Medici, Luca; Janssens, Koen; Goettlicher, Joerg; Denecke, Melissa A.; Mangold, Stefan; Ruggiero, Pacifico

    2010-01-01

    Direct mercury (Hg) speciation was assessed for soil samples with a Hg concentration ranging from 7 up to 240 mg kg -1 . Hg chemical forms were identified and quantified by sequential extractions and bulk- and micro-analytical techniques exploiting synchrotron generated X-rays. In particular, microspectroscopic techniques such as μ-XRF, μ-XRD and μ-XANES were necessary to solve bulk Hg speciation, in both soil fractions 3 S 2 Cl 2 ), and an amorphous phase containing Hg bound to chlorine and sulfur. The amount of metacinnabar and amorphous phases increased in the fraction <2 μm. No interaction among Hg-species and soil components was observed. All the observed Hg-species originated from the slow weathering of an inert Hg-containing waste material (K106, U.S. EPA) dumped in the area several years ago, which is changing into a relatively more dangerous source of pollution. - Direct mercury (Hg) speciation in chlor-alkali plant contaminated soils enabled the identification of potentially dangerous Hg-S/Cl amorphous species.

  2. Effect of postmortem sampling technique on the clinical significance of autopsy blood cultures.

    Science.gov (United States)

    Hove, M; Pencil, S D

    1998-02-01

    Our objective was to investigate the value of postmortem autopsy blood cultures performed with an iodine-subclavian technique relative to the classical method of atrial heat searing and antemortem blood cultures. The study consisted of a prospective autopsy series with each case serving as its own control relative to subsequent testing, and a retrospective survey of patients coming to autopsy who had both autopsy blood cultures and premortem blood cultures. A busy academic autopsy service (600 cases per year) at University of Texas Medical Branch Hospitals, Galveston, Texas, served as the setting for this work. The incidence of non-clinically relevant (false-positive) culture results were compared using different methods for collecting blood samples in a prospective series of 38 adult autopsy specimens. One hundred eleven adult autopsy specimens in which both postmortem and antemortem blood cultures were obtained were studied retrospectively. For both studies, positive culture results were scored as either clinically relevant or false positives based on analysis of the autopsy findings and the clinical summary. The rate of false-positive culture results obtained by an iodine-subclavian technique from blood drawn soon after death were statistically significantly lower (13%) than using the classical method of obtaining blood through the atrium after heat searing at the time of the autopsy (34%) in the same set of autopsy subjects. When autopsy results were compared with subjects' antemortem blood culture results, there was no significant difference in the rate of non-clinically relevant culture results in a paired retrospective series of antemortem blood cultures and postmortem blood cultures using the iodine-subclavian postmortem method (11.7% v 13.5%). The results indicate that autopsy blood cultures obtained using the iodine-subclavian technique have reliability equivalent to that of antemortem blood cultures.

  3. Computer code ENDSAM for random sampling and validation of the resonance parameters covariance matrices of some major nuclear data libraries

    International Nuclear Information System (INIS)

    Plevnik, Lucijan; Žerovnik, Gašper

    2016-01-01

    Highlights: • Methods for random sampling of correlated parameters. • Link to open-source code for sampling of resonance parameters in ENDF-6 format. • Validation of the code on realistic and artificial data. • Validation of covariances in three major contemporary nuclear data libraries. - Abstract: Methods for random sampling of correlated parameters are presented. The methods are implemented for sampling of resonance parameters in ENDF-6 format and a link to the open-source code ENDSAM is given. The code has been validated on realistic data. Additionally, consistency of covariances of resonance parameters of three major contemporary nuclear data libraries (JEFF-3.2, ENDF/B-VII.1 and JENDL-4.0u2) has been checked.

  4. Randomized Comparison of Two Vaginal Self-Sampling Methods for Human Papillomavirus Detection: Dry Swab versus FTA Cartridge

    OpenAIRE

    Catarino, Rosa; Vassilakos, Pierre; Bilancioni, Aline; Vanden Eynde, Mathieu; Meyer-Hamme, Ulrike; Menoud, Pierre-Alain; Guerry, Fr?d?ric; Petignat, Patrick

    2015-01-01

    Background Human papillomavirus (HPV) self-sampling (self-HPV) is valuable in cervical cancer screening. HPV testing is usually performed on physician-collected cervical smears stored in liquid-based medium. Dry filters and swabs are an alternative. We evaluated the adequacy of self-HPV using two dry storage and transport devices, the FTA cartridge and swab. Methods A total of 130 women performed two consecutive self-HPV samples. Randomization determined which of the two tests was performed f...

  5. General principles governing sampling and measurement techniques for monitoring radioactive effluents from nuclear facilities

    International Nuclear Information System (INIS)

    Fitoussi, L.

    1978-01-01

    An explanation is given of the need to monitor the release of radioactive gases and liquid effluents from nuclear facilities, with particular emphasis on the ICRP recommendations and on the interest in this problem shown by the larger international organizations. This is followed by a description of the classes of radionuclides that are normally monitored in this way. The characteristics of monitoring 'in line' and 'by sample taking' are described; the disadvantages of in line monitoring and the problem of sample representativity are discussed. There follows an account of the general principles for measuring gaseous and liquid effluents that are applied in the techniques normally employed at nuclear facilities. Standards relating to the specifications for monitoring instruments are at present being devised by the International Electrotechnical Commission, and there are still major differences in national practices, at least as far as measurement thresholds are concerned. In conclusion, it is shown that harmonization of practices and standardization of equipment would probably help to make international relations in the field more productive. (author)

  6. Determination of elemental in soil samples from Gebeng area using NAA technique

    International Nuclear Information System (INIS)

    Md Suhaimi Elias; Wo, Y.M.; Mohd Suhaimi Hamzah

    2016-01-01

    Rapid development and urbanization will increase number of residence and industrial area. Without proper management and control of pollution, these will give an adverse effect to environment and human life. The objective of this study to identify and quantify key contaminants into the environment of the Gebeng area as a result of industrial and human activities. Gebeng area was gazetted as one of the industrial estate in Pahang state. Assessment of elemental pollution in soil of Gebeng area base on level of concentration, enrichment factor and geo-accumulation index. The enrichment factors (EFs) were determined by the elemental rationing method, whilst the geo-accumulation index (I_g_e_o) by comparing of current to continental crustal average concentration of element. Twenty-seven of soil samples were collected from Gebeng area. Soil samples were analysed by using Neutron Activation Analyses (NAA) technique. The obtained data showed higher concentration of iron (Fe) due to abundance in soil compared to other elements. The results of enrichment factor showed that Gebeng area have enrich with elements of As, Br, Hf, Sb, Th and U. Base on the geo-accumulation index (I_g_e_o) classification, the soil quality of Gebeng area can be classified as class 0, (uncontaminated) to Class 3, (moderately to heavily contaminated). (author)

  7. Determination of the uranium concentration in water samples by the technique of fission track recording

    International Nuclear Information System (INIS)

    Geraldo, L.P.

    1979-01-01

    The technique of fission track register was developed for the determination of micrograms of uranium. The Makrofol KG, a synthetic plastic made by Bayer, was used as the detector and the wet method was utilized. The detector calibration curve allows the determination of the uranium concentration in a sample within an interval from 8.0 to 0.4μgU/L, the total error ranging from 3.3% to 29.0% respectively. The method was used in the determination of the uranium content in various water samples, obtained from various sources like rivers, sea etc. in the state of Sao Paulo, Brazil. Results were compared with those obtained by other authors using different methods. The average concentration found in sea waters (3.27 +- 9.12μgU/l) by this method is compatible with the international average accepted value of 3.3μgU/l, irrespective of site and depth. The determination of the uranium content by fission track counting has proved to be very convenient. (Author) [pt

  8. Neural tension technique is no different from random passive movements in reducing spasticity in patients with traumatic brain injury

    DEFF Research Database (Denmark)

    Lorentzen, Jakob; Nielsen, Dorthe; Holm, Karl

    2012-01-01

    Purpose: Neural tension technique (NTT) is a therapy believed to reduce spasticity and to increase range of motion (ROM). This study compared the ability of NTT and random passive movements (RPMs) to reduce spasticity in the knee flexors in 10 spastic patients with brain injury. Methods: An RCT...

  9. Albumin to creatinine ratio in a random urine sample: Correlation with severity of preeclampsia

    Directory of Open Access Journals (Sweden)

    Fady S. Moiety

    2014-06-01

    Conclusions: Random urine ACR may be a reliable method for prediction and assessment of severity of preeclampsia. Using the estimated cut-off may add to the predictive value of such a simple quick test.

  10. Enhancing trunk stability in acute poststroke subjects using physioball exercise and proprioceptive neuromuscular facilitation technique: A pilot randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Ravichandran Hariharasudhan

    2016-01-01

    Full Text Available Background: Stroke is one of the leading causes of death and disability worldwide. Poststroke, most survivors experience trunk control impairment and instability. Previous works on exercise on an unstable surface to improve trunk stability in nonstroke population had proven effective. Thus, physioball exercises (PBEs in poststroke subjects may be useful in the recovery of trunk stability and thereby reduce disability. We hypothesize that PBE is feasible and effective in enhancing trunk stability. Aims: To test the feasibility and successful implementation of conducting a randomized controlled study to assess the clinical effectiveness of PBE and proprioceptive neuromuscular facilitation (PNF technique to enhance trunk control in poststroke subjects. Methods: This study was conducted in a stroke unit of Global Hospitals and Health City, Chennai, India. Thirty patients with the first onset of stroke within 40 days of stroke duration, lesion to one side, and ability to sit independently with or without arm support for 15 days were recruited. All thirty poststroke subjects were randomized either into PBE group or PNF group, and outcome assessors involved in the trail were blinded to allocation. PBE group performed task-oriented activities on an unstable surface and PNF group were treated with PNF-specific trunk stability exercise program for 4 weeks (30 min/day, 5 times/week. Trunk impairment scale (TIS was used as a main outcome measure. Results: Data were analyzed using Wilcoxon signed rank sum test and Mann–Whitney U-test for intra- and inter-group comparison. The baseline characteristics between both groups were statistically nonsignificant. Within groups, there were significant improvements between baseline and at 4 weeks in the measure of TIS. In addition, PBE group showed a significant increase in trunk control (mean 2.33, 95% confidence interval 1.14-3.52, P = 0.002 than the PNF subject. Conclusion: This pilot randomized controlled trial

  11. A Novel Randomized Search Technique for Multiple Mobile Robot Paths Planning In Repetitive Dynamic Environment

    Directory of Open Access Journals (Sweden)

    Vahid Behravesh

    2012-08-01

    Full Text Available Presented article is studying the issue of path navigating for numerous robots. Our presented approach is based on both priority and the robust method for path finding in repetitive dynamic. Presented model can be generally implementable and useable: We do not assume any restriction regarding the quantity of levels of freedom for robots, and robots of diverse kinds can be applied at the same time. We proposed a random method and hill-climbing technique in the area based on precedence plans, which is used to determine a solution to a given trajectory planning problem and to make less the extent of total track. Our method plans trajectories for particular robots in the setting-time scope. Therefore, in order to specifying the interval of constant objects similar to other robots and the extent of the tracks which is traversed. For measuring the hazard for robots to conflict with each other it applied a method based on probability of the movements of robots. This algorithm applied to real robots with successful results. The proposed method performed and judged on both real robots and in simulation. We performed sequence of100tests with 8 robots for comparing with coordination method and current performances are effective. However, maximizing the performance is still possible. These performances estimations performed on Windows operating system and 3GHz Intel Pentium IV with and compiles with GCC 3.4. We used our PCGA robot for all experiments.  For a large environment of 19×15m2where we accomplished 40tests, our model is competent to plan high-quality paths in a severely short time (less than a second. Moreover, this article utilized lookup tables to keep expenses the formerly navigated robots made, increasing the number of robots don’t expand computation time.

  12. EFT (Emotional Freedom Techniques) and Resiliency in Veterans at Risk for PTSD: A Randomized Controlled Trial.

    Science.gov (United States)

    Church, Dawson; Sparks, Terry; Clond, Morgan

    2016-01-01

    Prior research indicates elevated but subclinical posttraumatic stress disorder (PTSD) symptoms as a risk factor for a later diagnosis of PTSD. This study examined the progression of symptoms in 21 subclinical veterans. Participants were randomized into a treatment as usual (TAU) wait-list group and an experimental group, which received TAU plus six sessions of clinical emotional freedom techniques (EFT). Symptoms were assessed using the PCL-M (Posttraumatic Checklist-Military) on which a score of 35 or higher indicates increased risk for PTSD. The mean pretreatment score of participants was 39 ± 8.7, with no significant difference between groups. No change was found in the TAU group during the wait period. Afterward, the TAU group received an identical clinical EFT protocol. Posttreatment groups were combined for analysis. Scores declined to a mean of 25 (-64%, P < .0001). Participants maintained their gains, with mean three-month and six-month follow-up PCL-M scores of 27 (P < .0001). Similar reductions were noted in the depth and breadth of psychological conditions such as anxiety. A Cohen's d = 1.99 indicates a large treatment effect. Reductions in traumatic brain injury symptoms (P = .045) and insomnia (P = .004) were also noted. Symptom improvements were similar to those assessed in studies of PTSD-positive veterans. EFT may thus be protective against an increase in symptoms and a later PTSD diagnosis. As a simple and quickly learned self-help method, EFT may be a clinically useful element of a resiliency program for veterans and active-duty warriors. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. A Pilot Randomized Controlled Trial of Novel Dressing and Securement Techniques in 101 Pediatric Patients.

    Science.gov (United States)

    Kleidon, Tricia M; Ullman, Amanda J; Gibson, Victoria; Chaseling, Brett; Schoutrop, Jason; Mihala, Gabor; Rickard, Claire M

    2017-11-01

    To evaluate feasibility of an efficacy trial comparing peripherally inserted central catheter (PICC) dressing and securement techniques to prevent complications and failure. This pilot, 3-armed, randomized controlled trial was undertaken at Royal Children's Hospital and Lady Cilento Children's Hospital, Brisbane, Australia, between April 2014 and September 2015. Pediatric participants (N = 101; age range, 0-18 y) were assigned to standard care (bordered polyurethane [BPU] dressing, sutureless securement device), tissue adhesive (TA) (plus BPU dressing), or integrated securement dressings (ISDs). Average PICC dwell time was 8.1 days (range, 0.2-27.7 d). Primary outcome was trial feasibility including PICC failure. Secondary outcomes were PICC complications, dressing performance, and parent and staff satisfaction. Protocol feasibility was established. PICC failure was 6% (2/32) with standard care, 6% (2/31) with ISD, and 3% (1/32) with TA. PICC complications were 16% across all groups. TA provided immediate postoperative hemostasis, prolonging the first dressing change until 5.5 days compared with 3.5 days and 2.5 days with standard care and ISD respectively. Bleeding was the most common reason for first dressing change: standard care (n = 18; 75%), ISD (n = 11; 69%), TA (n = 4; 27%). Parental satisfaction (median 9.7/10; P = .006) and staff feedback (9.2/10; P = .002) were most positive for ISD. This research suggests safety and acceptability of different securement dressings compared with standard care; securement dressings may also reduce dressing changes after insertion. Further research is required to confirm clinically cost-effective methods to prevent PICC failure. Copyright © 2017 SIR. Published by Elsevier Inc. All rights reserved.

  14. Alexander Technique Lessons or Acupuncture Sessions for Persons With Chronic Neck Pain: A Randomized Trial.

    Science.gov (United States)

    MacPherson, Hugh; Tilbrook, Helen; Richmond, Stewart; Woodman, Julia; Ballard, Kathleen; Atkin, Karl; Bland, Martin; Eldred, Janet; Essex, Holly; Hewitt, Catherine; Hopton, Ann; Keding, Ada; Lansdown, Harriet; Parrott, Steve; Torgerson, David; Wenham, Aniela; Watt, Ian

    2015-11-03

    Management of chronic neck pain may benefit from additional active self-care-oriented approaches. To evaluate clinical effectiveness of Alexander Technique lessons or acupuncture versus usual care for persons with chronic, nonspecific neck pain. Three-group randomized, controlled trial. (Current Controlled Trials: ISRCTN15186354). U.K. primary care. Persons with neck pain lasting at least 3 months, a score of at least 28% on the Northwick Park Questionnaire (NPQ) for neck pain and associated disability, and no serious underlying pathology. 12 acupuncture sessions or 20 one-to-one Alexander lessons (both 600 minutes total) plus usual care versus usual care alone. NPQ score (primary outcome) at 0, 3, 6, and 12 months (primary end point) and Chronic Pain Self-Efficacy Scale score, quality of life, and adverse events (secondary outcomes). 517 patients were recruited, and the median duration of neck pain was 6 years. Mean attendance was 10 acupuncture sessions and 14 Alexander lessons. Between-group reductions in NPQ score at 12 months versus usual care were 3.92 percentage points for acupuncture (95% CI, 0.97 to 6.87 percentage points) (P = 0.009) and 3.79 percentage points for Alexander lessons (CI, 0.91 to 6.66 percentage points) (P = 0.010). The 12-month reductions in NPQ score from baseline were 32% for acupuncture and 31% for Alexander lessons. Participant self-efficacy improved for both interventions versus usual care at 6 months (P neck pain and associated disability compared with usual care at 12 months. Enhanced self-efficacy may partially explain why longer-term benefits were sustained. Arthritis Research UK.

  15. Evaluation of a Class of Simple and Effective Uncertainty Methods for Sparse Samples of Random Variables and Functions

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Vicente [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bonney, Matthew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Schroeder, Benjamin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Weirs, V. Gregory [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-11-01

    When very few samples of a random quantity are available from a source distribution of unknown shape, it is usually not possible to accurately infer the exact distribution from which the data samples come. Under-estimation of important quantities such as response variance and failure probabilities can result. For many engineering purposes, including design and risk analysis, we attempt to avoid under-estimation with a strategy to conservatively estimate (bound) these types of quantities -- without being overly conservative -- when only a few samples of a random quantity are available from model predictions or replicate experiments. This report examines a class of related sparse-data uncertainty representation and inference approaches that are relatively simple, inexpensive, and effective. Tradeoffs between the methods' conservatism, reliability, and risk versus number of data samples (cost) are quantified with multi-attribute metrics use d to assess method performance for conservative estimation of two representative quantities: central 95% of response; and 10-4 probability of exceeding a response threshold in a tail of the distribution. Each method's performance is characterized with 10,000 random trials on a large number of diverse and challenging distributions. The best method and number of samples to use in a given circumstance depends on the uncertainty quantity to be estimated, the PDF character, and the desired reliability of bounding the true value. On the basis of this large data base and study, a strategy is proposed for selecting the method and number of samples for attaining reasonable credibility levels in bounding these types of quantities when sparse samples of random variables or functions are available from experiments or simulations.

  16. Randomization tests

    CERN Document Server

    Edgington, Eugene

    2007-01-01

    Statistical Tests That Do Not Require Random Sampling Randomization Tests Numerical Examples Randomization Tests and Nonrandom Samples The Prevalence of Nonrandom Samples in Experiments The Irrelevance of Random Samples for the Typical Experiment Generalizing from Nonrandom Samples Intelligibility Respect for the Validity of Randomization Tests Versatility Practicality Precursors of Randomization Tests Other Applications of Permutation Tests Questions and Exercises Notes References Randomized Experiments Unique Benefits of Experiments Experimentation without Mani

  17. The particle analysis based on FT-TIMS technique for swipe sample under the frame of nuclear safeguard

    International Nuclear Information System (INIS)

    Yang Tianli; Liu Xuemei; Liu Zhao; Tang Lei; Long Kaiming

    2008-06-01

    Under the frame of nuclear safeguard, the particles analysis for swipe sample is an advance mean to detect the undeclared uranium enriched facilities and undeclared uranium enriched activity. The technique of particle analysis based on fission track-thermal ionization mass spectrometry (FT-TIMS) for swipe sample have been built. The reliability and the experimental background for selecting particles consisting of uranium from swipe sample by FT method have been verified. In addition, the utilization coefficient of particles on the surface of swipe sample have also been tested. These works have provided the technique support for application in the area of nuclear verification. (authors)

  18. Effectiveness of a Treatment Involving Soft Tissue Techniques and/or Neural Mobilization Techniques in the Management of Tension-Type Headache: A Randomized Controlled Trial.

    Science.gov (United States)

    Ferragut-Garcías, Alejandro; Plaza-Manzano, Gustavo; Rodríguez-Blanco, Cleofás; Velasco-Roldán, Olga; Pecos-Martín, Daniel; Oliva-Pascual-Vaca, Jesús; Llabrés-Bennasar, Bartomeu; Oliva-Pascual-Vaca, Ángel

    2017-02-01

    To evaluate the effects of a protocol involving soft tissue techniques and/or neural mobilization techniques in the management of patients with frequent episodic tension-type headache (FETTH) and those with chronic tension-type headache (CTTH). Randomized, double-blind, placebo-controlled before and after trial. Rehabilitation area of the local hospital and a private physiotherapy center. Patients (N=97; 78 women, 19 men) diagnosed with FETTH or CTTH were randomly assigned to groups A, B, C, or D. (A) Placebo superficial massage; (B) soft tissue techniques; (C) neural mobilization techniques; (D) a combination of soft tissue and neural mobilization techniques. The pressure pain threshold (PPT) in the temporal muscles (points 1 and 2) and supraorbital region (point 3), the frequency and maximal intensity of pain crisis, and the score in the Headache Impact Test-6 (HIT-6) were evaluated. All variables were assessed before the intervention, at the end of the intervention, and 15 and 30 days after the intervention. Groups B, C, and D had an increase in PPT and a reduction in frequency, maximal intensity, and HIT-6 values in all time points after the intervention as compared with baseline and group A (P<.001 for all cases). Group D had the highest PPT values and the lowest frequency and HIT-6 values after the intervention. The application of soft tissue and neural mobilization techniques to patients with FETTH or CTTH induces significant changes in PPT, the characteristics of pain crisis, and its effect on activities of daily living as compared with the application of these techniques as isolated interventions. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  19. CT-Guided Transgluteal Biopsy for Systematic Random Sampling of the Prostate in Patients Without Rectal Access.

    Science.gov (United States)

    Goenka, Ajit H; Remer, Erick M; Veniero, Joseph C; Thupili, Chakradhar R; Klein, Eric A

    2015-09-01

    The objective of our study was to review our experience with CT-guided transgluteal prostate biopsy in patients without rectal access. Twenty-one CT-guided transgluteal prostate biopsy procedures were performed in 16 men (mean age, 68 years; age range, 60-78 years) who were under conscious sedation. The mean prostate-specific antigen (PSA) value was 11.4 ng/mL (range, 2.3-39.4 ng/mL). Six had seven prior unsuccessful transperineal or transurethral biopsies. Biopsy results, complications, sedation time, and radiation dose were recorded. The mean PSA values and number of core specimens were compared between patients with malignant results and patients with nonmalignant results using the Student t test. The average procedural sedation time was 50.6 minutes (range, 15-90 minutes) (n = 20), and the mean effective radiation dose was 8.2 mSv (median, 6.6 mSv; range 3.6-19.3 mSv) (n = 13). Twenty of the 21 (95%) procedures were technically successful. The only complication was a single episode of gross hematuria and penile pain in one patient, which resolved spontaneously. Of 20 successful biopsies, 8 (40%) yielded adenocarcinoma (Gleason score: mean, 8; range, 7-9). Twelve biopsies yielded nonmalignant results (60%): high-grade prostatic intraepithelial neoplasia (n = 3) or benign prostatic tissue with or without inflammation (n = 9). Three patients had carcinoma diagnosed on subsequent biopsies (second biopsy, n = 2 patients; third biopsy, n = 1 patient). A malignant biopsy result was not significantly associated with the number of core specimens (p = 0.3) or the mean PSA value (p = 0.1). CT-guided transgluteal prostate biopsy is a safe and reliable technique for the systematic random sampling of the prostate in patients without a rectal access. In patients with initial negative biopsy results, repeat biopsy should be considered if there is a persistent rise in the PSA value.

  20. Comparison of randomization techniques for clinical trials with data from the HOMERUS-trial

    NARCIS (Netherlands)

    Verberk, W. J.; Kroon, A. A.; Kessels, A. G. H.; Nelemans, P. J.; van Ree, J. W.; Lenders, J. W. M.; Thien, T.; Bakx, J. C.; van Montfrans, G. A.; Smit, A. J.; Beltman, F. W.; de Leeuw, P. W.

    2005-01-01

    Background. Several methods of randomization are available to create comparable intervention groups in a study. In the HOMERUS-trial, we compared the minimization procedure with a stratified and a non-stratified method of randomization in order to test which one is most appropriate for use in

  1. Comparison of randomization techniques for clinical trials with data from the HOMERUS-trial.

    NARCIS (Netherlands)

    Verberk, W.J.; Kroon, A.A.; Kessels, A.G.H.; Nelemans, P.J.; Ree, J.W. van; Lenders, J.W.M.; Thien, Th.; Bakx, J.C.; Montfrans, G.A. van; Smit, A.J.; Beltman, F.W.; Leeuw, P.W. de

    2005-01-01

    BACKGROUND: Several methods of randomization are available to create comparable intervention groups in a study. In the HOMERUS-trial, we compared the minimization procedure with a stratified and a non-stratified method of randomization in order to test which one is most appropriate for use in

  2. Quantifying ruminal nitrogen metabolism using the omasal sampling technique in cattle--a meta-analysis.

    Science.gov (United States)

    Broderick, G A; Huhtanen, P; Ahvenjärvi, S; Reynal, S M; Shingfield, K J

    2010-07-01

    Mixed model analysis of data from 32 studies (122 diets) was used to evaluate the precision and accuracy of the omasal sampling technique for quantifying ruminal-N metabolism and to assess the relationships between nonammonia-N flow at the omasal canal and milk protein yield. Data were derived from experiments in cattle fed North American diets (n=36) based on alfalfa silage, corn silage, and corn grain and Northern European diets (n=86) composed of grass silage and barley-based concentrates. In all studies, digesta flow was quantified using a triple-marker approach. Linear regressions were used to predict microbial-N flow to the omasum from intake of dry matter (DM), organic matter (OM), or total digestible nutrients. Efficiency of microbial-N synthesis increased with DM intake and there were trends for increased efficiency with elevated dietary concentrations of crude protein (CP) and rumen-degraded protein (RDP) but these effects were small. Regression of omasal rumen-undegraded protein (RUP) flow on CP intake indicated that an average 32% of dietary CP escaped and 68% was degraded in the rumen. The slope from regression of observed omasal flows of RUP on flows predicted by the National Research Council (2001) model indicated that NRC predicted greater RUP supply. Measured microbial-N flow was, on average, 26% greater than that predicted by the NRC model. Zero ruminal N-balance (omasal CP flow=CP intake) was obtained at dietary CP and RDP concentrations of 147 and 106 g/kg of DM, corresponding to ruminal ammonia-N and milk urea N concentrations of 7.1 and 8.3mg/100mL, respectively. Milk protein yield was positively related to the efficiency of microbial-N synthesis and measured RUP concentration. Improved efficiency of microbial-N synthesis and reduced ruminal CP degradability were positively associated with efficiency of capture of dietary N as milk N. In conclusion, the results of this study indicate that the omasal sampling technique yields valuable estimates

  3. The novel programmable riometer for in-depth ionospheric and magnetospheric observations (PRIAMOS) using direct sampling DSP techniques

    OpenAIRE

    Dekoulis, G.; Honary, F.

    2005-01-01

    This paper describes the feasibility study and simulation results for the unique multi-frequency, multi-bandwidth, Programmable Riometer for in-depth Ionospheric And Magnetospheric ObservationS (PRIAMOS) based on direct sampling digital signal processing (DSP) techniques. This novel architecture is based on sampling the cosmic noise wavefront at the antenna. It eliminates the usage of any intermediate frequency (IF) mixer stages (-6 dB) and the noise balancing technique (-3 dB), providing a m...

  4. Application of BAT assessments for sampling and monitoring techniques in the UK EPR

    International Nuclear Information System (INIS)

    Kerdaniel, Erwan du Fou de; Colin, Julie; Moyano, Florian; Ardon, Audrey

    2014-01-01

    EDF Energy intends to build two EPR units at the Hinkley Point C site and has initiated the studies to build two more EPR units at Sizewell C. The project is lead by NNB GenCo, a subsidiary of EDF Energy. NNB GenCo, identified as the future operator and intelligent customer, relies upon EDF SA's expertise as responsible designer. UK EPR design substantiation has been carried out during the Generic Design Assessment (GDA), with two key reports, namely the Pre-Construction Environmental Report and the Pre-Construction Safety Report. These reports lead to specific enquiries initiated by the Regulators through 'Assessment Findings' in all areas, including Chemistry. Along the GDA process, the Environmental Agency (EA) has authorised chemical and radiochemical effluents discharges, requiring NNB GenCo to demonstrate that Best Available Techniques (BAT) are used to sample and monitor the discharges and the generation of solid wastes. BAT demonstrations deliverables are formalised via 'Commitment Plans' and detailed within 'Environmental Optimisation Studies (EOS)' reports, in line with NNB GenCo's management arrangements. NNB GenCo manages the production of the EOS reports. EDF CEIDRE Laboratory department (CEIDRE/DLAB) has been identified as a contributor to the BAT demonstration regarding sampling and monitoring of chemical effluents, active effluents and solid wastes. As required by the definition of BAT in the UK context, demonstrations are based on: UK Regulatory context and associated specific requirements, International feedback analysis of European, EPRI and IAEA documents, French fleet framework: benchmarks with the operator, analysis of technical and operational management documentation, Sizewell B framework: benchmarks with the operator, analysis of technical and operational management documents, International, British and French standards. A close partnership between the different EDF SA departments and a clear

  5. Analyses of archaeological pottery samples using X-ray fluorescence technique for provenance study

    International Nuclear Information System (INIS)

    Tamilarasu, S.; Swain, K.K.; Singhal, R.K; Reddy, A.V.R.; Acharya, R.; Velraj, G.

    2015-01-01

    Archaeological artifacts reveal information on past human activities, artifact preparation technology, art and possible trade. Ceramics are the most stable and abundant material in archaeological context. Pottery is the most abundant tracers in all archaeological excavations. Compared to major elements, elements present at trace concentrations levels are source specific and they maintain same concentration levels in source clay as well as finished products e.g., fired clay potteries. As it is difficult to find out exact source or origin, provenance study is carried out first to establish whether objects under study are from the same or different sources/origin. Various analytical techniques like instrumental neutron activation analysis (INAA), Ion beam analysis (IBA) and X-ray fluorescence (XRF) have been used for obtaining elemental concentrations in archaeological potteries. Portable X-ray fluorescence (pXRF) spectrometry provides a non-destructive means for elemental characterization of a wide range of archaeological materials. Ten archaeological pottery samples were collected from Kottapuram, Kerala under the supervision of archaeological survey of India. Portable X-ray fluorescence (pXRF) spectrometry using a handheld Olympus Innov-X Delta XRF device, ACD BARC, has been used for chemical characterization of the pottery samples. The instrument is equipped with the Delta Rhodium (Rh) anode X-Ray tube and uses a Silicon Drift Detector (resolution <200 eV at 5.95 keV Mn Kα X-ray). NIST 2781 SRM was analyzed for quality control purpose. Ten elements namely Fe, Ti, Mn, Co, Cu, Zn, Pb, Zr, Mo and Se were chosen for cluster analysis and their concentration values were utilized for multivariate statistical analysis using WinSTAT 9.0

  6. Relationship between accuracy and number of samples on statistical quantity and contour map of environmental gamma-ray dose rate. Example of random sampling

    International Nuclear Information System (INIS)

    Matsuda, Hideharu; Minato, Susumu

    2002-01-01

    The accuracy of statistical quantity like the mean value and contour map obtained by measurement of the environmental gamma-ray dose rate was evaluated by random sampling of 5 different model distribution maps made by the mean slope, -1.3, of power spectra calculated from the actually measured values. The values were derived from 58 natural gamma dose rate data reported worldwide ranging in the means of 10-100 Gy/h rates and 10 -3 -10 7 km 2 areas. The accuracy of the mean value was found around ±7% even for 60 or 80 samplings (the most frequent number) and the standard deviation had the accuracy less than 1/4-1/3 of the means. The correlation coefficient of the frequency distribution was found 0.860 or more for 200-400 samplings (the most frequent number) but of the contour map, 0.502-0.770. (K.H.)

  7. Validation of the k-filtering technique for a signal composed of random-phase plane waves and non-random coherent structures

    Directory of Open Access Journals (Sweden)

    O. W. Roberts

    2014-12-01

    Full Text Available Recent observations of astrophysical magnetic fields have shown the presence of fluctuations being wave-like (propagating in the plasma frame and those described as being structure-like (advected by the plasma bulk velocity. Typically with single-spacecraft missions it is impossible to differentiate between these two fluctuations, due to the inherent spatio-temporal ambiguity associated with a single point measurement. However missions such as Cluster which contain multiple spacecraft have allowed for temporal and spatial changes to be resolved, using techniques such as k filtering. While this technique does not assume Taylor's hypothesis it requires both weak stationarity of the time series and that the fluctuations can be described by a superposition of plane waves with random phases. In this paper we test whether the method can cope with a synthetic signal which is composed of a combination of non-random-phase coherent structures with a mean radius d and a mean separation λ, as well as plane waves with random phase.

  8. Development of novel separation techniques for biological samples in capillary electrophoresis

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Huan -Tsung [Iowa State Univ., Ames, IA (United States)

    1994-07-27

    This dissertation includes three different topics: general introduction of capillary electrophoresis (CE); gradient in CE and CE in biological separations; and capillary gel electrophoresis (CGE) for DNA separation. Factors such as temperature, viscosity, pH, and the surface of capillary walls affecting the separation performance are demonstrated. A pH gradient between 3.0 and 5.2 is useful to improve the resolution among eight different organic acids. A flow gradient due to the change in the concentration of surfactant, which is able to coat to the capillary wall to change the flow rate and its direction, is also shown as a good way to improve the resolution for organic compounds. A temperature gradient caused by joule heat is shown by voltage programming to enhance the resolution and shorten the separation time for several phenolic compounds. The author also shows that self-regulating dynamic control of electroosmotic flow in CE by simply running separation in different concentrations of surfactant has less matrix effect on the separation performance. One of the most important demonstrations in this dissertation is that the author proposes on-column reaction which gives several advantages including the use of a small amount of sample, low risk of contamination, and time saving and kinetic features. The author uses this idea with laser induced fluorescence (LIF) as a detection mode to detect an on-column digestion of sub-ng of protein. This technique also is applied to single cell analysis in the group.

  9. Techniques for the detection of pathogenic Cryptococcus species in wood decay substrata and the evaluation of viability in stored samples

    Directory of Open Access Journals (Sweden)

    Christian Alvarez

    2013-02-01

    Full Text Available In this study, we evaluated several techniques for the detection of the yeast form of Cryptococcus in decaying wood and measured the viability of these fungi in environmental samples stored in the laboratory. Samples were collected from a tree known to be positive for Cryptococcus and were each inoculated on 10 Niger seed agar (NSA plates. The conventional technique (CT yielded a greater number of positive samples and indicated a higher fungal density [in colony forming units per gram of wood (CFU.g-1] compared to the humid swab technique (ST. However, the difference in positive and false negative results between the CT-ST was not significant. The threshold of detection for the CT was 0.05.10³ CFU.g-1, while the threshold for the ST was greater than 0.1.10³ CFU-1. No colonies were recovered using the dry swab technique. We also determined the viability of Cryptococcus in wood samples stored for 45 days at 25ºC using the CT and ST and found that samples not only continued to yield a positive response, but also exhibited an increase in CFU.g-1, suggesting that Cryptococcus is able to grow in stored environmental samples. The ST.1, in which samples collected with swabs were immediately plated on NSA medium, was more efficient and less laborious than either the CT or ST and required approximately 10 min to perform; however, additional studies are needed to validate this technique.

  10. Efficacy and complications associated with a modified inferior alveolar nerve block technique. A randomized, triple-blind clinical trial.

    Science.gov (United States)

    Montserrat-Bosch, Marta; Figueiredo, Rui; Nogueira-Magalhães, Pedro; Arnabat-Dominguez, Josep; Valmaseda-Castellón, Eduard; Gay-Escoda, Cosme

    2014-07-01

    To compare the efficacy and complication rates of two different techniques for inferior alveolar nerve blocks (IANB). A randomized, triple-blind clinical trial comprising 109 patients who required lower third molar removal was performed. In the control group, all patients received an IANB using the conventional Halsted technique, whereas in the experimental group, a modified technique using a more inferior injection point was performed. A total of 100 patients were randomized. The modified technique group showed a significantly higher onset time in the lower lip and chin area, and was frequently associated to a lingual electric discharge sensation. Three failures were recorded, 2 of them in the experimental group. No relevant local or systemic complications were registered. Both IANB techniques used in this trial are suitable for lower third molar removal. However, performing an inferior alveolar nerve block in a more inferior position (modified technique) extends the onset time, does not seem to reduce the risk of intravascular injections and might increase the risk of lingual nerve injuries.

  11. Contributions from the data samples in NOC technique on the extracting of the Sq variation

    Science.gov (United States)

    Wu, Yingyan; Xu, Wenyao

    2015-04-01

    The solar quiet daily variation, Sq, a rather regular variation is usually observed at mid-low latitudes on magnetic quiet days or less-disturbed days. It is mainly resulted from the dynamo currents in the ionospheric E region, which are driven by the atmospheric tidal wind and different processes and flow as two current whorls in each of the northern and southern hemispheres[1]. The Sq exhibits a conspicuous day-to-day (DTD) variability in daily range (or strength), shape (or phase) and its current focus. This variability is mainly attributed to changes in the ionospheric conductivity and tidal winds, varying with solar radiation and ionospheric conditions. Furthermore, it presents a seasonal variation and solar cycle variation[2-4]. In generally, Sq is expressed with the average value of the five international magnetic quiet days. Using data from global magnetic stations, equivalent current system of daily variation can be constructed to reveal characteristics of the currents[5]. In addition, using the differences of H component at two stations on north and south side of the Sq currents of focus, Sq is extracted much better[6]. Recently, the method of Natural Orthoganal Components (NOC) is used to decompose the magnetic daily variation and express it as the summation of eigenmodes, and indicate the first NOC eigenmode as the solar quiet daily variation, the second as the disturbance daily variation[7-9]. As we know, the NOC technique can help reveal simpler patterns within a complex set of variables, without designed basic-functions such as FFT technique. But the physical explanation of the NOC eigenmodes is greatly depends on the number of data samples and data regular-quality. Using the NOC method, we focus our present study on the analysis of the hourly means of the H component at BMT observatory in China from 2001 to 2008. The contributions of the number and the regular-quality of the data samples on which eigenmode corresponds to the Sq are analyzed, by

  12. Sample Size Calculation: Inaccurate A Priori Assumptions for Nuisance Parameters Can Greatly Affect the Power of a Randomized Controlled Trial.

    Directory of Open Access Journals (Sweden)

    Elsa Tavernier

    Full Text Available We aimed to examine the extent to which inaccurate assumptions for nuisance parameters used to calculate sample size can affect the power of a randomized controlled trial (RCT. In a simulation study, we separately considered an RCT with continuous, dichotomous or time-to-event outcomes, with associated nuisance parameters of standard deviation, success rate in the control group and survival rate in the control group at some time point, respectively. For each type of outcome, we calculated a required sample size N for a hypothesized treatment effect, an assumed nuisance parameter and a nominal power of 80%. We then assumed a nuisance parameter associated with a relative error at the design stage. For each type of outcome, we randomly drew 10,000 relative errors of the associated nuisance parameter (from empirical distributions derived from a previously published review. Then, retro-fitting the sample size formula, we derived, for the pre-calculated sample size N, the real power of the RCT, taking into account the relative error for the nuisance parameter. In total, 23%, 0% and 18% of RCTs with continuous, binary and time-to-event outcomes, respectively, were underpowered (i.e., the real power was 90%. Even with proper calculation of sample size, a substantial number of trials are underpowered or overpowered because of imprecise knowledge of nuisance parameters. Such findings raise questions about how sample size for RCTs should be determined.

  13. An application of reactor noise techniques to neutron transport problems in a random medium

    International Nuclear Information System (INIS)

    Sahni, D.C.

    1989-01-01

    Neutron transport problems in a random medium are considered by defining a joint Markov process describing the fluctuations of one neutron population and the random changes in the medium. Backward Chapman-Kolmogorov equations are derived which yield an adjoint transport equation for the average neutron density. It is shown that this average density also satisfied the direct transport equation as given by the phenomenological model. (author)

  14. Efficient sampling techniques for uncertainty quantification in history matching using nonlinear error models and ensemble level upscaling techniques

    KAUST Repository

    Efendiev, Y.

    2009-11-01

    The Markov chain Monte Carlo (MCMC) is a rigorous sampling method to quantify uncertainty in subsurface characterization. However, the MCMC usually requires many flow and transport simulations in evaluating the posterior distribution and can be computationally expensive for fine-scale geological models. We propose a methodology that combines coarse- and fine-scale information to improve the efficiency of MCMC methods. The proposed method employs off-line computations for modeling the relation between coarse- and fine-scale error responses. This relation is modeled using nonlinear functions with prescribed error precisions which are used in efficient sampling within the MCMC framework. We propose a two-stage MCMC where inexpensive coarse-scale simulations are performed to determine whether or not to run the fine-scale (resolved) simulations. The latter is determined on the basis of a statistical model developed off line. The proposed method is an extension of the approaches considered earlier where linear relations are used for modeling the response between coarse-scale and fine-scale models. The approach considered here does not rely on the proximity of approximate and resolved models and can employ much coarser and more inexpensive models to guide the fine-scale simulations. Numerical results for three-phase flow and transport demonstrate the advantages, efficiency, and utility of the method for uncertainty assessment in the history matching. Copyright 2009 by the American Geophysical Union.

  15. Sampling

    CERN Document Server

    Thompson, Steven K

    2012-01-01

    Praise for the Second Edition "This book has never had a competitor. It is the only book that takes a broad approach to sampling . . . any good personal statistics library should include a copy of this book." —Technometrics "Well-written . . . an excellent book on an important subject. Highly recommended." —Choice "An ideal reference for scientific researchers and other professionals who use sampling." —Zentralblatt Math Features new developments in the field combined with all aspects of obtaining, interpreting, and using sample data Sampling provides an up-to-date treat

  16. Characteristics of men with substance use disorder consequent to illicit drug use: comparison of a random sample and volunteers.

    Science.gov (United States)

    Reynolds, Maureen D; Tarter, Ralph E; Kirisci, Levent

    2004-09-06

    Men qualifying for substance use disorder (SUD) consequent to consumption of an illicit drug were compared according to recruitment method. It was hypothesized that volunteers would be more self-disclosing and exhibit more severe disturbances compared to randomly recruited subjects. Personal, demographic, family, social, substance use, psychiatric, and SUD characteristics of volunteers (N = 146) were compared to randomly recruited (N = 102) subjects. Volunteers had lower socioceconomic status, were more likely to be African American, and had lower IQ than randomly recruited subjects. Volunteers also evidenced greater social and family maladjustment and more frequently had received treatment for substance abuse. In addition, lower social desirability response bias was observed in the volunteers. SUD was not more severe in the volunteers; however, they reported a higher lifetime rate of opiate, diet, depressant, and analgesic drug use. Volunteers and randomly recruited subjects qualifying for SUD consequent to illicit drug use are similar in SUD severity but differ in terms of severity of psychosocial disturbance and history of drug involvement. The factors discriminating volunteers and randomly recruited subjects are well known to impact on outcome, hence they need to be considered in research design, especially when selecting a sampling strategy in treatment research.

  17. Randomization of grab-sampling strategies for estimating the annual exposure of U miners to Rn daughters.

    Science.gov (United States)

    Borak, T B

    1986-04-01

    Periodic grab sampling in combination with time-of-occupancy surveys has been the accepted procedure for estimating the annual exposure of underground U miners to Rn daughters. Temporal variations in the concentration of potential alpha energy in the mine generate uncertainties in this process. A system to randomize the selection of locations for measurement is described which can reduce uncertainties and eliminate systematic biases in the data. In general, a sample frequency of 50 measurements per year is sufficient to satisfy the criteria that the annual exposure be determined in working level months to within +/- 50% of the true value with a 95% level of confidence. Suggestions for implementing this randomization scheme are presented.

  18. Comparison between ultrasound guided technique and digital palpation technique for radial artery cannulation in adult patients: An updated meta-analysis of randomized controlled trials.

    Science.gov (United States)

    Bhattacharjee, Sulagna; Maitra, Souvik; Baidya, Dalim K

    2018-03-22

    Possible advantages and risks associated with ultrasound guided radial artery cannulation in-comparison to digital palpation guided method in adult patients are not fully known. We have compared ultrasound guided radial artery cannulation with digital palpation technique in this meta-analysis. Meta-analysis of randomized controlled trials. Trials conducted in operating room, emergency department, cardiac catheterization laboratory. PubMed and Cochrane Central Register of Controlled Trials (CENTRAL) were searched (from 1946 to 20th November 2017) to identify prospective randomized controlled trials in adult patients. Two-dimensional ultrasound guided radial artery catheterization versus digital palpation guided radial artery cannulation. Overall cannulation success rate, first attempt success rate, time to cannulation and mean number of attempts to successful cannulation. Odds ratio (OR) and standardized mean difference (SMD) or mean difference (MD) with 95% confidence interval (CI) were calculated for categorical and continuous variables respectively. Data of 1895 patients from 10 studies have been included in this meta- analysis. Overall cannulation success rate was similar between ultrasound guided technique and digital palpation [OR (95% CI) 2.01 (1.00, 4.06); p = 0.05]. Ultrasound guided radial artery cannulation is associated with higher first attempt success rate of radial artery cannulation in comparison to digital palpation [OR (95% CI) 2.76 (186, 4.10); p guided technique with palpation technique. Radial artery cannulation by ultrasound guidance may increase the first attempt success rate but not the overall cannulation success when compared to digital palpation technique. However, results of this meta-analysis should be interpreted with caution due presence of heterogeneity. Copyright © 2018. Published by Elsevier Inc.

  19. Evaluation of alternative macroinvertebrate sampling techniques for use in a new tropical freshwater bioassessment scheme

    OpenAIRE

    Isabel Eleanor Moore; Kevin Joseph Murphy

    2015-01-01

    Aim: The study aimed to determine the effectiveness of benthic macroinvertebrate dredge net sampling procedures as an alternative method to kick net sampling in tropical freshwater systems, specifically as an evaluation of sampling methods used in the Zambian Invertebrate Scoring System (ZISS) river bioassessment scheme. Tropical freshwater ecosystems are sometimes dangerous or inaccessible to sampling teams using traditional kick-sampling methods, so identifying an alternative procedure that...

  20. Effect of novel inhaler technique reminder labels on the retention of inhaler technique skills in asthma: a single-blind randomized controlled trial.

    Science.gov (United States)

    Basheti, Iman A; Obeidat, Nathir M; Reddel, Helen K

    2017-02-09

    Inhaler technique can be corrected with training, but skills drop off quickly without repeated training. The aim of our study was to explore the effect of novel inhaler technique labels on the retention of correct inhaler technique. In this single-blind randomized parallel-group active-controlled study, clinical pharmacists enrolled asthma patients using controller medication by Accuhaler [Diskus] or Turbuhaler. Inhaler technique was assessed using published checklists (score 0-9). Symptom control was assessed by asthma control test. Patients were randomized into active (ACCa; THa) and control (ACCc; THc) groups. All patients received a "Show-and-Tell" inhaler technique counseling service. Active patients also received inhaler labels highlighting their initial errors. Baseline data were available for 95 patients, 68% females, mean age 44.9 (SD 15.2) years. Mean inhaler scores were ACCa:5.3 ± 1.0; THa:4.7 ± 0.9, ACCc:5.5 ± 1.1; THc:4.2 ± 1.0. Asthma was poorly controlled (mean ACT scores ACCa:13.9 ± 4.3; THa:12.1 ± 3.9; ACCc:12.7 ± 3.3; THc:14.3 ± 3.7). After training, all patients had correct technique (score 9/9). After 3 months, there was significantly less decline in inhaler technique scores for active than control groups (mean difference: Accuhaler -1.04 (95% confidence interval -1.92, -0.16, P = 0.022); Turbuhaler -1.61 (-2.63, -0.59, P = 0.003). Symptom control improved significantly, with no significant difference between active and control patients, but active patients used less reliever medication (active 2.19 (SD 1.78) vs. control 3.42 (1.83) puffs/day, P = 0.002). After inhaler training, novel inhaler technique labels improve retention of correct inhaler technique skills with dry powder inhalers. Inhaler technique labels represent a simple, scalable intervention that has the potential to extend the benefit of inhaler training on asthma outcomes. REMINDER LABELS IMPROVE INHALER TECHNIQUE: Personalized

  1. Random Sampling with Interspike-Intervals of the Exponential Integrate and Fire Neuron: A Computational Interpretation of UP-States.

    Directory of Open Access Journals (Sweden)

    Andreas Steimer

    Full Text Available Oscillations between high and low values of the membrane potential (UP and DOWN states respectively are an ubiquitous feature of cortical neurons during slow wave sleep and anesthesia. Nevertheless, a surprisingly small number of quantitative studies have been conducted only that deal with this phenomenon's implications for computation. Here we present a novel theory that explains on a detailed mathematical level the computational benefits of UP states. The theory is based on random sampling by means of interspike intervals (ISIs of the exponential integrate and fire (EIF model neuron, such that each spike is considered a sample, whose analog value corresponds to the spike's preceding ISI. As we show, the EIF's exponential sodium current, that kicks in when balancing a noisy membrane potential around values close to the firing threshold, leads to a particularly simple, approximative relationship between the neuron's ISI distribution and input current. Approximation quality depends on the frequency spectrum of the current and is improved upon increasing the voltage baseline towards threshold. Thus, the conceptually simpler leaky integrate and fire neuron that is missing such an additional current boost performs consistently worse than the EIF and does not improve when voltage baseline is increased. For the EIF in contrast, the presented mechanism is particularly effective in the high-conductance regime, which is a hallmark feature of UP-states. Our theoretical results are confirmed by accompanying simulations, which were conducted for input currents of varying spectral composition. Moreover, we provide analytical estimations of the range of ISI distributions the EIF neuron can sample from at a given approximation level. Such samples may be considered by any algorithmic procedure that is based on random sampling, such as Markov Chain Monte Carlo or message-passing methods. Finally, we explain how spike-based random sampling relates to existing

  2. Random Sampling with Interspike-Intervals of the Exponential Integrate and Fire Neuron: A Computational Interpretation of UP-States.

    Science.gov (United States)

    Steimer, Andreas; Schindler, Kaspar

    2015-01-01

    Oscillations between high and low values of the membrane potential (UP and DOWN states respectively) are an ubiquitous feature of cortical neurons during slow wave sleep and anesthesia. Nevertheless, a surprisingly small number of quantitative studies have been conducted only that deal with this phenomenon's implications for computation. Here we present a novel theory that explains on a detailed mathematical level the computational benefits of UP states. The theory is based on random sampling by means of interspike intervals (ISIs) of the exponential integrate and fire (EIF) model neuron, such that each spike is considered a sample, whose analog value corresponds to the spike's preceding ISI. As we show, the EIF's exponential sodium current, that kicks in when balancing a noisy membrane potential around values close to the firing threshold, leads to a particularly simple, approximative relationship between the neuron's ISI distribution and input current. Approximation quality depends on the frequency spectrum of the current and is improved upon increasing the voltage baseline towards threshold. Thus, the conceptually simpler leaky integrate and fire neuron that is missing such an additional current boost performs consistently worse than the EIF and does not improve when voltage baseline is increased. For the EIF in contrast, the presented mechanism is particularly effective in the high-conductance regime, which is a hallmark feature of UP-states. Our theoretical results are confirmed by accompanying simulations, which were conducted for input currents of varying spectral composition. Moreover, we provide analytical estimations of the range of ISI distributions the EIF neuron can sample from at a given approximation level. Such samples may be considered by any algorithmic procedure that is based on random sampling, such as Markov Chain Monte Carlo or message-passing methods. Finally, we explain how spike-based random sampling relates to existing computational

  3. Parameters, test criteria and fault assessment in random sampling of waste barrels from non-qualified processes

    International Nuclear Information System (INIS)

    Martens, B.R.

    1989-01-01

    In the context of random sampling tests, parameters are checked on the waste barrels and criteria are given on which these tests are based. Also, it is shown how faulty data on the properties of the waste or faulty waste barrels should be treated. To decide the extent of testing, the properties of the waste relevant to final storage are determined based on the conditioning process used. (DG) [de

  4. A Randomized Controlled Trial of Mastication with Complete Dentures Made by a Conventional or an Abbreviated Technique.

    Science.gov (United States)

    Mengatto, Cristiane Machado; Gameiro, Gustavo Hauber; Brondani, Mario; Owen, C Peter; MacEntee, Michael I

    The aim of this randomized clinical trial was to test the hypothesis that there are no statistically significant differences after 3 and 6 months in masticatory performance or chewing ability of people with new complete dentures made by an abbreviated or a conventional technique. The trial included 20 edentulous participants at a dental school in Brazil assigned randomly to receive dentures made by either a conventional technique involving six clinical sessions or by an abbreviated technique involving three clinical sessions. At baseline with old dentures and at 3 and 6 months with new dentures, masticatory performance was measured by counting the number of chewing strokes and the time before participants had an urge to swallow and by calculating the medium particle size of a silicone material after 20 chewing strokes and at the urge to swallow. On each occasion, the participants recorded on visual analog scales their ability to chew five food textures. Statistical significance (P ≤ .05) of changes in masticatory performance and chewing ability during the trial were analyzed with generalized estimating equations. Both techniques improved masticatory performance between baseline and 6 months and the ability to bite and chew all foods apart from hard apples. There were no significant differences in masticatory performance or chewing ability after 6 months between complete dentures made by a conventional or an abbreviated technique.

  5. Random or systematic sampling to detect a localised microbial contamination within a batch of food

    NARCIS (Netherlands)

    Jongenburger, I.; Reij, M.W.; Boer, E.P.J.; Gorris, L.G.M.; Zwietering, M.H.

    2011-01-01

    Pathogenic microorganisms are known to be distributed heterogeneously in food products that are solid, semi-solid or powdered, like for instance peanut butter, cereals, or powdered milk. This complicates effective detection of the pathogens by sampling. Two-class sampling plans, which are deployed

  6. Conditional estimation of exponential random graph models from snowball sampling designs

    NARCIS (Netherlands)

    Pattison, Philippa E.; Robins, Garry L.; Snijders, Tom A. B.; Wang, Peng

    2013-01-01

    A complete survey of a network in a large population may be prohibitively difficult and costly. So it is important to estimate models for networks using data from various network sampling designs, such as link-tracing designs. We focus here on snowball sampling designs, designs in which the members

  7. Inferences about Variance Components and Reliability-Generalizability Coefficients in the Absence of Random Sampling.

    Science.gov (United States)

    Kane, Michael

    2002-01-01

    Reviews the criticisms of sampling assumptions in generalizability theory (and in reliability theory) and examines the feasibility of using representative sampling, stratification, homogeneity assumptions, and replications to address these criticisms. Suggests some general outlines for the conduct of generalizability theory studies. (SLD)

  8. Comparison of the efficacy of two anesthetic techniques of mandibular primary first molar: A randomized clinical trial

    Directory of Open Access Journals (Sweden)

    Davood Ghasemi Tudeshchoie

    2013-01-01

    Full Text Available Background: The most common technique to anesthetize mandibular primary teeth is inferior alveolar (I.A nerve block injection which induces a relatively sustained anesthesia and in turn may potentially traumatize soft-tissues. Therefore, the need of having an alternative technique of anesthesia with a shorter term but the same efficacy is reasonable. The aim of this study was a comparison of the efficacy of two anesthetic techniques of mandibular primary first molar. Materials and Methods: In this randomized crossover clinical trial, 40 children with ages ranged from 5 years to 8 years whose mandibular primary first molars were eligible for pulpotomy, were selected and divided randomly into two groups. The right and left mandibular first molars of group A were anesthetized with infiltration and I. A nerve block techniques in the first and second sessions respectively. The left and right mandibular first molars of group B were anesthetized with I.A nerve block and infiltration techniques in the first and second sessions respectively. The severity of pain were measured and recorded according to sound-eye-motor scale by a certain person. Data was analyzed using Wilcoxon Signed Rank and Mann-Whitney U tests (P < 0.05. Results: The severity of pain was lower in infiltration technique versus I.A nerve block. There were no significant differences between the severities of pain on pulpal exposure of two techniques. Conclusion: It seems that infiltration technique is more favorable to anesthetize the mandibular primary first molar compared to I.A nerve block.

  9. Effect of apical clearing technique on the treatment outcome of teeth with asymptomatic apical periodontitis: A randomized clinical trial

    OpenAIRE

    Priya Mittal; Ajay Logani; Naseem Shah; R M Pandey

    2016-01-01

    Aim: This study aims to compare the periapical healing of teeth with asymptomatic apical periodontitis treated either by conventional apical preparation (CAP) or apical clearing technique (ACT). Materials and Methods: T wenty subjects with bilateral nonvital similar teeth exhibiting comparable periapical index (PAI) score were enrolled and randomly allocated. Group I (CAP, n = 20): Apical preparation three sizes greater (master apical file [MAF]) than the first binding file at the establis...

  10. Gamma self-shielding correction factors calculation for aqueous bulk sample analysis by PGNAA technique

    International Nuclear Information System (INIS)

    Nasrabadi, M.N.; Mohammadi, A.; Jalali, M.

    2009-01-01

    In this paper bulk sample prompt gamma neutron activation analysis (BSPGNAA) was applied to aqueous sample analysis using a relative method. For elemental analysis of an unknown bulk sample, gamma self-shielding coefficient was required. Gamma self-shielding coefficient of unknown samples was estimated by an experimental method and also by MCNP code calculation. The proposed methodology can be used for the determination of the elemental concentration of unknown aqueous samples by BSPGNAA where knowledge of the gamma self-shielding within the sample volume is required.

  11. Laser-Assisted Sampling Techniques in Combination with ICP-MS: A Novel Approach for Particle Analysis at the IAEA Environmental Samples Laboratory

    International Nuclear Information System (INIS)

    Dzigal, N.; Chinea-Cano, E.

    2015-01-01

    Researchers have found many applications for lasers. About two decades ago, scientists started using lasers as sample introduction instruments for mass spectrometry measurements. Similarly, lasers as micro-dissection tools have also been increasingly on demand in the fields of life sciences, materials science, forensics, etc. This presentation deals with the interception of these aforementioned laser-assisted techniques to the field of particle analysis. Historically, the use of a nanosecond laser to ablate material has been used in materials science. Recently, it has been proven that in the analysis of particulate materials the disadvantages associated with the utilization of nanosecond lasers such as overheating and melting of the sample are suppressed when using femtosecond lasers. Further, due to the length of a single laser shot, fs-LA allows a more controlled ablation to occur and therefore the sample plasma is more homogeneous and less mass-fractionation events are detected. The use of laser micro-dissection devices enables the physical segmentation of microsized artefacts previously performed by a laborious manual procedure. By combining the precision of the laser cutting inherent to the LMD technique together with a particle identification methodology, one can increase the efficiency of single particle isolation. Further, besides the increase in throughput of analyses, this combination enhances the signal-to-noise ratio by removing matrix particles effectively. Specifically, this contribution describes the use of an Olympus+MMI laser microdissection device in improving the sample preparation of environmental swipe samples and the installation of an Applied Spectra J200 fs-LA/LIBS (laser ablation/laser inducedbreakdown spectroscopy) system as a sample introduction device to a quadrupole mass spectrometer, the iCap Q from Thermofisher Scientific at the IAEA Environmental Samples Laboratory are explored. Preliminary results of the ongoing efforts for the

  12. Estimation of the Coefficient of Restitution of Rocking Systems by the Random Decrement Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Demosthenous, M.; Manos, G. C.

    The aim of this paper is to investigate the possibility of estimating an average damping parameter for a rocking system due to impact, the so-called coefficient of restitution, from the random response, i.e. when the loads are random and unknown, and the response is measured. The objective is to ...... of freedom system loaded by white noise, estimating the coefficient of restitution as explained, and comparing the estimates with the value used in the simulations. Several estimates for the coefficient of restitution are considered, and reasonable results are achieved....

  13. Sample-interpolation timing: an optimized technique for the digital measurement of time of flight for γ rays and neutrons at relatively low sampling rates

    International Nuclear Information System (INIS)

    Aspinall, M D; Joyce, M J; Mackin, R O; Jarrah, Z; Boston, A J; Nolan, P J; Peyton, A J; Hawkes, N P

    2009-01-01

    A unique, digital time pick-off method, known as sample-interpolation timing (SIT) is described. This method demonstrates the possibility of improved timing resolution for the digital measurement of time of flight compared with digital replica-analogue time pick-off methods for signals sampled at relatively low rates. Three analogue timing methods have been replicated in the digital domain (leading-edge, crossover and constant-fraction timing) for pulse data sampled at 8 GSa s −1 . Events arising from the 7 Li(p, n) 7 Be reaction have been detected with an EJ-301 organic liquid scintillator and recorded with a fast digital sampling oscilloscope. Sample-interpolation timing was developed solely for the digital domain and thus performs more efficiently on digital signals compared with analogue time pick-off methods replicated digitally, especially for fast signals that are sampled at rates that current affordable and portable devices can achieve. Sample interpolation can be applied to any analogue timing method replicated digitally and thus also has the potential to exploit the generic capabilities of analogue techniques with the benefits of operating in the digital domain. A threshold in sampling rate with respect to the signal pulse width is observed beyond which further improvements in timing resolution are not attained. This advance is relevant to many applications in which time-of-flight measurement is essential

  14. Determination of structure of oriented samples using two-dimensional solid state NMR techniques

    International Nuclear Information System (INIS)

    Jin Hong; Harbison, G.S.

    1990-01-01

    One dimensional and two-dimensional MAS techniques can give detailed information about the structure and dynamics of oriented systems. We describe the application of such techniques to the liquid-crystalline polymer poly(p-phenyleneterphtalimide) (PPTA), and thence deduce the solid-state structure of the material. (author). 9 refs.; 6 figs

  15. ANALYTICAL TECHNIQUES FOR THE DETERMINATION OF MELOXICAM IN PHARMACEUTICAL FORMULATIONS AND BIOLOGICAL SAMPLES

    Directory of Open Access Journals (Sweden)

    Aisha Noreen

    2016-06-01

    Full Text Available Meloxicam (MX belongs to the family of oxicams which is the most important group of non steroidal anti-inflammatory drugs (NSAIDs and is widely used for their analgesics and antipyretic activities. It inhibits both COX-I and COX-II enzymes with less gastric and local tissues irritation. A number of analytical techniques have been used for the determination of MX in pharmaceutical as well as in biological fluids. These techniques include titrimetry, spectrometry, chromatography, flow injection spectrometry, fluorescence spectrometry, capillary zone electrophoresis and electrochemical techniques. Many of these techniques have also been used for the simultaneous determination of MX with other compounds. A comprehensive review of these analytical techniques has been done which could be useful for the analytical chemists and quality control pharmacists.

  16. Improvement of Frequency Domain Output Only Modal Identification from the Application of the Random Decrement Technique

    DEFF Research Database (Denmark)

    Rodrigues, J.; Brincker, Rune; Andersen, P.

    2004-01-01

    This paper explores the idea of estimating the spectral densities as the Fourier transform of the random decrement functions for the application of frequency domain output-only modal identification methods. The gains in relation to the usual procedure of computing the spectral densities directly...

  17. Photon event distribution sampling: an image formation technique for scanning microscopes that permits tracking of sub-diffraction particles with high spatial and temporal resolutions.

    Science.gov (United States)

    Larkin, J D; Publicover, N G; Sutko, J L

    2011-01-01

    In photon event distribution sampling, an image formation technique for scanning microscopes, the maximum likelihood position of origin of each detected photon is acquired as a data set rather than binning photons in pixels. Subsequently, an intensity-related probability density function describing the uncertainty associated with the photon position measurement is applied to each position and individual photon intensity distributions are summed to form an image. Compared to pixel-based images, photon event distribution sampling images exhibit increased signal-to-noise and comparable spatial resolution. Photon event distribution sampling is superior to pixel-based image formation in recognizing the presence of structured (non-random) photon distributions at low photon counts and permits use of non-raster scanning patterns. A photon event distribution sampling based method for localizing single particles derived from a multi-variate normal distribution is more precise than statistical (Gaussian) fitting to pixel-based images. Using the multi-variate normal distribution method, non-raster scanning and a typical confocal microscope, localizations with 8 nm precision were achieved at 10 ms sampling rates with acquisition of ~200 photons per frame. Single nanometre precision was obtained with a greater number of photons per frame. In summary, photon event distribution sampling provides an efficient way to form images when low numbers of photons are involved and permits particle tracking with confocal point-scanning microscopes with nanometre precision deep within specimens. © 2010 The Authors Journal of Microscopy © 2010 The Royal Microscopical Society.

  18. Sample application of sensitivity/uncertainty analysis techniques to a groundwater transport problem. National Low-Level Waste Management Program

    International Nuclear Information System (INIS)

    Seitz, R.R.; Rood, A.S.; Harris, G.A.; Maheras, S.J.; Kotecki, M.

    1991-06-01

    The primary objective of this document is to provide sample applications of selected sensitivity and uncertainty analysis techniques within the context of the radiological performance assessment process. These applications were drawn from the companion document Guidelines for Sensitivity and Uncertainty Analyses of Low-Level Radioactive Waste Performance Assessment Computer Codes (S. Maheras and M. Kotecki, DOE/LLW-100, 1990). Three techniques are illustrated in this document: one-factor-at-a-time (OFAT) analysis, fractional factorial design, and Latin hypercube sampling. The report also illustrates the differences in sensitivity and uncertainty analysis at the early and latter stages of the performance assessment process, and potential pitfalls that can be encountered when applying the techniques. The emphasis is on application of the techniques as opposed to the actual results, since the results are hypothetical and are not based on site-specific conditions

  19. Random Walks on Directed Networks: Inference and Respondent-Driven Sampling

    Directory of Open Access Journals (Sweden)

    Malmros Jens

    2016-06-01

    Full Text Available Respondent-driven sampling (RDS is often used to estimate population properties (e.g., sexual risk behavior in hard-to-reach populations. In RDS, already sampled individuals recruit population members to the sample from their social contacts in an efficient snowball-like sampling procedure. By assuming a Markov model for the recruitment of individuals, asymptotically unbiased estimates of population characteristics can be obtained. Current RDS estimation methodology assumes that the social network is undirected, that is, all edges are reciprocal. However, empirical social networks in general also include a substantial number of nonreciprocal edges. In this article, we develop an estimation method for RDS in populations connected by social networks that include reciprocal and nonreciprocal edges. We derive estimators of the selection probabilities of individuals as a function of the number of outgoing edges of sampled individuals. The proposed estimators are evaluated on artificial and empirical networks and are shown to generally perform better than existing estimators. This is the case in particular when the fraction of directed edges in the network is large.

  20. Comparison of coarse coal dust sampling techniques in a laboratory-simulated longwall section.

    Science.gov (United States)

    Patts, Justin R; Barone, Teresa L

    2017-05-01

    Airborne coal dust generated during mining can deposit and accumulate on mine surfaces, presenting a dust explosion hazard. When assessing dust hazard mitigation strategies for airborne dust reduction, sampling is done in high-velocity ventilation air, which is used to purge the mining face and gallery tunnel. In this environment, the sampler inlet velocity should be matched to the air stream velocity (isokinetic sampling) to prevent oversampling of coarse dust at low sampler-to-air velocity ratios. Low velocity ratios are often encountered when using low flow rate, personal sampling pumps commonly used in underground mines. In this study, with a goal of employing mine-ready equipment, a personal sampler was adapted for area sampling of coarse coal dust in high-velocity ventilation air. This was done by adapting an isokinetic nozzle to the inlet of an Institute of Occupational Medicine (Edinburgh, Scotland) sampling cassette (IOM). Collected dust masses were compared for the modified IOM isokinetic sampler (IOM-MOD), the IOM without the isokinetic nozzle, and a conventional dust sampling cassette without the cyclone on the inlet. All samplers were operated at a flow rate typical of personal sampling pumps: 2 L/min. To ensure differences between collected masses that could be attributed to sampler design and were not influenced by artifacts from dust concentration gradients, relatively uniform and repeatable dust concentrations were demonstrated in the sampling zone of the National Institute for Occupational Safety and Health experimental mine gallery. Consistent with isokinetic theory, greater differences between isokinetic and non-isokinetic sampled masses were found for larger dust volume-size distributions and higher ventilation air velocities. Since isokinetic sampling is conventionally used to determine total dust concentration, and isokinetic sampling made a difference in collected masses, the results suggest when sampling for coarse coal dust the IOM-MOD may

  1. Random sampling of the Central European bat fauna reveals the existence of numerous hitherto unknown adenoviruses.

    Science.gov (United States)

    Vidovszky, Márton; Kohl, Claudia; Boldogh, Sándor; Görföl, Tamás; Wibbelt, Gudrun; Kurth, Andreas; Harrach, Balázs

    2015-12-01

    From over 1250 extant species of the order Chiroptera, 25 and 28 are known to occur in Germany and Hungary, respectively. Close to 350 samples originating from 28 bat species (17 from Germany, 27 from Hungary) were screened for the presence of adenoviruses (AdVs) using a nested PCR that targets the DNA polymerase gene of AdVs. An additional PCR was designed and applied to amplify a fragment from the gene encoding the IVa2 protein of mastadenoviruses. All German samples originated from organs of bats found moribund or dead. The Hungarian samples were excrements collected from colonies of known bat species, throat or rectal swab samples, taken from live individuals that had been captured for faunistic surveys and migration studies, as well as internal organs of dead specimens. Overall, 51 samples (14.73%) were found positive. We detected 28 seemingly novel and six previously described bat AdVs by sequencing the PCR products. The positivity rate was the highest among the guano samples of bat colonies. In phylogeny reconstructions, the AdVs detected in bats clustered roughly, but not perfectly, according to the hosts' families (Vespertilionidae, Rhinolophidae, Hipposideridae, Phyllostomidae and Pteropodidae). In a few cases, identical sequences were derived from animals of closely related species. On the other hand, some bat species proved to harbour more than one type of AdV. The high prevalence of infection and the large number of chiropteran species worldwide make us hypothesise that hundreds of different yet unknown AdV types might circulate in bats.

  2. Associations Among Religiousness and Community Volunteerism in National Random Samples of American Adults.

    Science.gov (United States)

    Haggard, Megan C; Kang, Linda L; Rowatt, Wade C; Shen, Megan Johnson

    2015-01-01

    The connection between religiousness and volunteering for the community can be explained through two distinct features of religion. First, religious organizations are social groups that encourage members to help others through planned opportunities. Second, helping others is regarded as an important value for members in religious organizations to uphold. We examined the relationship between religiousness and self-reported community volunteering in two independent national random surveys of American adults (i.e., the 2005 and 2007 waves of the Baylor Religion Survey). In both waves, frequency of religious service attendance was associated with an increase in likelihood that individuals would volunteer, whether through their religious organization or not, whereas frequency of reading sacred texts outside of religious services was associated with an increase in likelihood of volunteering only for or through their religious organization. The role of religion in community volunteering is discussed in light of these findings.

  3. Experimental-calculation technique for Ksub(IC) determination using the samples of decreased dimensions

    International Nuclear Information System (INIS)

    Vinokurov, V.A.; Dymshits, A.V.; Pirusskij, M.V.; Ovsyannikov, B.M.; Kononov, V.V.

    1981-01-01

    A possibility to decrease the size of samples, which is necessary for the reliable determination of fractUre toughness Ksub(1c), is established. The dependences of crack-resistance caracteristics on the sample dimensions are determined experimentally. The static bending tests are made using the 1251 model of ''Instron'' installation with a specially designed device. The samples of the 20KhNMF steel have been tested. It is shown that the Ksub(1c) value, determined for the samples with the largest netto cross section (50x100 rm), is considerably lower than Ksub(1c) values, determined for the samples with the decreased sizes. it is shown that the developed experimental-calculated method of Ksub(1c) determination can be practically used for the samples of the decreased sizes with the introduction of the corresponding amendment coefficient [ru

  4. Electromembrane extraction as a rapid and selective miniaturized sample preparation technique for biological fluids

    DEFF Research Database (Denmark)

    Gjelstad, Astrid; Pedersen-Bjergaard, Stig; Seip, Knut Fredrik

    2015-01-01

    This special report discusses the sample preparation method electromembrane extraction, which was introduced in 2006 as a rapid and selective miniaturized extraction method. The extraction principle is based on isolation of charged analytes extracted from an aqueous sample, across a thin film....... Technical aspects of electromembrane extraction, important extraction parameters as well as a handful of examples of applications from different biological samples and bioanalytical areas are discussed in the paper....

  5. Demonstrating Reliable High Level Waste Slurry Sampling Techniques to Support Hanford Waste Processing

    Energy Technology Data Exchange (ETDEWEB)

    Kelly, Steven E.

    2013-11-11

    The Hanford Tank Operations Contractor (TOC) and the Hanford Waste Treatment and Immobilization Plant (WTP) contractor are both engaged in demonstrating mixing, sampling, and transfer system capability using simulated Hanford High-Level Waste (HL W) formulations. This work represents one of the remaining technical issues with the high-level waste treatment mission at Hanford. The TOC must demonstrate the ability to adequately mix and sample high-level waste feed to meet the WTP Waste Acceptance Criteria and Data Quality Objectives. The sampling method employed must support both TOC and WTP requirements. To facilitate information transfer between the two facilities the mixing and sampling demonstrations are led by the One System Integrated Project Team. The One System team, Waste Feed Delivery Mixing and Sampling Program, has developed a full scale sampling loop to demonstrate sampler capability. This paper discusses the full scale sampling loops ability to meet precision and accuracy requirements, including lessons learned during testing. Results of the testing showed that the Isolok(R) sampler chosen for implementation provides precise, repeatable results. The Isolok(R) sampler accuracy as tested did not meet test success criteria. Review of test data and the test platform following testing by a sampling expert identified several issues regarding the sampler used to provide reference material used to judge the Isolok's accuracy. Recommendations were made to obtain new data to evaluate the sampler's accuracy utilizing a reference sampler that follows good sampling protocol.

  6. Demonstrating Reliable High Level Waste Slurry Sampling Techniques to Support Hanford Waste Processing

    International Nuclear Information System (INIS)

    Kelly, Steven E.

    2013-01-01

    The Hanford Tank Operations Contractor (TOC) and the Hanford Waste Treatment and Immobilization Plant (WTP) contractor are both engaged in demonstrating mixing, sampling, and transfer system capability using simulated Hanford High-Level Waste (HL W) formulations. This work represents one of the remaining technical issues with the high-level waste treatment mission at Hanford. The TOC must demonstrate the ability to adequately mix and sample high-level waste feed to meet the WTP Waste Acceptance Criteria and Data Quality Objectives. The sampling method employed must support both TOC and WTP requirements. To facilitate information transfer between the two facilities the mixing and sampling demonstrations are led by the One System Integrated Project Team. The One System team, Waste Feed Delivery Mixing and Sampling Program, has developed a full scale sampling loop to demonstrate sampler capability. This paper discusses the full scale sampling loops ability to meet precision and accuracy requirements, including lessons learned during testing. Results of the testing showed that the Isolok(R) sampler chosen for implementation provides precise, repeatable results. The Isolok(R) sampler accuracy as tested did not meet test success criteria. Review of test data and the test platform following testing by a sampling expert identified several issues regarding the sampler used to provide reference material used to judge the Isolok's accuracy. Recommendations were made to obtain new data to evaluate the sampler's accuracy utilizing a reference sampler that follows good sampling protocol

  7. Re-estimating sample size in cluster randomized trials with active recruitment within clusters

    NARCIS (Netherlands)

    van Schie, Sander; Moerbeek, Mirjam

    2014-01-01

    Often only a limited number of clusters can be obtained in cluster randomised trials, although many potential participants can be recruited within each cluster. Thus, active recruitment is feasible within the clusters. To obtain an efficient sample size in a cluster randomised trial, the cluster

  8. Computer simulation of stochastic processes through model-sampling (Monte Carlo) techniques.

    Science.gov (United States)

    Sheppard, C W.

    1969-03-01

    A simple Monte Carlo simulation program is outlined which can be used for the investigation of random-walk problems, for example in diffusion, or the movement of tracers in the blood circulation. The results given by the simulation are compared with those predicted by well-established theory, and it is shown how the model can be expanded to deal with drift, and with reflexion from or adsorption at a boundary.

  9. FPRandom: Randomizing core browser objects to break advanced device fingerprinting techniques

    OpenAIRE

    Laperdrix , Pierre; Baudry , Benoit; Mishra , Vikas

    2017-01-01

    International audience; The rich programming interfaces (APIs) provided by web browsers can be diverted to collect a browser fingerprint. A small number of queries on these interfaces are sufficient to build a fingerprint that is statistically unique and very stable over time. Consequently, the fingerprint can be used to track users. Our work aims at mitigating the risk of browser fingerprinting for users privacy by 'breaking' the stability of a fingerprint over time. We add randomness in the...

  10. A unified development of several techniques for the representation of random vectors and data sets

    Science.gov (United States)

    Bundick, W. T.

    1973-01-01

    Linear vector space theory is used to develop a general representation of a set of data vectors or random vectors by linear combinations of orthonormal vectors such that the mean squared error of the representation is minimized. The orthonormal vectors are shown to be the eigenvectors of an operator. The general representation is applied to several specific problems involving the use of the Karhunen-Loeve expansion, principal component analysis, and empirical orthogonal functions; and the common properties of these representations are developed.

  11. Behavioural sampling techniques and activity pattern of Indian Pangolin Manis crassicaudata (Mammalia: Manidae in captivity

    Directory of Open Access Journals (Sweden)

    R.K. Mohapatra

    2013-12-01

    Full Text Available The study presents data on six Indian Pangolins Manis crassicaudata observed in captivity at the Pangolin Conservation Breeding Centre, Nandankanan, Odisha, India over 1377 hours of video recordings for each pangolin between 1500hr and 0800hr on 81 consecutive observational days. Video recordings were made through digital systems assisted by infrared enabled CCTV cameras. The data highlights patterns relate to 12 different behaviour and enclosure utilization. Different interval periods for sampling of instantaneous behaviour from video recordings have been evaluated to develop optimal study methods for the future. The activity budgets of pangolins displayed natural patterns of nocturnal activity with a peak between 20:00-21:00 hr. When out of their burrow, they spent about 59% of the time walking in the enclosure, and 14% of the time feeding. The repeatability of the behaviours has a significant negative correlation with the mean time spent in that behaviour. Focal behavioural samples significantly correlated with instantaneous samples up to 15 minutes interval. The correlation values gradually decreased with the increase in sampling interval. The results indicate that results obtained from focal sampling and instantaneous sampling with relatively shorter intervals (=5 minutes are about equally reliable. The study suggests use of focal sampling, instead of instantaneous sampling to record behaviour relating to social interactions.

  12. Analysis of techniques of sample attack for soil and mineral analysis

    International Nuclear Information System (INIS)

    Dean, J.R.; Chiu, N.W.

    1985-05-01

    Four methods of sample attack were evaluated in the laboratory for use in the determination of uranium, radium-226, thorium-232, thorium-230, thorium-228, and lead-210. The methods evaluated were (1) KF/pyrosulfate fusion; (2) Sodium carbonate fusion; (3) Nitric, perchloric, hydrofluoric acid digestion; and, (4) combination nitric, perchloric, hydrofluoric acid/pyrosulfate fusion. Five samples were chosen for evaluation; two were mine tailings from Bancroft, Ontario and Beaverlodge, Saskatchewan, one was a synthetic uranium ore-silica mixture and two were soil samples supplied by AECB. The KF/pyrosulfate dissolution procedure was found to be the fastest and, overall, most accurate dissolution method for the analysis of 1-20 samples. For larger numbers of samples the three acid/pyrosulfate fusion combination was shown to have some merit

  13. Uranium, radium and radon exhalation study in some soil samples using track etch technique

    International Nuclear Information System (INIS)

    Harmanjit, Singh; Joga, Singh; Surinder, Singh; Bajwa, B.S.

    2006-01-01

    Full text of publication follows: Uranium, radium concentration and radon exhalation rates have been determined in the soil samples collected from some areas of Punjab using the L.R.-115 nuclear track detectors. Radium concentration in these samples has been found to be varying from 0.80 to 5.34 Bq Kg-1. The radon exhalation rate in these samples has been found to be varying from 0.99 to 6.60 mBq Kg -1 h -1 (32.82 to 218.49 mBqm -2 h -1 ). A good correlation has been observed between radon exhalation rate and radium concentration observed in the soil samples. The uranium concentration in all these samples is being carried out and the other correlations will also be established. (authors)

  14. Evaluation of oxidation techniques for preparing bioassay and environmental samples for liquid scintillation counting

    International Nuclear Information System (INIS)

    Miller, H.H.

    1979-10-01

    In environmental and biological monitoring for carbon-14 and tritium, the presence of color and chemical quenching agents in the samples can degrade the efficiency of liquid scintillation counting. A series of experiments was performed to evaluate the usefulness, under routine conditions, of first oxidizing the samples to improve the counting by removing the color and quenching agents. The scintillation counter was calibrated for the effects of quenching agents on its counting efficiency. Oxidizing apparatus was tested for its ability to accurately recover the 14 C and 3 H in the samples. Scintillation counting efficiences were compared for a variety of oxidized and unoxidized environmental and bioassay samples. The overall conclusion was that, for routine counting, oxidation of such samples is advantageous when they are highly quenched or in solid form

  15. Improving oral hygiene skills by computer-based training: a randomized controlled comparison of the modified Bass and the Fones techniques.

    Directory of Open Access Journals (Sweden)

    Daniela Harnacke

    Full Text Available BACKGROUND: Gingivitis and other plaque-associated diseases have a high prevalence in western communities even though the majority of adults report daily oral hygiene. This indicates a lack of oral hygiene skills. Currently, there is no clear evidence as to which brushing technique would bring about the best oral hygiene skills. While the modified Bass technique is often recommended by dentists and in textbooks, the Fones technique is often recommended in patient brochures. Still, standardized comparisons of the effectiveness of teaching these techniques are lacking. METHODOLOGY/PRINCIPAL FINDINGS: In a final sample of n = 56 students, this multidisciplinary, randomized, examiner-blinded, controlled study compared the effects of parallel and standardized interactive computer presentations teaching either the Fones or the modified Bass technique. A control group was taught the basics of tooth brushing alone. Oral hygiene skills (remaining plaque after thorough oral hygiene and gingivitis were assessed at baseline and 6, 12, and 28 weeks after the intervention. We found a significant group×time interaction for gingivitis (F(4/102 = 3.267; p = 0.016; ε = 0.957; η(2 = 0.114 and a significant main effect of group for oral hygiene skills (F(2/51 = 7.088; p = 0.002; η(2 = 0.218. Fones was superior to Bass; Bass did not differ from the control group. Group differences were most prominent after 6 and 12 weeks. CONCLUSIONS/SIGNIFICANCE: The present trial indicates an advantage of teaching the Fones as compared to the modified Bass technique with respect to oral hygiene skills and gingivitis. Future studies are needed to analyze whether the disadvantage of teaching the Bass technique observed here is restricted to the teaching method employed. TRIAL REGISTRATION: German Clinical Trials Register DRKS00003488.

  16. Percutaneous CT-guided sacroiliac joint sampling for infection: aspiration, biopsy, and technique.

    Science.gov (United States)

    Knipp, David; Simeone, F Joseph; Nelson, Sandra B; Huang, Ambrose J; Chang, Connie Y

    2018-04-01

    To evaluate methods of CT-guided sacroiliac joint sampling in patients with suspected infection. All CT-guided sacroiliac joint sampling procedures for suspected infection were reviewed for sampling type (aspiration, lavage aspiration, biopsy), microbiology results, and clinical and imaging follow-up. The primary gold standard was anatomic pathology. If pathology was not available, then positive blood culture with the same organism as SIJ sampling, imaging and clinical follow-up, or clinical follow-up only were used. Anterior and posterior joint distention was evaluated by MRI within 7 days of the procedure. A total of 34 patients (age 39 ± 20 (range, 6-75) years; 21 F, 13 M) were included. Aspiration samples only were obtained in 13/34 (38%) cases, biopsy samples only in 9/34 (26%) cases, and both samples in 12/34 (35%) cases. There was an overall 54% sensitivity and 86% specificity. For the aspiration samples, sensitivity and specificity were 60 and 81%, respectively, compared to 45 and 90% for the biopsy samples. In cases with both samples, biopsy did not add additional microbial information. Seventeen (17/34, 50%) patients had an MRI. The anterior joint was more distended than the posterior joint in 15/17 (88%) of patients, and this difference was significant (P = 0.0003). All of these 17 patients had an attempted aspiration by a posterior approach; 6/17 (35%) resulted in a successful aspiration. Aspiration of the sacroiliac joint has a higher sensitivity than biopsy and should always be attempted first. MRI may be helpful for procedure planning.

  17. Percutaneous CT-guided sacroiliac joint sampling for infection. Aspiration, biopsy, and technique

    Energy Technology Data Exchange (ETDEWEB)

    Knipp, David; Simeone, F.J.; Huang, Ambrose J.; Chang, Connie Y. [Massachusetts General Hospital, Division of Musculoskeletal Imaging and Intervention, Department of Radiology, Boston, MA (United States); Nelson, Sandra B. [Massachusetts General Hospital, Infectious Disease Unit, Department of Medicine, Boston, MA (United States)

    2018-04-15

    To evaluate methods of CT-guided sacroiliac joint sampling in patients with suspected infection. All CT-guided sacroiliac joint sampling procedures for suspected infection were reviewed for sampling type (aspiration, lavage aspiration, biopsy), microbiology results, and clinical and imaging follow-up. The primary gold standard was anatomic pathology. If pathology was not available, then positive blood culture with the same organism as SIJ sampling, imaging and clinical follow-up, or clinical follow-up only were used. Anterior and posterior joint distention was evaluated by MRI within 7 days of the procedure. A total of 34 patients (age 39 ± 20 (range, 6-75) years; 21 F, 13 M) were included. Aspiration samples only were obtained in 13/34 (38%) cases, biopsy samples only in 9/34 (26%) cases, and both samples in 12/34 (35%) cases. There was an overall 54% sensitivity and 86% specificity. For the aspiration samples, sensitivity and specificity were 60 and 81%, respectively, compared to 45 and 90% for the biopsy samples. In cases with both samples, biopsy did not add additional microbial information. Seventeen (17/34, 50%) patients had an MRI. The anterior joint was more distended than the posterior joint in 15/17 (88%) of patients, and this difference was significant (P = 0.0003). All of these 17 patients had an attempted aspiration by a posterior approach; 6/17 (35%) resulted in a successful aspiration. Aspiration of the sacroiliac joint has a higher sensitivity than biopsy and should always be attempted first. MRI may be helpful for procedure planning. (orig.)

  18. Random Evolutionary Dynamics Driven by Fitness and House-of-Cards Mutations: Sampling Formulae

    Science.gov (United States)

    Huillet, Thierry E.

    2017-07-01

    We first revisit the multi-allelic mutation-fitness balance problem, especially when mutations obey a house of cards condition, where the discrete-time deterministic evolutionary dynamics of the allelic frequencies derives from a Shahshahani potential. We then consider multi-allelic Wright-Fisher stochastic models whose deviation to neutrality is from the Shahshahani mutation/selection potential. We next focus on the weak selection, weak mutation cases and, making use of a Gamma calculus, we compute the normalizing partition functions of the invariant probability densities appearing in their Wright-Fisher diffusive approximations. Using these results, generalized Ewens sampling formulae (ESF) from the equilibrium distributions are derived. We start treating the ESF in the mixed mutation/selection potential case and then we restrict ourselves to the ESF in the simpler house-of-cards mutations only situation. We also address some issues concerning sampling problems from infinitely-many alleles weak limits.

  19. Dorsal onlay (Barbagli technique) versus dorsal inlay (Asopa technique) buccal mucosal graft urethroplasty for anterior urethral stricture: a prospective randomized study.

    Science.gov (United States)

    Aldaqadossi, Hussein; El Gamal, Samir; El-Nadey, Mohamed; El Gamal, Osama; Radwan, Mohamed; Gaber, Mohamed

    2014-02-01

    To compare both the dorsal onlay technique of Barbagli and the dorsal inlay technique of Asopa for the management of long anterior urethral stricture. From January 2010 to May 2012, a total of 47 patients with long anterior urethral strictures were randomized into two groups. The first group included 25 patients who were managed by dorsal onlay buccal mucosal graft urethroplasty. The second group included 22 patients who were managed by dorsal inlay buccal mucosal graft urethroplasty. Different clinical parameters, postoperative complications and success rates were compared between both groups. The overall success rate in the dorsal onlay group was 88%, whereas in the dorsal inlay group the success rate was 86.4% during the follow-up period. The mean operative time was significantly longer in the dorsal onlay urethroplasty group (205 ± 19.63 min) than in the dorsal inlay urethroplasty group (128 ± 4.9 min, P-value <0.0001). The average blood loss was significantly higher in the dorsal onlay urethroplasty group (228 ± 5.32 mL) than in the dorsal inlay urethroplasty group (105 ± 12.05 mL, P-value <0.0001). The dorsal onlay technique of Barbagli and the dorsal inlay technique of Asopa buccal mucosal graft urethroplasty provide similar success rates. The Asopa technique is easy to carry out, provides shorter operative time and less blood loss, and it is associated with fewer complications for anterior urethral stricture repair. © 2013 The Japanese Urological Association.

  20. New Equipment and Techniques for Remote Sampling of Stored Radioactive Waste

    International Nuclear Information System (INIS)

    Nance, T.A.

    2001-01-01

    Radioactive waste is stored at the Savannah River Site (SRS), part of the Department of Energy (DOE) complex. This radioactive waste is stored in buried tanks and management of the waste requires several processes, including material addition, heating, cooling, mixing, and transfer from tank to tank. During waste processing, it is necessary to know the chemical components and their characteristics to determine the steps necessary to maintain the waste form or to manipulate the waste into the form desired. Waste characterization begins by obtaining a sample for analysis. High level radioactive waste sampling is routinely done with simple, standard samplers such as a dip sample. Other sampling is non-routine or specialized, with unique, special requirements, such as sampling remote areas that are difficult to reach. Other specialized sampling includes sampling materials with unknown characteristics or material that must be gathered to obtain an adequate sample or materials that must be broken up to sample or forcibly separated from the tank. The samplers described in this paper are specialized samplers. These samplers include the Dip Filter Sampler, Soft Core Sampler, Hard Core Sampler, Circle Scrape Sampler, Small Scrape Sampler, Suction and Strain Sampler, and Vial Snapper Sampler. The Dip Filter Sampler is used to sample floating particulate matter or floating organic matter. The Soft Core Sampler and Hard Core Sampler are used to obtain samples of solids from the tank floor. The Soft Core Sampler is used on soft solids such as sludge and saltcake and the Hard Core Sampler on hardened solid deposits. The Circle Scrape Sampler is used to obtain solid samples through a small entry riser and out from under the riser. The Small Scrape Sampler enters a small entry riser and is used to scrape a sample from the tank wall. The Suction and Strain Sampler is used to gather a remote submerged sample or filter a solid sample from supernate. The Vial Snapper Grab Sampler is

  1. Seroincidence of non-typhoid Salmonella infections: convenience vs. random community-based sampling.

    Science.gov (United States)

    Emborg, H-D; Simonsen, J; Jørgensen, C S; Harritshøj, L H; Krogfelt, K A; Linneberg, A; Mølbak, K

    2016-01-01

    The incidence of reported infections of non-typhoid Salmonella is affected by biases inherent to passive laboratory surveillance, whereas analysis of blood sera may provide a less biased alternative to estimate the force of Salmonella transmission in humans. We developed a mathematical model that enabled a back-calculation of the annual seroincidence of Salmonella based on measurements of specific antibodies. The aim of the present study was to determine the seroincidence in two convenience samples from 2012 (Danish blood donors, n = 500, and pregnant women, n = 637) and a community-based sample of healthy individuals from 2006 to 2007 (n = 1780). The lowest antibody levels were measured in the samples from the community cohort and the highest in pregnant women. The annual Salmonella seroincidences were 319 infections/1000 pregnant women [90% credibility interval (CrI) 210-441], 182/1000 in blood donors (90% CrI 85-298) and 77/1000 in the community cohort (90% CrI 45-114). Although the differences between study populations decreased when accounting for different age distributions the estimates depend on the study population. It is important to be aware of this issue and define a certain population under surveillance in order to obtain consistent results in an application of serological measures for public health purposes.

  2. A sero-survey of rinderpest in nomadic pastoral systems in central and southern Somalia from 2002 to 2003, using a spatially integrated random sampling approach.

    Science.gov (United States)

    Tempia, S; Salman, M D; Keefe, T; Morley, P; Freier, J E; DeMartini, J C; Wamwayi, H M; Njeumi, F; Soumaré, B; Abdi, A M

    2010-12-01

    A cross-sectional sero-survey, using a two-stage cluster sampling design, was conducted between 2002 and 2003 in ten administrative regions of central and southern Somalia, to estimate the seroprevalence and geographic distribution of rinderpest (RP) in the study area, as well as to identify potential risk factors for the observed seroprevalence distribution. The study was also used to test the feasibility of the spatially integrated investigation technique in nomadic and semi-nomadic pastoral systems. In the absence of a systematic list of livestock holdings, the primary sampling units were selected by generating random map coordinates. A total of 9,216 serum samples were collected from cattle aged 12 to 36 months at 562 sampling sites. Two apparent clusters of RP seroprevalence were detected. Four potential risk factors associated with the observed seroprevalence were identified: the mobility of cattle herds, the cattle population density, the proximity of cattle herds to cattle trade routes and cattle herd size. Risk maps were then generated to assist in designing more targeted surveillance strategies. The observed seroprevalence in these areas declined over time. In subsequent years, similar seroprevalence studies in neighbouring areas of Kenya and Ethiopia also showed a very low seroprevalence of RP or the absence of antibodies against RP. The progressive decline in RP antibody prevalence is consistent with virus extinction. Verification of freedom from RP infection in the Somali ecosystem is currently in progress.

  3. Self-reference and random sampling approach for label-free identification of DNA composition using plasmonic nanomaterials.

    Science.gov (United States)

    Freeman, Lindsay M; Pang, Lin; Fainman, Yeshaiahu

    2018-05-09

    The analysis of DNA has led to revolutionary advancements in the fields of medical diagnostics, genomics, prenatal screening, and forensic science, with the global DNA testing market expected to reach revenues of USD 10.04 billion per year by 2020. However, the current methods for DNA analysis remain dependent on the necessity for fluorophores or conjugated proteins, leading to high costs associated with consumable materials and manual labor. Here, we demonstrate a potential label-free DNA composition detection method using surface-enhanced Raman spectroscopy (SERS) in which we identify the composition of cytosine and adenine within single strands of DNA. This approach depends on the fact that there is one phosphate backbone per nucleotide, which we use as a reference to compensate for systematic measurement variations. We utilize plasmonic nanomaterials with random Raman sampling to perform label-free detection of the nucleotide composition within DNA strands, generating a calibration curve from standard samples of DNA and demonstrating the capability of resolving the nucleotide composition. The work represents an innovative way for detection of the DNA composition within DNA strands without the necessity of attached labels, offering a highly sensitive and reproducible method that factors in random sampling to minimize error.

  4. The quality of the reported sample size calculations in randomized controlled trials indexed in PubMed.

    Science.gov (United States)

    Lee, Paul H; Tse, Andy C Y

    2017-05-01

    There are limited data on the quality of reporting of information essential for replication of the calculation as well as the accuracy of the sample size calculation. We examine the current quality of reporting of the sample size calculation in randomized controlled trials (RCTs) published in PubMed and to examine the variation in reporting across study design, study characteristics, and journal impact factor. We also reviewed the targeted sample size reported in trial registries. We reviewed and analyzed all RCTs published in December 2014 with journals indexed in PubMed. The 2014 Impact Factors for the journals were used as proxies for their quality. Of the 451 analyzed papers, 58.1% reported an a priori sample size calculation. Nearly all papers provided the level of significance (97.7%) and desired power (96.6%), and most of the papers reported the minimum clinically important effect size (73.3%). The median (inter-quartile range) of the percentage difference of the reported and calculated sample size calculation was 0.0% (IQR -4.6%;3.0%). The accuracy of the reported sample size was better for studies published in journals that endorsed the CONSORT statement and journals with an impact factor. A total of 98 papers had provided targeted sample size on trial registries and about two-third of these papers (n=62) reported sample size calculation, but only 25 (40.3%) had no discrepancy with the reported number in the trial registries. The reporting of the sample size calculation in RCTs published in PubMed-indexed journals and trial registries were poor. The CONSORT statement should be more widely endorsed. Copyright © 2016 European Federation of Internal Medicine. Published by Elsevier B.V. All rights reserved.

  5. Combined-stent covered technique and single covered stent technique for transjugular intrahepatic porto-systemic shunt: a prospective randomized controlled study

    International Nuclear Information System (INIS)

    Wang Changming; Li Xuan; Fu Jun; Lu Xianjun; Luan Jingyuan; Li Tianrun; Zhao Jun; Dong Guoxiang

    2014-01-01

    Objective: To compare the technique of combined stents with that of single stent-graft for the construction of transjugular intrahepatic porto-systemic shunt (TIPS) and to discuss their clinical outcomes. Methods: During the period from April 2011 to Dec. 2012, a total of 30 patients with upper gastrointestinal bleeding due to portal hypertension were admitted to the hospital. TIPS procedure was carried out in all the 30 patients. The patients were randomly allocated into either combined-stent group (n=17) or stent-graft group (control group, n=13) on the basis of a computer-generated randomization sequence. The PSC quality control, the patency rate of portal vein branches and the shunts, the incidence of encephalopathy and the survival rate of the two groups were documented and analyzed. Results: Technical success rate was 100%. After the treatment, in both groups the PSG became significantly lower than the PSG determined before the treatment (P<0.000). According to quality control chart, the of combined stents was superior to stent-graft technique in effectively controlling PSG. Five days after TIPS, the ammonia level in the combined- stent group was significantly decreased (P=0.029), while in the control group the ammonia level showed no significant changes (P=0.065). One patient died shortly after TIPS. The median follow- up time was 181 days. During the follow-up period death occurred in 3 cases in each group. The difference in the survival rate between the two groups was no significant (P=0.906). Of the three patients who developed encephalopathy, two were in the combined-stent group and one was in the control group. After medication the symptoms of encephalopathy were relieved. The patency rate of the shunts was 100% and the re-bleeding rate was 0% for both groups. The patency rate of portal vein branches was significantly higher in the combined-stent group than that in the control group (P=0.039). Conclusion: For the construction of TIPS, the technique of

  6. Active Learning Not Associated with Student Learning in a Random Sample of College Biology Courses

    Science.gov (United States)

    Andrews, T. M.; Leonard, M. J.; Colgrove, C. A.; Kalinowski, S. T.

    2011-01-01

    Previous research has suggested that adding active learning to traditional college science lectures substantially improves student learning. However, this research predominantly studied courses taught by science education researchers, who are likely to have exceptional teaching expertise. The present study investigated introductory biology courses randomly selected from a list of prominent colleges and universities to include instructors representing a broader population. We examined the relationship between active learning and student learning in the subject area of natural selection. We found no association between student learning gains and the use of active-learning instruction. Although active learning has the potential to substantially improve student learning, this research suggests that active learning, as used by typical college biology instructors, is not associated with greater learning gains. We contend that most instructors lack the rich and nuanced understanding of teaching and learning that science education researchers have developed. Therefore, active learning as designed and implemented by typical college biology instructors may superficially resemble active learning used by education researchers, but lacks the constructivist elements necessary for improving learning. PMID:22135373

  7. Advanced sampling techniques for hand-held FT-IR instrumentation

    Science.gov (United States)

    Arnó, Josep; Frunzi, Michael; Weber, Chris; Levy, Dustin

    2013-05-01

    FT-IR spectroscopy is the technology of choice to identify solid and liquid phase unknown samples. The challenging ConOps in emergency response and military field applications require a significant redesign of the stationary FT-IR bench-top instruments typically used in laboratories. Specifically, field portable units require high levels of resistance against mechanical shock and chemical attack, ease of use in restrictive gear, extreme reliability, quick and easy interpretation of results, and reduced size. In the last 20 years, FT-IR instruments have been re-engineered to fit in small suitcases for field portable use and recently further miniaturized for handheld operation. This article introduces the HazMatID™ Elite, a FT-IR instrument designed to balance the portability advantages of a handheld device with the performance challenges associated with miniaturization. In this paper, special focus will be given to the HazMatID Elite's sampling interfaces optimized to collect and interrogate different types of samples: accumulated material using the on-board ATR press, dispersed powders using the ClearSampler™ tool, and the touch-to-sample sensor for direct liquid sampling. The application of the novel sample swipe accessory (ClearSampler) to collect material from surfaces will be discussed in some detail. The accessory was tested and evaluated for the detection of explosive residues before and after detonation. Experimental results derived from these investigations will be described in an effort to outline the advantages of this technology over existing sampling methods.

  8. Method and apparstus for determining random coincidence count rate in a scintillation counter utilizing the coincidence technique

    International Nuclear Information System (INIS)

    Horrocks, D.L.

    1980-01-01

    A method and apparatus for the reliable determination of a random coincidence count attributable to chance coincidences of single-photon events which are each detected in only a single detector of a scintillation counter utilizing two detectors in a coincidence counting technique are described. A firstdelay device is employed to delay output pulses from one detector, and then the delayed signal is compared with the undelayed signal from the other detector in a coincidence circuit, to obtain an approximate random coincidence count. The output of the coincidence circuit is applied to an anti-coincidence circuit, where it is corrected by elimination of pulses coincident with, and attributable to, conventionally detected real coincidences, and by elimination of pulses coincident with, and attributable to, real coincidences that have been delayed by a second delay device having the same time parameter as the first. 8 claims

  9. Effect of DNA extraction methods and sampling techniques on the apparent structure of cow and sheep rumen microbial communities.

    Directory of Open Access Journals (Sweden)

    Gemma Henderson

    Full Text Available Molecular microbial ecology techniques are widely used to study the composition of the rumen microbiota and to increase understanding of the roles they play. Therefore, sampling and DNA extraction methods that result in adequate yields of microbial DNA that also accurately represents the microbial community are crucial. Fifteen different methods were used to extract DNA from cow and sheep rumen samples. The DNA yield and quality, and its suitability for downstream PCR amplifications varied considerably, depending on the DNA extraction method used. DNA extracts from nine extraction methods that passed these first quality criteria were evaluated further by quantitative PCR enumeration of microbial marker loci. Absolute microbial numbers, determined on the same rumen samples, differed by more than 100-fold, depending on the DNA extraction method used. The apparent compositions of the archaeal, bacterial, ciliate protozoal, and fungal communities in identical rumen samples were assessed using 454 Titanium pyrosequencing. Significant differences in microbial community composition were observed between extraction methods, for example in the relative abundances of members of the phyla Bacteroidetes and Firmicutes. Microbial communities in parallel samples collected from cows by oral stomach-tubing or through a rumen fistula, and in liquid and solid rumen digesta fractions, were compared using one of the DNA extraction methods. Community representations were generally similar, regardless of the rumen sampling technique used, but significant differences in the abundances of some microbial taxa such as the Clostridiales and the Methanobrevibacter ruminantium clade were observed. The apparent microbial community composition differed between rumen sample fractions, and Prevotellaceae were most abundant in the liquid fraction. DNA extraction methods that involved phenol-chloroform extraction and mechanical lysis steps tended to be more comparable. However

  10. Evaluation of alternative macroinvertebrate sampling techniques for use in a new tropical freshwater bioassessment scheme

    Directory of Open Access Journals (Sweden)

    Isabel Eleanor Moore

    2015-06-01

    Full Text Available Aim: The study aimed to determine the effectiveness of benthic macroinvertebrate dredge net sampling procedures as an alternative method to kick net sampling in tropical freshwater systems, specifically as an evaluation of sampling methods used in the Zambian Invertebrate Scoring System (ZISS river bioassessment scheme. Tropical freshwater ecosystems are sometimes dangerous or inaccessible to sampling teams using traditional kick-sampling methods, so identifying an alternative procedure that produces similar results is necessary in order to collect data from a wide variety of habitats.MethodsBoth kick and dredge nets were used to collect macroinvertebrate samples at 16 riverine sites in Zambia, ranging from backwaters and floodplain lagoons to fast flowing streams and rivers. The data were used to calculate ZISS, diversity (S: number of taxa present, and Average Score Per Taxon (ASPT scores per site, using the two sampling methods to compare their sampling effectiveness. Environmental parameters, namely pH, conductivity, underwater photosynthetically active radiation (PAR, temperature, alkalinity, flow, and altitude, were also recorded and used in statistical analysis. Invertebrate communities present at the sample sites were determined using multivariate procedures.ResultsAnalysis of the invertebrate community and environmental data suggested that the testing exercise was undertaken in four distinct macroinvertebrate community types, supporting at least two quite different macroinvertebrate assemblages, and showing significant differences in habitat conditions. Significant correlations were found for all three bioassessment score variables between results acquired using the two methods, with dredge-sampling normally producing lower scores than did the kick net procedures. Linear regression models were produced in order to correct each biological variable score collected by a dredge net to a score similar to that of one collected by kick net

  11. Measuring gas-residence times in large municipal incinerators, by means of a pseudo-random binary signal tracer technique

    International Nuclear Information System (INIS)

    Nasserzadeh, V.; Swithenbank, J.; Jones, B.

    1995-01-01

    The problem of measuring gas-residence time in large incinerators was studied by the pseudo-random binary sequence (PRBS) stimulus tracer response technique at the Sheffield municipal solid-waste incinerator (35 MW plant). The steady-state system was disturbed by the superimposition of small fluctuations in the form of a pseudo-random binary sequence of methane pulses, and the response of the incinerator was determined from the CO 2 concentration in flue gases at the boiler exit, measured with a specially developed optical gas analyser with a high-frequency response. For data acquisition, an on-line PC computer was used together with the LAB Windows software system; the output response was then cross-correlated with the perturbation signal to give the impulse response of the incinerator. There was very good agreement between the gas-residence time for the Sheffield MSW incinerator as calculated by computational fluid dynamics (FLUENT Model) and gas-residence time at the plant as measured by the PRBS tracer technique. The results obtained from this research programme clearly demonstrate that the PRBS stimulus tracer response technique can be successfully and economically used to measure gas-residence times in large incinerator plants. It also suggests that the common commercial practice of characterising the incinerator operation by a single-residence-time parameter may lead to a misrepresentation of the complexities involved in describing the operation of the incineration system. (author)

  12. Dosimetric characterization of BeO samples in alpha, beta and X radiation beams using luminescent techniques

    International Nuclear Information System (INIS)

    Groppo, Daniela Piai

    2013-01-01

    In the medical field, the ionizing radiation is used both for therapeutic and diagnostic purposes, in a wide range of radiation doses. In order to ensure that the objective is achieved in practice, detailed studies of detectors and devices in different types of radiations beams are necessary. In this work a dosimetric characterization of BeO samples was performed using the techniques of thermoluminescence (TL) and optically stimulated luminescence (OSL) by a comparison of their response for alpha, beta and X radiations and the establishment of an appropriated system for use in monitoring of these radiations beams. The main results are: the high sensitivity to beta radiation for both techniques, good reproducibility of TL and OSL response (coefficients of variation lower than 5%), maximum energy dependence of the X radiation of 28% for the TL technique, and only 7% for the OSL technique, within the studied energy range. The dosimetric characteristics obtained in this work show the possibility of applying BeO samples to dosimetry of alpha, beta and X radiations, considering the studied dose ranges, using the TL and OSL techniques. From the results obtained, the samples of BeO showed their potential use for beam dosimetry in diagnostic radiology and radiotherapy. (author)

  13. An evaluation of sampling methods and supporting techniques for tackling lead in drinking water in Aberta Province

    Science.gov (United States)

    A collaborative project commenced in August 2013 with the aim of demonstrating a range of techniques that can be used in tackling the problems of lead in drinking water. The main project was completed in March 2014, with supplementary sampling exercises in mid-2014. It involved t...

  14. The alkali halide disk technique in infra-red spectrometry : Anomalous behaviour of some samples dispersed in alkali halide disks

    NARCIS (Netherlands)

    Tolk, A.

    1961-01-01

    Some difficulties encountered in the application of the alkali halide disk technique in infra-red spectrometry are discussed. Complications due to interaction of the sample with the alkali halide have been studied experimentally. It was found that the anomalous behaviour of benzoic acid, succinic

  15. Comparative Study of Radon Concentration with Two Techniques and Elemental Analysis in Drinking Water Samples of the Jammu District, Jammu and Kashmir, India.

    Science.gov (United States)

    Kumar, Ajay; Kaur, Manpreet; Mehra, Rohit; Sharma, Dinesh Kumar; Mishra, Rosaline

    2017-10-01

    The level of radon concentration has been assessed using the Advanced SMART RnDuo technique in 30 drinking water samples from Jammu district, Jammu and Kashmir, India. The water samples were collected from wells, hand pumps, submersible pumps, and stored waters. The randomly obtained 14 values of radon concentration in water sources using the SMART RnDuo technique have been compared and cross checked by a RAD7 device. A good positive correlation (R = 0.88) has been observed between the two techniques. The overall value of radon concentration in various water sources has ranged from 2.45 to 18.43 Bq L, with a mean value of 8.24 ± 4.04 Bq L, and it agreed well with the recommended limit suggested by the European Commission and UNSCEAR. However, the higher activity of mean radon concentration was found in groundwater drawn from well, hand and submersible pumps as compared to stored water. The total annual effective dose due to radon inhalation and ingestion ranged from 6.69 to 50.31 μSv y with a mean value of 22.48 ± 11.03 μSv y. The total annual effective dose was found to lie within the safe limit (100 μSv y) suggested by WHO. Heavy metal analysis was also carried out in various water sources by using an atomic absorption spectrophotometer (AAS), and the highest value of heavy metals was found mostly in groundwater samples. The obtained results were compared with Indian and International organizations like WHO and the EU Council. Among all the samples, the elemental analysis is not on the exceeding side of the permissible limit.

  16. Green Aspects of Techniques for the Determination of Currently Used Pesticides in Environmental Samples

    OpenAIRE

    Maciej Tankiewicz; Jacek Namieśnik; Jolanta Stocka; Marek Biziuk

    2011-01-01

    Pesticides are among the most dangerous environmental pollutants because of their stability, mobility and long-term effects on living organisms. Their presence in the environment is a particular danger. It is therefore crucial to monitor pesticide residues using all available analytical methods. The analysis of environmental samples for the presence of pesticides is very difficult: the processes involved in sample preparation are labor-intensive and time-consuming. To date, it has been standa...

  17. Sample Preparation for Determination of Biological Thiols by Liquid Chromatography and Electromigration Techniques

    OpenAIRE

    Bald, Edward

    2004-01-01

    Wydrukowano z dostarczonych Wydawnictwu UŁ gotowych materiałów Majority of the bioanalytical or environmental methods do not use just one chromatografie or electrophoretic step, but rather involve several sample pretreatment steps which simplfy the matrix, and often preconcentrate and chemically modify the analytes. This work surveys typical procedures for sample preparation for most commonly analyzed biofluids with particular emphasis placed on chemical derivatization of su...

  18. A novel fluorescent in situ hybridization technique for detection of Rickettsia spp. in archival samples

    DEFF Research Database (Denmark)

    Svendsen, Claus Bo; Boye, Mette; Struve, Carsten

    2009-01-01

    A novel, sensitive and specific method for detecting Rickettsia spp. in archival samples is described. The method involves the use of fluorescently marked oligonucleotide probes for in situ hybridization. Specific hybridization of Ricekttsia was found without problems of cross-reactions with bact......A novel, sensitive and specific method for detecting Rickettsia spp. in archival samples is described. The method involves the use of fluorescently marked oligonucleotide probes for in situ hybridization. Specific hybridization of Ricekttsia was found without problems of cross...

  19. Internal Stress Distribution Measurement of TIG Welded SUS304 Samples Using Neutron Diffraction Technique

    Science.gov (United States)

    Muslih, M. Refai; Sumirat, I.; Sairun; Purwanta

    2008-03-01

    The distribution of residual stress of SUS304 samples that were undergone TIG welding process with four different electric currents has been measured. The welding has been done in the middle part of the samples that was previously grooved by milling machine. Before they were welded the samples were annealed at 650 degree Celsius for one hour. The annealing process was done to eliminate residual stress generated by grooving process so that the residual stress within the samples was merely produced from welding process. The calculation of distribution of residual stress was carried out by measuring the strains within crystal planes of Fe(220) SUS304. Strain, Young modulus, and Poisson ratio of Fe(220) SUS304 were measured using DN1-M neutron diffractometer. Young modulus and Poisson ratio of Fe(220) SUS304 sample were measured in-situ. The result of calculations showed that distribution of residual stress of SUS304 in the vicinity of welded area is influenced both by treatments given at the samples-making process and by the electric current used during welding process.

  20. Temporal and social contexts of heroin-using populations. An illustration of the snowball sampling technique.

    Science.gov (United States)

    Kaplan, C D; Korf, D; Sterk, C

    1987-09-01

    Snowball sampling is a method that has been used in the social sciences to study sensitive topics, rare traits, personal networks, and social relationships. The method involves the selection of samples utilizing "insider" knowledge and referral chains among subjects who possess common traits that are of research interest. It is especially useful in generating samples for which clinical sampling frames may be difficult to obtain or are biased in some way. In this paper, snowball samples of heroin users in two Dutch cities have been analyzed for the purpose of providing descriptions and limited inferences about the temporal and social contexts of their lifestyles. Two distinct heroin-using populations have been discovered who are distinguished by their life cycle stage. Significant contextual explanations have been found involving the passage from adolescent peer group to criminal occupation, the functioning of network "knots" and "outcroppings," and the frequency of social contact. It is suggested that the snowball sampling method may have utility in studying the temporal and social contexts of other populations of clinical interest.

  1. Detection of Chlamydophila psittaci from feral pigeons in environmental samples: problems with currently available techniques.

    Science.gov (United States)

    Geigenfeind, Ila; Haag-Wackernagel, Daniel

    2010-03-01

    Chlamydophila psittaci (Lillie, 1930) Everett et al., 1999, the pathogenic agent of human ornithosis, is widespread in feral pigeon populations and many cases of transmission from feral pigeons to humans have been reported. The aim of the present study was to detect C. psittaci in environmental samples to find out more about possible transmission routes and, therefore, to assess the zoonotic risk for humans. Fecal samples were collected from nest boxes in a feral pigeon loft. Additionally, samples were taken from the feather dust film covering the water surface of public fountains where pigeons regularly bathe. The samples were tested for the presence of chlamydial antigen using an antigen enzyme-linked immunosorbent assay to prove shedding of C. psittaci by feral pigeons. This test detects a genus specific lipopolysaccharide in the outer membrane of the chlamydial bacteria. Samples were tested using the IDEIA PCE Chlamydia Test kit (DakoCytomation) and positive results were verified with IDEIA Chlamydia Blocking Reagents (DakoCytomation). The IDEIA PCE Chlamydia Test yields a high proportion of positive results. However, when IDEIA Chlamydia Blocking was performed, most of the positive results turned out to be negative or could not be interpreted. We conclude that antigen-enzyme-linked immunosorbent assay tests are not suitable for detecting C. psittaci in environmental samples. Previous publications where no blocking test was used should be reconsidered critically. © 2010 ISZS, Blackwell Publishing and IOZ/CAS.

  2. Hypoalgesic effects of three different manual therapy techniques on cervical spine and psychological interaction: A randomized clinical trial.

    Science.gov (United States)

    Alonso-Perez, Jose Luis; Lopez-Lopez, Almudena; La Touche, Roy; Lerma-Lara, Sergio; Suarez, Emilio; Rojas, Javier; Bishop, Mark D; Villafañe, Jorge Hugo; Fernández-Carnero, Josué

    2017-10-01

    The purpose of this study was to evaluate the extent to which psychological factors interact with a particular manual therapy (MT) technique to induce hypoalgesia in healthy subjects. Seventy-five healthy volunteers (36 female, 39 males), were recruited in this double-blind, controlled and parallel study. Subjects were randomly assigned to receive: High velocity low amplitude technique (HVLA), joint mobilization, or Cervical Lateral glide mobilization (CLGM). Pressure pain threshold (PPT) over C7 unilaterally, trapezius muscle and lateral epicondyle bilaterally, were measured prior to single technique MT was applied and immediately after to applied MT. Pain catastrophizing, depression, anxiety and kinesiophobia were evaluated before treatment. The results indicate that hypoalgesia was observed in all groups after treatment in the neck and elbow region (P < 0.05), but mobilization induces more hypoalgesic effects. Catastrophizing interacted with change over time in PPT, for changes in C7 and in manipulation group. All the MT techniques studied produced local and segmental hypoalgesic effects, supporting the results of previous studies studying the individual interventions. Interaction between catastrophizing and HVLA technique suggest that whether catastrophizing level is low or medium, the chance of success is high, but high levels of catastrophizing may result in poor outcome after HVLA intervention. ClinicalTrials.gov Registration Number: NCT02782585. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. An orientation-space super sampling technique for six-dimensional diffraction contrast tomography

    NARCIS (Netherlands)

    N.R. Viganò (Nicola); K.J. Batenburg (Joost); W. Ludwig (Wolfgang)

    2016-01-01

    textabstractDiffraction contrast tomography (DCT) is an X-ray full-field imaging technique that allows for the non-destructive three-dimensional investigation of polycrystalline materials and the determination of the physical and morphological properties of their crystallographic domains, called

  4. Effects of pushing techniques during the second stage of labor: A randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Refika Genç Koyucu

    2017-10-01

    Conclusions: Although the duration of the second stage of labor was longer compared to valsalva pushing technique, women were able to give birth without requiring any verbal or visual instruction, without exceeding the limit value of two hours and without affecting fetal wellness and neonatal results.

  5. Direct Trace Element Analysis of Liquid Blood Samples by In-Air Ion Beam Analytical Techniques (PIXE-PIGE).

    Science.gov (United States)

    Huszank, Robert; Csedreki, László; Török, Zsófia

    2017-02-07

    There are various liquid materials whose elemental composition is of interest in various fields of science and technology. In many cases, sample preparation or the extraction can be complicated, or it would destroy the original environment before the analysis (for example, in the case of biological samples). However, multielement direct analysis of liquid samples can be realized by an external PIXE-PIGE measurement system. Particle-induced X-ray and gamma-ray emission spectroscopy (PIXE, PIGE) techniques were applied in external (in-air) microbeam configuration for the trace and main element determination of liquid samples. The direct analysis of standard solutions of several metal salts and human blood samples (whole blood, blood serum, blood plasma, and formed elements) was realized. From the blood samples, Na, P, S, Cl, K, Ca, Fe, Cu, Zn, and Br elemental concentrations were determined. The focused and scanned ion beam creates an opportunity to analyze very small volume samples (∼10 μL). As the sample matrix consists of light elements, the analysis is possible at ppm level. Using this external beam setup, it was found that it is possible to determine elemental composition of small-volume liquid samples routinely, while the liquid samples do not require any preparation processes, and thus, they can be analyzed directly. In the case of lower concentrations, the method is also suitable for the analysis (down to even ∼1 ppm level) but with less accuracy and longer measurement times.

  6. Incorporating covariance estimation uncertainty in spatial sampling design for prediction with trans-Gaussian random fields

    Directory of Open Access Journals (Sweden)

    Gunter eSpöck

    2015-05-01

    Full Text Available Recently, Spock and Pilz [38], demonstratedthat the spatial sampling design problem forthe Bayesian linear kriging predictor can betransformed to an equivalent experimentaldesign problem for a linear regression modelwith stochastic regression coefficients anduncorrelated errors. The stochastic regressioncoefficients derive from the polar spectralapproximation of the residual process. Thus,standard optimal convex experimental designtheory can be used to calculate optimal spatialsampling designs. The design functionals ̈considered in Spock and Pilz [38] did nottake into account the fact that kriging isactually a plug-in predictor which uses theestimated covariance function. The resultingoptimal designs were close to space-fillingconfigurations, because the design criteriondid not consider the uncertainty of thecovariance function.In this paper we also assume that thecovariance function is estimated, e.g., byrestricted maximum likelihood (REML. Wethen develop a design criterion that fully takesaccount of the covariance uncertainty. Theresulting designs are less regular and space-filling compared to those ignoring covarianceuncertainty. The new designs, however, alsorequire some closely spaced samples in orderto improve the estimate of the covariancefunction. We also relax the assumption ofGaussian observations and assume that thedata is transformed to Gaussianity by meansof the Box-Cox transformation. The resultingprediction method is known as trans-Gaussiankriging. We apply the Smith and Zhu [37]approach to this kriging method and show thatresulting optimal designs also depend on theavailable data. We illustrate our results witha data set of monthly rainfall measurementsfrom Upper Austria.

  7. Determination of multi-element in marine sediment samples collected in Angola by the k0-NAA technique

    International Nuclear Information System (INIS)

    Teixeira, M.C.P.; Ho Manh Dung; Cao Dong Vu; Nguyen Thi Sy; Nguyen Thanh Binh; Vuong Huu Tan

    2006-01-01

    The marine sediment samples were designed to collect in Angola for marine environmental pollution study. The k 0 -standardization method of neutron activation analysis (k 0 -NAA) on Dalat research reactor has been developed to determine of multi-element in the Angola marine sediment samples. The samples were irradiated in cell 7-1 for short- and middle-lived nuclides and rotary specimen rack for long-lived nuclides. The irradiation facilities were characterized for neutron spectrum parameters and post-activated samples were measured on the calibrated gamma-ray spectrometers using HPGe detectors. The analytical results for 9 marine sediment samples with 27 elements: Al, As, Br, Ca, Ce,Cl, Co, Cs, Dy, Fe, Hf, I, K, Mg, Mn, Na, Rb, Sb, Sc, Se, Sm, Th, Ti, U, V and Zn in term of mean concentration, standard deviation and their content range are shown in the report. The analytical quality assurance was done by analysis of a Japan's certified reference material namely marine sediment NMIJ-CRM-7302a. These preliminary results revealed that the k 0 -NAA technique on the Dalat research reactor is a good analytical technique for determination of multi-element in the marine sediment samples. Some heavy metals and trace elements determined in this work possibly connected to the human activities at the sampling region. (author)

  8. Extraction of Plutonium From Spiked INEEL Soil Samples Using the Ligand-Assisted Supercritical Fluid Extraction (LA-SFE) Technique

    International Nuclear Information System (INIS)

    Fox, R.V.; Mincher, B.J.; Holmes, R.G.G.

    1999-01-01

    In order to investigate the effectiveness of ligand-assisted supercritical fluid extraction for the removal of transuranic contaminations from soils an Idaho National Engineering and Environmental Laboratory (INEEL) silty-clay soil sample was obtained from near the Radioactive Waste Management Complex area and subjected to three different chemical preparations before being spiked with plutonium. The spiked INEEL soil samples were subjected to a sequential aqueous extraction procedure to determine radionuclide portioning in each sample. Results from those extractions demonstrate that plutonium consistently partitioned into the residual fraction across all three INEEL soil preparations whereas americium partitioned 73% into the iron/manganese fraction for soil preparation A, with the balance partitioning into the residual fraction. Plutonium and americium were extracted from the INEEL soil samples using a ligand-assisted supercritical fluid extraction technique. Initial supercritical fluid extraction runs produced plutonium extraction technique. Initial supercritical fluid extraction runs produced plutonium extraction efficiencies ranging from 14% to 19%. After a second round wherein the initial extraction parameters were changed, the plutonium extraction efficiencies increased to 60% and as high as 80% with the americium level in the post-extracted soil samples dropping near to the detection limits. The third round of experiments are currently underway. These results demonstrate that the ligand-assisted supercritical fluid extraction technique can effectively extract plutonium from the spiked INEEL soil preparations

  9. Provenance validation of polished rice samples using nuclear and isotopic analytical techniques

    International Nuclear Information System (INIS)

    Pabroa, P.C.B.; Sucgang, R.J.; Mendoza, N.D.S.; Ebihara, M.; Peña, M.

    2015-01-01

    Rice (Oryza sativa) has been considered the best staple food among all cereals and is the staple food for over 3 billion people, constituting over half of the world’s population. Elemental and isotopic analysis revealed variance between Philippine and Japanese rice. Rice samples collected in Japan and in the Philippines (market survey samples from Metro Manila, and farm harvests from Aklan province and Central Luzon) were washed, dried and ground to fine powder. Elemental analyses of the samples were carried out using instrumental neutron activation analysis (INAA) while isotopic signatures of the samples were determined using the isotope ratio mass spectrometry (IRMS). Results show that compared with the unpolished rice standard NIES CRM10b, the polished Japanese and Philippine rice sampled show reduced concentrations of elements by as much as 1/10. 1/4 , 1/5 and 1/3 for Mg, Mn, K and Na, respectively. Levels of Ca and Zn are not greatly affected. Arsenic, probably introduced from fertilizers used in rice fields is found in all the Japanese rice tested at an average concentration of 0.103 μg/g and three out of four of the Philippine rice at an average concentration of 0.70μg/g. Higher levels of Br seen in two of the Philippine rice at 14 and 34μg/g indicated probable contamination source from the pesticide methyl bromide during quarantine. Good correlation of isotopic signatures with geographical location of polished, but not for unpolished, rice samples from Central Luzon and Aklan indicated that provenance studies are best done on polished rice samples. Isotopic with of ω’”13C show signature that of a C3 plant with possible narrow distinguishable signature with Japanese rice falling within -27.5 to -28.5 while Philippine rice within -29 to -30. Rice provenance can be ascertained using elemental analysis and isotopic abundance determination as shown by the study.(author)

  10. Trace uranium analysis in geological sample by isotope dilution-alpha spectrometry and comparison with other techniques

    International Nuclear Information System (INIS)

    Shihomatsu, H.M.; Iyer, S.S.

    1988-12-01

    Establishment of uranium determination in geological samples by alpha spectrometric isotope dilution technique using 233 U tracer is described in the present work. The various steps involved in the method namely, preparation of the sample, electrodeposition, alpha spectrometry, isotope dilution, calculation of the concentration and error statistics are discussed in detail. The experimental parameters for the electrodeposition of uranium, like current density, pH concentration of the electrolyte solution, deposition time, electrode distance were all optimised based on the efficiency of the deposition. The total accuracy and precision of the IDAS using 233 U tracer in the determination of uranium in mineral and granite samples were of the order of 1 to 2% for the concentration range of 50-1500 ppm of U. Our results are compared with those obtained by others workers using similar and different techniques. (author) [pt

  11. Clinical application of microsampling versus conventional sampling techniques in the quantitative bioanalysis of antibiotics: a systematic review.

    Science.gov (United States)

    Guerra Valero, Yarmarly C; Wallis, Steven C; Lipman, Jeffrey; Stove, Christophe; Roberts, Jason A; Parker, Suzanne L

    2018-03-01

    Conventional sampling techniques for clinical pharmacokinetic studies often require the removal of large blood volumes from patients. This can result in a physiological or emotional burden, particularly for neonates or pediatric patients. Antibiotic pharmacokinetic studies are typically performed on healthy adults or general ward patients. These may not account for alterations to a patient's pathophysiology and can lead to suboptimal treatment. Microsampling offers an important opportunity for clinical pharmacokinetic studies in vulnerable patient populations, where smaller sample volumes can be collected. This systematic review provides a description of currently available microsampling techniques and an overview of studies reporting the quantitation and validation of antibiotics using microsampling. A comparison of microsampling to conventional sampling in clinical studies is included.

  12. Sampling technique for collection of expired COsub(2) in studies using naturally labelled sup(13)C in calves

    International Nuclear Information System (INIS)

    Chevalier, R.; Pelletier, G.; Gagnon, M.

    1984-01-01

    Natural isotopic labelling of animal metabolic processes presents great advantages over the use of artificially enriched compounds. These advantages include the use of natural diets, low cost and easiness of handling, sampling and disposal. Changes in the sup(13)C/sup(12)C ratio of expired COsub(2) after ingestion of a test meal containing a variable proportion of Csub(3) and/or Csub(4) plant-derived products permits measurements of the immediate oxidation of dietary carbohydrates. A breathing device for collection of expired COsub(2) and purification techniques are described. The breath-sampling device is composed of a polypropylene enclosure and a rubber bag. The rubber bag is blown up by the animal and a sample of the expired gas is transferred to a sampling glass holder in which a vacuum had been previously achieved. Sample purification of expired COsub(2) is done by differential freezing

  13. Elemental analysis of water and soil environmental samples in Tabuk area by neutron capture gamma-ray spectroscopy techniques

    International Nuclear Information System (INIS)

    Al-Aseery, Sh.M.; Alamoudi, Z.; Hassan, A.M.

    2006-01-01

    The prompt and delayed gamma-rays due to neutron capture in the nuclei of the constituent elements of three soil samples and one drinking water sample have been measured. The 252 Cf and 226 Ra/Be isotopic neutron sources are used for neutron irradiation. Also, the hyper pure germanium detection system is used. The soil samples were from Astra, Tadco and El-Gammaz farms, while the water sample was taken from Tabuk city. In case of prompt gamma-ray analysis, a total of 16 elements were identified and the concentration percentage values by weight were calculated for: C, Na, Mg, Al, Si, S, Cl,, Ca, Ti, Cr, Mn, Fe, Co, Zn, Sr ad Pb elements. A comparative study between the results obtained in this work and the results obtained by ICP-MS and EDX-Ray techniques for the same samples is given

  14. Introduction to Body Composition Assessment Using the Deuterium Dilution Technique with Analysis of Urine Samples by Isotope Ratio Mass Spectrometry

    International Nuclear Information System (INIS)

    2010-01-01

    The IAEA has fostered the more widespread use of a stable isotope technique to assess body composition in different population groups to address priority areas in public health nutrition in Member States. It has done this by supporting national and regional nutrition projects through its technical cooperation programme and coordinated research projects over many years. This publication was developed by an international group of experts to provide practical hands-on guidance in the use of this technique in settings where analysis of stable isotope ratios in biological samples is to be made by isotope ratio mass spectrometry. The publication is targeted at new users of this technique, for example nutritionists, analytical chemists and other professionals. More detailed information on the theoretical background and the practical applications of state of the art methodologies to monitor changes in body composition can be found in IAEA Human Health Series No. 3, Assessment of Body Composition and Total Energy Expenditure in Humans by Stable Isotope Techniques

  15. Seasonal comparison of moss bag technique against vertical snow samples for monitoring atmospheric pollution.

    Science.gov (United States)

    Salo, Hanna; Berisha, Anna-Kaisa; Mäkinen, Joni

    2016-03-01

    This is the first study seasonally applying Sphagnum papillosum moss bags and vertical snow samples for monitoring atmospheric pollution. Moss bags, exposed in January, were collected together with snow samples by early March 2012 near the Harjavalta Industrial Park in southwest Finland. Magnetic, chemical, scanning electron microscopy-energy dispersive X-ray spectroscopy (SEM-EDX), K-means clustering, and Tomlinson pollution load index (PLI) data showed parallel spatial trends of pollution dispersal for both materials. Results strengthen previous findings that concentrate and slag handling activities were important (dust) emission sources while the impact from Cu-Ni smelter's pipe remained secondary at closer distances. Statistically significant correlations existed between the variables of snow and moss bags. As a summary, both methods work well for sampling and are efficient pollutant accumulators. Moss bags can be used also in winter conditions and they provide more homogeneous and better controlled sampling method than snow samples. Copyright © 2015. Published by Elsevier B.V.

  16. Techniques of lumbar-sacral spine fusion in spondylosis: systematic literature review and meta-analysis of randomized clinical trials.

    Science.gov (United States)

    Umeta, Ricardo S G; Avanzi, Osmar

    2011-07-01

    Spine fusions can be performed through different techniques and are used to treat a number of vertebral pathologies. However, there seems to be no consensus regarding which technique of fusion is best suited to treat each distinct spinal disease or group of diseases. To study the effectiveness and complications of the different techniques used for spinal fusion in patients with lumbar spondylosis. Systematic literature review and meta-analysis. Randomized clinical studies comparing the most commonly performed surgical techniques for spine fusion in lumbar-sacral spondylosis, as well as those reporting patient outcome were selected. Identify which technique, if any, presents the best clinical, functional, and radiographic outcome. Systematic literature review and meta-analysis based on scientific articles published and indexed to the following databases: PubMed (1966-2009), Cochrane Collaboration-CENTRAL, EMBASE (1980-2009), and LILACS (1982-2009). The general search strategy focused on the surgical treatment of patients with lumbar-sacral spondylosis. Eight studies met the inclusion criteria and were selected with a total of 1,136 patients. Meta-analysis showed that patients who underwent interbody fusion presented a significantly smaller blood loss (p=.001) and a greater rate of bone fusion (p=.02). Patients submitted to fusion using the posterolateral approach had a significantly shorter operative time (p=.007) and less perioperative complications (p=.03). No statistically significant difference was found for the other studied variables (pain, functional impairment, and return to work). The most commonly used techniques for lumbar spine fusion in patients with spondylosis were interbody fusion and posterolateral approach. Both techniques were comparable in final outcome, but the former presented better rates of fusion and the latter the less complications. Copyright © 2011 Elsevier Inc. All rights reserved.

  17. Fast neutron and gamma-ray transmission technique in mixed samples. MCNP calculations

    International Nuclear Information System (INIS)

    Perez, N.; Padron, I.

    2001-01-01

    In this paper the moisture in sand and also the sulfur content in toluene have been described by using the simultaneous fast neutron/gamma transmission technique (FNGT). Monte Carlo calculations show that it is possible to apply this technique with accelerator-based and isotopic neutron sources in the on-line analysis to perform the product quality control, specifically in the building materials industry and the petroleum one. It has been used particles from a 14MeV neutron generator and also from an Am-Be neutron source. The estima