WorldWideScience

Sample records for stratified random-sampling technique

  1. MUP, CEC-DES, STRADE. Codes for uncertainty propagation, experimental design and stratified random sampling techniques

    International Nuclear Information System (INIS)

    Amendola, A.; Astolfi, M.; Lisanti, B.

    1983-01-01

    The report describes the how-to-use of the codes: MUP (Monte Carlo Uncertainty Propagation) for uncertainty analysis by Monte Carlo simulation, including correlation analysis, extreme value identification and study of selected ranges of the variable space; CEC-DES (Central Composite Design) for building experimental matrices according to the requirements of Central Composite and Factorial Experimental Designs; and, STRADE (Stratified Random Design) for experimental designs based on the Latin Hypercube Sampling Techniques. Application fields, of the codes are probabilistic risk assessment, experimental design, sensitivity analysis and system identification problems

  2. Stratified source-sampling techniques for Monte Carlo eigenvalue analysis

    International Nuclear Information System (INIS)

    Mohamed, A.

    1998-01-01

    In 1995, at a conference on criticality safety, a special session was devoted to the Monte Carlo ''Eigenvalue of the World'' problem. Argonne presented a paper, at that session, in which the anomalies originally observed in that problem were reproduced in a much simplified model-problem configuration, and removed by a version of stratified source-sampling. In this paper, stratified source-sampling techniques are generalized and applied to three different Eigenvalue of the World configurations which take into account real-world statistical noise sources not included in the model problem, but which differ in the amount of neutronic coupling among the constituents of each configuration. It is concluded that, in Monte Carlo eigenvalue analysis of loosely-coupled arrays, the use of stratified source-sampling reduces the probability of encountering an anomalous result over that if conventional source-sampling methods are used. However, this gain in reliability is substantially less than that observed in the model-problem results

  3. Monitoring oil persistence on beaches : SCAT versus stratified random sampling designs

    International Nuclear Information System (INIS)

    Short, J.W.; Lindeberg, M.R.; Harris, P.M.; Maselko, J.M.; Pella, J.J.; Rice, S.D.

    2003-01-01

    In the event of a coastal oil spill, shoreline clean-up assessment teams (SCAT) commonly rely on visual inspection of the entire affected area to monitor the persistence of the oil on beaches. Occasionally, pits are excavated to evaluate the persistence of subsurface oil. This approach is practical for directing clean-up efforts directly following a spill. However, sampling of the 1989 Exxon Valdez oil spill in Prince William Sound 12 years later has shown that visual inspection combined with pit excavation does not offer estimates of contaminated beach area of stranded oil volumes. This information is needed to statistically evaluate the significance of change with time. Assumptions regarding the correlation of visually-evident surface oil and cryptic subsurface oil are usually not evaluated as part of the SCAT mandate. Stratified random sampling can avoid such problems and could produce precise estimates of oiled area and volume that allow for statistical assessment of major temporal trends and the extent of the impact. The 2001 sampling of the shoreline of Prince William Sound showed that 15 per cent of surface oil occurrences were associated with subsurface oil. This study demonstrates the usefulness of the stratified random sampling method and shows how sampling design parameters impact statistical outcome. Power analysis based on the study results, indicate that optimum power is derived when unnecessary stratification is avoided. It was emphasized that sampling effort should be balanced between choosing sufficient beaches for sampling and the intensity of sampling

  4. Application of a stratified random sampling technique to the estimation and minimization of respirable quartz exposure to underground miners

    International Nuclear Information System (INIS)

    Makepeace, C.E.; Horvath, F.J.; Stocker, H.

    1981-11-01

    The aim of a stratified random sampling plan is to provide the best estimate (in the absence of full-shift personal gravimetric sampling) of personal exposure to respirable quartz among underground miners. One also gains information of the exposure distribution of all the miners at the same time. Three variables (or strata) are considered in the present scheme: locations, occupations and times of sampling. Random sampling within each stratum ensures that each location, occupation and time of sampling has equal opportunity of being selected without bias. Following implementation of the plan and analysis of collected data, one can determine the individual exposures and the mean. This information can then be used to identify those groups whose exposure contributes significantly to the collective exposure. In turn, this identification, along with other considerations, allows the mine operator to carry out a cost-benefit optimization and eventual implementation of engineering controls for these groups. This optimization and engineering control procedure, together with the random sampling plan, can then be used in an iterative manner to minimize the mean value of the distribution and collective exposures

  5. Sample size estimation and sampling techniques for selecting a representative sample

    Directory of Open Access Journals (Sweden)

    Aamir Omair

    2014-01-01

    Full Text Available Introduction: The purpose of this article is to provide a general understanding of the concepts of sampling as applied to health-related research. Sample Size Estimation: It is important to select a representative sample in quantitative research in order to be able to generalize the results to the target population. The sample should be of the required sample size and must be selected using an appropriate probability sampling technique. There are many hidden biases which can adversely affect the outcome of the study. Important factors to consider for estimating the sample size include the size of the study population, confidence level, expected proportion of the outcome variable (for categorical variables/standard deviation of the outcome variable (for numerical variables, and the required precision (margin of accuracy from the study. The more the precision required, the greater is the required sample size. Sampling Techniques: The probability sampling techniques applied for health related research include simple random sampling, systematic random sampling, stratified random sampling, cluster sampling, and multistage sampling. These are more recommended than the nonprobability sampling techniques, because the results of the study can be generalized to the target population.

  6. Fixed-location hydroacoustic monitoring designs for estimating fish passage using stratified random and systematic sampling

    International Nuclear Information System (INIS)

    Skalski, J.R.; Hoffman, A.; Ransom, B.H.; Steig, T.W.

    1993-01-01

    Five alternate sampling designs are compared using 15 d of 24-h continuous hydroacoustic data to identify the most favorable approach to fixed-location hydroacoustic monitoring of salmonid outmigrants. Four alternative aproaches to systematic sampling are compared among themselves and with stratified random sampling (STRS). Stratifying systematic sampling (STSYS) on a daily basis is found to reduce sampling error in multiday monitoring studies. Although sampling precision was predictable with varying levels of effort in STRS, neither magnitude nor direction of change in precision was predictable when effort was varied in systematic sampling (SYS). Furthermore, modifying systematic sampling to include replicated (e.g., nested) sampling (RSYS) is further shown to provide unbiased point and variance estimates as does STRS. Numerous short sampling intervals (e.g., 12 samples of 1-min duration per hour) must be monitored hourly using RSYS to provide efficient, unbiased point and interval estimates. For equal levels of effort, STRS outperformed all variations of SYS examined. Parametric approaches to confidence interval estimates are found to be superior to nonparametric interval estimates (i.e., bootstrap and jackknife) in estimating total fish passage. 10 refs., 1 fig., 8 tabs

  7. BWIP-RANDOM-SAMPLING, Random Sample Generation for Nuclear Waste Disposal

    International Nuclear Information System (INIS)

    Sagar, B.

    1989-01-01

    1 - Description of program or function: Random samples for different distribution types are generated. Distribution types as required for performance assessment modeling of geologic nuclear waste disposal are provided. These are: - Uniform, - Log-uniform (base 10 or natural), - Normal, - Lognormal (base 10 or natural), - Exponential, - Bernoulli, - User defined continuous distribution. 2 - Method of solution: A linear congruential generator is used for uniform random numbers. A set of functions is used to transform the uniform distribution to the other distributions. Stratified, rather than random, sampling can be chosen. Truncated limits can be specified on many distributions, whose usual definition has an infinite support. 3 - Restrictions on the complexity of the problem: Generation of correlated random variables is not included

  8. Evaluating effectiveness of down-sampling for stratified designs and unbalanced prevalence in Random Forest models of tree species distributions in Nevada

    Science.gov (United States)

    Elizabeth A. Freeman; Gretchen G. Moisen; Tracy S. Frescino

    2012-01-01

    Random Forests is frequently used to model species distributions over large geographic areas. Complications arise when data used to train the models have been collected in stratified designs that involve different sampling intensity per stratum. The modeling process is further complicated if some of the target species are relatively rare on the landscape leading to an...

  9. Stratified random sampling plans designed to assist in the determination of radon and radon daughter concentrations in underground uranium mine atmosphere

    International Nuclear Information System (INIS)

    Makepeace, C.E.

    1981-01-01

    Sampling strategies for the monitoring of deleterious agents present in uranium mine air in underground and surface mining areas are described. These methods are designed to prevent overexposure of the lining of the respiratory system of uranium miners to ionizing radiation from radon and radon daughters, and whole body overexposure to external gamma radiation. A detailed description is provided of stratified random sampling monitoring methodology for obtaining baseline data to be used as a reference for subsequent compliance assessment

  10. Estimation of Finite Population Mean in Multivariate Stratified Sampling under Cost Function Using Goal Programming

    Directory of Open Access Journals (Sweden)

    Atta Ullah

    2014-01-01

    Full Text Available In practical utilization of stratified random sampling scheme, the investigator meets a problem to select a sample that maximizes the precision of a finite population mean under cost constraint. An allocation of sample size becomes complicated when more than one characteristic is observed from each selected unit in a sample. In many real life situations, a linear cost function of a sample size nh is not a good approximation to actual cost of sample survey when traveling cost between selected units in a stratum is significant. In this paper, sample allocation problem in multivariate stratified random sampling with proposed cost function is formulated in integer nonlinear multiobjective mathematical programming. A solution procedure is proposed using extended lexicographic goal programming approach. A numerical example is presented to illustrate the computational details and to compare the efficiency of proposed compromise allocation.

  11. Effects of unstratified and centre-stratified randomization in multi-centre clinical trials.

    Science.gov (United States)

    Anisimov, Vladimir V

    2011-01-01

    This paper deals with the analysis of randomization effects in multi-centre clinical trials. The two randomization schemes most often used in clinical trials are considered: unstratified and centre-stratified block-permuted randomization. The prediction of the number of patients randomized to different treatment arms in different regions during the recruitment period accounting for the stochastic nature of the recruitment and effects of multiple centres is investigated. A new analytic approach using a Poisson-gamma patient recruitment model (patients arrive at different centres according to Poisson processes with rates sampled from a gamma distributed population) and its further extensions is proposed. Closed-form expressions for corresponding distributions of the predicted number of the patients randomized in different regions are derived. In the case of two treatments, the properties of the total imbalance in the number of patients on treatment arms caused by using centre-stratified randomization are investigated and for a large number of centres a normal approximation of imbalance is proved. The impact of imbalance on the power of the study is considered. It is shown that the loss of statistical power is practically negligible and can be compensated by a minor increase in sample size. The influence of patient dropout is also investigated. The impact of randomization on predicted drug supply overage is discussed. Copyright © 2010 John Wiley & Sons, Ltd.

  12. Stereo imaging and random array stratified imaging for cargo radiation inspecting

    International Nuclear Information System (INIS)

    Wang Jingjin; Zeng Yu

    2003-01-01

    This paper presents a Stereo Imaging and Random Array Stratified Imaging for cargo container radiation Inspecting. By using dual-line vertical detector array scan, a stereo image of inspected cargo can be obtained and watched with virtual reality view. The random detector array has only one-row of detectors but distributed in a certain horizontal dimension randomly. To scan a cargo container with this random array detector, a 'defocused' image is obtained. By using 'anti-random focusing', one layer of the image can be focused on the background of all defocused images from other layers. A stratified X-ray image of overlapped bike wheels is presented

  13. A random spatial sampling method in a rural developing nation

    Science.gov (United States)

    Michelle C. Kondo; Kent D.W. Bream; Frances K. Barg; Charles C. Branas

    2014-01-01

    Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. We describe a stratified random sampling method...

  14. Data splitting for artificial neural networks using SOM-based stratified sampling.

    Science.gov (United States)

    May, R J; Maier, H R; Dandy, G C

    2010-03-01

    Data splitting is an important consideration during artificial neural network (ANN) development where hold-out cross-validation is commonly employed to ensure generalization. Even for a moderate sample size, the sampling methodology used for data splitting can have a significant effect on the quality of the subsets used for training, testing and validating an ANN. Poor data splitting can result in inaccurate and highly variable model performance; however, the choice of sampling methodology is rarely given due consideration by ANN modellers. Increased confidence in the sampling is of paramount importance, since the hold-out sampling is generally performed only once during ANN development. This paper considers the variability in the quality of subsets that are obtained using different data splitting approaches. A novel approach to stratified sampling, based on Neyman sampling of the self-organizing map (SOM), is developed, with several guidelines identified for setting the SOM size and sample allocation in order to minimize the bias and variance in the datasets. Using an example ANN function approximation task, the SOM-based approach is evaluated in comparison to random sampling, DUPLEX, systematic stratified sampling, and trial-and-error sampling to minimize the statistical differences between data sets. Of these approaches, DUPLEX is found to provide benchmark performance with good model performance, with no variability. The results show that the SOM-based approach also reliably generates high-quality samples and can therefore be used with greater confidence than other approaches, especially in the case of non-uniform datasets, with the benefit of scalability to perform data splitting on large datasets. Copyright 2009 Elsevier Ltd. All rights reserved.

  15. Distribution-Preserving Stratified Sampling for Learning Problems.

    Science.gov (United States)

    Cervellera, Cristiano; Maccio, Danilo

    2017-06-09

    The need for extracting a small sample from a large amount of real data, possibly streaming, arises routinely in learning problems, e.g., for storage, to cope with computational limitations, obtain good training/test/validation sets, and select minibatches for stochastic gradient neural network training. Unless we have reasons to select the samples in an active way dictated by the specific task and/or model at hand, it is important that the distribution of the selected points is as similar as possible to the original data. This is obvious for unsupervised learning problems, where the goal is to gain insights on the distribution of the data, but it is also relevant for supervised problems, where the theory explains how the training set distribution influences the generalization error. In this paper, we analyze the technique of stratified sampling from the point of view of distances between probabilities. This allows us to introduce an algorithm, based on recursive binary partition of the input space, aimed at obtaining samples that are distributed as much as possible as the original data. A theoretical analysis is proposed, proving the (greedy) optimality of the procedure together with explicit error bounds. An adaptive version of the algorithm is also introduced to cope with streaming data. Simulation tests on various data sets and different learning tasks are also provided.

  16. An improved algorithm of image processing technique for film thickness measurement in a horizontal stratified gas-liquid two-phase flow

    Energy Technology Data Exchange (ETDEWEB)

    Kuntoro, Hadiyan Yusuf, E-mail: hadiyan.y.kuntoro@mail.ugm.ac.id; Majid, Akmal Irfan; Deendarlianto, E-mail: deendarlianto@ugm.ac.id [Center for Energy Studies, Gadjah Mada University, Sekip K-1A Kampus UGM, Yogyakarta 55281 (Indonesia); Department of Mechanical and Industrial Engineering, Faculty of Engineering, Gadjah Mada University, Jalan Grafika 2, Yogyakarta 55281 (Indonesia); Hudaya, Akhmad Zidni; Dinaryanto, Okto [Department of Mechanical and Industrial Engineering, Faculty of Engineering, Gadjah Mada University, Jalan Grafika 2, Yogyakarta 55281 (Indonesia)

    2016-06-03

    Due to the importance of the two-phase flow researches for the industrial safety analysis, many researchers developed various methods and techniques to study the two-phase flow phenomena on the industrial cases, such as in the chemical, petroleum and nuclear industries cases. One of the developing methods and techniques is image processing technique. This technique is widely used in the two-phase flow researches due to the non-intrusive capability to process a lot of visualization data which are contain many complexities. Moreover, this technique allows to capture direct-visual information data of the flow which are difficult to be captured by other methods and techniques. The main objective of this paper is to present an improved algorithm of image processing technique from the preceding algorithm for the stratified flow cases. The present algorithm can measure the film thickness (h{sub L}) of stratified flow as well as the geometrical properties of the interfacial waves with lower processing time and random-access memory (RAM) usage than the preceding algorithm. Also, the measurement results are aimed to develop a high quality database of stratified flow which is scanty. In the present work, the measurement results had a satisfactory agreement with the previous works.

  17. [Study of spatial stratified sampling strategy of Oncomelania hupensis snail survey based on plant abundance].

    Science.gov (United States)

    Xun-Ping, W; An, Z

    2017-07-27

    Objective To optimize and simplify the survey method of Oncomelania hupensis snails in marshland endemic regions of schistosomiasis, so as to improve the precision, efficiency and economy of the snail survey. Methods A snail sampling strategy (Spatial Sampling Scenario of Oncomelania based on Plant Abundance, SOPA) which took the plant abundance as auxiliary variable was explored and an experimental study in a 50 m×50 m plot in a marshland in the Poyang Lake region was performed. Firstly, the push broom surveyed data was stratified into 5 layers by the plant abundance data; then, the required numbers of optimal sampling points of each layer through Hammond McCullagh equation were calculated; thirdly, every sample point in the line with the Multiple Directional Interpolation (MDI) placement scheme was pinpointed; and finally, the comparison study among the outcomes of the spatial random sampling strategy, the traditional systematic sampling method, the spatial stratified sampling method, Sandwich spatial sampling and inference and SOPA was performed. Results The method (SOPA) proposed in this study had the minimal absolute error of 0.213 8; and the traditional systematic sampling method had the largest estimate, and the absolute error was 0.924 4. Conclusion The snail sampling strategy (SOPA) proposed in this study obtains the higher estimation accuracy than the other four methods.

  18. An unbiased estimator of the variance of simple random sampling using mixed random-systematic sampling

    OpenAIRE

    Padilla, Alberto

    2009-01-01

    Systematic sampling is a commonly used technique due to its simplicity and ease of implementation. The drawback of this simplicity is that it is not possible to estimate the design variance without bias. There are several ways to circumvent this problem. One method is to suppose that the variable of interest has a random order in the population, so the sample variance of simple random sampling without replacement is used. By means of a mixed random - systematic sample, an unbiased estimator o...

  19. Independent random sampling methods

    CERN Document Server

    Martino, Luca; Míguez, Joaquín

    2018-01-01

    This book systematically addresses the design and analysis of efficient techniques for independent random sampling. Both general-purpose approaches, which can be used to generate samples from arbitrary probability distributions, and tailored techniques, designed to efficiently address common real-world practical problems, are introduced and discussed in detail. In turn, the monograph presents fundamental results and methodologies in the field, elaborating and developing them into the latest techniques. The theory and methods are illustrated with a varied collection of examples, which are discussed in detail in the text and supplemented with ready-to-run computer code. The main problem addressed in the book is how to generate independent random samples from an arbitrary probability distribution with the weakest possible constraints or assumptions in a form suitable for practical implementation. The authors review the fundamental results and methods in the field, address the latest methods, and emphasize the li...

  20. Random forcing of geostrophic motion in rotating stratified turbulence

    Science.gov (United States)

    Waite, Michael L.

    2017-12-01

    Random forcing of geostrophic motion is a common approach in idealized simulations of rotating stratified turbulence. Such forcing represents the injection of energy into large-scale balanced motion, and the resulting breakdown of quasi-geostrophic turbulence into inertia-gravity waves and stratified turbulence can shed light on the turbulent cascade processes of the atmospheric mesoscale. White noise forcing is commonly employed, which excites all frequencies equally, including frequencies much higher than the natural frequencies of large-scale vortices. In this paper, the effects of these high frequencies in the forcing are investigated. Geostrophic motion is randomly forced with red noise over a range of decorrelation time scales τ, from a few time steps to twice the large-scale vortex time scale. It is found that short τ (i.e., nearly white noise) results in about 46% more gravity wave energy than longer τ, despite the fact that waves are not directly forced. We argue that this effect is due to wave-vortex interactions, through which the high frequencies in the forcing are able to excite waves at their natural frequencies. It is concluded that white noise forcing should be avoided, even if it is only applied to the geostrophic motion, when a careful investigation of spontaneous wave generation is needed.

  1. Temporally stratified sampling programs for estimation of fish impingement

    International Nuclear Information System (INIS)

    Kumar, K.D.; Griffith, J.S.

    1977-01-01

    Impingement monitoring programs often expend valuable and limited resources and fail to provide a dependable estimate of either total annual impingement or those biological and physicochemical factors affecting impingement. In situations where initial monitoring has identified ''problem'' fish species and the periodicity of their impingement, intensive sampling during periods of high impingement will maximize information obtained. We use data gathered at two nuclear generating facilities in the southeastern United States to discuss techniques of designing such temporally stratified monitoring programs and their benefits and drawbacks. Of the possible temporal patterns in environmental factors within a calendar year, differences among seasons are most influential in the impingement of freshwater fishes in the Southeast. Data on the threadfin shad (Dorosoma petenense) and the role of seasonal temperature changes are utilized as an example to demonstrate ways of most efficiently and accurately estimating impingement of the species

  2. Stratified sampling design based on data mining.

    Science.gov (United States)

    Kim, Yeonkook J; Oh, Yoonhwan; Park, Sunghoon; Cho, Sungzoon; Park, Hayoung

    2013-09-01

    To explore classification rules based on data mining methodologies which are to be used in defining strata in stratified sampling of healthcare providers with improved sampling efficiency. We performed k-means clustering to group providers with similar characteristics, then, constructed decision trees on cluster labels to generate stratification rules. We assessed the variance explained by the stratification proposed in this study and by conventional stratification to evaluate the performance of the sampling design. We constructed a study database from health insurance claims data and providers' profile data made available to this study by the Health Insurance Review and Assessment Service of South Korea, and population data from Statistics Korea. From our database, we used the data for single specialty clinics or hospitals in two specialties, general surgery and ophthalmology, for the year 2011 in this study. Data mining resulted in five strata in general surgery with two stratification variables, the number of inpatients per specialist and population density of provider location, and five strata in ophthalmology with two stratification variables, the number of inpatients per specialist and number of beds. The percentages of variance in annual changes in the productivity of specialists explained by the stratification in general surgery and ophthalmology were 22% and 8%, respectively, whereas conventional stratification by the type of provider location and number of beds explained 2% and 0.2% of variance, respectively. This study demonstrated that data mining methods can be used in designing efficient stratified sampling with variables readily available to the insurer and government; it offers an alternative to the existing stratification method that is widely used in healthcare provider surveys in South Korea.

  3. Generation and Analysis of Constrained Random Sampling Patterns

    DEFF Research Database (Denmark)

    Pierzchlewski, Jacek; Arildsen, Thomas

    2016-01-01

    Random sampling is a technique for signal acquisition which is gaining popularity in practical signal processing systems. Nowadays, event-driven analog-to-digital converters make random sampling feasible in practical applications. A process of random sampling is defined by a sampling pattern, which...... indicates signal sampling points in time. Practical random sampling patterns are constrained by ADC characteristics and application requirements. In this paper, we introduce statistical methods which evaluate random sampling pattern generators with emphasis on practical applications. Furthermore, we propose...... algorithm generates random sampling patterns dedicated for event-driven-ADCs better than existed sampling pattern generators. Finally, implementation issues of random sampling patterns are discussed....

  4. Bayesian stratified sampling to assess corpus utility

    Energy Technology Data Exchange (ETDEWEB)

    Hochberg, J.; Scovel, C.; Thomas, T.; Hall, S.

    1998-12-01

    This paper describes a method for asking statistical questions about a large text corpus. The authors exemplify the method by addressing the question, ``What percentage of Federal Register documents are real documents, of possible interest to a text researcher or analyst?`` They estimate an answer to this question by evaluating 200 documents selected from a corpus of 45,820 Federal Register documents. Bayesian analysis and stratified sampling are used to reduce the sampling uncertainty of the estimate from over 3,100 documents to fewer than 1,000. A possible application of the method is to establish baseline statistics used to estimate recall rates for information retrieval systems.

  5. Monte Carlo stratified source-sampling

    International Nuclear Information System (INIS)

    Blomquist, R.N.; Gelbard, E.M.

    1997-01-01

    In 1995, at a conference on criticality safety, a special session was devoted to the Monte Carlo open-quotes eigenvalue of the worldclose quotes problem. Argonne presented a paper, at that session, in which the anomalies originally observed in that problem were reproduced in a much simplified model-problem configuration, and removed by a version of stratified source-sampling. The original test-problem was treated by a special code designed specifically for that purpose. Recently ANL started work on a method for dealing with more realistic eigenvalue of the world configurations, and has been incorporating this method into VIM. The original method has been modified to take into account real-world statistical noise sources not included in the model problem. This paper constitutes a status report on work still in progress

  6. A stratified random survey of the proportion of poor quality oral artesunate sold at medicine outlets in the Lao PDR – implications for therapeutic failure and drug resistance

    Directory of Open Access Journals (Sweden)

    Vongsack Latsamy

    2009-07-01

    Full Text Available Abstract Background Counterfeit oral artesunate has been a major public health problem in mainland SE Asia, impeding malaria control. A countrywide stratified random survey was performed to determine the availability and quality of oral artesunate in pharmacies and outlets (shops selling medicines in the Lao PDR (Laos. Methods In 2003, 'mystery' shoppers were asked to buy artesunate tablets from 180 outlets in 12 of the 18 Lao provinces. Outlets were selected using stratified random sampling by investigators not involved in sampling. Samples were analysed for packaging characteristics, by the Fast Red Dye test, high-performance liquid chromatography (HPLC, mass spectrometry (MS, X-ray diffractometry and pollen analysis. Results Of 180 outlets sampled, 25 (13.9% sold oral artesunate. Outlets selling artesunate were more commonly found in the more malarious southern Laos. Of the 25 outlets, 22 (88%; 95%CI 68–97% sold counterfeit artesunate, as defined by packaging and chemistry. No artesunate was detected in the counterfeits by any of the chemical analysis techniques and analysis of the packaging demonstrated seven different counterfeit types. There was complete agreement between the Fast Red dye test, HPLC and MS analysis. A wide variety of wrong active ingredients were found by MS. Of great concern, 4/27 (14.8% fakes contained detectable amounts of artemisinin (0.26–115.7 mg/tablet. Conclusion This random survey confirms results from previous convenience surveys that counterfeit artesunate is a severe public health problem. The presence of artemisinin in counterfeits may encourage malaria resistance to artemisinin derivatives. With increasing accessibility of artemisinin-derivative combination therapy (ACT in Laos, the removal of artesunate monotherapy from pharmacies may be an effective intervention.

  7. Unit Stratified Sampling as a Tool for Approximation of Stochastic Optimization Problems

    Czech Academy of Sciences Publication Activity Database

    Šmíd, Martin

    2012-01-01

    Roč. 19, č. 30 (2012), s. 153-169 ISSN 1212-074X R&D Projects: GA ČR GAP402/11/0150; GA ČR GAP402/10/0956; GA ČR GA402/09/0965 Institutional research plan: CEZ:AV0Z10750506 Institutional support: RVO:67985556 Keywords : Stochastic programming * approximation * stratified sampling Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2013/E/smid-unit stratified sampling as a tool for approximation of stochastic optimization problems.pdf

  8. The study of combining Latin Hypercube Sampling method and LU decomposition method (LULHS method) for constructing spatial random field

    Science.gov (United States)

    WANG, P. T.

    2015-12-01

    Groundwater modeling requires to assign hydrogeological properties to every numerical grid. Due to the lack of detailed information and the inherent spatial heterogeneity, geological properties can be treated as random variables. Hydrogeological property is assumed to be a multivariate distribution with spatial correlations. By sampling random numbers from a given statistical distribution and assigning a value to each grid, a random field for modeling can be completed. Therefore, statistics sampling plays an important role in the efficiency of modeling procedure. Latin Hypercube Sampling (LHS) is a stratified random sampling procedure that provides an efficient way to sample variables from their multivariate distributions. This study combines the the stratified random procedure from LHS and the simulation by using LU decomposition to form LULHS. Both conditional and unconditional simulations of LULHS were develpoed. The simulation efficiency and spatial correlation of LULHS are compared to the other three different simulation methods. The results show that for the conditional simulation and unconditional simulation, LULHS method is more efficient in terms of computational effort. Less realizations are required to achieve the required statistical accuracy and spatial correlation.

  9. A user`s guide to LHS: Sandia`s Latin Hypercube Sampling Software

    Energy Technology Data Exchange (ETDEWEB)

    Wyss, G.D.; Jorgensen, K.H. [Sandia National Labs., Albuquerque, NM (United States). Risk Assessment and Systems Modeling Dept.

    1998-02-01

    This document is a reference guide for LHS, Sandia`s Latin Hypercube Sampling Software. This software has been developed to generate either Latin hypercube or random multivariate samples. The Latin hypercube technique employs a constrained sampling scheme, whereas random sampling corresponds to a simple Monte Carlo technique. The present program replaces the previous Latin hypercube sampling program developed at Sandia National Laboratories (SAND83-2365). This manual covers the theory behind stratified sampling as well as use of the LHS code both with the Windows graphical user interface and in the stand-alone mode.

  10. A direct observation method for auditing large urban centers using stratified sampling, mobile GIS technology and virtual environments.

    Science.gov (United States)

    Lafontaine, Sean J V; Sawada, M; Kristjansson, Elizabeth

    2017-02-16

    With the expansion and growth of research on neighbourhood characteristics, there is an increased need for direct observational field audits. Herein, we introduce a novel direct observational audit method and systematic social observation instrument (SSOI) for efficiently assessing neighbourhood aesthetics over large urban areas. Our audit method uses spatial random sampling stratified by residential zoning and incorporates both mobile geographic information systems technology and virtual environments. The reliability of our method was tested in two ways: first, in 15 Ottawa neighbourhoods, we compared results at audited locations over two subsequent years, and second; we audited every residential block (167 blocks) in one neighbourhood and compared the distribution of SSOI aesthetics index scores with results from the randomly audited locations. Finally, we present interrater reliability and consistency results on all observed items. The observed neighbourhood average aesthetics index score estimated from four or five stratified random audit locations is sufficient to characterize the average neighbourhood aesthetics. The SSOI was internally consistent and demonstrated good to excellent interrater reliability. At the neighbourhood level, aesthetics is positively related to SES and physical activity and negatively correlated with BMI. The proposed approach to direct neighbourhood auditing performs sufficiently and has the advantage of financial and temporal efficiency when auditing a large city.

  11. Estimating HIES Data through Ratio and Regression Methods for Different Sampling Designs

    Directory of Open Access Journals (Sweden)

    Faqir Muhammad

    2007-01-01

    Full Text Available In this study, comparison has been made for different sampling designs, using the HIES data of North West Frontier Province (NWFP for 2001-02 and 1998-99 collected from the Federal Bureau of Statistics, Statistical Division, Government of Pakistan, Islamabad. The performance of the estimators has also been considered using bootstrap and Jacknife. A two-stage stratified random sample design is adopted by HIES. In the first stage, enumeration blocks and villages are treated as the first stage Primary Sampling Units (PSU. The sample PSU’s are selected with probability proportional to size. Secondary Sampling Units (SSU i.e., households are selected by systematic sampling with a random start. They have used a single study variable. We have compared the HIES technique with some other designs, which are: Stratified Simple Random Sampling. Stratified Systematic Sampling. Stratified Ranked Set Sampling. Stratified Two Phase Sampling. Ratio and Regression methods were applied with two study variables, which are: Income (y and Household sizes (x. Jacknife and Bootstrap are used for variance replication. Simple Random Sampling with sample size (462 to 561 gave moderate variances both by Jacknife and Bootstrap. By applying Systematic Sampling, we received moderate variance with sample size (467. In Jacknife with Systematic Sampling, we obtained variance of regression estimator greater than that of ratio estimator for a sample size (467 to 631. At a sample size (952 variance of ratio estimator gets greater than that of regression estimator. The most efficient design comes out to be Ranked set sampling compared with other designs. The Ranked set sampling with jackknife and bootstrap, gives minimum variance even with the smallest sample size (467. Two Phase sampling gave poor performance. Multi-stage sampling applied by HIES gave large variances especially if used with a single study variable.

  12. Occupational position and its relation to mental distress in a random sample of Danish residents

    DEFF Research Database (Denmark)

    Rugulies, Reiner Ernst; Madsen, Ida E H; Nielsen, Maj Britt D

    2010-01-01

    PURPOSE: To analyze the distribution of depressive, anxiety, and somatization symptoms across different occupational positions in a random sample of Danish residents. METHODS: The study sample consisted of 591 Danish residents (50% women), aged 20-65, drawn from an age- and gender-stratified random...... sample of the Danish population. Participants filled out a survey that included the 92 item version of the Hopkins Symptom Checklist (SCL-92). We categorized occupational position into seven groups: high- and low-grade non-manual workers, skilled and unskilled manual workers, high- and low-grade self...

  13. Urine sampling techniques in symptomatic primary-care patients

    DEFF Research Database (Denmark)

    Holm, Anne; Aabenhus, Rune

    2016-01-01

    in infection rate between mid-stream-clean-catch, mid-stream-urine and random samples. Conclusions: At present, no evidence suggests that sampling technique affects the accuracy of the microbiological diagnosis in non-pregnant women with symptoms of urinary tract infection in primary care. However......Background: Choice of urine sampling technique in urinary tract infection may impact diagnostic accuracy and thus lead to possible over- or undertreatment. Currently no evidencebased consensus exists regarding correct sampling technique of urine from women with symptoms of urinary tract infection...... a randomized or paired design to compare the result of urine culture obtained with two or more collection techniques in adult, female, non-pregnant patients with symptoms of urinary tract infection. We evaluated quality of the studies and compared accuracy based on dichotomized outcomes. Results: We included...

  14. Methylmercury speciation in the dissolved phase of a stratified lake using the diffusive gradient in thin film technique

    Energy Technology Data Exchange (ETDEWEB)

    Clarisse, Olivier [Trent University, Department of Chemistry, 1600 West Bank Drive, Peterborough, Ontario K9J 7B8 (Canada)], E-mail: olivier.clarisse@umoncton.ca; Foucher, Delphine; Hintelmann, Holger [Trent University, Department of Chemistry, 1600 West Bank Drive, Peterborough, Ontario K9J 7B8 (Canada)

    2009-03-15

    The diffusive gradient in thin film (DGT) technique was successfully used to monitor methylmercury (MeHg) speciation in the dissolved phase of a stratified boreal lake, Lake 658 of the Experimental Lakes Area (ELA) in Ontario, Canada. Water samples were conventionally analysed for MeHg, sulfides, and dissolved organic matter (DOM). MeHg accumulated by DGT devices was compared to MeHg concentration measured conventionally in water samples to establish MeHg speciation. In the epilimnion, MeHg was almost entirely bound to DOM. In the top of the hypolimnion an additional labile fraction was identified, and at the bottom of the lake a significant fraction of MeHg was potentially associated to colloidal material. As part of the METAALICUS project, isotope enriched inorganic mercury was applied to Lake 658 and its watershed for several years to establish the relationship between atmospheric Hg deposition and Hg in fish. Little or no difference in MeHg speciation in the dissolved phase was detected between ambient and spike MeHg. - Methylmercury speciation was determined in the dissolved phase of a stratified lake using the diffusive gradient in thin film technique.

  15. Methylmercury speciation in the dissolved phase of a stratified lake using the diffusive gradient in thin film technique

    International Nuclear Information System (INIS)

    Clarisse, Olivier; Foucher, Delphine; Hintelmann, Holger

    2009-01-01

    The diffusive gradient in thin film (DGT) technique was successfully used to monitor methylmercury (MeHg) speciation in the dissolved phase of a stratified boreal lake, Lake 658 of the Experimental Lakes Area (ELA) in Ontario, Canada. Water samples were conventionally analysed for MeHg, sulfides, and dissolved organic matter (DOM). MeHg accumulated by DGT devices was compared to MeHg concentration measured conventionally in water samples to establish MeHg speciation. In the epilimnion, MeHg was almost entirely bound to DOM. In the top of the hypolimnion an additional labile fraction was identified, and at the bottom of the lake a significant fraction of MeHg was potentially associated to colloidal material. As part of the METAALICUS project, isotope enriched inorganic mercury was applied to Lake 658 and its watershed for several years to establish the relationship between atmospheric Hg deposition and Hg in fish. Little or no difference in MeHg speciation in the dissolved phase was detected between ambient and spike MeHg. - Methylmercury speciation was determined in the dissolved phase of a stratified lake using the diffusive gradient in thin film technique

  16. A stratified two-stage sampling design for digital soil mapping in a Mediterranean basin

    Science.gov (United States)

    Blaschek, Michael; Duttmann, Rainer

    2015-04-01

    The quality of environmental modelling results often depends on reliable soil information. In order to obtain soil data in an efficient manner, several sampling strategies are at hand depending on the level of prior knowledge and the overall objective of the planned survey. This study focuses on the collection of soil samples considering available continuous secondary information in an undulating, 16 km²-sized river catchment near Ussana in southern Sardinia (Italy). A design-based, stratified, two-stage sampling design has been applied aiming at the spatial prediction of soil property values at individual locations. The stratification based on quantiles from density functions of two land-surface parameters - topographic wetness index and potential incoming solar radiation - derived from a digital elevation model. Combined with four main geological units, the applied procedure led to 30 different classes in the given test site. Up to six polygons of each available class were selected randomly excluding those areas smaller than 1ha to avoid incorrect location of the points in the field. Further exclusion rules were applied before polygon selection masking out roads and buildings using a 20m buffer. The selection procedure was repeated ten times and the set of polygons with the best geographical spread were chosen. Finally, exact point locations were selected randomly from inside the chosen polygon features. A second selection based on the same stratification and following the same methodology (selecting one polygon instead of six) was made in order to create an appropriate validation set. Supplementary samples were obtained during a second survey focusing on polygons that have either not been considered during the first phase at all or were not adequately represented with respect to feature size. In total, both field campaigns produced an interpolation set of 156 samples and a validation set of 41 points. The selection of sample point locations has been done using

  17. Comparison of randomization techniques for clinical trials with data from the HOMERUS-trial

    NARCIS (Netherlands)

    Verberk, W. J.; Kroon, A. A.; Kessels, A. G. H.; Nelemans, P. J.; van Ree, J. W.; Lenders, J. W. M.; Thien, T.; Bakx, J. C.; van Montfrans, G. A.; Smit, A. J.; Beltman, F. W.; de Leeuw, P. W.

    2005-01-01

    Background. Several methods of randomization are available to create comparable intervention groups in a study. In the HOMERUS-trial, we compared the minimization procedure with a stratified and a non-stratified method of randomization in order to test which one is most appropriate for use in

  18. Comparison of randomization techniques for clinical trials with data from the HOMERUS-trial.

    NARCIS (Netherlands)

    Verberk, W.J.; Kroon, A.A.; Kessels, A.G.H.; Nelemans, P.J.; Ree, J.W. van; Lenders, J.W.M.; Thien, Th.; Bakx, J.C.; Montfrans, G.A. van; Smit, A.J.; Beltman, F.W.; Leeuw, P.W. de

    2005-01-01

    BACKGROUND: Several methods of randomization are available to create comparable intervention groups in a study. In the HOMERUS-trial, we compared the minimization procedure with a stratified and a non-stratified method of randomization in order to test which one is most appropriate for use in

  19. Pedestrian-vehicle crashes and analytical techniques for stratified contingency tables.

    Science.gov (United States)

    Al-Ghamdi, Ali S

    2002-03-01

    In 1999 there were 450 fatalities due to road crashes in Riyadh, the capital of Saudi Arabia, of which 130 were pedestrians. Hence, every fourth person killed on the roads is a pedestrian. The aim of this study is to investigate pedestrian-vehicle crashes in this fast-growing city with two objectives in mind: to analyze pedestrian collisions with regard to their causes, characteristics, location of injury on the victim's body, and most common patterns and to determine the potential for use of the odds ratio technique in the analysis of stratified contingency tables. Data from 638 pedestrian-vehicle crashes reported by police, during the period 1997-1999, were used. A systematic sampling technique was followed in which every third record was used. The analysis showed that the pedestrian fatality rate per 10(5) population is 2.8. The rates were relatively high within the childhood (1-9 years) and young adult (10-19 years) groups, and the old-age groups (60 - > 80 years), which indicate that young as well as the elderly people in this city are more likely to be involved in fatal accidents of this type than are those in other age groups. The analysis revealed that 77.1% of pedestrians were probably struck while crossing a roadway either not in a crosswalk or where no crosswalk existed. In addition, the distribution of injuries on the victims' bodies was determined from hospital records. More than one-third of the fatal injuries were located on the head and chest. An attempt was made to conduct an association analysis between crash severity (i.e. injury or fatal) and some of the study variables using chi-square and odds ratio techniques. The categorical nature of the data helped in using these analytical techniques.

  20. Multiple sensitive estimation and optimal sample size allocation in the item sum technique.

    Science.gov (United States)

    Perri, Pier Francesco; Rueda García, María Del Mar; Cobo Rodríguez, Beatriz

    2018-01-01

    For surveys of sensitive issues in life sciences, statistical procedures can be used to reduce nonresponse and social desirability response bias. Both of these phenomena provoke nonsampling errors that are difficult to deal with and can seriously flaw the validity of the analyses. The item sum technique (IST) is a very recent indirect questioning method derived from the item count technique that seeks to procure more reliable responses on quantitative items than direct questioning while preserving respondents' anonymity. This article addresses two important questions concerning the IST: (i) its implementation when two or more sensitive variables are investigated and efficient estimates of their unknown population means are required; (ii) the determination of the optimal sample size to achieve minimum variance estimates. These aspects are of great relevance for survey practitioners engaged in sensitive research and, to the best of our knowledge, were not studied so far. In this article, theoretical results for multiple estimation and optimal allocation are obtained under a generic sampling design and then particularized to simple random sampling and stratified sampling designs. Theoretical considerations are integrated with a number of simulation studies based on data from two real surveys and conducted to ascertain the efficiency gain derived from optimal allocation in different situations. One of the surveys concerns cannabis consumption among university students. Our findings highlight some methodological advances that can be obtained in life sciences IST surveys when optimal allocation is achieved. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Reliability of impingement sampling designs: An example from the Indian Point station

    International Nuclear Information System (INIS)

    Mattson, M.T.; Waxman, J.B.; Watson, D.A.

    1988-01-01

    A 4-year data base (1976-1979) of daily fish impingement counts at the Indian Point electric power station on the Hudson River was used to compare the precision and reliability of three random-sampling designs: (1) simple random, (2) seasonally stratified, and (3) empirically stratified. The precision of daily impingement estimates improved logarithmically for each design as more days in the year were sampled. Simple random sampling was the least, and empirically stratified sampling was the most precise design, and the difference in precision between the two stratified designs was small. Computer-simulated sampling was used to estimate the reliability of the two stratified-random-sampling designs. A seasonally stratified sampling design was selected as the most appropriate reduced-sampling program for Indian Point station because: (1) reasonably precise and reliable impingement estimates were obtained using this design for all species combined and for eight common Hudson River fish by sampling only 30% of the days in a year (110 d); and (2) seasonal strata may be more precise and reliable than empirical strata if future changes in annual impingement patterns occur. The seasonally stratified design applied to the 1976-1983 Indian Point impingement data showed that selection of sampling dates based on daily species-specific impingement variability gave results that were more precise, but not more consistently reliable, than sampling allocations based on the variability of all fish species combined. 14 refs., 1 fig., 6 tabs

  2. Characterisation of the suspended particulate matter in a stratified estuarine environment employing complementary techniques

    Science.gov (United States)

    Thomas, Luis P.; Marino, Beatriz M.; Szupiany, Ricardo N.; Gallo, Marcos N.

    2017-09-01

    The ability to predict the sediment and nutrient circulation within estuarine waters is of significant economic and ecological importance. In these complex systems, flocculation is a dynamically active process that is directly affected by the prevalent environmental conditions. Consequently, the floc properties continuously change, which greatly complicates the characterisation of the suspended particle matter (SPM). In the present study, three different techniques are combined in a stratified estuary under quiet weather conditions and with a low river discharge to search for a solution to this problem. The challenge is to obtain the concentration, size and flux of suspended elements through selected cross-sections using the method based on the simultaneous backscatter records of 1200 and 600 kHz ADCPs, isokinetic sampling data and LISST-25X measurements. The two-ADCP method is highly effective for determining the SPM size distributions in a non-intrusive way. The isokinetic sampling and the LISST-25X diffractometer offer point measurements at specific depths, which are especially useful for calibrating the ADCP backscatter intensity as a function of the SPM concentration and size, and providing complementary information on the sites where acoustic records are not available. Limitations and potentials of the techniques applied are discussed.

  3. Emerging Techniques in Stratified Designs and Continuous Gradients for Tissue Engineering of Interfaces

    Science.gov (United States)

    Dormer, Nathan H.; Berkland, Cory J.; Detamore, Michael S.

    2013-01-01

    Interfacial tissue engineering is an emerging branch of regenerative medicine, where engineers are faced with developing methods for the repair of one or many functional tissue systems simultaneously. Early and recent solutions for complex tissue formation have utilized stratified designs, where scaffold formulations are segregated into two or more layers, with discrete changes in physical or chemical properties, mimicking a corresponding number of interfacing tissue types. This method has brought forth promising results, along with a myriad of regenerative techniques. The latest designs, however, are employing “continuous gradients” in properties, where there is no discrete segregation between scaffold layers. This review compares the methods and applications of recent stratified approaches to emerging continuously graded methods. PMID:20411333

  4. Assessment of fracture risk: value of random population-based samples--the Geelong Osteoporosis Study.

    Science.gov (United States)

    Henry, M J; Pasco, J A; Seeman, E; Nicholson, G C; Sanders, K M; Kotowicz, M A

    2001-01-01

    Fracture risk is determined by bone mineral density (BMD). The T-score, a measure of fracture risk, is the position of an individual's BMD in relation to a reference range. The aim of this study was to determine the magnitude of change in the T-score when different sampling techniques were used to produce the reference range. Reference ranges were derived from three samples, drawn from the same region: (1) an age-stratified population-based random sample, (2) unselected volunteers, and (3) a selected healthy subset of the population-based sample with no diseases or drugs known to affect bone. T-scores were calculated using the three reference ranges for a cohort of women who had sustained a fracture and as a group had a low mean BMD (ages 35-72 yr; n = 484). For most comparisons, the T-scores for the fracture cohort were more negative using the population reference range. The difference in T-scores reached 1.0 SD. The proportion of the fracture cohort classified as having osteoporosis at the spine was 26, 14, and 23% when the population, volunteer, and healthy reference ranges were applied, respectively. The use of inappropriate reference ranges results in substantial changes to T-scores and may lead to inappropriate management.

  5. Comparison of sampling techniques for use in SYVAC

    International Nuclear Information System (INIS)

    Dalrymple, G.J.

    1984-01-01

    The Stephen Howe review (reference TR-STH-1) recommended the use of a deterministic generator (DG) sampling technique for sampling the input values to the SYVAC (SYstems Variability Analysis Code) program. This technique was compared with Monte Carlo simple random sampling (MC) by taking a 1000 run case of SYVAC using MC as the reference case. The results show that DG appears relatively inaccurate for most values of consequence when used with 11 sample intervals. If 22 sample intervals are used then DG generates cumulative distribution functions that are statistically similar to the reference distribution. 400 runs of DG or MC are adequate to generate a representative cumulative distribution function. The MC technique appears to perform better than DG for the same number of runs. However, the DG predicts higher doses and in view of the importance of generating data in the high dose region this sampling technique with 22 sample intervals is recommended for use in SYVAC. (author)

  6. Sampling Methods in Cardiovascular Nursing Research: An Overview.

    Science.gov (United States)

    Kandola, Damanpreet; Banner, Davina; O'Keefe-McCarthy, Sheila; Jassal, Debbie

    2014-01-01

    Cardiovascular nursing research covers a wide array of topics from health services to psychosocial patient experiences. The selection of specific participant samples is an important part of the research design and process. The sampling strategy employed is of utmost importance to ensure that a representative sample of participants is chosen. There are two main categories of sampling methods: probability and non-probability. Probability sampling is the random selection of elements from the population, where each element of the population has an equal and independent chance of being included in the sample. There are five main types of probability sampling including simple random sampling, systematic sampling, stratified sampling, cluster sampling, and multi-stage sampling. Non-probability sampling methods are those in which elements are chosen through non-random methods for inclusion into the research study and include convenience sampling, purposive sampling, and snowball sampling. Each approach offers distinct advantages and disadvantages and must be considered critically. In this research column, we provide an introduction to these key sampling techniques and draw on examples from the cardiovascular research. Understanding the differences in sampling techniques may aid nurses in effective appraisal of research literature and provide a reference pointfor nurses who engage in cardiovascular research.

  7. National Coral Reef Monitoring Program: Benthic Images Collected from Stratified Random Sites (StRS) across American Samoa in 2015

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The data described here are benthic habitat imagery that result from benthic photo-quadrat surveys conducted along transects at stratified random sites across...

  8. Prototypic Features of Loneliness in a Stratified Sample of Adolescents

    Directory of Open Access Journals (Sweden)

    Mathias Lasgaard

    2009-06-01

    Full Text Available Dominant theoretical approaches in loneliness research emphasize the value of personality characteristics in explaining loneliness. The present study examines whether dysfunctional social strategies and attributions in lonely adolescents can be explained by personality characteristics. A questionnaire survey was conducted with 379 Danish Grade 8 students (M = 14.1 years, SD = 0.4 from 22 geographically stratified and randomly selected schools. Hierarchical linear regression analysis showed that network orientation, success expectation and avoidance in affiliative situations predicted loneliness independent of personality characteristics, demographics and social desirability. The study indicates that dysfunctional strategies and attributions in affiliative situations are directly related to loneliness in adolescence. These strategies and attributions may preclude lonely adolescents from guidance and intervention. Thus, professionals need to be knowledgeable about prototypic features of loneliness in addition to employing a pro-active approach when assisting adolescents who display prototypic features.

  9. Randomized branch sampling to estimatefruit production in Pecan trees cv. ‘Barton’

    Directory of Open Access Journals (Sweden)

    Filemom Manoel Mokochinski

    Full Text Available ABSTRACT: Sampling techniques to quantify the production of fruits are still very scarce and create a gap in crop development research. This study was conducted in a rural property in the county of Cachoeira do Sul - RS to estimate the efficiency of randomized branch sampling (RBS in quantifying the production of pecan fruit at three different ages (5,7 and 10 years. Two selection techniques were tested: the probability proportional to the diameter (PPD and the uniform probability (UP techniques, which were performed on nine trees, three from each age and randomly chosen. The RBS underestimated fruit production for all ages, and its main drawback was the high sampling error (125.17% - PPD and 111.04% - UP. The UP was regarded as more efficient than the PPD, though both techniques estimated similar production and similar experimental errors. In conclusion, we reported that branch sampling was inaccurate for this case study, requiring new studies to produce estimates with smaller sampling error.

  10. Sampling designs and methods for estimating fish-impingement losses at cooling-water intakes

    International Nuclear Information System (INIS)

    Murarka, I.P.; Bodeau, D.J.

    1977-01-01

    Several systems for estimating fish impingement at power plant cooling-water intakes are compared to determine the most statistically efficient sampling designs and methods. Compared to a simple random sampling scheme the stratified systematic random sampling scheme, the systematic random sampling scheme, and the stratified random sampling scheme yield higher efficiencies and better estimators for the parameters in two models of fish impingement as a time-series process. Mathematical results and illustrative examples of the applications of the sampling schemes to simulated and real data are given. Some sampling designs applicable to fish-impingement studies are presented in appendixes

  11. Brachytherapy dose-volume histogram computations using optimized stratified sampling methods

    International Nuclear Information System (INIS)

    Karouzakis, K.; Lahanas, M.; Milickovic, N.; Giannouli, S.; Baltas, D.; Zamboglou, N.

    2002-01-01

    A stratified sampling method for the efficient repeated computation of dose-volume histograms (DVHs) in brachytherapy is presented as used for anatomy based brachytherapy optimization methods. The aim of the method is to reduce the number of sampling points required for the calculation of DVHs for the body and the PTV. From the DVHs are derived the quantities such as Conformity Index COIN and COIN integrals. This is achieved by using partial uniform distributed sampling points with a density in each region obtained from a survey of the gradients or the variance of the dose distribution in these regions. The shape of the sampling regions is adapted to the patient anatomy and the shape and size of the implant. For the application of this method a single preprocessing step is necessary which requires only a few seconds. Ten clinical implants were used to study the appropriate number of sampling points, given a required accuracy for quantities such as cumulative DVHs, COIN indices and COIN integrals. We found that DVHs of very large tissue volumes surrounding the PTV, and also COIN distributions, can be obtained using a factor of 5-10 times smaller the number of sampling points in comparison with uniform distributed points

  12. Geospatial techniques for developing a sampling frame of watersheds across a region

    Science.gov (United States)

    Gresswell, Robert E.; Bateman, Douglas S.; Lienkaemper, George; Guy, T.J.

    2004-01-01

    Current land-management decisions that affect the persistence of native salmonids are often influenced by studies of individual sites that are selected based on judgment and convenience. Although this approach is useful for some purposes, extrapolating results to areas that were not sampled is statistically inappropriate because the sampling design is usually biased. Therefore, in recent investigations of coastal cutthroat trout (Oncorhynchus clarki clarki) located above natural barriers to anadromous salmonids, we used a methodology for extending the statistical scope of inference. The purpose of this paper is to apply geospatial tools to identify a population of watersheds and develop a probability-based sampling design for coastal cutthroat trout in western Oregon, USA. The population of mid-size watersheds (500-5800 ha) west of the Cascade Range divide was derived from watershed delineations based on digital elevation models. Because a database with locations of isolated populations of coastal cutthroat trout did not exist, a sampling frame of isolated watersheds containing cutthroat trout had to be developed. After the sampling frame of watersheds was established, isolated watersheds with coastal cutthroat trout were stratified by ecoregion and erosion potential based on dominant bedrock lithology (i.e., sedimentary and igneous). A stratified random sample of 60 watersheds was selected with proportional allocation in each stratum. By comparing watershed drainage areas of streams in the general population to those in the sampling frame and the resulting sample (n = 60), we were able to evaluate the how representative the subset of watersheds was in relation to the population of watersheds. Geospatial tools provided a relatively inexpensive means to generate the information necessary to develop a statistically robust, probability-based sampling design.

  13. A Bayesian Justification for Random Sampling in Sample Survey

    Directory of Open Access Journals (Sweden)

    Glen Meeden

    2012-07-01

    Full Text Available In the usual Bayesian approach to survey sampling the sampling design, plays a minimal role, at best. Although a close relationship between exchangeable prior distributions and simple random sampling has been noted; how to formally integrate simple random sampling into the Bayesian paradigm is not clear. Recently it has been argued that the sampling design can be thought of as part of a Bayesian's prior distribution. We will show here that under this scenario simple random sample can be given a Bayesian justification in survey sampling.

  14. Stratified Sampling to Define Levels of Petrographic Variation in Coal Beds: Examples from Indonesia and New Zealand

    Directory of Open Access Journals (Sweden)

    Tim A. Moore

    2016-01-01

    Full Text Available DOI: 10.17014/ijog.3.1.29-51Stratified sampling of coal seams for petrographic analysis using block samples is a viable alternative to standard methods of channel sampling and particulate pellet mounts. Although petrographic analysis of particulate pellets is employed widely, it is both time consuming and does not allow variation within sampling units to be assessed - an important measure in any study whether it be for paleoenvironmental reconstruction or in obtaining estimates of industrial attributes. Also, samples taken as intact blocks provide additional information, such as texture and botanical affinity that cannot be gained using particulate pellets. Stratified sampling can be employed both on ‘fine’ and ‘coarse’ grained coal units. Fine-grained coals are defined as those coal intervals that do not contain vitrain bands greater than approximately 1 mm in thickness (as measured perpendicular to bedding. In fine-grained coal seams, a reasonable sized block sample (with a polished surface area of ~3 cm2 can be taken that encapsulates the macroscopic variability. However, for coarse-grained coals (vitrain bands >1 mm a different system has to be employed in order to accurately account for the larger particles. Macroscopic point counting of vitrain bands can accurately account for those particles>1 mm within a coal interval. This point counting method is conducted using something as simple as string on a coal face with marked intervals greater than the largest particle expected to be encountered (although new technologies are being developed to capture this type of information digitally. Comparative analyses of particulate pellets and blocks on the same interval show less than 6% variation between the two sample types when blocks are recalculated to include macroscopic counts of vitrain. Therefore even in coarse-grained coals, stratified sampling can be used effectively and representatively.

  15. Modelling of ground penetrating radar data in stratified media using the reflectivity technique

    International Nuclear Information System (INIS)

    Sena, Armando R; Sen, Mrinal K; Stoffa, Paul L

    2008-01-01

    Horizontally layered media are often encountered in shallow exploration geophysics. Ground penetrating radar (GPR) data in these environments can be modelled by techniques that are more efficient than finite difference (FD) or finite element (FE) schemes because the lateral homogeneity of the media allows us to reduce the dependence on the horizontal spatial variables through Fourier transforms on these coordinates. We adapt and implement the invariant embedding or reflectivity technique used to model elastic waves in layered media to model GPR data. The results obtained with the reflectivity and FDTD modelling techniques are in excellent agreement and the effects of the air–soil interface on the radiation pattern are correctly taken into account by the reflectivity technique. Comparison with real wide-angle GPR data shows that the reflectivity technique can satisfactorily reproduce the real GPR data. These results and the computationally efficient characteristics of the reflectivity technique (compared to FD or FE) demonstrate its usefulness in interpretation and possible model-based inversion schemes of GPR data in stratified media

  16. Landslide Susceptibility Assessment Using Frequency Ratio Technique with Iterative Random Sampling

    Directory of Open Access Journals (Sweden)

    Hyun-Joo Oh

    2017-01-01

    Full Text Available This paper assesses the performance of the landslide susceptibility analysis using frequency ratio (FR with an iterative random sampling. A pair of before-and-after digital aerial photographs with 50 cm spatial resolution was used to detect landslide occurrences in Yongin area, Korea. Iterative random sampling was run ten times in total and each time it was applied to the training and validation datasets. Thirteen landslide causative factors were derived from the topographic, soil, forest, and geological maps. The FR scores were calculated from the causative factors and training occurrences repeatedly ten times. The ten landslide susceptibility maps were obtained from the integration of causative factors that assigned FR scores. The landslide susceptibility maps were validated by using each validation dataset. The FR method achieved susceptibility accuracies from 89.48% to 93.21%. And the landslide susceptibility accuracy of the FR method is higher than 89%. Moreover, the ten times iterative FR modeling may contribute to a better understanding of a regularized relationship between the causative factors and landslide susceptibility. This makes it possible to incorporate knowledge-driven considerations of the causative factors into the landslide susceptibility analysis and also be extensively used to other areas.

  17. Sampling Strategies for Evaluating the Rate of Adventitious Transgene Presence in Non-Genetically Modified Crop Fields.

    Science.gov (United States)

    Makowski, David; Bancal, Rémi; Bensadoun, Arnaud; Monod, Hervé; Messéan, Antoine

    2017-09-01

    According to E.U. regulations, the maximum allowable rate of adventitious transgene presence in non-genetically modified (GM) crops is 0.9%. We compared four sampling methods for the detection of transgenic material in agricultural non-GM maize fields: random sampling, stratified sampling, random sampling + ratio reweighting, random sampling + regression reweighting. Random sampling involves simply sampling maize grains from different locations selected at random from the field concerned. The stratified and reweighting sampling methods make use of an auxiliary variable corresponding to the output of a gene-flow model (a zero-inflated Poisson model) simulating cross-pollination as a function of wind speed, wind direction, and distance to the closest GM maize field. With the stratified sampling method, an auxiliary variable is used to define several strata with contrasting transgene presence rates, and grains are then sampled at random from each stratum. With the two methods involving reweighting, grains are first sampled at random from various locations within the field, and the observations are then reweighted according to the auxiliary variable. Data collected from three maize fields were used to compare the four sampling methods, and the results were used to determine the extent to which transgene presence rate estimation was improved by the use of stratified and reweighting sampling methods. We found that transgene rate estimates were more accurate and that substantially smaller samples could be used with sampling strategies based on an auxiliary variable derived from a gene-flow model. © 2017 Society for Risk Analysis.

  18. [Comparison study on sampling methods of Oncomelania hupensis snail survey in marshland schistosomiasis epidemic areas in China].

    Science.gov (United States)

    An, Zhao; Wen-Xin, Zhang; Zhong, Yao; Yu-Kuan, Ma; Qing, Liu; Hou-Lang, Duan; Yi-di, Shang

    2016-06-29

    To optimize and simplify the survey method of Oncomelania hupensis snail in marshland endemic region of schistosomiasis and increase the precision, efficiency and economy of the snail survey. A quadrate experimental field was selected as the subject of 50 m×50 m size in Chayegang marshland near Henghu farm in the Poyang Lake region and a whole-covered method was adopted to survey the snails. The simple random sampling, systematic sampling and stratified random sampling methods were applied to calculate the minimum sample size, relative sampling error and absolute sampling error. The minimum sample sizes of the simple random sampling, systematic sampling and stratified random sampling methods were 300, 300 and 225, respectively. The relative sampling errors of three methods were all less than 15%. The absolute sampling errors were 0.221 7, 0.302 4 and 0.047 8, respectively. The spatial stratified sampling with altitude as the stratum variable is an efficient approach of lower cost and higher precision for the snail survey.

  19. Design and simulation of stratified probability digital receiver with application to the multipath communication

    Science.gov (United States)

    Deal, J. H.

    1975-01-01

    One approach to the problem of simplifying complex nonlinear filtering algorithms is through using stratified probability approximations where the continuous probability density functions of certain random variables are represented by discrete mass approximations. This technique is developed in this paper and used to simplify the filtering algorithms developed for the optimum receiver for signals corrupted by both additive and multiplicative noise.

  20. National Coral Reef Monitoring Program: Benthic Images Collected from Stratified Random Sites (StRS) across American Samoa in 2015 (NCEI Accession 0159168)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The data described here are benthic habitat imagery that result from benthic photo-quadrat surveys conducted along transects at stratified random sites across...

  1. Exploring pseudo- and chaotic random Monte Carlo simulations

    Science.gov (United States)

    Blais, J. A. Rod; Zhang, Zhan

    2011-07-01

    Computer simulations are an increasingly important area of geoscience research and development. At the core of stochastic or Monte Carlo simulations are the random number sequences that are assumed to be distributed with specific characteristics. Computer-generated random numbers, uniformly distributed on (0, 1), can be very different depending on the selection of pseudo-random number (PRN) or chaotic random number (CRN) generators. In the evaluation of some definite integrals, the resulting error variances can even be of different orders of magnitude. Furthermore, practical techniques for variance reduction such as importance sampling and stratified sampling can be applied in most Monte Carlo simulations and significantly improve the results. A comparative analysis of these strategies has been carried out for computational applications in planar and spatial contexts. Based on these experiments, and on some practical examples of geodetic direct and inverse problems, conclusions and recommendations concerning their performance and general applicability are included.

  2. Systematic random sampling of the comet assay.

    Science.gov (United States)

    McArt, Darragh G; Wasson, Gillian R; McKerr, George; Saetzler, Kurt; Reed, Matt; Howard, C Vyvyan

    2009-07-01

    The comet assay is a technique used to quantify DNA damage and repair at a cellular level. In the assay, cells are embedded in agarose and the cellular content is stripped away leaving only the DNA trapped in an agarose cavity which can then be electrophoresed. The damaged DNA can enter the agarose and migrate while the undamaged DNA cannot and is retained. DNA damage is measured as the proportion of the migratory 'tail' DNA compared to the total DNA in the cell. The fundamental basis of these arbitrary values is obtained in the comet acquisition phase using fluorescence microscopy with a stoichiometric stain in tandem with image analysis software. Current methods deployed in such an acquisition are expected to be both objectively and randomly obtained. In this paper we examine the 'randomness' of the acquisition phase and suggest an alternative method that offers both objective and unbiased comet selection. In order to achieve this, we have adopted a survey sampling approach widely used in stereology, which offers a method of systematic random sampling (SRS). This is desirable as it offers an impartial and reproducible method of comet analysis that can be used both manually or automated. By making use of an unbiased sampling frame and using microscope verniers, we are able to increase the precision of estimates of DNA damage. Results obtained from a multiple-user pooled variation experiment showed that the SRS technique attained a lower variability than that of the traditional approach. The analysis of a single user with repetition experiment showed greater individual variances while not being detrimental to overall averages. This would suggest that the SRS method offers a better reflection of DNA damage for a given slide and also offers better user reproducibility.

  3. Modern survey sampling

    CERN Document Server

    Chaudhuri, Arijit

    2014-01-01

    Exposure to SamplingAbstract Introduction Concepts of Population, Sample, and SamplingInitial RamificationsAbstract Introduction Sampling Design, Sampling SchemeRandom Numbers and Their Uses in Simple RandomSampling (SRS)Drawing Simple Random Samples with and withoutReplacementEstimation of Mean, Total, Ratio of Totals/Means:Variance and Variance EstimationDetermination of Sample SizesA.2 Appendix to Chapter 2 A.More on Equal Probability Sampling A.Horvitz-Thompson EstimatorA.SufficiencyA.LikelihoodA.Non-Existence Theorem More Intricacies Abstract Introduction Unequal Probability Sampling StrategiesPPS Sampling Exploring Improved WaysAbstract Introduction Stratified Sampling Cluster SamplingMulti-Stage SamplingMulti-Phase Sampling: Ratio and RegressionEstimationviiviii ContentsControlled SamplingModeling Introduction Super-Population ModelingPrediction Approach Model-Assisted Approach Bayesian Methods Spatial SmoothingSampling on Successive Occasions: Panel Rotation Non-Response and Not-at-Homes Weighting Adj...

  4. The Dirichet-Multinomial model for multivariate randomized response data and small samples

    NARCIS (Netherlands)

    Avetisyan, Marianna; Fox, Gerardus J.A.

    2012-01-01

    In survey sampling the randomized response (RR) technique can be used to obtain truthful answers to sensitive questions. Although the individual answers are masked due to the RR technique, individual (sensitive) response rates can be estimated when observing multivariate response data. The

  5. Misrepresenting random sampling? A systematic review of research papers in the Journal of Advanced Nursing.

    Science.gov (United States)

    Williamson, Graham R

    2003-11-01

    This paper discusses the theoretical limitations of the use of random sampling and probability theory in the production of a significance level (or P-value) in nursing research. Potential alternatives, in the form of randomization tests, are proposed. Research papers in nursing, medicine and psychology frequently misrepresent their statistical findings, as the P-values reported assume random sampling. In this systematic review of studies published between January 1995 and June 2002 in the Journal of Advanced Nursing, 89 (68%) studies broke this assumption because they used convenience samples or entire populations. As a result, some of the findings may be questionable. The key ideas of random sampling and probability theory for statistical testing (for generating a P-value) are outlined. The result of a systematic review of research papers published in the Journal of Advanced Nursing is then presented, showing how frequently random sampling appears to have been misrepresented. Useful alternative techniques that might overcome these limitations are then discussed. REVIEW LIMITATIONS: This review is limited in scope because it is applied to one journal, and so the findings cannot be generalized to other nursing journals or to nursing research in general. However, it is possible that other nursing journals are also publishing research articles based on the misrepresentation of random sampling. The review is also limited because in several of the articles the sampling method was not completely clearly stated, and in this circumstance a judgment has been made as to the sampling method employed, based on the indications given by author(s). Quantitative researchers in nursing should be very careful that the statistical techniques they use are appropriate for the design and sampling methods of their studies. If the techniques they employ are not appropriate, they run the risk of misinterpreting findings by using inappropriate, unrepresentative and biased samples.

  6. Two-phase air-water stratified flow measurement using ultrasonic techniques

    International Nuclear Information System (INIS)

    Fan, Shiwei; Yan, Tinghu; Yeung, Hoi

    2014-01-01

    In this paper, a time resolved ultrasound system was developed for investigating two-phase air-water stratified flow. The hardware of the system includes a pulsed wave transducer, a pulser/receiver, and a digital oscilloscope. The time domain cross correlation method is used to calculate the velocity profile along ultrasonic beam. The system is able to provide velocities with spatial resolution of around 1mm and the temporal resolution of 200μs. Experiments were carried out on single phase water flow and two-phase air-water stratified flow. For single phase water flow, the flow rates from ultrasound system were compared with those from electromagnetic flow (EM) meter, which showed good agreement. Then, the experiments were conducted on two-phase air-water stratified flow and the results were given. Compared with liquid height measurement from conductance probe, it indicated that the measured velocities were explainable

  7. Systematic versus random sampling in stereological studies.

    Science.gov (United States)

    West, Mark J

    2012-12-01

    The sampling that takes place at all levels of an experimental design must be random if the estimate is to be unbiased in a statistical sense. There are two fundamental ways by which one can make a random sample of the sections and positions to be probed on the sections. Using a card-sampling analogy, one can pick any card at all out of a deck of cards. This is referred to as independent random sampling because the sampling of any one card is made without reference to the position of the other cards. The other approach to obtaining a random sample would be to pick a card within a set number of cards and others at equal intervals within the deck. Systematic sampling along one axis of many biological structures is more efficient than random sampling, because most biological structures are not randomly organized. This article discusses the merits of systematic versus random sampling in stereological studies.

  8. k-Means: Random Sampling Procedure

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. k-Means: Random Sampling Procedure. Optimal 1-Mean is. Approximation of Centroid (Inaba et al). S = random sample of size O(1/ ); Centroid of S is a (1+ )-approx centroid of P with constant probability.

  9. Random sampling technique for ultra-fast computations of molecular opacities for exoplanet atmospheres

    Science.gov (United States)

    Min, M.

    2017-10-01

    Context. Opacities of molecules in exoplanet atmospheres rely on increasingly detailed line-lists for these molecules. The line lists available today contain for many species up to several billions of lines. Computation of the spectral line profile created by pressure and temperature broadening, the Voigt profile, of all of these lines is becoming a computational challenge. Aims: We aim to create a method to compute the Voigt profile in a way that automatically focusses the computation time into the strongest lines, while still maintaining the continuum contribution of the high number of weaker lines. Methods: Here, we outline a statistical line sampling technique that samples the Voigt profile quickly and with high accuracy. The number of samples is adjusted to the strength of the line and the local spectral line density. This automatically provides high accuracy line shapes for strong lines or lines that are spectrally isolated. The line sampling technique automatically preserves the integrated line opacity for all lines, thereby also providing the continuum opacity created by the large number of weak lines at very low computational cost. Results: The line sampling technique is tested for accuracy when computing line spectra and correlated-k tables. Extremely fast computations ( 3.5 × 105 lines per second per core on a standard current day desktop computer) with high accuracy (≤1% almost everywhere) are obtained. A detailed recipe on how to perform the computations is given.

  10. Comparison of sampling strategies for object-based classification of urban vegetation from Very High Resolution satellite images

    Science.gov (United States)

    Rougier, Simon; Puissant, Anne; Stumpf, André; Lachiche, Nicolas

    2016-09-01

    Vegetation monitoring is becoming a major issue in the urban environment due to the services they procure and necessitates an accurate and up to date mapping. Very High Resolution satellite images enable a detailed mapping of the urban tree and herbaceous vegetation. Several supervised classifications with statistical learning techniques have provided good results for the detection of urban vegetation but necessitate a large amount of training data. In this context, this study proposes to investigate the performances of different sampling strategies in order to reduce the number of examples needed. Two windows based active learning algorithms from state-of-art are compared to a classical stratified random sampling and a third combining active learning and stratified strategies is proposed. The efficiency of these strategies is evaluated on two medium size French cities, Strasbourg and Rennes, associated to different datasets. Results demonstrate that classical stratified random sampling can in some cases be just as effective as active learning methods and that it should be used more frequently to evaluate new active learning methods. Moreover, the active learning strategies proposed in this work enables to reduce the computational runtime by selecting multiple windows at each iteration without increasing the number of windows needed.

  11. Design of dry sand soil stratified sampler

    Science.gov (United States)

    Li, Erkang; Chen, Wei; Feng, Xiao; Liao, Hongbo; Liang, Xiaodong

    2018-04-01

    This paper presents a design of a stratified sampler for dry sand soil, which can be used for stratified sampling of loose sand under certain conditions. Our group designed the mechanical structure of a portable, single - person, dry sandy soil stratified sampler. We have set up a mathematical model for the sampler. It lays the foundation for further development of design research.

  12. The Dirichlet-Multinomial Model for Multivariate Randomized Response Data and Small Samples

    Science.gov (United States)

    Avetisyan, Marianna; Fox, Jean-Paul

    2012-01-01

    In survey sampling the randomized response (RR) technique can be used to obtain truthful answers to sensitive questions. Although the individual answers are masked due to the RR technique, individual (sensitive) response rates can be estimated when observing multivariate response data. The beta-binomial model for binary RR data will be generalized…

  13. Statistical Theory of the Vector Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Brincker, Rune; Ibrahim, S. R.

    1999-01-01

    decays. Due to the speed and/or accuracy of the Vector Random Decrement technique, it was introduced as an attractive alternative to the Random Decrement technique. In this paper, the theory of the Vector Random Decrement technique is extended by applying a statistical description of the stochastic...

  14. Sampling problems for randomly broken sticks

    Energy Technology Data Exchange (ETDEWEB)

    Huillet, Thierry [Laboratoire de Physique Theorique et Modelisation, CNRS-UMR 8089 et Universite de Cergy-Pontoise, 5 mail Gay-Lussac, 95031, Neuville sur Oise (France)

    2003-04-11

    Consider the random partitioning model of a population (represented by a stick of length 1) into n species (fragments) with identically distributed random weights (sizes). Upon ranking the fragments' weights according to ascending sizes, let S{sub m:n} be the size of the mth smallest fragment. Assume that some observer is sampling such populations as follows: drop at random k points (the sample size) onto this stick and record the corresponding numbers of visited fragments. We shall investigate the following sampling problems: (1) what is the sample size if the sampling is carried out until the first visit of the smallest fragment (size S{sub 1:n})? (2) For a given sample size, have all the fragments of the stick been visited at least once or not? This question is related to Feller's random coupon collector problem. (3) In what order are new fragments being discovered and what is the random number of samples separating the discovery of consecutive new fragments until exhaustion of the list? For this problem, the distribution of the size-biased permutation of the species' weights, as the sequence of their weights in their order of appearance is needed and studied.

  15. The status of dental caries and related factors in a sample of Iranian adolescents

    DEFF Research Database (Denmark)

    Pakpour, Amir H.; Hidarnia, Alireza; Hajizadeh, Ebrahim

    2011-01-01

    Objective: To describe the status of dental caries in a sample of Iranian adolescents aged 14 to 18 years in Qazvin, and to identify caries-related factors affecting this group. Study design: Qazvin was divided into three zones according to socio-economic status. The sampling procedure used...... was a stratified cluster sampling technique; incorporating 3 stratified zones, for each of which a cluster of school children were recruited from randomly selected high schools. The adolescents agreed to participate in the study and to complete a questionnaire. Dental caries status was assessed in terms of decayed...... their teeth on a regular basis. Although the incidence of caries was found to be moderate, it was influenced by demographic factors such as age and gender in addition to socio-behavioral variables such as family income, the level of education attained by parents, the frequency of dental brushing and flossing...

  16. Properties of the endogenous post-stratified estimator using a random forests model

    Science.gov (United States)

    John Tipton; Jean Opsomer; Gretchen G. Moisen

    2012-01-01

    Post-stratification is used in survey statistics as a method to improve variance estimates. In traditional post-stratification methods, the variable on which the data is being stratified must be known at the population level. In many cases this is not possible, but it is possible to use a model to predict values using covariates, and then stratify on these predicted...

  17. Nitrogen transformations in stratified aquatic microbial ecosystems

    DEFF Research Database (Denmark)

    Revsbech, Niels Peter; Risgaard-Petersen, N.; Schramm, Andreas

    2006-01-01

    Abstract  New analytical methods such as advanced molecular techniques and microsensors have resulted in new insights about how nitrogen transformations in stratified microbial systems such as sediments and biofilms are regulated at a µm-mm scale. A large and ever-expanding knowledge base about n...... performing dissimilatory reduction of nitrate to ammonium have given new dimensions to the understanding of nitrogen cycling in nature, and the occurrence of these organisms and processes in stratified microbial communities will be described in detail.......Abstract  New analytical methods such as advanced molecular techniques and microsensors have resulted in new insights about how nitrogen transformations in stratified microbial systems such as sediments and biofilms are regulated at a µm-mm scale. A large and ever-expanding knowledge base about...... nitrogen fixation, nitrification, denitrification, and dissimilatory reduction of nitrate to ammonium, and about the microorganisms performing the processes, has been produced by use of these techniques. During the last decade the discovery of anammmox bacteria and migrating, nitrate accumulating bacteria...

  18. Development and Assessment of an E-Learning Course on Breast Imaging for Radiographers: A Stratified Randomized Controlled Trial

    Science.gov (United States)

    Ventura, Sandra Rua; Ramos, Isabel; Rodrigues, Pedro Pereira

    2015-01-01

    Background Mammography is considered the best imaging technique for breast cancer screening, and the radiographer plays an important role in its performance. Therefore, continuing education is critical to improving the performance of these professionals and thus providing better health care services. Objective Our goal was to develop an e-learning course on breast imaging for radiographers, assessing its efficacy, effectiveness, and user satisfaction. Methods A stratified randomized controlled trial was performed with radiographers and radiology students who already had mammography training, using pre- and post-knowledge tests, and satisfaction questionnaires. The primary outcome was the improvement in test results (percentage of correct answers), using intention-to-treat and per-protocol analysis. Results A total of 54 participants were assigned to the intervention (20 students plus 34 radiographers) with 53 controls (19+34). The intervention was completed by 40 participants (11+29), with 4 (2+2) discontinued interventions, and 10 (7+3) lost to follow-up. Differences in the primary outcome were found between intervention and control: 21 versus 4 percentage points (pp), Pe-learning course is effective, especially for radiographers, which highlights the need for continuing education. PMID:25560547

  19. Application of bias factor method using random sampling technique for prediction accuracy improvement of critical eigenvalue of BWR

    International Nuclear Information System (INIS)

    Ito, Motohiro; Endo, Tomohiro; Yamamoto, Akio; Kuroda, Yusuke; Yoshii, Takashi

    2017-01-01

    The bias factor method based on the random sampling technique is applied to the benchmark problem of Peach Bottom Unit 2. Validity and availability of the present method, i.e. correction of calculation results and reduction of uncertainty, are confirmed in addition to features and performance of the present method. In the present study, core characteristics in cycle 3 are corrected with the proposed method using predicted and 'measured' critical eigenvalues in cycles 1 and 2. As the source of uncertainty, variance-covariance of cross sections is considered. The calculation results indicate that bias between predicted and measured results, and uncertainty owing to cross section can be reduced. Extension to other uncertainties such as thermal hydraulics properties will be a future task. (author)

  20. STATISTICAL LANDMARKS AND PRACTICAL ISSUES REGARDING THE USE OF SIMPLE RANDOM SAMPLING IN MARKET RESEARCHES

    Directory of Open Access Journals (Sweden)

    CODRUŢA DURA

    2010-01-01

    Full Text Available The sample represents a particular segment of the statistical populationchosen to represent it as a whole. The representativeness of the sample determines the accuracyfor estimations made on the basis of calculating the research indicators and the inferentialstatistics. The method of random sampling is part of probabilistic methods which can be usedwithin marketing research and it is characterized by the fact that it imposes the requirementthat each unit belonging to the statistical population should have an equal chance of beingselected for the sampling process. When the simple random sampling is meant to be rigorouslyput into practice, it is recommended to use the technique of random number tables in order toconfigure the sample which will provide information that the marketer needs. The paper alsodetails the practical procedure implemented in order to create a sample for a marketingresearch by generating random numbers using the facilities offered by Microsoft Excel.

  1. National Coral Reef Monitoring Program: Benthic Cover Derived from Analysis of Benthic Images Collected during Stratified Random Surveys (StRS) across American Samoa in 2015

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The data described here result from benthic photo-quadrat surveys conducted along transects at stratified random sites across American Samoa in 2015 as a part of...

  2. Improved importance sampling technique for efficient simulation of digital communication systems

    Science.gov (United States)

    Lu, Dingqing; Yao, Kung

    1988-01-01

    A new, improved importance sampling (IIS) approach to simulation is considered. Some basic concepts of IS are introduced, and detailed evolutions of simulation estimation variances for Monte Carlo (MC) and IS simulations are given. The general results obtained from these evolutions are applied to the specific previously known conventional importance sampling (CIS) technique and the new IIS technique. The derivation for a linear system with no signal random memory is considered in some detail. For the CIS technique, the optimum input scaling parameter is found, while for the IIS technique, the optimum translation parameter is found. The results are generalized to a linear system with memory and signals. Specific numerical and simulation results are given which show the advantages of CIS over MC and IIS over CIS for simulations of digital communications systems.

  3. Mahalanobis' Contributions to Sample Surveys

    Indian Academy of Sciences (India)

    Sample Survey started its operations in October 1950 under the ... and adopted random cuts for estimating the acreage under jute ... demographic factors relating to indebtedness, unemployment, ... traffic surveys, demand for currency coins and average life of .... Mahalanobis derived the optimum allocation in stratified.

  4. Estimation of Sensitive Proportion by Randomized Response Data in Successive Sampling

    Directory of Open Access Journals (Sweden)

    Bo Yu

    2015-01-01

    Full Text Available This paper considers the problem of estimation for binomial proportions of sensitive or stigmatizing attributes in the population of interest. Randomized response techniques are suggested for protecting the privacy of respondents and reducing the response bias while eliciting information on sensitive attributes. In many sensitive question surveys, the same population is often sampled repeatedly on each occasion. In this paper, we apply successive sampling scheme to improve the estimation of the sensitive proportion on current occasion.

  5. The electron transport problem sampling by Monte Carlo individual collision technique

    International Nuclear Information System (INIS)

    Androsenko, P.A.; Belousov, V.I.

    2005-01-01

    The problem of electron transport is of most interest in all fields of the modern science. To solve this problem the Monte Carlo sampling has to be used. The electron transport is characterized by a large number of individual interactions. To simulate electron transport the 'condensed history' technique may be used where a large number of collisions are grouped into a single step to be sampled randomly. Another kind of Monte Carlo sampling is the individual collision technique. In comparison with condensed history technique researcher has the incontestable advantages. For example one does not need to give parameters altered by condensed history technique like upper limit for electron energy, resolution, number of sub-steps etc. Also the condensed history technique may lose some very important tracks of electrons because of its limited nature by step parameters of particle movement and due to weakness of algorithms for example energy indexing algorithm. There are no these disadvantages in the individual collision technique. This report presents some sampling algorithms of new version BRAND code where above mentioned technique is used. All information on electrons was taken from Endf-6 files. They are the important part of BRAND. These files have not been processed but directly taken from electron information source. Four kinds of interaction like the elastic interaction, the Bremsstrahlung, the atomic excitation and the atomic electro-ionization were considered. In this report some results of sampling are presented after comparison with analogs. For example the endovascular radiotherapy problem (P2) of QUADOS2002 was presented in comparison with another techniques that are usually used. (authors)

  6. Gambling problems in the family – A stratified probability sample study of prevalence and reported consequences

    Directory of Open Access Journals (Sweden)

    Øren Anita

    2008-12-01

    Full Text Available Abstract Background Prior studies on the impact of problem gambling in the family mainly include help-seeking populations with small numbers of participants. The objective of the present stratified probability sample study was to explore the epidemiology of problem gambling in the family in the general population. Methods Men and women 16–74 years-old randomly selected from the Norwegian national population database received an invitation to participate in this postal questionnaire study. The response rate was 36.1% (3,483/9,638. Given the lack of validated criteria, two survey questions ("Have you ever noticed that a close relative spent more and more money on gambling?" and "Have you ever experienced that a close relative lied to you about how much he/she gambles?" were extrapolated from the Lie/Bet Screen for pathological gambling. Respondents answering "yes" to both questions were defined as Concerned Significant Others (CSOs. Results Overall, 2.0% of the study population was defined as CSOs. Young age, female gender, and divorced marital status were factors positively associated with being a CSO. CSOs often reported to have experienced conflicts in the family related to gambling, worsening of the family's financial situation, and impaired mental and physical health. Conclusion Problematic gambling behaviour not only affects the gambling individual but also has a strong impact on the quality of life of family members.

  7. Statistical sampling applied to the radiological characterization of historical waste

    Directory of Open Access Journals (Sweden)

    Zaffora Biagio

    2016-01-01

    Full Text Available The evaluation of the activity of radionuclides in radioactive waste is required for its disposal in final repositories. Easy-to-measure nuclides, like γ-emitters and high-energy X-rays, can be measured via non-destructive nuclear techniques from outside a waste package. Some radionuclides are difficult-to-measure (DTM from outside a package because they are α- or β-emitters. The present article discusses the application of linear regression, scaling factors (SF and the so-called “mean activity method” to estimate the activity of DTM nuclides on metallic waste produced at the European Organization for Nuclear Research (CERN. Various statistical sampling techniques including simple random sampling, systematic sampling, stratified and authoritative sampling are described and applied to 2 waste populations of activated copper cables. The bootstrap is introduced as a tool to estimate average activities and standard errors in waste characterization. The analysis of the DTM Ni-63 is used as an example. Experimental and theoretical values of SFs are calculated and compared. Guidelines for sampling historical waste using probabilistic and non-probabilistic sampling are finally given.

  8. Health service accreditation as a predictor of clinical and organisational performance: a blinded, random, stratified study.

    Science.gov (United States)

    Braithwaite, Jeffrey; Greenfield, David; Westbrook, Johanna; Pawsey, Marjorie; Westbrook, Mary; Gibberd, Robert; Naylor, Justine; Nathan, Sally; Robinson, Maureen; Runciman, Bill; Jackson, Margaret; Travaglia, Joanne; Johnston, Brian; Yen, Desmond; McDonald, Heather; Low, Lena; Redman, Sally; Johnson, Betty; Corbett, Angus; Hennessy, Darlene; Clark, John; Lancaster, Judie

    2010-02-01

    Despite the widespread use of accreditation in many countries, and prevailing beliefs that accreditation is associated with variables contributing to clinical care and organisational outcomes, little systematic research has been conducted to examine its validity as a predictor of healthcare performance. To determine whether accreditation performance is associated with self-reported clinical performance and independent ratings of four aspects of organisational performance. Independent blinded assessment of these variables in a random, stratified sample of health service organisations. Acute care: large, medium and small health-service organisations in Australia. Study participants Nineteen health service organisations employing 16 448 staff treating 321 289 inpatients and 1 971 087 non-inpatient services annually, representing approximately 5% of the Australian acute care health system. Correlations of accreditation performance with organisational culture, organisational climate, consumer involvement, leadership and clinical performance. Results Accreditation performance was significantly positively correlated with organisational culture (rho=0.618, p=0.005) and leadership (rho=0.616, p=0.005). There was a trend between accreditation and clinical performance (rho=0.450, p=0.080). Accreditation was unrelated to organisational climate (rho=0.378, p=0.110) and consumer involvement (rho=0.215, p=0.377). Accreditation results predict leadership behaviours and cultural characteristics of healthcare organisations but not organisational climate or consumer participation, and a positive trend between accreditation and clinical performance is noted.

  9. Random On-Board Pixel Sampling (ROPS) X-Ray Camera

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Zhehui [Los Alamos; Iaroshenko, O. [Los Alamos; Li, S. [Los Alamos; Liu, T. [Fermilab; Parab, N. [Argonne (main); Chen, W. W. [Purdue U.; Chu, P. [Los Alamos; Kenyon, G. [Los Alamos; Lipton, R. [Fermilab; Sun, K.-X. [Nevada U., Las Vegas

    2017-09-25

    Recent advances in compressed sensing theory and algorithms offer new possibilities for high-speed X-ray camera design. In many CMOS cameras, each pixel has an independent on-board circuit that includes an amplifier, noise rejection, signal shaper, an analog-to-digital converter (ADC), and optional in-pixel storage. When X-ray images are sparse, i.e., when one of the following cases is true: (a.) The number of pixels with true X-ray hits is much smaller than the total number of pixels; (b.) The X-ray information is redundant; or (c.) Some prior knowledge about the X-ray images exists, sparse sampling may be allowed. Here we first illustrate the feasibility of random on-board pixel sampling (ROPS) using an existing set of X-ray images, followed by a discussion about signal to noise as a function of pixel size. Next, we describe a possible circuit architecture to achieve random pixel access and in-pixel storage. The combination of a multilayer architecture, sparse on-chip sampling, and computational image techniques, is expected to facilitate the development and applications of high-speed X-ray camera technology.

  10. Towards Cost-efficient Sampling Methods

    OpenAIRE

    Peng, Luo; Yongli, Li; Chong, Wu

    2014-01-01

    The sampling method has been paid much attention in the field of complex network in general and statistical physics in particular. This paper presents two new sampling methods based on the perspective that a small part of vertices with high node degree can possess the most structure information of a network. The two proposed sampling methods are efficient in sampling the nodes with high degree. The first new sampling method is improved on the basis of the stratified random sampling method and...

  11. Monitoring and identification of spatiotemporal landscape changes in multiple remote sensing images by using a stratified conditional Latin hypercube sampling approach and geostatistical simulation.

    Science.gov (United States)

    Lin, Yu-Pin; Chu, Hone-Jay; Huang, Yu-Long; Tang, Chia-Hsi; Rouhani, Shahrokh

    2011-06-01

    This study develops a stratified conditional Latin hypercube sampling (scLHS) approach for multiple, remotely sensed, normalized difference vegetation index (NDVI) images. The objective is to sample, monitor, and delineate spatiotemporal landscape changes, including spatial heterogeneity and variability, in a given area. The scLHS approach, which is based on the variance quadtree technique (VQT) and the conditional Latin hypercube sampling (cLHS) method, selects samples in order to delineate landscape changes from multiple NDVI images. The images are then mapped for calibration and validation by using sequential Gaussian simulation (SGS) with the scLHS selected samples. Spatial statistical results indicate that in terms of their statistical distribution, spatial distribution, and spatial variation, the statistics and variograms of the scLHS samples resemble those of multiple NDVI images more closely than those of cLHS and VQT samples. Moreover, the accuracy of simulated NDVI images based on SGS with scLHS samples is significantly better than that of simulated NDVI images based on SGS with cLHS samples and VQT samples, respectively. However, the proposed approach efficiently monitors the spatial characteristics of landscape changes, including the statistics, spatial variability, and heterogeneity of NDVI images. In addition, SGS with the scLHS samples effectively reproduces spatial patterns and landscape changes in multiple NDVI images.

  12. Impacts of Sample Design for Validation Data on the Accuracy of Feedforward Neural Network Classification

    Directory of Open Access Journals (Sweden)

    Giles M. Foody

    2017-08-01

    Full Text Available Validation data are often used to evaluate the performance of a trained neural network and used in the selection of a network deemed optimal for the task at-hand. Optimality is commonly assessed with a measure, such as overall classification accuracy. The latter is often calculated directly from a confusion matrix showing the counts of cases in the validation set with particular labelling properties. The sample design used to form the validation set can, however, influence the estimated magnitude of the accuracy. Commonly, the validation set is formed with a stratified sample to give balanced classes, but also via random sampling, which reflects class abundance. It is suggested that if the ultimate aim is to accurately classify a dataset in which the classes do vary in abundance, a validation set formed via random, rather than stratified, sampling is preferred. This is illustrated with the classification of simulated and remotely-sensed datasets. With both datasets, statistically significant differences in the accuracy with which the data could be classified arose from the use of validation sets formed via random and stratified sampling (z = 2.7 and 1.9 for the simulated and real datasets respectively, for both p < 0.05%. The accuracy of the classifications that used a stratified sample in validation were smaller, a result of cases of an abundant class being commissioned into a rarer class. Simple means to address the issue are suggested.

  13. Correcting Classifiers for Sample Selection Bias in Two-Phase Case-Control Studies

    Science.gov (United States)

    Theis, Fabian J.

    2017-01-01

    Epidemiological studies often utilize stratified data in which rare outcomes or exposures are artificially enriched. This design can increase precision in association tests but distorts predictions when applying classifiers on nonstratified data. Several methods correct for this so-called sample selection bias, but their performance remains unclear especially for machine learning classifiers. With an emphasis on two-phase case-control studies, we aim to assess which corrections to perform in which setting and to obtain methods suitable for machine learning techniques, especially the random forest. We propose two new resampling-based methods to resemble the original data and covariance structure: stochastic inverse-probability oversampling and parametric inverse-probability bagging. We compare all techniques for the random forest and other classifiers, both theoretically and on simulated and real data. Empirical results show that the random forest profits from only the parametric inverse-probability bagging proposed by us. For other classifiers, correction is mostly advantageous, and methods perform uniformly. We discuss consequences of inappropriate distribution assumptions and reason for different behaviors between the random forest and other classifiers. In conclusion, we provide guidance for choosing correction methods when training classifiers on biased samples. For random forests, our method outperforms state-of-the-art procedures if distribution assumptions are roughly fulfilled. We provide our implementation in the R package sambia. PMID:29312464

  14. An efficient method of randomly sampling the coherent angular scatter distribution

    International Nuclear Information System (INIS)

    Williamson, J.F.; Morin, R.L.

    1983-01-01

    Monte Carlo simulations of photon transport phenomena require random selection of an interaction process at each collision site along the photon track. Possible choices are usually limited to photoelectric absorption and incoherent scatter as approximated by the Klein-Nishina distribution. A technique is described for sampling the coherent angular scatter distribution, for the benefit of workers in medical physics. (U.K.)

  15. The electron transport problem sampling by Monte Carlo individual collision technique

    Energy Technology Data Exchange (ETDEWEB)

    Androsenko, P.A.; Belousov, V.I. [Obninsk State Technical Univ. of Nuclear Power Engineering, Kaluga region (Russian Federation)

    2005-07-01

    The problem of electron transport is of most interest in all fields of the modern science. To solve this problem the Monte Carlo sampling has to be used. The electron transport is characterized by a large number of individual interactions. To simulate electron transport the 'condensed history' technique may be used where a large number of collisions are grouped into a single step to be sampled randomly. Another kind of Monte Carlo sampling is the individual collision technique. In comparison with condensed history technique researcher has the incontestable advantages. For example one does not need to give parameters altered by condensed history technique like upper limit for electron energy, resolution, number of sub-steps etc. Also the condensed history technique may lose some very important tracks of electrons because of its limited nature by step parameters of particle movement and due to weakness of algorithms for example energy indexing algorithm. There are no these disadvantages in the individual collision technique. This report presents some sampling algorithms of new version BRAND code where above mentioned technique is used. All information on electrons was taken from Endf-6 files. They are the important part of BRAND. These files have not been processed but directly taken from electron information source. Four kinds of interaction like the elastic interaction, the Bremsstrahlung, the atomic excitation and the atomic electro-ionization were considered. In this report some results of sampling are presented after comparison with analogs. For example the endovascular radiotherapy problem (P2) of QUADOS2002 was presented in comparison with another techniques that are usually used. (authors)

  16. National Coral Reef Monitoring Program: Benthic Images Collected from Stratified Random Sites (StRS) across Wake Island from 2014-03-16 to 2014-03-20 (NCEI Accession 0159157)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The data described here are benthic habitat imagery that result from benthic photo-quadrat surveys conducted along transects at stratified random sites across Wake...

  17. Two sampling techniques for game meat

    OpenAIRE

    van der Merwe, Maretha; Jooste, Piet J.; Hoffman, Louw C.; Calitz, Frikkie J.

    2013-01-01

    A study was conducted to compare the excision sampling technique used by the export market and the sampling technique preferred by European countries, namely the biotrace cattle and swine test. The measuring unit for the excision sampling was grams (g) and square centimetres (cm2) for the swabbing technique. The two techniques were compared after a pilot test was conducted on spiked approved beef carcasses (n = 12) that statistically proved the two measuring units correlated. The two sampling...

  18. ANALYSIS OF MONTE CARLO SIMULATION SAMPLING TECHNIQUES ON SMALL SIGNAL STABILITY OF WIND GENERATOR- CONNECTED POWER SYSTEM

    Directory of Open Access Journals (Sweden)

    TEMITOPE RAPHAEL AYODELE

    2016-04-01

    Full Text Available Monte Carlo simulation using Simple Random Sampling (SRS technique is popularly known for its ability to handle complex uncertainty problems. However, to produce a reasonable result, it requires huge sample size. This makes it to be computationally expensive, time consuming and unfit for online power system applications. In this article, the performance of Latin Hypercube Sampling (LHS technique is explored and compared with SRS in term of accuracy, robustness and speed for small signal stability application in a wind generator-connected power system. The analysis is performed using probabilistic techniques via eigenvalue analysis on two standard networks (Single Machine Infinite Bus and IEEE 16–machine 68 bus test system. The accuracy of the two sampling techniques is determined by comparing their different sample sizes with the IDEAL (conventional. The robustness is determined based on a significant variance reduction when the experiment is repeated 100 times with different sample sizes using the two sampling techniques in turn. Some of the results show that sample sizes generated from LHS for small signal stability application produces the same result as that of the IDEAL values starting from 100 sample size. This shows that about 100 sample size of random variable generated using LHS method is good enough to produce reasonable results for practical purpose in small signal stability application. It is also revealed that LHS has the least variance when the experiment is repeated 100 times compared to SRS techniques. This signifies the robustness of LHS over that of SRS techniques. 100 sample size of LHS produces the same result as that of the conventional method consisting of 50000 sample size. The reduced sample size required by LHS gives it computational speed advantage (about six times over the conventional method.

  19. Evaluation of sampling strategies to estimate crown biomass

    Directory of Open Access Journals (Sweden)

    Krishna P Poudel

    2015-01-01

    Full Text Available Background Depending on tree and site characteristics crown biomass accounts for a significant portion of the total aboveground biomass in the tree. Crown biomass estimation is useful for different purposes including evaluating the economic feasibility of crown utilization for energy production or forest products, fuel load assessments and fire management strategies, and wildfire modeling. However, crown biomass is difficult to predict because of the variability within and among species and sites. Thus the allometric equations used for predicting crown biomass should be based on data collected with precise and unbiased sampling strategies. In this study, we evaluate the performance different sampling strategies to estimate crown biomass and to evaluate the effect of sample size in estimating crown biomass. Methods Using data collected from 20 destructively sampled trees, we evaluated 11 different sampling strategies using six evaluation statistics: bias, relative bias, root mean square error (RMSE, relative RMSE, amount of biomass sampled, and relative biomass sampled. We also evaluated the performance of the selected sampling strategies when different numbers of branches (3, 6, 9, and 12 are selected from each tree. Tree specific log linear model with branch diameter and branch length as covariates was used to obtain individual branch biomass. Results Compared to all other methods stratified sampling with probability proportional to size estimation technique produced better results when three or six branches per tree were sampled. However, the systematic sampling with ratio estimation technique was the best when at least nine branches per tree were sampled. Under the stratified sampling strategy, selecting unequal number of branches per stratum produced approximately similar results to simple random sampling, but it further decreased RMSE when information on branch diameter is used in the design and estimation phases. Conclusions Use of

  20. Estimation of Correlation Functions by the Random DEC Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Krenk, Steen; Jensen, Jakob Laigaard

    The Random Dec Technique is a versatile technique for characterization of random signals in the time domain. In this paper a short review of the most important properties of the technique is given. The review is mainly based on recently achieved results that are still unpublished, or that has just...

  1. A census-weighted, spatially-stratified household sampling strategy for urban malaria epidemiology

    Directory of Open Access Journals (Sweden)

    Slutsker Laurence

    2008-02-01

    Full Text Available Abstract Background Urban malaria is likely to become increasingly important as a consequence of the growing proportion of Africans living in cities. A novel sampling strategy was developed for urban areas to generate a sample simultaneously representative of population and inhabited environments. Such a strategy should facilitate analysis of important epidemiological relationships in this ecological context. Methods Census maps and summary data for Kisumu, Kenya, were used to create a pseudo-sampling frame using the geographic coordinates of census-sampled structures. For every enumeration area (EA designated as urban by the census (n = 535, a sample of structures equal to one-tenth the number of households was selected. In EAs designated as rural (n = 32, a geographically random sample totalling one-tenth the number of households was selected from a grid of points at 100 m intervals. The selected samples were cross-referenced to a geographic information system, and coordinates transferred to handheld global positioning units. Interviewers found the closest eligible household to the sampling point and interviewed the caregiver of a child aged Results 4,336 interviews were completed in 473 of the 567 study area EAs from June 2002 through February 2003. EAs without completed interviews were randomly distributed, and non-response was approximately 2%. Mean distance from the assigned sampling point to the completed interview was 74.6 m, and was significantly less in urban than rural EAs, even when controlling for number of households. The selected sample had significantly more children and females of childbearing age than the general population, and fewer older individuals. Conclusion This method selected a sample that was simultaneously population-representative and inclusive of important environmental variation. The use of a pseudo-sampling frame and pre-programmed handheld GPS units is more efficient and may yield a more complete sample than

  2. National Coral Reef Monitoring Program: Benthic Cover Derived from Analysis of Benthic Images Collected during Stratified Random Surveys (StRS) across American Samoa in 2015 (NCEI Accession 0157752)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The data described here result from benthic photo-quadrat surveys conducted along transects at stratified random sites across American Samoa in 2015 as a part of...

  3. Estimation of Correlation Functions by the Random Decrement Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Krenk, Steen; Jensen, Jakob Laigaard

    responses simulated by two SDOF ARMA models loaded by the same bandlimited white noise. The speed and the accuracy of the RDD technique is compared to the Fast Fourier Transform (FFT) technique. The RDD technique does not involve multiplications, but only additions. Therefore, the technique is very fast......The Random Decrement (RDD) Technique is a versatile technique for characterization of random signals in the time domain. In this paper a short review of the theoretical basis is given, and the technique is illustrated by estimating auto-correlation functions and cross-correlation functions on modal...

  4. Estimation of Correlation Functions by the Random Decrement Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Krenk, Steen; Jensen, Jacob Laigaard

    1991-01-01

    responses simulated by two SDOF ARMA models loaded by the same band-limited white noise. The speed and the accuracy of the RDD technique is compared to the Fast Fourier Transform (FFT) technique. The RDD technique does not involve multiplications, but only additions. Therefore, the technique is very fast......The Random Decrement (RDD) Technique is a versatile technique for characterization of random signals in the time domain. In this paper a short review of the theoretical basis is given, and the technique is illustrated by estimating auto-correlation functions and cross-correlation functions on modal...

  5. Estimation of Correlation Functions by the Random Decrement Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Krenk, Steen; Jensen, Jakob Laigaard

    1992-01-01

    responses simulated by two SDOF ARMA models loaded by the same bandlimited white noise. The speed and the accuracy of the RDD technique is compared to the Fast Fourier Transform (FFT) technique. The RDD technique does not involve multiplications, but only additions. Therefore, the technique is very fast......The Random Decrement (RDD) Technique is a versatile technique for characterization of random signals in the time domain. In this paper a short review of the theoretical basis is given, and the technique is illustrated by estimating auto-correlation functions and cross-correlation functions on modal...

  6. Two sampling techniques for game meat

    Directory of Open Access Journals (Sweden)

    Maretha van der Merwe

    2013-03-01

    Full Text Available A study was conducted to compare the excision sampling technique used by the export market and the sampling technique preferred by European countries, namely the biotrace cattle and swine test. The measuring unit for the excision sampling was grams (g and square centimetres (cm2 for the swabbing technique. The two techniques were compared after a pilot test was conducted on spiked approved beef carcasses (n = 12 that statistically proved the two measuring units correlated. The two sampling techniques were conducted on the same game carcasses (n = 13 and analyses performed for aerobic plate count (APC, Escherichia coli and Staphylococcus aureus, for both techniques. A more representative result was obtained by swabbing and no damage was caused to the carcass. Conversely, the excision technique yielded fewer organisms and caused minor damage to the carcass. The recovery ratio from the sampling technique improved 5.4 times for APC, 108.0 times for E. coli and 3.4 times for S. aureus over the results obtained from the excision technique. It was concluded that the sampling methods of excision and swabbing can be used to obtain bacterial profiles from both export and local carcasses and could be used to indicate whether game carcasses intended for the local market are possibly on par with game carcasses intended for the export market and therefore safe for human consumption.

  7. Two sampling techniques for game meat.

    Science.gov (United States)

    van der Merwe, Maretha; Jooste, Piet J; Hoffman, Louw C; Calitz, Frikkie J

    2013-03-20

    A study was conducted to compare the excision sampling technique used by the export market and the sampling technique preferred by European countries, namely the biotrace cattle and swine test. The measuring unit for the excision sampling was grams (g) and square centimetres (cm2) for the swabbing technique. The two techniques were compared after a pilot test was conducted on spiked approved beef carcasses (n = 12) that statistically proved the two measuring units correlated. The two sampling techniques were conducted on the same game carcasses (n = 13) and analyses performed for aerobic plate count (APC), Escherichia coli and Staphylococcus aureus, for both techniques. A more representative result was obtained by swabbing and no damage was caused to the carcass. Conversely, the excision technique yielded fewer organisms and caused minor damage to the carcass. The recovery ratio from the sampling technique improved 5.4 times for APC, 108.0 times for E. coli and 3.4 times for S. aureus over the results obtained from the excision technique. It was concluded that the sampling methods of excision and swabbing can be used to obtain bacterial profiles from both export and local carcasses and could be used to indicate whether game carcasses intended for the local market are possibly on par with game carcasses intended for the export market and therefore safe for human consumption.

  8. Methodology Series Module 5: Sampling Strategies.

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ' Sampling Method'. There are essentially two types of sampling methods: 1) probability sampling - based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling - based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ' random sample' when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ' generalizability' of these results. In such a scenario, the researcher may want to use ' purposive sampling' for the study.

  9. Methodology series module 5: Sampling strategies

    Directory of Open Access Journals (Sweden)

    Maninder Singh Setia

    2016-01-01

    Full Text Available Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the 'Sampling Method'. There are essentially two types of sampling methods: 1 probability sampling – based on chance events (such as random numbers, flipping a coin etc.; and 2 non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term 'random sample' when the researcher has used convenience sample. The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the 'generalizability' of these results. In such a scenario, the researcher may want to use 'purposive sampling' for the study.

  10. Methodology Series Module 5: Sampling Strategies

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ‘ Sampling Method’. There are essentially two types of sampling methods: 1) probability sampling – based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ‘ random sample’ when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ‘ generalizability’ of these results. In such a scenario, the researcher may want to use ‘ purposive sampling’ for the study. PMID:27688438

  11. National Coral Reef Monitoring Program: benthic images collected from stratified random sites (StRS) across the Hawaiian Archipelago from 2016-07-13 to 2016-09-27 (NCEI Accession 0164293)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The data described here are benthic habitat imagery that result from benthic photo-quadrat surveys conducted along transects at stratified random sites across the...

  12. National Coral Reef Monitoring Program: Benthic Images Collected from Stratified Random Sites (StRS) across the Hawaiian Archipelago from 2013-05-01 to 2013-10-31 (NCEI Accession 0159144)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The data described here are benthic habitat imagery that result from benthic photo-quadrat surveys conducted along transects at stratified random sites across the...

  13. National Coral Reef Monitoring Program: Benthic Images Collected from Stratified Random Sites (StRS) across the Mariana Archipelago from 2014-03-25 to 2014-05-07 (NCEI Accession 0159142)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The data described here are benthic habitat imagery that result from benthic photo-quadrat surveys conducted along transects at stratified random sites across the...

  14. Prevalence and correlates of problematic smartphone use in a large random sample of Chinese undergraduates.

    Science.gov (United States)

    Long, Jiang; Liu, Tie-Qiao; Liao, Yan-Hui; Qi, Chang; He, Hao-Yu; Chen, Shu-Bao; Billieux, Joël

    2016-11-17

    Smartphones are becoming a daily necessity for most undergraduates in Mainland China. Because the present scenario of problematic smartphone use (PSU) is largely unexplored, in the current study we aimed to estimate the prevalence of PSU and to screen suitable predictors for PSU among Chinese undergraduates in the framework of the stress-coping theory. A sample of 1062 undergraduate smartphone users was recruited by means of the stratified cluster random sampling strategy between April and May 2015. The Problematic Cellular Phone Use Questionnaire was used to identify PSU. We evaluated five candidate risk factors for PSU by using logistic regression analysis while controlling for demographic characteristics and specific features of smartphone use. The prevalence of PSU among Chinese undergraduates was estimated to be 21.3%. The risk factors for PSU were majoring in the humanities, high monthly income from the family (≥1500 RMB), serious emotional symptoms, high perceived stress, and perfectionism-related factors (high doubts about actions, high parental expectations). PSU among undergraduates appears to be ubiquitous and thus constitutes a public health issue in Mainland China. Although further longitudinal studies are required to test whether PSU is a transient phenomenon or a chronic and progressive condition, our study successfully identified socio-demographic and psychological risk factors for PSU. These results, obtained from a random and thus representative sample of undergraduates, opens up new avenues in terms of prevention and regulation policies.

  15. National Coral Reef Monitoring Program: Benthic Cover Derived from Analysis of Benthic Images Collected during Stratified Random Surveys (StRS) across the Hawaiian Archipelago in 2013 (NCEI Accession 0159140)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The data described here result from benthic photo-quadrat surveys conducted along transects at stratified random sites across the Hawaiian archipelago in 2013 as a...

  16. Spatial Sampling of Weather Data for Regional Crop Yield Simulations

    Science.gov (United States)

    Van Bussel, Lenny G. J.; Ewert, Frank; Zhao, Gang; Hoffmann, Holger; Enders, Andreas; Wallach, Daniel; Asseng, Senthold; Baigorria, Guillermo A.; Basso, Bruno; Biernath, Christian; hide

    2016-01-01

    Field-scale crop models are increasingly applied at spatio-temporal scales that range from regions to the globe and from decades up to 100 years. Sufficiently detailed data to capture the prevailing spatio-temporal heterogeneity in weather, soil, and management conditions as needed by crop models are rarely available. Effective sampling may overcome the problem of missing data but has rarely been investigated. In this study the effect of sampling weather data has been evaluated for simulating yields of winter wheat in a region in Germany over a 30-year period (1982-2011) using 12 process-based crop models. A stratified sampling was applied to compare the effect of different sizes of spatially sampled weather data (10, 30, 50, 100, 500, 1000 and full coverage of 34,078 sampling points) on simulated wheat yields. Stratified sampling was further compared with random sampling. Possible interactions between sample size and crop model were evaluated. The results showed differences in simulated yields among crop models but all models reproduced well the pattern of the stratification. Importantly, the regional mean of simulated yields based on full coverage could already be reproduced by a small sample of 10 points. This was also true for reproducing the temporal variability in simulated yields but more sampling points (about 100) were required to accurately reproduce spatial yield variability. The number of sampling points can be smaller when a stratified sampling is applied as compared to a random sampling. However, differences between crop models were observed including some interaction between the effect of sampling on simulated yields and the model used. We concluded that stratified sampling can considerably reduce the number of required simulations. But, differences between crop models must be considered as the choice for a specific model can have larger effects on simulated yields than the sampling strategy. Assessing the impact of sampling soil and crop management

  17. Sampling large random knots in a confined space

    International Nuclear Information System (INIS)

    Arsuaga, J; Blackstone, T; Diao, Y; Hinson, K; Karadayi, E; Saito, M

    2007-01-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e n 2 )). We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n 2 ). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications

  18. Sampling large random knots in a confined space

    Science.gov (United States)

    Arsuaga, J.; Blackstone, T.; Diao, Y.; Hinson, K.; Karadayi, E.; Saito, M.

    2007-09-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e^{n^2}) . We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n2). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  19. Sampling large random knots in a confined space

    Energy Technology Data Exchange (ETDEWEB)

    Arsuaga, J [Department of Mathematics, San Francisco State University, 1600 Holloway Ave, San Francisco, CA 94132 (United States); Blackstone, T [Department of Computer Science, San Francisco State University, 1600 Holloway Ave., San Francisco, CA 94132 (United States); Diao, Y [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Hinson, K [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Karadayi, E [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States); Saito, M [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States)

    2007-09-28

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e{sup n{sup 2}}). We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n{sup 2}). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  20. Grain distinct stratified nanolayers in aluminium alloys

    Energy Technology Data Exchange (ETDEWEB)

    Donatus, U., E-mail: uyimedonatus@yahoo.com [School of Materials, The University of Manchester, Manchester, M13 9PL, England (United Kingdom); Thompson, G.E.; Zhou, X.; Alias, J. [School of Materials, The University of Manchester, Manchester, M13 9PL, England (United Kingdom); Tsai, I.-L. [Oxford Instruments NanoAnalysis, HP12 2SE, High Wycombe (United Kingdom)

    2017-02-15

    The grains of aluminium alloys have stratified nanolayers which determine their mechanical and chemical responses. In this study, the nanolayers were revealed in the grains of AA6082 (T6 and T7 conditions), AA5083-O and AA2024-T3 alloys by etching the alloys in a solution comprising 20 g Cr{sub 2}O{sub 3} + 30 ml HPO{sub 3} in 1 L H{sub 2}O. Microstructural examination was conducted on selected grains of interest using scanning electron microscopy and electron backscatter diffraction technique. It was observed that the nanolayers are orientation dependent and are parallel to the {100} planes. They have ordered and repeated tunnel squares that are flawed at the sides which are aligned in the <100> directions. These flawed tunnel squares dictate the tunnelling corrosion morphology as well as appearing to have an affect on the arrangement and sizes of the precipitation hardening particles. The inclination of the stratified nanolayers, their interpacing, and the groove sizes have significant influence on the corrosion behaviour and seeming influence on the strengthening mechanism of the investigated aluminium alloys. - Highlights: • Stratified nanolayers in aluminium alloy grains. • Relationship of the stratified nanolayers with grain orientation. • Influence of the inclinations of the stratified nanolayers on corrosion. • Influence of the nanolayers interspacing and groove sizes on hardness and corrosion.

  1. Development and assessment of an e-learning course on breast imaging for radiographers: a stratified randomized controlled trial.

    Science.gov (United States)

    Moreira, Inês C; Ventura, Sandra Rua; Ramos, Isabel; Rodrigues, Pedro Pereira

    2015-01-05

    Mammography is considered the best imaging technique for breast cancer screening, and the radiographer plays an important role in its performance. Therefore, continuing education is critical to improving the performance of these professionals and thus providing better health care services. Our goal was to develop an e-learning course on breast imaging for radiographers, assessing its efficacy, effectiveness, and user satisfaction. A stratified randomized controlled trial was performed with radiographers and radiology students who already had mammography training, using pre- and post-knowledge tests, and satisfaction questionnaires. The primary outcome was the improvement in test results (percentage of correct answers), using intention-to-treat and per-protocol analysis. A total of 54 participants were assigned to the intervention (20 students plus 34 radiographers) with 53 controls (19+34). The intervention was completed by 40 participants (11+29), with 4 (2+2) discontinued interventions, and 10 (7+3) lost to follow-up. Differences in the primary outcome were found between intervention and control: 21 versus 4 percentage points (pp), Peffect in radiographers (23 pp vs 4 pp; P=.004) but was unclear in students (18 pp vs 5 pp; P=.098). Nonetheless, differences in students' posttest results were found (88% vs 63%; P=.003), which were absent in pretest (63% vs 63%; P=.106). The per-protocol analysis showed a higher effect (26 pp vs 2 pp; Pe-learning course is effective, especially for radiographers, which highlights the need for continuing education.

  2. Path integral methods for primordial density perturbations - sampling of constrained Gaussian random fields

    International Nuclear Information System (INIS)

    Bertschinger, E.

    1987-01-01

    Path integrals may be used to describe the statistical properties of a random field such as the primordial density perturbation field. In this framework the probability distribution is given for a Gaussian random field subjected to constraints such as the presence of a protovoid or supercluster at a specific location in the initial conditions. An algorithm has been constructed for generating samples of a constrained Gaussian random field on a lattice using Monte Carlo techniques. The method makes possible a systematic study of the density field around peaks or other constrained regions in the biased galaxy formation scenario, and it is effective for generating initial conditions for N-body simulations with rare objects in the computational volume. 21 references

  3. Randomized clinical trial comparing control of maxillary anchorage with 2 retraction techniques.

    Science.gov (United States)

    Xu, Tian-Min; Zhang, Xiaoyun; Oh, Hee Soo; Boyd, Robert L; Korn, Edward L; Baumrind, Sheldon

    2010-11-01

    The objective of this pilot randomized clinical trial was to investigate the relative effectiveness of anchorage conservation of en-masse and 2-step retraction techniques during maximum anchorage treatment in patients with Angle Class I and Class II malocclusions. Sixty-four growing subjects (25 boys, 39 girls; 10.2-15.9 years old) who required maximum anchorage were randomized to 2 treatment techniques: en-masse retraction (n = 32) and 2-step retraction (n = 32); the groups were stratified by sex and starting age. Each patient was treated by a full-time clinic instructor experienced in the use of both retraction techniques at the orthodontic clinic of Peking University School of Stomatology in China. All patients used headgear, and most had transpalatal appliances. Lateral cephalograms taken before treatment and at the end of treatment were used to evaluate treatment-associated changes. Differences in maxillary molar mesial displacement and maxillary incisor retraction were measured with the before and after treatment tracings superimposed on the anatomic best fit of the palatal structures. Differences in mesial displacement of the maxillary first molar were compared between the 2 treatment techniques, between sexes, and between different starting-age groups. Average mesial displacement of the maxillary first molar was slightly less in the en-masse group than in the 2-step group (mean, -0.36 mm; 95% CI, -1.42 to 0.71 mm). The average mesial displacement of the maxillary first molar for both treatment groups pooled (n = 63, because 1 patient was lost to follow-up) was 4.3 ± 2.1 mm (mean ± standard deviation). Boys had significantly more mesial displacement than girls (mean difference, 1.3 mm; P <0.03). Younger adolescents had significantly more mesial displacement than older adolescents (mean difference, 1.3 mm; P <0.02). Average mesial displacement of the maxillary first molar with 2-step retraction was slightly greater than that for en-masse retraction, but the

  4. An Assessment of Polynomial Regression Techniques for the Relative Radiometric Normalization (RRN of High-Resolution Multi-Temporal Airborne Thermal Infrared (TIR Imagery

    Directory of Open Access Journals (Sweden)

    Mir Mustafizur Rahman

    2014-11-01

    Full Text Available Thermal Infrared (TIR remote sensing images of urban environments are increasingly available from airborne and satellite platforms. However, limited access to high-spatial resolution (H-res: ~1 m TIR satellite images requires the use of TIR airborne sensors for mapping large complex urban surfaces, especially at micro-scales. A critical limitation of such H-res mapping is the need to acquire a large scene composed of multiple flight lines and mosaic them together. This results in the same scene components (e.g., roads, buildings, green space and water exhibiting different temperatures in different flight lines. To mitigate these effects, linear relative radiometric normalization (RRN techniques are often applied. However, the Earth’s surface is composed of features whose thermal behaviour is characterized by complexity and non-linearity. Therefore, we hypothesize that non-linear RRN techniques should demonstrate increased radiometric agreement over similar linear techniques. To test this hypothesis, this paper evaluates four (linear and non-linear RRN techniques, including: (i histogram matching (HM; (ii pseudo-invariant feature-based polynomial regression (PIF_Poly; (iii no-change stratified random sample-based linear regression (NCSRS_Lin; and (iv no-change stratified random sample-based polynomial regression (NCSRS_Poly; two of which (ii and iv are newly proposed non-linear techniques. When applied over two adjacent flight lines (~70 km2 of TABI-1800 airborne data, visual and statistical results show that both new non-linear techniques improved radiometric agreement over the previously evaluated linear techniques, with the new fully-automated method, NCSRS-based polynomial regression, providing the highest improvement in radiometric agreement between the master and the slave images, at ~56%. This is ~5% higher than the best previously evaluated linear technique (NCSRS-based linear regression.

  5. Stratified B-trees and versioning dictionaries

    OpenAIRE

    Twigg, Andy; Byde, Andrew; Milos, Grzegorz; Moreton, Tim; Wilkes, John; Wilkie, Tom

    2011-01-01

    A classic versioned data structure in storage and computer science is the copy-on-write (CoW) B-tree -- it underlies many of today's file systems and databases, including WAFL, ZFS, Btrfs and more. Unfortunately, it doesn't inherit the B-tree's optimality properties; it has poor space utilization, cannot offer fast updates, and relies on random IO to scale. Yet, nothing better has been developed since. We describe the `stratified B-tree', which beats all known semi-external memory versioned B...

  6. Spectral Estimation by the Random Dec Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Jensen, Jacob L.; Krenk, Steen

    1990-01-01

    This paper contains an empirical study of the accuracy of the Random Dec (RDD) technique. Realizations of the response from a single-degree-of-freedom system loaded by white noise are simulated using an ARMA model. The Autocorrelation function is estimated using the RDD technique and the estimated...

  7. Spectral Estimation by the Random DEC Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Jensen, J. Laigaard; Krenk, S.

    This paper contains an empirical study of the accuracy of the Random Dec (RDD) technique. Realizations of the response from a single-degree-of-freedom system loaded by white noise are simulated using an ARMA model. The Autocorrelation function is estimated using the RDD technique and the estimated...

  8. Radioisotope Sample Measurement Techniques in Medicine and Biology. Proceedings of the Symposium on Radioisotope Sample Measurement Techniques

    International Nuclear Information System (INIS)

    1965-01-01

    The medical and biological applications of radioisotopes depend on two basically different types of measurements, those on living subjects in vivo and those on samples in vitro. The International Atomic Energy Agency has in the past held several meetings on in vivo measurement techniques, notably whole-body counting and radioisotope scanning. The present volume contains the Proceedings of the first Symposium the Agency has organized to discuss the various aspects of techniques for sample measurement in vitro. The range of these sample measurement techniques is very wide. The sample may weigh a few milligrams or several hundred grams, and may be in the gaseous, liquid or solid state. Its radioactive content may consist of a single, known radioisotope or several unknown ones. The concentration of radioactivity may be low, medium or high. The measurements may be made manually or automatically and any one of the many radiation detectors now available may be used. The 53 papers presented at the Symposium illustrate the great variety of methods now in use for radioactive- sample measurements. The first topic discussed is gamma-ray spectrometry, which finds an increasing number of applications in sample measurements. Other sections of the Proceedings deal with: the use of computers in gamma-ray spectrometry and multiple tracer techniques; recent developments in activation analysis where both gamma-ray spectrometry and computing techniques are applied; thin-layer and paper radio chromatographic techniques for use with low energy beta-ray emitters; various aspects of liquid scintillation counting techniques in the measurement of alpha- and beta-ray emitters, including chemical and colour quenching; autoradiographic techniques; calibration of equipment; and standardization of radioisotopes. Finally, some applications of solid-state detectors are presented; this section may be regarded as a preview of important future developments. The meeting was attended by 203 participants

  9. Sampling Polya-Gamma random variates: alternate and approximate techniques

    OpenAIRE

    Windle, Jesse; Polson, Nicholas G.; Scott, James G.

    2014-01-01

    Efficiently sampling from the P\\'olya-Gamma distribution, ${PG}(b,z)$, is an essential element of P\\'olya-Gamma data augmentation. Polson et. al (2013) show how to efficiently sample from the ${PG}(1,z)$ distribution. We build two new samplers that offer improved performance when sampling from the ${PG}(b,z)$ distribution and $b$ is not unity.

  10. Efficient sampling of complex network with modified random walk strategies

    Science.gov (United States)

    Xie, Yunya; Chang, Shuhua; Zhang, Zhipeng; Zhang, Mi; Yang, Lei

    2018-02-01

    We present two novel random walk strategies, choosing seed node (CSN) random walk and no-retracing (NR) random walk. Different from the classical random walk sampling, the CSN and NR strategies focus on the influences of the seed node choice and path overlap, respectively. Three random walk samplings are applied in the Erdös-Rényi (ER), Barabási-Albert (BA), Watts-Strogatz (WS), and the weighted USAir networks, respectively. Then, the major properties of sampled subnets, such as sampling efficiency, degree distributions, average degree and average clustering coefficient, are studied. The similar conclusions can be reached with these three random walk strategies. Firstly, the networks with small scales and simple structures are conducive to the sampling. Secondly, the average degree and the average clustering coefficient of the sampled subnet tend to the corresponding values of original networks with limited steps. And thirdly, all the degree distributions of the subnets are slightly biased to the high degree side. However, the NR strategy performs better for the average clustering coefficient of the subnet. In the real weighted USAir networks, some obvious characters like the larger clustering coefficient and the fluctuation of degree distribution are reproduced well by these random walk strategies.

  11. Nitrogen transformations in stratified aquatic microbial ecosystems

    DEFF Research Database (Denmark)

    Revsbech, N. P.; Risgaard-Petersen, N.; Schramm, A.

    2006-01-01

    Abstract  New analytical methods such as advanced molecular techniques and microsensors have resulted in new insights about how nitrogen transformations in stratified microbial systems such as sediments and biofilms are regulated at a µm-mm scale. A large and ever-expanding knowledge base about n...

  12. Optimization of refueling-shuffling scheme in PWR core by random search strategy

    International Nuclear Information System (INIS)

    Wu Yuan

    1991-11-01

    A random method for simulating optimization of refueling management in a pressurized water reactor (PWR) core is described. The main purpose of the optimization was to select the 'best' refueling arrangement scheme which would produce maximum economic benefits under certain imposed conditions. To fulfill this goal, an effective optimization strategy, two-stage random search method was developed. First, the search was made in a manner similar to the stratified sampling technique. A local optimum can be reached by comparison of the successive results. Then the other random experiences would be carried on between different strata to try to find the global optimum. In general, it can be used as a practical tool for conventional fuel management scheme. However, it can also be used in studies on optimization of Low-Leakage fuel management. Some calculations were done for a typical PWR core on a CYBER-180/830 computer. The results show that the method proposed can obtain satisfactory approach at reasonable low computational cost

  13. RandomSpot: A web-based tool for systematic random sampling of virtual slides.

    Science.gov (United States)

    Wright, Alexander I; Grabsch, Heike I; Treanor, Darren E

    2015-01-01

    This paper describes work presented at the Nordic Symposium on Digital Pathology 2014, Linköping, Sweden. Systematic random sampling (SRS) is a stereological tool, which provides a framework to quickly build an accurate estimation of the distribution of objects or classes within an image, whilst minimizing the number of observations required. RandomSpot is a web-based tool for SRS in stereology, which systematically places equidistant points within a given region of interest on a virtual slide. Each point can then be visually inspected by a pathologist in order to generate an unbiased sample of the distribution of classes within the tissue. Further measurements can then be derived from the distribution, such as the ratio of tumor to stroma. RandomSpot replicates the fundamental principle of traditional light microscope grid-shaped graticules, with the added benefits associated with virtual slides, such as facilitated collaboration and automated navigation between points. Once the sample points have been added to the region(s) of interest, users can download the annotations and view them locally using their virtual slide viewing software. Since its introduction, RandomSpot has been used extensively for international collaborative projects, clinical trials and independent research projects. So far, the system has been used to generate over 21,000 sample sets, and has been used to generate data for use in multiple publications, identifying significant new prognostic markers in colorectal, upper gastro-intestinal and breast cancer. Data generated using RandomSpot also has significant value for training image analysis algorithms using sample point coordinates and pathologist classifications.

  14. Types, Magnitude, Predictors and Controlling Mechanisms of ...

    African Journals Online (AJOL)

    Multi-stage sampling that involves simple random and stratified sampling techniques was used to select student participants. Accidental sampling techniques were employed to select teacher participants. Questionnaire that contained items on socio-demographic variables, scales on aggression, scales on parenting styles ...

  15. Effects of a random spatial variation of the plasma density on the mode conversion in cold, unmagnetized, and stratified plasmas

    Energy Technology Data Exchange (ETDEWEB)

    Jung Yu, Dae [School of Space Research, Kyung Hee University, Yongin 446-701 (Korea, Republic of); Kim, Kihong [Department of Energy Systems Research, Ajou University, Suwon 443-749 (Korea, Republic of)

    2013-12-15

    We study the effects of a random spatial variation of the plasma density on the mode conversion of electromagnetic waves into electrostatic oscillations in cold, unmagnetized, and stratified plasmas. Using the invariant imbedding method, we calculate precisely the electromagnetic field distribution and the mode conversion coefficient, which is defined to be the fraction of the incident wave power converted into electrostatic oscillations, for the configuration where a numerically generated random density variation is added to the background linear density profile. We repeat similar calculations for a large number of random configurations and take an average of the results. We obtain a peculiar nonmonotonic dependence of the mode conversion coefficient on the strength of randomness. As the disorder increases from zero, the maximum value of the mode conversion coefficient decreases initially, then increases to a maximum, and finally decreases towards zero. The range of the incident angle in which mode conversion occurs increases monotonically as the disorder increases. We present numerical results suggesting that the decrease of mode conversion mainly results from the increased reflection due to the Anderson localization effect originating from disorder, whereas the increase of mode conversion of the intermediate disorder regime comes from the appearance of many resonance points and the enhanced tunneling between the resonance points and the cutoff point. We also find a very large local enhancement of the magnetic field intensity for particular random configurations. In order to obtain high mode conversion efficiency, it is desirable to restrict the randomness close to the resonance region.

  16. Effects of a random spatial variation of the plasma density on the mode conversion in cold, unmagnetized, and stratified plasmas

    International Nuclear Information System (INIS)

    Jung Yu, Dae; Kim, Kihong

    2013-01-01

    We study the effects of a random spatial variation of the plasma density on the mode conversion of electromagnetic waves into electrostatic oscillations in cold, unmagnetized, and stratified plasmas. Using the invariant imbedding method, we calculate precisely the electromagnetic field distribution and the mode conversion coefficient, which is defined to be the fraction of the incident wave power converted into electrostatic oscillations, for the configuration where a numerically generated random density variation is added to the background linear density profile. We repeat similar calculations for a large number of random configurations and take an average of the results. We obtain a peculiar nonmonotonic dependence of the mode conversion coefficient on the strength of randomness. As the disorder increases from zero, the maximum value of the mode conversion coefficient decreases initially, then increases to a maximum, and finally decreases towards zero. The range of the incident angle in which mode conversion occurs increases monotonically as the disorder increases. We present numerical results suggesting that the decrease of mode conversion mainly results from the increased reflection due to the Anderson localization effect originating from disorder, whereas the increase of mode conversion of the intermediate disorder regime comes from the appearance of many resonance points and the enhanced tunneling between the resonance points and the cutoff point. We also find a very large local enhancement of the magnetic field intensity for particular random configurations. In order to obtain high mode conversion efficiency, it is desirable to restrict the randomness close to the resonance region

  17. Acceptance sampling using judgmental and randomly selected samples

    Energy Technology Data Exchange (ETDEWEB)

    Sego, Landon H.; Shulman, Stanley A.; Anderson, Kevin K.; Wilson, John E.; Pulsipher, Brent A.; Sieber, W. Karl

    2010-09-01

    We present a Bayesian model for acceptance sampling where the population consists of two groups, each with different levels of risk of containing unacceptable items. Expert opinion, or judgment, may be required to distinguish between the high and low-risk groups. Hence, high-risk items are likely to be identifed (and sampled) using expert judgment, while the remaining low-risk items are sampled randomly. We focus on the situation where all observed samples must be acceptable. Consequently, the objective of the statistical inference is to quantify the probability that a large percentage of the unsampled items in the population are also acceptable. We demonstrate that traditional (frequentist) acceptance sampling and simpler Bayesian formulations of the problem are essentially special cases of the proposed model. We explore the properties of the model in detail, and discuss the conditions necessary to ensure that required samples sizes are non-decreasing function of the population size. The method is applicable to a variety of acceptance sampling problems, and, in particular, to environmental sampling where the objective is to demonstrate the safety of reoccupying a remediated facility that has been contaminated with a lethal agent.

  18. Dynamic measurement of liquid film thickness in stratified flow by using ultrasonic echo technique

    International Nuclear Information System (INIS)

    Serizawa, A.; Nagane, K.; Kamei, T.; Kawara, Z.; Ebisu, T.; Torikoshi, K.

    2004-01-01

    We developed a technique to measure time-dependent local film thickness in stratified air-water flow over a horizontal plate by using a time of flight of ultrasonic transmission. The ultrasonic echoes reflected at the liquid/air interfaces are detected by a conventional ultrasonic instrumentation, and the signals are analyzed by a personal computer after being digitalized by an A/D converter to give the time of flight for the ultrasonic waves to run over a distance of twice of the film thickness. A 3.8 mm diameter probe type ultrasonic transducer was used in the present work which transmits and receives 10 MHz frequency ultrasonic waves. The estimated spatial resolution with this arrangement is 0.075 mm in film thickness for water. The time resolution, which depends on both the A/D converter and the memory capacity was up to several tens Hz. We also discussed the sensitivity of the method to the inclination angle of the interfaces. (author)

  19. Critical evaluation of sample pretreatment techniques.

    Science.gov (United States)

    Hyötyläinen, Tuulia

    2009-06-01

    Sample preparation before chromatographic separation is the most time-consuming and error-prone part of the analytical procedure. Therefore, selecting and optimizing an appropriate sample preparation scheme is a key factor in the final success of the analysis, and the judicious choice of an appropriate procedure greatly influences the reliability and accuracy of a given analysis. The main objective of this review is to critically evaluate the applicability, disadvantages, and advantages of various sample preparation techniques. Particular emphasis is placed on extraction techniques suitable for both liquid and solid samples.

  20. Development of a sampling strategy and sample size calculation to estimate the distribution of mammographic breast density in Korean women.

    Science.gov (United States)

    Jun, Jae Kwan; Kim, Mi Jin; Choi, Kui Son; Suh, Mina; Jung, Kyu-Won

    2012-01-01

    Mammographic breast density is a known risk factor for breast cancer. To conduct a survey to estimate the distribution of mammographic breast density in Korean women, appropriate sampling strategies for representative and efficient sampling design were evaluated through simulation. Using the target population from the National Cancer Screening Programme (NCSP) for breast cancer in 2009, we verified the distribution estimate by repeating the simulation 1,000 times using stratified random sampling to investigate the distribution of breast density of 1,340,362 women. According to the simulation results, using a sampling design stratifying the nation into three groups (metropolitan, urban, and rural), with a total sample size of 4,000, we estimated the distribution of breast density in Korean women at a level of 0.01% tolerance. Based on the results of our study, a nationwide survey for estimating the distribution of mammographic breast density among Korean women can be conducted efficiently.

  1. FDTD scattered field formulation for scatterers in stratified dispersive media.

    Science.gov (United States)

    Olkkonen, Juuso

    2010-03-01

    We introduce a simple scattered field (SF) technique that enables finite difference time domain (FDTD) modeling of light scattering from dispersive objects residing in stratified dispersive media. The introduced SF technique is verified against the total field scattered field (TFSF) technique. As an application example, we study surface plasmon polariton enhanced light transmission through a 100 nm wide slit in a silver film.

  2. Event-triggered synchronization for reaction-diffusion complex networks via random sampling

    Science.gov (United States)

    Dong, Tao; Wang, Aijuan; Zhu, Huiyun; Liao, Xiaofeng

    2018-04-01

    In this paper, the synchronization problem of the reaction-diffusion complex networks (RDCNs) with Dirichlet boundary conditions is considered, where the data is sampled randomly. An event-triggered controller based on the sampled data is proposed, which can reduce the number of controller and the communication load. Under this strategy, the synchronization problem of the diffusion complex network is equivalently converted to the stability of a of reaction-diffusion complex dynamical systems with time delay. By using the matrix inequality technique and Lyapunov method, the synchronization conditions of the RDCNs are derived, which are dependent on the diffusion term. Moreover, it is found the proposed control strategy can get rid of the Zeno behavior naturally. Finally, a numerical example is given to verify the obtained results.

  3. Newly introduced sample preparation techniques: towards miniaturization.

    Science.gov (United States)

    Costa, Rosaria

    2014-01-01

    Sampling and sample preparation are of crucial importance in an analytical procedure, representing quite often a source of errors. The technique chosen for the isolation of analytes greatly affects the success of a chemical determination. On the other hand, growing concerns about environmental and human safety, along with the introduction of international regulations for quality control, have moved the interest of scientists towards specific needs. Newly introduced sample preparation techniques are challenged to meet new criteria: (i) miniaturization, (ii) higher sensitivity and selectivity, and (iii) automation. In this survey, the most recent techniques introduced in the field of sample preparation will be described and discussed, along with many examples of applications.

  4. Detection and monitoring of invasive exotic plants: a comparison of four sampling methods

    Science.gov (United States)

    Cynthia D. Huebner

    2007-01-01

    The ability to detect and monitor exotic invasive plants is likely to vary depending on the sampling method employed. Methods with strong qualitative thoroughness for species detection often lack the intensity necessary to monitor vegetation change. Four sampling methods (systematic plot, stratified-random plot, modified Whittaker, and timed meander) in hemlock and red...

  5. The Risk-Stratified Osteoporosis Strategy Evaluation study (ROSE)

    DEFF Research Database (Denmark)

    Rubin, Katrine Hass; Holmberg, Teresa; Rothmann, Mette Juel

    2015-01-01

    The risk-stratified osteoporosis strategy evaluation study (ROSE) is a randomized prospective population-based study investigating the effectiveness of a two-step screening program for osteoporosis in women. This paper reports the study design and baseline characteristics of the study population....... 35,000 women aged 65-80 years were selected at random from the population in the Region of Southern Denmark and-before inclusion-randomized to either a screening group or a control group. As first step, a self-administered questionnaire regarding risk factors for osteoporosis based on FRAX......(®) was issued to both groups. As second step, subjects in the screening group with a 10-year probability of major osteoporotic fractures ≥15 % were offered a DXA scan. Patients diagnosed with osteoporosis from the DXA scan were advised to see their GP and discuss pharmaceutical treatment according to Danish...

  6. A new simple technique for improving the random properties of chaos-based cryptosystems

    Science.gov (United States)

    Garcia-Bosque, M.; Pérez-Resa, A.; Sánchez-Azqueta, C.; Celma, S.

    2018-03-01

    A new technique for improving the security of chaos-based stream ciphers has been proposed and tested experimentally. This technique manages to improve the randomness properties of the generated keystream by preventing the system to fall into short period cycles due to digitation. In order to test this technique, a stream cipher based on a Skew Tent Map algorithm has been implemented on a Virtex 7 FPGA. The randomness of the keystream generated by this system has been compared to the randomness of the keystream generated by the same system with the proposed randomness-enhancement technique. By subjecting both keystreams to the National Institute of Standards and Technology (NIST) tests, we have proved that our method can considerably improve the randomness of the generated keystreams. In order to incorporate our randomness-enhancement technique, only 41 extra slices have been needed, proving that, apart from effective, this method is also efficient in terms of area and hardware resources.

  7. Toward cost-efficient sampling methods

    Science.gov (United States)

    Luo, Peng; Li, Yongli; Wu, Chong; Zhang, Guijie

    2015-09-01

    The sampling method has been paid much attention in the field of complex network in general and statistical physics in particular. This paper proposes two new sampling methods based on the idea that a small part of vertices with high node degree could possess the most structure information of a complex network. The two proposed sampling methods are efficient in sampling high degree nodes so that they would be useful even if the sampling rate is low, which means cost-efficient. The first new sampling method is developed on the basis of the widely used stratified random sampling (SRS) method and the second one improves the famous snowball sampling (SBS) method. In order to demonstrate the validity and accuracy of two new sampling methods, we compare them with the existing sampling methods in three commonly used simulation networks that are scale-free network, random network, small-world network, and also in two real networks. The experimental results illustrate that the two proposed sampling methods perform much better than the existing sampling methods in terms of achieving the true network structure characteristics reflected by clustering coefficient, Bonacich centrality and average path length, especially when the sampling rate is low.

  8. Visualization of mole fraction distribution of slow jet forming stably stratified field

    International Nuclear Information System (INIS)

    Fumizawa, Motoo; Hishida, Makoto

    1990-01-01

    An experimental study has been performed to investigate the behavior of flow and mass transfer in gaseous slow jet in which buoyancy force opposed the flow forming stably stratified field. The study has been performed to understand the basic features of air ingress phenomena at pipe rupture accident of the high temperature gas-cooled reactor. A displacement fringe technique was adopted in Mach-Zehnder interferometer to visualize the mole fraction distribution. As the result, the followings were obtained: (1) The stably stratified fields were formed in the vicinity of the outlet of the slow jet. The penetration distance of the stably stratified fields increased with Froude number. (2) Mass fraction distributions in the stably stratified fields were well correlated with the present model using the ramp mole velocity profile. (author)

  9. Experimental and numerical investigation of stratified gas-liquid flow in inclined circular pipes

    International Nuclear Information System (INIS)

    Faccini, J.L.H.; Sampaio, P.A.B. de; Botelho, M.H.D.S.; Cunha, M.V.; Cunha Filho, J.S.; Su, J.

    2012-01-01

    In this paper, a stratified gas-liquid flow is experimentally and numerically investigated. Two measurement techniques, namely an ultrasonic technique and a visualization technique, are applied on an inclined circular test section using a fast single transducer pulse-echo technique and a high-speed camera. A numerical model is employed to simulate the stratified gas-liquid flow, formed by a system of non-linear differential equations consisting of the Reynolds averaged Navier-Stokes equations with the κ-ω turbulence model. The test section used in this work is comprised mainly of a transparent circular pipe with inner diameter 1 inch, and inclination angles varying from -2.5 to -10.0 degrees. Numerical solutions are obtained for the liquid height as a function of inclination angles, and compared with our own experimental data. (author)

  10. National Coral Reef Monitoring Program: Benthic Cover Derived from Analysis of Benthic Images Collected during Stratified Random Surveys (StRS) across the Mariana Archipelago from 2014-03-25 to 2014-05-07 (NCEI Accession 0159148)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The data described here result from benthic photo-quadrat surveys conducted along transects at stratified random sites across the Mariana archipelago in 2014 as a...

  11. National Coral Reef Monitoring Program: benthic cover derived from analysis of benthic images collected during stratified random surveys (StRS) across the Hawaiian Archipelago from 2016-07-13 to 2016-09-27 (NCEI Accession 0164295)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The data described here result from benthic photo-quadrat surveys conducted along transects at stratified random sites across the Hawaiian archipelago in 2016 as a...

  12. Department of Geography, Ibra

    African Journals Online (AJOL)

    USER

    2016-09-07

    Sep 7, 2016 ... sample unit was determined using stratified sampling method and simple random technique ... demands of the population, (WWAP, ... the national average of 1.8%. .... Optimal Access ... updated figured was used to calculate.

  13. Proteomic Challenges: Sample Preparation Techniques for Microgram-Quantity Protein Analysis from Biological Samples

    Directory of Open Access Journals (Sweden)

    Peter Feist

    2015-02-01

    Full Text Available Proteins regulate many cellular functions and analyzing the presence and abundance of proteins in biological samples are central focuses in proteomics. The discovery and validation of biomarkers, pathways, and drug targets for various diseases can be accomplished using mass spectrometry-based proteomics. However, with mass-limited samples like tumor biopsies, it can be challenging to obtain sufficient amounts of proteins to generate high-quality mass spectrometric data. Techniques developed for macroscale quantities recover sufficient amounts of protein from milligram quantities of starting material, but sample losses become crippling with these techniques when only microgram amounts of material are available. To combat this challenge, proteomicists have developed micro-scale techniques that are compatible with decreased sample size (100 μg or lower and still enable excellent proteome coverage. Extraction, contaminant removal, protein quantitation, and sample handling techniques for the microgram protein range are reviewed here, with an emphasis on liquid chromatography and bottom-up mass spectrometry-compatible techniques. Also, a range of biological specimens, including mammalian tissues and model cell culture systems, are discussed.

  14. Proteomic Challenges: Sample Preparation Techniques for Microgram-Quantity Protein Analysis from Biological Samples

    Science.gov (United States)

    Feist, Peter; Hummon, Amanda B.

    2015-01-01

    Proteins regulate many cellular functions and analyzing the presence and abundance of proteins in biological samples are central focuses in proteomics. The discovery and validation of biomarkers, pathways, and drug targets for various diseases can be accomplished using mass spectrometry-based proteomics. However, with mass-limited samples like tumor biopsies, it can be challenging to obtain sufficient amounts of proteins to generate high-quality mass spectrometric data. Techniques developed for macroscale quantities recover sufficient amounts of protein from milligram quantities of starting material, but sample losses become crippling with these techniques when only microgram amounts of material are available. To combat this challenge, proteomicists have developed micro-scale techniques that are compatible with decreased sample size (100 μg or lower) and still enable excellent proteome coverage. Extraction, contaminant removal, protein quantitation, and sample handling techniques for the microgram protein range are reviewed here, with an emphasis on liquid chromatography and bottom-up mass spectrometry-compatible techniques. Also, a range of biological specimens, including mammalian tissues and model cell culture systems, are discussed. PMID:25664860

  15. Proteomic challenges: sample preparation techniques for microgram-quantity protein analysis from biological samples.

    Science.gov (United States)

    Feist, Peter; Hummon, Amanda B

    2015-02-05

    Proteins regulate many cellular functions and analyzing the presence and abundance of proteins in biological samples are central focuses in proteomics. The discovery and validation of biomarkers, pathways, and drug targets for various diseases can be accomplished using mass spectrometry-based proteomics. However, with mass-limited samples like tumor biopsies, it can be challenging to obtain sufficient amounts of proteins to generate high-quality mass spectrometric data. Techniques developed for macroscale quantities recover sufficient amounts of protein from milligram quantities of starting material, but sample losses become crippling with these techniques when only microgram amounts of material are available. To combat this challenge, proteomicists have developed micro-scale techniques that are compatible with decreased sample size (100 μg or lower) and still enable excellent proteome coverage. Extraction, contaminant removal, protein quantitation, and sample handling techniques for the microgram protein range are reviewed here, with an emphasis on liquid chromatography and bottom-up mass spectrometry-compatible techniques. Also, a range of biological specimens, including mammalian tissues and model cell culture systems, are discussed.

  16. The contribution of simple random sampling to observed variations in faecal egg counts.

    Science.gov (United States)

    Torgerson, Paul R; Paul, Michaela; Lewis, Fraser I

    2012-09-10

    It has been over 100 years since the classical paper published by Gosset in 1907, under the pseudonym "Student", demonstrated that yeast cells suspended in a fluid and measured by a haemocytometer conformed to a Poisson process. Similarly parasite eggs in a faecal suspension also conform to a Poisson process. Despite this there are common misconceptions how to analyse or interpret observations from the McMaster or similar quantitative parasitic diagnostic techniques, widely used for evaluating parasite eggs in faeces. The McMaster technique can easily be shown from a theoretical perspective to give variable results that inevitably arise from the random distribution of parasite eggs in a well mixed faecal sample. The Poisson processes that lead to this variability are described and illustrative examples of the potentially large confidence intervals that can arise from observed faecal eggs counts that are calculated from the observations on a McMaster slide. Attempts to modify the McMaster technique, or indeed other quantitative techniques, to ensure uniform egg counts are doomed to failure and belie ignorance of Poisson processes. A simple method to immediately identify excess variation/poor sampling from replicate counts is provided. Copyright © 2012 Elsevier B.V. All rights reserved.

  17. NAIL SAMPLING TECHNIQUE AND ITS INTERPRETATION

    Directory of Open Access Journals (Sweden)

    TZAR MN

    2011-01-01

    Full Text Available The clinical suspicion of onychomyosis based on appearance of the nails, requires culture for confirmation. This is because treatment requires prolonged use of systemic agents which may cause side effects. One of the common problems encountered is improper nail sampling technique which results in loss of essential information. The unfamiliar terminologies used in reporting culture results may intimidate physicians resulting in misinterpretation and hamper treatment decision. This article provides a simple guide on nail sampling technique and the interpretation of culture results.

  18. Identification of System Parameters by the Random Decrement Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Kirkegaard, Poul Henning; Rytter, Anders

    1991-01-01

    -Walker equations and finally, least-square fitting of the theoretical correlation function. The results are compared to the results of fitting an Auto Regressive Moving Average (ARMA) model directly to the system output from a single-degree-of-freedom system loaded by white noise.......The aim of this paper is to investigate and illustrate the possibilities of using correlation functions estimated by the Random Decrement Technique as a basis for parameter identification. A two-stage system identification system is used: first, the correlation functions are estimated by the Random...... Decrement Technique, and then the system parameters are identified from the correlation function estimates. Three different techniques are used in the parameter identification process: a simple non-parametric method, estimation of an Auto Regressive (AR) model by solving an overdetermined set of Yule...

  19. Random vs. systematic sampling from administrative databases involving human subjects.

    Science.gov (United States)

    Hagino, C; Lo, R J

    1998-09-01

    Two sampling techniques, simple random sampling (SRS) and systematic sampling (SS), were compared to determine whether they yield similar and accurate distributions for the following four factors: age, gender, geographic location and years in practice. Any point estimate within 7 yr or 7 percentage points of its reference standard (SRS or the entire data set, i.e., the target population) was considered "acceptably similar" to the reference standard. The sampling frame was from the entire membership database of the Canadian Chiropractic Association. The two sampling methods were tested using eight different sample sizes of n (50, 100, 150, 200, 250, 300, 500, 800). From the profile/characteristics, summaries of four known factors [gender, average age, number (%) of chiropractors in each province and years in practice], between- and within-methods chi 2 tests and unpaired t tests were performed to determine whether any of the differences [descriptively greater than 7% or 7 yr] were also statistically significant. The strengths of the agreements between the provincial distributions were quantified by calculating the percent agreements for each (provincial pairwise-comparison methods). Any percent agreement less than 70% was judged to be unacceptable. Our assessments of the two sampling methods (SRS and SS) for the different sample sizes tested suggest that SRS and SS yielded acceptably similar results. Both methods started to yield "correct" sample profiles at approximately the same sample size (n > 200). SS is not only convenient, it can be recommended for sampling from large databases in which the data are listed without any inherent order biases other than alphabetical listing by surname.

  20. Development of sampling techniques for ITER Type B radwaste

    International Nuclear Information System (INIS)

    Hong, Kwon Pyo; Kim, Sung Geun; Jung, Sang Hee; Oh, Wan Ho; Park, Myung Chul; Kim, Hee Moon; Ahn, Sang Bok

    2016-01-01

    There are several difficulties and limitation in sampling activities. As the Type B radwaste components are mostly metallic(mostly stainless steel) and bulk(∼ 1 m in size and ∼ 100 mm in thickness), it is difficult in taking samples from the surface of Type B radwaste by remote operation. But also, sampling should be performed without use of any liquid coolant to avoid the spread of contamination. And all sampling procedures are carried in the hot cell red zone with remote operation. Three kinds of sampling techniques are being developed. They are core sampling, chip sampling, and wedge sampling, which are the candidates of sampling techniques to be applied to ITER hot cell. Object materials for sampling are stainless steel or Cu alloy block in order to simulate ITER Type B radwaste. The best sampling technique for ITER Type B radwaste among the three sampling techniques will be suggested in several months after finishing the related experiment

  1. Development of sampling techniques for ITER Type B radwaste

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Kwon Pyo; Kim, Sung Geun; Jung, Sang Hee; Oh, Wan Ho; Park, Myung Chul; Kim, Hee Moon; Ahn, Sang Bok [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    There are several difficulties and limitation in sampling activities. As the Type B radwaste components are mostly metallic(mostly stainless steel) and bulk(∼ 1 m in size and ∼ 100 mm in thickness), it is difficult in taking samples from the surface of Type B radwaste by remote operation. But also, sampling should be performed without use of any liquid coolant to avoid the spread of contamination. And all sampling procedures are carried in the hot cell red zone with remote operation. Three kinds of sampling techniques are being developed. They are core sampling, chip sampling, and wedge sampling, which are the candidates of sampling techniques to be applied to ITER hot cell. Object materials for sampling are stainless steel or Cu alloy block in order to simulate ITER Type B radwaste. The best sampling technique for ITER Type B radwaste among the three sampling techniques will be suggested in several months after finishing the related experiment.

  2. Rationale and Design of Khuzestan Vitamin D Deficiency Screening Program in Pregnancy: A Stratified Randomized Vitamin D Supplementation Controlled Trial.

    Science.gov (United States)

    Rostami, Maryam; Ramezani Tehrani, Fahimeh; Simbar, Masoumeh; Hosseinpanah, Farhad; Alavi Majd, Hamid

    2017-04-07

    Although there have been marked improvements in our understanding of vitamin D functions in different diseases, gaps on its role during pregnancy remain. Due to the lack of consensus on the most accurate marker of vitamin D deficiency during pregnancy and the optimal level of 25-hydroxyvitamin D, 25(OH)D, for its definition, vitamin D deficiency assessment during pregnancy is a complicated process. Besides, the optimal protocol for treatment of hypovitaminosis D and its effect on maternal and neonatal outcomes are still unclear. The aim of our study was to estimate the prevalence of vitamin D deficiency in the first trimester of pregnancy and to compare vitamin D screening strategy with no screening. Also, we intended to compare the effectiveness of various treatment regimens on maternal and neonatal outcomes in Masjed-Soleyman and Shushtar cities of Khuzestan province, Iran. This was a two-phase study. First, a population-based cross-sectional study was conducted; recruiting 1600 and 900 first trimester pregnant women from health centers of Masjed-Soleyman and Shushtar, respectively, using stratified multistage cluster sampling with probability proportional to size (PPS) method. Second, to assess the effect of screening strategy on maternal and neonatal outcomes, Masjed-Soleyman participants were assigned to a screening program versus Shushtar participants who became the nonscreening arm. Within the framework of the screening regimen, an 8-arm blind randomized clinical trial was undertaken to compare the effects of various treatment protocols. A total of 800 pregnant women with vitamin D deficiency were selected using simple random sampling from the 1600 individuals of Masjed-Soleyman as interventional groups. Serum concentrations of 25(OH)D were classified as: (1) severe deficient (20ng/ml). Those with severe and moderate deficiency were randomly divided into 4 subgroups and received vitamin D3 based on protocol and were followed until delivery. Data was analyzed

  3. Boat sampling technique for assessment of ageing of components

    International Nuclear Information System (INIS)

    Kumar, Kundan; Shyam, T.V.; Kayal, J.N.; Rupani, B.B.

    2006-01-01

    Boat sampling technique (BST) is a surface sampling technique, which has been developed for obtaining, in-situ, metal samples from the surface of an operating component without affecting its operating service life. The BST is non-destructive in nature and the sample is obtained without plastic deformation or without thermal degradation of the parent material. The shape and size of the sample depends upon the shape of the cutter and the surface geometry of the parent material. Miniature test specimens are generated from the sample and the specimens are subjected to various tests, viz. Metallurgical Evaluation, Metallographic Evaluation, Micro-hardness Evaluation, sensitisation test, small punch test etc. to confirm the integrity and assessment of safe operating life of the component. This paper highlights design objective of boat sampling technique, description of sampling module, sampling cutter and its performance evaluation, cutting process, boat samples, operational sequence of sampling module, qualification of sampling module, qualification of sampling technique, qualification of scooped region of the parent material, sample retrieval system, inspection, testing and examination to be carried out on the boat samples and scooped region. (author)

  4. Unwilling or Unable to Cheat? Evidence from a Randomized Tax Audit Experiment in Denmark

    OpenAIRE

    Henrik J. Kleven; Martin B. Knudsen; Claus T. Kreiner; Søren Pedersen; Emmanuel Saez

    2010-01-01

    This paper analyzes a randomized tax enforcement experiment in Denmark. In the base year, a stratified and representative sample of over 40,000 individual income tax filers was selected for the experiment. Half of the tax filers were randomly selected to be thoroughly audited, while the rest were deliberately not audited. The following year, "threat-of-audit" letters were randomly assigned and sent to tax filers in both groups. Using comprehensive administrative tax data, we present four main...

  5. SOME SYSTEMATIC SAMPLING STRATEGIES USING MULTIPLE RANDOM STARTS

    OpenAIRE

    Sampath Sundaram; Ammani Sivaraman

    2010-01-01

    In this paper an attempt is made to extend linear systematic sampling using multiple random starts due to Gautschi(1957)for various types of systematic sampling schemes available in literature, namely(i)  Balanced Systematic Sampling (BSS) of  Sethi (1965) and (ii) Modified Systematic Sampling (MSS) of Singh, Jindal, and Garg  (1968). Further, the proposed methods were compared with Yates corrected estimator developed with reference to Gautschi’s Linear systematic samplin...

  6. Identification of System Parameters by the Random Decrement Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Kirkegaard, Poul Henning; Rytter, Anders

    -Walker equations and finally least square fitting of the theoretical correlation function. The results are compared to the results of fitting an Auto Regressive Moving Average(ARMA) model directly to the system output. All investigations are performed on the simulated output from a single degree-off-freedom system......The aim of this paper is to investigate and illustrate the possibilities of using correlation functions estimated by the Random Decrement Technique as a basis for parameter identification. A two-stage system identification method is used: first the correlation functions are estimated by the Random...... Decrement technique and then the system parameters are identified from the correlation function estimates. Three different techniques are used in the parameters identification process: a simple non-paramatic method, estimation of an Auto Regressive(AR) model by solving an overdetermined set of Yule...

  7. National Coral Reef Monitoring Program: Benthic Cover Derived from Analysis of Benthic Images Collected during Stratified Random Surveys (StRS) across the Pacific Remote Island Areas from 2015-01-26 to 2015-04-28 (NCEI Accession 0159165)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The data described here result from benthic photo-quadrat surveys conducted along transects at stratified random sites across the Pacific Remote Island Areas since...

  8. Assessment of the effect of population and diary sampling methods on estimation of school-age children exposure to fine particles.

    Science.gov (United States)

    Che, W W; Frey, H Christopher; Lau, Alexis K H

    2014-12-01

    Population and diary sampling methods are employed in exposure models to sample simulated individuals and their daily activity on each simulation day. Different sampling methods may lead to variations in estimated human exposure. In this study, two population sampling methods (stratified-random and random-random) and three diary sampling methods (random resampling, diversity and autocorrelation, and Markov-chain cluster [MCC]) are evaluated. Their impacts on estimated children's exposure to ambient fine particulate matter (PM2.5 ) are quantified via case studies for children in Wake County, NC for July 2002. The estimated mean daily average exposure is 12.9 μg/m(3) for simulated children using the stratified population sampling method, and 12.2 μg/m(3) using the random sampling method. These minor differences are caused by the random sampling among ages within census tracts. Among the three diary sampling methods, there are differences in the estimated number of individuals with multiple days of exposures exceeding a benchmark of concern of 25 μg/m(3) due to differences in how multiday longitudinal diaries are estimated. The MCC method is relatively more conservative. In case studies evaluated here, the MCC method led to 10% higher estimation of the number of individuals with repeated exposures exceeding the benchmark. The comparisons help to identify and contrast the capabilities of each method and to offer insight regarding implications of method choice. Exposure simulation results are robust to the two population sampling methods evaluated, and are sensitive to the choice of method for simulating longitudinal diaries, particularly when analyzing results for specific microenvironments or for exposures exceeding a benchmark of concern. © 2014 Society for Risk Analysis.

  9. Non-terminal blood sampling techniques in guinea pigs.

    Science.gov (United States)

    Birck, Malene M; Tveden-Nyborg, Pernille; Lindblad, Maiken M; Lykkesfeldt, Jens

    2014-10-11

    Guinea pigs possess several biological similarities to humans and are validated experimental animal models(1-3). However, the use of guinea pigs currently represents a relatively narrow area of research and descriptive data on specific methodology is correspondingly scarce. The anatomical features of guinea pigs are slightly different from other rodent models, hence modulation of sampling techniques to accommodate for species-specific differences, e.g., compared to mice and rats, are necessary to obtain sufficient and high quality samples. As both long and short term in vivo studies often require repeated blood sampling the choice of technique should be well considered in order to reduce stress and discomfort in the animals but also to ensure survival as well as compliance with requirements of sample size and accessibility. Venous blood samples can be obtained at a number of sites in guinea pigs e.g., the saphenous and jugular veins, each technique containing both advantages and disadvantages(4,5). Here, we present four different blood sampling techniques for either conscious or anaesthetized guinea pigs. The procedures are all non-terminal procedures provided that sample volumes and number of samples do not exceed guidelines for blood collection in laboratory animals(6). All the described methods have been thoroughly tested and applied for repeated in vivo blood sampling in studies within our research facility.

  10. Stratifying empiric risk of schizophrenia among first degree relatives using multiple predictors in two independent Indian samples.

    Science.gov (United States)

    Bhatia, Triptish; Gettig, Elizabeth A; Gottesman, Irving I; Berliner, Jonathan; Mishra, N N; Nimgaonkar, Vishwajit L; Deshpande, Smita N

    2016-12-01

    Schizophrenia (SZ) has an estimated heritability of 64-88%, with the higher values based on twin studies. Conventionally, family history of psychosis is the best individual-level predictor of risk, but reliable risk estimates are unavailable for Indian populations. Genetic, environmental, and epigenetic factors are equally important and should be considered when predicting risk in 'at risk' individuals. To estimate risk based on an Indian schizophrenia participant's family history combined with selected demographic factors. To incorporate variables in addition to family history, and to stratify risk, we constructed a regression equation that included demographic variables in addition to family history. The equation was tested in two independent Indian samples: (i) an initial sample of SZ participants (N=128) with one sibling or offspring; (ii) a second, independent sample consisting of multiply affected families (N=138 families, with two or more sibs/offspring affected with SZ). The overall estimated risk was 4.31±0.27 (mean±standard deviation). There were 19 (14.8%) individuals in the high risk group, 75 (58.6%) in the moderate risk and 34 (26.6%) in the above average risk (in Sample A). In the validation sample, risks were distributed as: high (45%), moderate (38%) and above average (17%). Consistent risk estimates were obtained from both samples using the regression equation. Familial risk can be combined with demographic factors to estimate risk for SZ in India. If replicated, the proposed stratification of risk may be easier and more realistic for family members. Copyright © 2016. Published by Elsevier B.V.

  11. Screen Space Ambient Occlusion Based Multiple Importance Sampling for Real-Time Rendering

    Science.gov (United States)

    Zerari, Abd El Mouméne; Babahenini, Mohamed Chaouki

    2018-03-01

    We propose a new approximation technique for accelerating the Global Illumination algorithm for real-time rendering. The proposed approach is based on the Screen-Space Ambient Occlusion (SSAO) method, which approximates the global illumination for large, fully dynamic scenes at interactive frame rates. Current algorithms that are based on the SSAO method suffer from difficulties due to the large number of samples that are required. In this paper, we propose an improvement to the SSAO technique by integrating it with a Multiple Importance Sampling technique that combines a stratified sampling method with an importance sampling method, with the objective of reducing the number of samples. Experimental evaluation demonstrates that our technique can produce high-quality images in real time and is significantly faster than traditional techniques.

  12. Effectiveness of a two-step population-based osteoporosis screening program using FRAX: the randomized Risk-stratified Osteoporosis Strategy Evaluation (ROSE) study.

    Science.gov (United States)

    Rubin, K H; Rothmann, M J; Holmberg, T; Høiberg, M; Möller, S; Barkmann, R; Glüer, C C; Hermann, A P; Bech, M; Gram, J; Brixen, K

    2018-03-01

    The Risk-stratified Osteoporosis Strategy Evaluation (ROSE) study investigated the effectiveness of a two-step screening program for osteoporosis in women. We found no overall reduction in fractures from systematic screening compared to the current case-finding strategy. The group of moderate- to high-risk women, who accepted the invitation to DXA, seemed to benefit from the program. The purpose of the ROSE study was to investigate the effectiveness of a two-step population-based osteoporosis screening program using the Fracture Risk Assessment Tool (FRAX) derived from a self-administered questionnaire to select women for DXA scan. After the scanning, standard osteoporosis management according to Danish national guidelines was followed. Participants were randomized to either screening or control group, and randomization was stratified according to age and area of residence. Inclusion took place from February 2010 to November 2011. Participants received a self-administered questionnaire, and women in the screening group with a FRAX score ≥ 15% (major osteoporotic fractures) were invited to a DXA scan. Primary outcome was incident clinical fractures. Intention-to-treat analysis and two per-protocol analyses were performed. A total of 3416 fractures were observed during a median follow-up of 5 years. No significant differences were found in the intention-to-treat analyses with 34,229 women included aged 65-80 years. The per-protocol analyses showed a risk reduction in the group that underwent DXA scanning compared to women in the control group with a FRAX ≥ 15%, in regard to major osteoporotic fractures, hip fractures, and all fractures. The risk reduction was most pronounced for hip fractures (adjusted SHR 0.741, p = 0.007). Compared to an office-based case-finding strategy, the two-step systematic screening strategy had no overall effect on fracture incidence. The two-step strategy seemed, however, to be beneficial in the group of women who were

  13. Short-term outcome after Onstep versus Lichtenstein technique for inguinal hernia repair

    DEFF Research Database (Denmark)

    Andresen, K; Burcharth, J; Fonnes, S

    2015-01-01

    was to investigate if there were differences in early postoperative pain during the first 10 days between the Onstep and the Lichtenstein technique. METHODS: This was a double-blinded, randomized clinical trial conducted in five surgical departments in Denmark, from April 2013 to June 2014. Eligible participants...... for this study were male patients, >18 years, with a primary inguinal hernia. Experimental treatment in this study was the Onstep technique, which was compared with the Lichtenstein repair. Primary outcome was postoperative pain during the first 10 days following surgery. Secondary outcomes included duration...... of surgery, period for return to normal daily activities (days), and recurrence. Randomization was done in blocks and stratified on centers. Participants and study personnel handling questionnaires and analysis were blinded to the allocation. RESULTS: In total, 290 participants were randomized. We found...

  14. Interactive Fuzzy Goal Programming approach in multi-response stratified sample surveys

    Directory of Open Access Journals (Sweden)

    Gupta Neha

    2016-01-01

    Full Text Available In this paper, we applied an Interactive Fuzzy Goal Programming (IFGP approach with linear, exponential and hyperbolic membership functions, which focuses on maximizing the minimum membership values to determine the preferred compromise solution for the multi-response stratified surveys problem, formulated as a Multi- Objective Non Linear Programming Problem (MONLPP, and by linearizing the nonlinear objective functions at their individual optimum solution, the problem is approximated to an Integer Linear Programming Problem (ILPP. A numerical example based on real data is given, and comparison with some existing allocations viz. Cochran’s compromise allocation, Chatterjee’s compromise allocation and Khowaja’s compromise allocation is made to demonstrate the utility of the approach.

  15. Health plan auditing: 100-percent-of-claims vs. random-sample audits.

    Science.gov (United States)

    Sillup, George P; Klimberg, Ronald K

    2011-01-01

    The objective of this study was to examine the relative efficacy of two different methodologies for auditing self-funded medical claim expenses: 100-percent-of-claims auditing versus random-sampling auditing. Multiple data sets of claim errors or 'exceptions' from two Fortune-100 corporations were analysed and compared to 100 simulated audits of 300- and 400-claim random samples. Random-sample simulations failed to identify a significant number and amount of the errors that ranged from $200,000 to $750,000. These results suggest that health plan expenses of corporations could be significantly reduced if they audited 100% of claims and embraced a zero-defect approach.

  16. Is the Role of Teacher Performance Appraisal in Ethiopia Rhetoric or ...

    African Journals Online (AJOL)

    The study was conducted on eight randomly selected full cycle primary schools selected using urban and rural stratification. Teachers, school administrative committee, students, and parents were participants of the study. Proportionate stratified random sampling technique was employed to select teachers. On the other ...

  17. Effectiveness of three different oral hygiene techniques on Viridans streptococci: A randomized controlled trial

    Directory of Open Access Journals (Sweden)

    N Naveen

    2016-01-01

    Full Text Available Introduction: Tongue cleaning is an important aspect of oral hygiene maintenance along with other mechanical and chemical aids. These methods have an influence on microorganism count in saliva. Aim: To assess the effectiveness of three different oral hygiene techniques on Viridans streptococci. Materials and Methods: This was a randomized controlled trial with 45 study subjects aged between 14 and 16 years and were randomly allocated into three groups: Group A - plastic tongue scraper, Group B - chlorhexidine mouthwash along with plastic tongue scraper, and Group C - chlorhexidine mouthwash. Unstimulated salivary samples were collected on the 1st, 7th, and 15th day before routine oral hygiene practices. Saliva samples were collected and incubated for 48 h on itis Salivarius(MS agar. Streptococcus mitis, Streptococcus mutans, and Streptococcus salivarius were counted. Data were analyzed using descriptive and inferential statistics. Results: The mean count of S. mitis, S. mutans, and S. salivarius for Group A, B, and C was found to be significant (P < 0.001 when compared between 1st, 7th, and 15th day. Between-groups comparisons revealed a significant difference between Groups A and C, B and C (P < 0.001. Conclusion: There was a significant reduction in bacterial count in all the participants indicating that all the three methods are useful in improving oral hygiene. Combination technique was found to be most effective.

  18. Revisiting random walk based sampling in networks: evasion of burn-in period and frequent regenerations.

    Science.gov (United States)

    Avrachenkov, Konstantin; Borkar, Vivek S; Kadavankandy, Arun; Sreedharan, Jithin K

    2018-01-01

    In the framework of network sampling, random walk (RW) based estimation techniques provide many pragmatic solutions while uncovering the unknown network as little as possible. Despite several theoretical advances in this area, RW based sampling techniques usually make a strong assumption that the samples are in stationary regime, and hence are impelled to leave out the samples collected during the burn-in period. This work proposes two sampling schemes without burn-in time constraint to estimate the average of an arbitrary function defined on the network nodes, for example, the average age of users in a social network. The central idea of the algorithms lies in exploiting regeneration of RWs at revisits to an aggregated super-node or to a set of nodes, and in strategies to enhance the frequency of such regenerations either by contracting the graph or by making the hitting set larger. Our first algorithm, which is based on reinforcement learning (RL), uses stochastic approximation to derive an estimator. This method can be seen as intermediate between purely stochastic Markov chain Monte Carlo iterations and deterministic relative value iterations. The second algorithm, which we call the Ratio with Tours (RT)-estimator, is a modified form of respondent-driven sampling (RDS) that accommodates the idea of regeneration. We study the methods via simulations on real networks. We observe that the trajectories of RL-estimator are much more stable than those of standard random walk based estimation procedures, and its error performance is comparable to that of respondent-driven sampling (RDS) which has a smaller asymptotic variance than many other estimators. Simulation studies also show that the mean squared error of RT-estimator decays much faster than that of RDS with time. The newly developed RW based estimators (RL- and RT-estimators) allow to avoid burn-in period, provide better control of stability along the sample path, and overall reduce the estimation time. Our

  19. Differences in sampling techniques on total post-mortem tryptase.

    Science.gov (United States)

    Tse, R; Garland, J; Kesha, K; Elstub, H; Cala, A D; Ahn, Y; Stables, S; Palmiere, C

    2017-11-20

    The measurement of mast cell tryptase is commonly used to support the diagnosis of anaphylaxis. In the post-mortem setting, the literature recommends sampling from peripheral blood sources (femoral blood) but does not specify the exact sampling technique. Sampling techniques vary between pathologists, and it is unclear whether different sampling techniques have any impact on post-mortem tryptase levels. The aim of this study is to compare the difference in femoral total post-mortem tryptase levels between two sampling techniques. A 6-month retrospective study comparing femoral total post-mortem tryptase levels between (1) aspirating femoral vessels with a needle and syringe prior to evisceration and (2) femoral vein cut down during evisceration. Twenty cases were identified, with three cases excluded from analysis. There was a statistically significant difference (paired t test, p sampling methods. The clinical significance of this finding and what factors may contribute to it are unclear. When requesting post-mortem tryptase, the pathologist should consider documenting the exact blood collection site and method used for collection. In addition, blood samples acquired by different techniques should not be mixed together and should be analyzed separately if possible.

  20. Microextraction sample preparation techniques in biomedical analysis.

    Science.gov (United States)

    Szultka, Malgorzata; Pomastowski, Pawel; Railean-Plugaru, Viorica; Buszewski, Boguslaw

    2014-11-01

    Biologically active compounds are found in biological samples at relatively low concentration levels. The sample preparation of target compounds from biological, pharmaceutical, environmental, and food matrices is one of the most time-consuming steps in the analytical procedure. The microextraction techniques are dominant. Metabolomic studies also require application of proper analytical technique for the determination of endogenic metabolites present in biological matrix on trace concentration levels. Due to the reproducibility of data, precision, relatively low cost of the appropriate analysis, simplicity of the determination, and the possibility of direct combination of those techniques with other methods (combination types on-line and off-line), they have become the most widespread in routine determinations. Additionally, sample pretreatment procedures have to be more selective, cheap, quick, and environmentally friendly. This review summarizes the current achievements and applications of microextraction techniques. The main aim is to deal with the utilization of different types of sorbents for microextraction and emphasize the use of new synthesized sorbents as well as to bring together studies concerning the systematic approach to method development. This review is dedicated to the description of microextraction techniques and their application in biomedical analysis. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Study of the therapeutic effects of a hippotherapy simulator in children with cerebral palsy: a stratified single-blind randomized controlled trial.

    Science.gov (United States)

    Herrero, Pablo; Gómez-Trullén, Eva M; Asensio, Angel; García, Elena; Casas, Roberto; Monserrat, Esther; Pandyan, Anand

    2012-12-01

    To investigate whether hippotherapy (when applied by a simulator) improves postural control and balance in children with cerebral palsy. Stratified single-blind randomized controlled trial with an independent assessor. Stratification was made by gross motor function classification system levels, and allocation was concealed. Children between 4 and 18 years old with cerebral palsy. Participants were randomized to an intervention (simulator ON) or control (simulator OFF) group after getting informed consent. Treatment was provided once a week (15 minutes) for 10 weeks. Gross Motor Function Measure (dimension B for balance and the Total Score) and Sitting Assessment Scale were carried out at baseline (prior to randomization), end of intervention and 12 weeks after completing the intervention. Thirty-eight children participated. The groups were balanced at baseline. Sitting balance (measured by dimension B of the Gross Motor Function Measure) improved significantly in the treatment group (effect size = 0.36; 95% CI 0.01-0.71) and the effect size was greater in the severely disabled group (effect size = 0.80; 95% CI 0.13-1.47). The improvements in sitting balance were not maintained over the follow-up period. Changes in the total score of the Gross Motor Function Measure and the Sitting Assessment Scale were not significant. Hippotherapy with a simulator can improve sitting balance in cerebral palsy children who have higher levels of disability. However, this did not lead to a change in the overall function of these children (Gross Motor Function Classification System level V).

  2. Information content of household-stratified epidemics

    Directory of Open Access Journals (Sweden)

    T.M. Kinyanjui

    2016-09-01

    Full Text Available Household structure is a key driver of many infectious diseases, as well as a natural target for interventions such as vaccination programs. Many theoretical and conceptual advances on household-stratified epidemic models are relatively recent, but have successfully managed to increase the applicability of such models to practical problems. To be of maximum realism and hence benefit, they require parameterisation from epidemiological data, and while household-stratified final size data has been the traditional source, increasingly time-series infection data from households are becoming available. This paper is concerned with the design of studies aimed at collecting time-series epidemic data in order to maximize the amount of information available to calibrate household models. A design decision involves a trade-off between the number of households to enrol and the sampling frequency. Two commonly used epidemiological study designs are considered: cross-sectional, where different households are sampled at every time point, and cohort, where the same households are followed over the course of the study period. The search for an optimal design uses Bayesian computationally intensive methods to explore the joint parameter-design space combined with the Shannon entropy of the posteriors to estimate the amount of information in each design. For the cross-sectional design, the amount of information increases with the sampling intensity, i.e., the designs with the highest number of time points have the most information. On the other hand, the cohort design often exhibits a trade-off between the number of households sampled and the intensity of follow-up. Our results broadly support the choices made in existing epidemiological data collection studies. Prospective problem-specific use of our computational methods can bring significant benefits in guiding future study designs.

  3. Information content of household-stratified epidemics.

    Science.gov (United States)

    Kinyanjui, T M; Pellis, L; House, T

    2016-09-01

    Household structure is a key driver of many infectious diseases, as well as a natural target for interventions such as vaccination programs. Many theoretical and conceptual advances on household-stratified epidemic models are relatively recent, but have successfully managed to increase the applicability of such models to practical problems. To be of maximum realism and hence benefit, they require parameterisation from epidemiological data, and while household-stratified final size data has been the traditional source, increasingly time-series infection data from households are becoming available. This paper is concerned with the design of studies aimed at collecting time-series epidemic data in order to maximize the amount of information available to calibrate household models. A design decision involves a trade-off between the number of households to enrol and the sampling frequency. Two commonly used epidemiological study designs are considered: cross-sectional, where different households are sampled at every time point, and cohort, where the same households are followed over the course of the study period. The search for an optimal design uses Bayesian computationally intensive methods to explore the joint parameter-design space combined with the Shannon entropy of the posteriors to estimate the amount of information in each design. For the cross-sectional design, the amount of information increases with the sampling intensity, i.e., the designs with the highest number of time points have the most information. On the other hand, the cohort design often exhibits a trade-off between the number of households sampled and the intensity of follow-up. Our results broadly support the choices made in existing epidemiological data collection studies. Prospective problem-specific use of our computational methods can bring significant benefits in guiding future study designs. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  4. A cost-saving statistically based screening technique for focused sampling of a lead-contaminated site

    International Nuclear Information System (INIS)

    Moscati, A.F. Jr.; Hediger, E.M.; Rupp, M.J.

    1986-01-01

    High concentrations of lead in soils along an abandoned railroad line prompted a remedial investigation to characterize the extent of contamination across a 7-acre site. Contamination was thought to be spotty across the site reflecting its past use in battery recycling operations at discrete locations. A screening technique was employed to delineate the more highly contaminated areas by testing a statistically determined minimum number of random samples from each of seven discrete site areas. The approach not only quickly identified those site areas which would require more extensive grid sampling, but also provided a statistically defensible basis for excluding other site areas from further consideration, thus saving the cost of additional sample collection and analysis. The reduction in the number of samples collected in ''clean'' areas of the site ranged from 45 to 60%

  5. New materials for sample preparation techniques in bioanalysis.

    Science.gov (United States)

    Nazario, Carlos Eduardo Domingues; Fumes, Bruno Henrique; da Silva, Meire Ribeiro; Lanças, Fernando Mauro

    2017-02-01

    The analysis of biological samples is a complex and difficult task owing to two basic and complementary issues: the high complexity of most biological matrices and the need to determine minute quantities of active substances and contaminants in such complex sample. To succeed in this endeavor samples are usually subject to three steps of a comprehensive analytical methodological approach: sample preparation, analytes isolation (usually utilizing a chromatographic technique) and qualitative/quantitative analysis (usually with the aid of mass spectrometric tools). Owing to the complex nature of bio-samples, and the very low concentration of the target analytes to be determined, selective sample preparation techniques is mandatory in order to overcome the difficulties imposed by these two constraints. During the last decade new chemical synthesis approaches has been developed and optimized, such as sol-gel and molecularly imprinting technologies, allowing the preparation of novel materials for sample preparation including graphene and derivatives, magnetic materials, ionic liquids, molecularly imprinted polymers, and much more. In this contribution we will review these novel techniques and materials, as well as their application to the bioanalysis niche. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Vegetation structure and composition across different land use in a semi-arid savanna of southern Zimbabwe

    NARCIS (Netherlands)

    Zisadza-Gandiwa, P.; Mango, L.; Gandiwa, E.; Goza, D.; Parakasingwa, C.; Chinoitezvi, E.; Shimbani, J.; Muvengwi, J.

    2013-01-01

    We compared the structure and composition of vegetation communities across different land uses in the northern Gonarezhou National Park and adjacent areas, southeast Zimbabwe. Vegetation data were collected from 60 sample plots using a stratified random sampling technique from April to May 2012.

  7. Comparison of sampling strategies for tobacco retailer inspections to maximize coverage in vulnerable areas and minimize cost.

    Science.gov (United States)

    Lee, Joseph G L; Shook-Sa, Bonnie E; Bowling, J Michael; Ribisl, Kurt M

    2017-06-23

    In the United States, tens of thousands of inspections of tobacco retailers are conducted each year. Various sampling choices can reduce travel costs, emphasize enforcement in areas with greater non-compliance, and allow for comparability between states and over time. We sought to develop a model sampling strategy for state tobacco retailer inspections. Using a 2014 list of 10,161 North Carolina tobacco retailers, we compared results from simple random sampling; stratified, clustered at the ZIP code sampling; and, stratified, clustered at the census tract sampling. We conducted a simulation of repeated sampling and compared approaches for their comparative level of precision, coverage, and retailer dispersion. While maintaining an adequate design effect and statistical precision appropriate for a public health enforcement program, both stratified, clustered ZIP- and tract-based approaches were feasible. Both ZIP and tract strategies yielded improvements over simple random sampling, with relative improvements, respectively, of average distance between retailers (reduced 5.0% and 1.9%), percent Black residents in sampled neighborhoods (increased 17.2% and 32.6%), percent Hispanic residents in sampled neighborhoods (reduced 2.2% and increased 18.3%), percentage of sampled retailers located near schools (increased 61.3% and 37.5%), and poverty rate in sampled neighborhoods (increased 14.0% and 38.2%). States can make retailer inspections more efficient and targeted with stratified, clustered sampling. Use of statistically appropriate sampling strategies like these should be considered by states, researchers, and the Food and Drug Administration to improve program impact and allow for comparisons over time and across states. The authors present a model tobacco retailer sampling strategy for promoting compliance and reducing costs that could be used by U.S. states and the Food and Drug Administration (FDA). The design is feasible to implement in North Carolina. Use of

  8. Estimates of Inequality Indices Based on Simple Random, Ranked Set, and Systematic Sampling

    OpenAIRE

    Bansal, Pooja; Arora, Sangeeta; Mahajan, Kalpana K.

    2013-01-01

    Gini index, Bonferroni index, and Absolute Lorenz index are some popular indices of inequality showing different features of inequality measurement. In general simple random sampling procedure is commonly used to estimate the inequality indices and their related inference. The key condition that the samples must be drawn via simple random sampling procedure though makes calculations much simpler but this assumption is often violated in practice as the data does not always yield simple random ...

  9. Employment status, inflation and suicidal behaviour: an analysis of a stratified sample in Italy.

    Science.gov (United States)

    Solano, Paola; Pizzorno, Enrico; Gallina, Anna M; Mattei, Chiara; Gabrielli, Filippo; Kayman, Joshua

    2012-09-01

    There is abundant empirical evidence of a surplus risk of suicide among the unemployed, although few studies have investigated the influence of economic downturns on suicidal behaviours in an employment status-stratified sample. We investigated how economic inflation affected suicidal behaviours according to employment status in Italy from 2001 to 2008. Data concerning economically active people were provided by the Italian Institute for Statistical Analysis and by the International Monetary Fund. The association between inflation and completed versus attempted suicide with respect to employment status was investigated in every year and quarter-year of the study time frame. We considered three occupational categories: employed, unemployed who were previously employed and unemployed who had never worked. The unemployed are at higher suicide risk than the employed. Among the PE, a significant association between inflation and suicide attempt was found, whereas no association was reported concerning completed suicides. No association was found between completed and attempted suicides among the employed, the NE and inflation. Completed suicide in females is significantly associated with unemployment in every quarter-year. The reported vulnerability to suicidal behaviours among the PE as inflation rises underlines the need of effective support strategies for both genders in times of economic downturns.

  10. An alternative procedure for estimating the population mean in simple random sampling

    Directory of Open Access Journals (Sweden)

    Housila P. Singh

    2012-03-01

    Full Text Available This paper deals with the problem of estimating the finite population mean using auxiliary information in simple random sampling. Firstly we have suggested a correction to the mean squared error of the estimator proposed by Gupta and Shabbir [On improvement in estimating the population mean in simple random sampling. Jour. Appl. Statist. 35(5 (2008, pp. 559-566]. Later we have proposed a ratio type estimator and its properties are studied in simple random sampling. Numerically we have shown that the proposed class of estimators is more efficient than different known estimators including Gupta and Shabbir (2008 estimator.

  11. Correlated random sampling for multivariate normal and log-normal distributions

    International Nuclear Information System (INIS)

    Žerovnik, Gašper; Trkov, Andrej; Kodeli, Ivan A.

    2012-01-01

    A method for correlated random sampling is presented. Representative samples for multivariate normal or log-normal distribution can be produced. Furthermore, any combination of normally and log-normally distributed correlated variables may be sampled to any requested accuracy. Possible applications of the method include sampling of resonance parameters which are used for reactor calculations.

  12. Prevalence of Childhood Physical Abuse in a Representative Sample of College Students in Samsun, Turkey

    Science.gov (United States)

    Turla, Ahmet; Dundar, Cihad; Ozkanli, Caglar

    2010-01-01

    The main objective of this article is to obtain the prevalence of childhood physical abuse experiences in college students. This cross-sectional study was performed on a gender-stratified random sample of 988 participants studying at Ondokuz Mayis University, with self-reported anonymous questionnaires. It included questions on physical abuse in…

  13. Determination of metals in air samples using X-Ray fluorescence associated the APDC preconcentration technique

    Energy Technology Data Exchange (ETDEWEB)

    Nardes, Raysa C.; Santos, Ramon S.; Sanches, Francis A.C.R.A.; Gama Filho, Hamilton S.; Oliveira, Davi F.; Anjos, Marcelino J., E-mail: rc.nardes@gmail.com, E-mail: ramonziosp@yahoo.com.br, E-mail: francissanches@gmail.com, E-mail: hamiltongamafilho@hotmail.com, E-mail: davi.oliveira@uerj.br, E-mail: marcelin@uerj.br [Universidade do Estado do Rio de Janeiro (UERJ), Rio de Janeiro, RJ (Brazil). Instituto de Fisica. Departamento de Fisica Aplicada e Termodinamica

    2015-07-01

    Air pollution has become one of the leading quality degradation factors of life for people in large urban centers. Studies indicate that the suspended particulate matter in the atmosphere is directly associated with risks to public health, in addition, it can cause damage to fauna, flora and public / cultural patrimonies. The inhalable particulate materials can cause the emergence and / or worsening of chronic diseases related to respiratory system and other diseases, such as reduced physical strength. In this study, we propose a new method to measure the concentration of total suspended particulate matter (TSP) in the air using an impinger as an air cleaning apparatus, preconcentration with APDC and Total Reflection X-ray Fluorescence technique (TXRF) to analyze the heavy metals present in the air. The samples were collected from five random points in the city of Rio de Janeiro/Brazil. Analyses of TXRF were performed at the Brazilian Synchrotron Light Laboratory (LNLS). The technique proved viable because it was able to detect five important metallic elements to environmental studies: Cr, Fe, Ni, Cu and Zn. This technique presented substantial efficiency in determining the elementary concentration of air pollutants, in addition to low cost. It can be concluded that the metals analysis technique in air samples using an impinger as sample collection instrument associated with a complexing agent (APDC) was viable because it is a low-cost technique, moreover, it was possible the detection of five important metal elements in environmental studies associated with industrial emissions and urban traffic. (author)

  14. Determination of metals in air samples using X-Ray fluorescence associated the APDC preconcentration technique

    International Nuclear Information System (INIS)

    Nardes, Raysa C.; Santos, Ramon S.; Sanches, Francis A.C.R.A.; Gama Filho, Hamilton S.; Oliveira, Davi F.; Anjos, Marcelino J.

    2015-01-01

    Air pollution has become one of the leading quality degradation factors of life for people in large urban centers. Studies indicate that the suspended particulate matter in the atmosphere is directly associated with risks to public health, in addition, it can cause damage to fauna, flora and public / cultural patrimonies. The inhalable particulate materials can cause the emergence and / or worsening of chronic diseases related to respiratory system and other diseases, such as reduced physical strength. In this study, we propose a new method to measure the concentration of total suspended particulate matter (TSP) in the air using an impinger as an air cleaning apparatus, preconcentration with APDC and Total Reflection X-ray Fluorescence technique (TXRF) to analyze the heavy metals present in the air. The samples were collected from five random points in the city of Rio de Janeiro/Brazil. Analyses of TXRF were performed at the Brazilian Synchrotron Light Laboratory (LNLS). The technique proved viable because it was able to detect five important metallic elements to environmental studies: Cr, Fe, Ni, Cu and Zn. This technique presented substantial efficiency in determining the elementary concentration of air pollutants, in addition to low cost. It can be concluded that the metals analysis technique in air samples using an impinger as sample collection instrument associated with a complexing agent (APDC) was viable because it is a low-cost technique, moreover, it was possible the detection of five important metal elements in environmental studies associated with industrial emissions and urban traffic. (author)

  15. A tale of two "forests": random forest machine learning AIDS tropical forest carbon mapping.

    Science.gov (United States)

    Mascaro, Joseph; Asner, Gregory P; Knapp, David E; Kennedy-Bowdoin, Ty; Martin, Roberta E; Anderson, Christopher; Higgins, Mark; Chadwick, K Dana

    2014-01-01

    Accurate and spatially-explicit maps of tropical forest carbon stocks are needed to implement carbon offset mechanisms such as REDD+ (Reduced Deforestation and Degradation Plus). The Random Forest machine learning algorithm may aid carbon mapping applications using remotely-sensed data. However, Random Forest has never been compared to traditional and potentially more reliable techniques such as regionally stratified sampling and upscaling, and it has rarely been employed with spatial data. Here, we evaluated the performance of Random Forest in upscaling airborne LiDAR (Light Detection and Ranging)-based carbon estimates compared to the stratification approach over a 16-million hectare focal area of the Western Amazon. We considered two runs of Random Forest, both with and without spatial contextual modeling by including--in the latter case--x, and y position directly in the model. In each case, we set aside 8 million hectares (i.e., half of the focal area) for validation; this rigorous test of Random Forest went above and beyond the internal validation normally compiled by the algorithm (i.e., called "out-of-bag"), which proved insufficient for this spatial application. In this heterogeneous region of Northern Peru, the model with spatial context was the best preforming run of Random Forest, and explained 59% of LiDAR-based carbon estimates within the validation area, compared to 37% for stratification or 43% by Random Forest without spatial context. With the 60% improvement in explained variation, RMSE against validation LiDAR samples improved from 33 to 26 Mg C ha(-1) when using Random Forest with spatial context. Our results suggest that spatial context should be considered when using Random Forest, and that doing so may result in substantially improved carbon stock modeling for purposes of climate change mitigation.

  16. A tale of two "forests": random forest machine learning AIDS tropical forest carbon mapping.

    Directory of Open Access Journals (Sweden)

    Joseph Mascaro

    Full Text Available Accurate and spatially-explicit maps of tropical forest carbon stocks are needed to implement carbon offset mechanisms such as REDD+ (Reduced Deforestation and Degradation Plus. The Random Forest machine learning algorithm may aid carbon mapping applications using remotely-sensed data. However, Random Forest has never been compared to traditional and potentially more reliable techniques such as regionally stratified sampling and upscaling, and it has rarely been employed with spatial data. Here, we evaluated the performance of Random Forest in upscaling airborne LiDAR (Light Detection and Ranging-based carbon estimates compared to the stratification approach over a 16-million hectare focal area of the Western Amazon. We considered two runs of Random Forest, both with and without spatial contextual modeling by including--in the latter case--x, and y position directly in the model. In each case, we set aside 8 million hectares (i.e., half of the focal area for validation; this rigorous test of Random Forest went above and beyond the internal validation normally compiled by the algorithm (i.e., called "out-of-bag", which proved insufficient for this spatial application. In this heterogeneous region of Northern Peru, the model with spatial context was the best preforming run of Random Forest, and explained 59% of LiDAR-based carbon estimates within the validation area, compared to 37% for stratification or 43% by Random Forest without spatial context. With the 60% improvement in explained variation, RMSE against validation LiDAR samples improved from 33 to 26 Mg C ha(-1 when using Random Forest with spatial context. Our results suggest that spatial context should be considered when using Random Forest, and that doing so may result in substantially improved carbon stock modeling for purposes of climate change mitigation.

  17. Comparative Effects of Circuit Training Programme on Speed and ...

    African Journals Online (AJOL)

    Stratified random sampling technique was used to select 40 pre-menarceal and 40 postmenarcheal girls who were later randomly assigned to experimental and control groups. At the end of the training programme, 40 subjects completed the post training measurements, so there were 10 subjects in each of the four study ...

  18. A comparative study of sampling techniques for monitoring carcass contamination

    NARCIS (Netherlands)

    Snijders, J.M.A.; Janssen, M.H.W.; Gerats, G.E.; Corstiaensen, G.P.

    1984-01-01

    Four bacteriological sampling techniques i.e. the excision, double swab, agar contract and modified agar contact techniques were compared by sampling pig carcasses before and after chilling. As well as assessing the advantages and disadvantages of the techniques particular attention was paid to

  19. Multiuser Random Coding Techniques for Mismatched Decoding

    OpenAIRE

    Scarlett, Jonathan; Martinez, Alfonso; Guillén i Fàbregas, Albert

    2016-01-01

    This paper studies multiuser random coding techniques for channel coding with a given (possibly suboptimal) decoding rule. For the mismatched discrete memoryless multiple-access channel, an error exponent is obtained that is tight with respect to the ensemble average, and positive within the interior of Lapidoth's achievable rate region. This exponent proves the ensemble tightness of the exponent of Liu and Hughes in the case of maximum-likelihood decoding. An equivalent dual form of Lapidoth...

  20. Sample preparation for special PIE-techniques at ITU

    International Nuclear Information System (INIS)

    Toscano, E.H.; Manzel, R.

    2002-01-01

    Several sample preparation techniques were developed and installed in hot cells. The techniques were conceived to evaluate the performance of highly burnt fuel rods and include: (a) a device for the removal of the fuel, (b) a method for the preparation of the specimen ends for the welding of new end caps and for the careful cleaning of samples for Transmission Electron Microscopy and Glow Discharge Mass Spectroscopy, (c) a sample pressurisation device for long term creep tests, and (d) a diameter measuring device for creep or burst samples. Examples of the determination of the mechanical properties, the behaviour under transient conditions and for the assessment of the corrosion behaviour of high burnup cladding materials are presented. (author)

  1. New trends in sample preparation techniques for environmental analysis.

    Science.gov (United States)

    Ribeiro, Cláudia; Ribeiro, Ana Rita; Maia, Alexandra S; Gonçalves, Virgínia M F; Tiritan, Maria Elizabeth

    2014-01-01

    Environmental samples include a wide variety of complex matrices, with low concentrations of analytes and presence of several interferences. Sample preparation is a critical step and the main source of uncertainties in the analysis of environmental samples, and it is usually laborious, high cost, time consuming, and polluting. In this context, there is increasing interest in developing faster, cost-effective, and environmentally friendly sample preparation techniques. Recently, new methods have been developed and optimized in order to miniaturize extraction steps, to reduce solvent consumption or become solventless, and to automate systems. This review attempts to present an overview of the fundamentals, procedure, and application of the most recently developed sample preparation techniques for the extraction, cleanup, and concentration of organic pollutants from environmental samples. These techniques include: solid phase microextraction, on-line solid phase extraction, microextraction by packed sorbent, dispersive liquid-liquid microextraction, and QuEChERS (Quick, Easy, Cheap, Effective, Rugged and Safe).

  2. Prevalence and factors affecting work-related injury among workers ...

    African Journals Online (AJOL)

    Administrator

    EPI INFO version 6.04 statistical software was used to calculate sample size. ... Stratified random sampling technique was applied to get the desired ... The quality of data was ensured through the training of data collectors and .... The bivariate analysis indicated that more .... Multivariate Logistic regression analysis: Stepwise.

  3. Family Cohesion and Level of Communication Between Parents and ...

    African Journals Online (AJOL)

    This study investigated the level of communication between parents and their adolescent children and how such communication affects family cohesion. A sample of 200 subjects made up of adolescents and parents were selected through cluster, stratified and random sampling techniques from ten Local Government Areas ...

  4. Random sampling of quantum states: a survey of methods and some issues regarding the Overparametrized Method

    International Nuclear Information System (INIS)

    Maziero, Jonas

    2015-01-01

    The numerical generation of random quantum states (RQS) is an important procedure for investigations in quantum information science. Here, we review some methods that may be used for performing that task. We start by presenting a simple procedure for generating random state vectors, for which the main tool is the random sampling of unbiased discrete probability distributions (DPD). Afterwards, the creation of random density matrices is addressed. In this context, we first present the standard method, which consists in using the spectral decomposition of a quantum state for getting RQS from random DPDs and random unitary matrices. In the sequence, the Bloch vector parametrization method is described. This approach, despite being useful in several instances, is not in general convenient for RQS generation. In the last part of the article, we regard the overparametrized method (OPM) and the related Ginibre and Bures techniques. The OPM can be used to create random positive semidefinite matrices with unit trace from randomly produced general complex matrices in a simple way that is friendly for numerical implementations. We consider a physically relevant issue related to the possible domains that may be used for the real and imaginary parts of the elements of such general complex matrices. Subsequently, a too fast concentration of measure in the quantum state space that appears in this parametrization is noticed. (author)

  5. Mode of Supervision and Teacher Productivity | Akinwumi | Nigerian ...

    African Journals Online (AJOL)

    This paper investigated the impact of principal supervisory techniques on teacher productivity in Oyo State Secondary Schools. An ex-post -facto research design was adopted for the study. The stratified random sampling techniques were used to select 85 schools from among 318 public secondary schools and 15 private ...

  6. Random sampling of evolution time space and Fourier transform processing

    International Nuclear Information System (INIS)

    Kazimierczuk, Krzysztof; Zawadzka, Anna; Kozminski, Wiktor; Zhukov, Igor

    2006-01-01

    Application of Fourier Transform for processing 3D NMR spectra with random sampling of evolution time space is presented. The 2D FT is calculated for pairs of frequencies, instead of conventional sequence of one-dimensional transforms. Signal to noise ratios and linewidths for different random distributions were investigated by simulations and experiments. The experimental examples include 3D HNCA, HNCACB and 15 N-edited NOESY-HSQC spectra of 13 C 15 N labeled ubiquitin sample. Obtained results revealed general applicability of proposed method and the significant improvement of resolution in comparison with conventional spectra recorded in the same time

  7. Non-terminal blood sampling techniques in Guinea pigs

    DEFF Research Database (Denmark)

    Birck, Malene Muusfeldt; Tveden-Nyborg, Pernille; Lindblad, Maiken Marie

    2014-01-01

    Guinea pigs possess several biological similarities to humans and are validated experimental animal models(1-3). However, the use of guinea pigs currently represents a relatively narrow area of research and descriptive data on specific methodology is correspondingly scarce. The anatomical features...... of guinea pigs are slightly different from other rodent models, hence modulation of sampling techniques to accommodate for species-specific differences, e.g., compared to mice and rats, are necessary to obtain sufficient and high quality samples. As both long and short term in vivo studies often require...... repeated blood sampling the choice of technique should be well considered in order to reduce stress and discomfort in the animals but also to ensure survival as well as compliance with requirements of sample size and accessibility. Venous blood samples can be obtained at a number of sites in guinea pigs e...

  8. A random sampling procedure for anisotropic distributions

    International Nuclear Information System (INIS)

    Nagrajan, P.S.; Sethulakshmi, P.; Raghavendran, C.P.; Bhatia, D.P.

    1975-01-01

    A procedure is described for sampling the scattering angle of neutrons as per specified angular distribution data. The cosine of the scattering angle is written as a double Legendre expansion in the incident neutron energy and a random number. The coefficients of the expansion are given for C, N, O, Si, Ca, Fe and Pb and these elements are of interest in dosimetry and shielding. (author)

  9. Diagnostic accuracy of the STRATIFY clinical prediction rule for falls: A systematic review and meta-analysis

    LENUS (Irish Health Repository)

    Billington, Jennifer

    2012-08-07

    AbstractBackgroundThe STRATIFY score is a clinical prediction rule (CPR) derived to assist clinicians to identify patients at risk of falling. The purpose of this systematic review and meta-analysis is to determine the overall diagnostic accuracy of the STRATIFY rule across a variety of clinical settings.MethodsA literature search was performed to identify all studies that validated the STRATIFY rule. The methodological quality of the studies was assessed using the Quality Assessment of Diagnostic Accuracy Studies tool. A STRATIFY score of ≥2 points was used to identify individuals at higher risk of falling. All included studies were combined using a bivariate random effects model to generate pooled sensitivity and specificity of STRATIFY at ≥2 points. Heterogeneity was assessed using the variance of logit transformed sensitivity and specificity.ResultsSeventeen studies were included in our meta-analysis, incorporating 11,378 patients. At a score ≥2 points, the STRATIFY rule is more useful at ruling out falls in those classified as low risk, with a greater pooled sensitivity estimate (0.67, 95% CI 0.52–0.80) than specificity (0.57, 95% CI 0.45 – 0.69). The sensitivity analysis which examined the performance of the rule in different settings and subgroups also showed broadly comparable results, indicating that the STRATIFY rule performs in a similar manner across a variety of different ‘at risk’ patient groups in different clinical settings.ConclusionThis systematic review shows that the diagnostic accuracy of the STRATIFY rule is limited and should not be used in isolation for identifying individuals at high risk of falls in clinical practice.

  10. The assessment of female students' perceptions, practices and ...

    African Journals Online (AJOL)

    The main purpose of this study was to assess perceptions, practices and challenges of ... Stratified sampling followed by Simple random sampling (lottery method) technique ... The gathered information was analyzed using both quantitative and ... By Country · List All Titles · Free To Read Titles This Journal is Open Access.

  11. Gray bootstrap method for estimating frequency-varying random vibration signals with small samples

    Directory of Open Access Journals (Sweden)

    Wang Yanqing

    2014-04-01

    Full Text Available During environment testing, the estimation of random vibration signals (RVS is an important technique for the airborne platform safety and reliability. However, the available methods including extreme value envelope method (EVEM, statistical tolerances method (STM and improved statistical tolerance method (ISTM require large samples and typical probability distribution. Moreover, the frequency-varying characteristic of RVS is usually not taken into account. Gray bootstrap method (GBM is proposed to solve the problem of estimating frequency-varying RVS with small samples. Firstly, the estimated indexes are obtained including the estimated interval, the estimated uncertainty, the estimated value, the estimated error and estimated reliability. In addition, GBM is applied to estimating the single flight testing of certain aircraft. At last, in order to evaluate the estimated performance, GBM is compared with bootstrap method (BM and gray method (GM in testing analysis. The result shows that GBM has superiority for estimating dynamic signals with small samples and estimated reliability is proved to be 100% at the given confidence level.

  12. Statistical sampling techniques as applied to OSE inspections

    International Nuclear Information System (INIS)

    Davis, J.J.; Cote, R.W.

    1987-01-01

    The need has been recognized for statistically valid methods for gathering information during OSE inspections; and for interpretation of results, both from performance testing and from records reviews, interviews, etc. Battelle Columbus Division, under contract to DOE OSE has performed and is continuing to perform work in the area of statistical methodology for OSE inspections. This paper represents some of the sampling methodology currently being developed for use during OSE inspections. Topics include population definition, sample size requirements, level of confidence and practical logistical constraints associated with the conduct of an inspection based on random sampling. Sequential sampling schemes and sampling from finite populations are also discussed. The methods described are applicable to various data gathering activities, ranging from the sampling and examination of classified documents to the sampling of Protective Force security inspectors for skill testing

  13. A study on the representative sampling survey for the inspection of the clearance level for the radioisotope waste

    International Nuclear Information System (INIS)

    Hong Joo Ahn; Se Chul Sohn; Kwang Yong Jee; Ju Youl Kim; In Koo Lee

    2007-01-01

    Utilization facilities for radioisotope (RI) are increasing annually in South Korea, and the total number was 2,723, as of December 31, 2005. The inspection of a clearance level is a very important problem in order to ensure a social reliance for releasing radioactive materials to the environment. Korean regulations for such a clearance are described in Notice No. 2001-30 of the Ministry of Science and Technology (MOST) and Notice No. 2002-67 of the Ministry of Commerce, Industry and Energy (MOCIE). Most unsealed sources in RI waste drums at a storage facility are low-level beta-emitters with short half-lives, so it is impossible to measure their inventories by a nondestructive analysis. Furthermore, RI wastes generated from hospital, educational and research institutes and industry have a heterogeneous, multiple, irregular, and a small quantity of a waste stream. This study addresses a representative (master) sampling survey and analysis plan for RI wastes because a complete enumeration of waste drums is impossible and not desirable in terms of a cost and efficiency. The existing approaches to a representative sampling include a judgmental, simple random, stratified random, systematic grid, systematic random, composite, and adaptive sampling. A representative sampling plan may combine two or more of the above sampling approaches depending on the type and distribution of a waste stream. Stratified random sampling (constrained randomization) is proven to be adequate for a sampling design of a RI waste regarding a half-life, surface dose, undertaking time to a storage facility, and type of waste. The developed sampling protocol includes estimating the number of drums within a waste stream, estimating the number of samples, and a confirmation of the required number of samples. The statistical process control for a quality assurance plan includes control charts and an upper control limit (UCL) of 95% to determine whether a clearance level is met or not. (authors)

  14. Random sampling of elementary flux modes in large-scale metabolic networks.

    Science.gov (United States)

    Machado, Daniel; Soons, Zita; Patil, Kiran Raosaheb; Ferreira, Eugénio C; Rocha, Isabel

    2012-09-15

    The description of a metabolic network in terms of elementary (flux) modes (EMs) provides an important framework for metabolic pathway analysis. However, their application to large networks has been hampered by the combinatorial explosion in the number of modes. In this work, we develop a method for generating random samples of EMs without computing the whole set. Our algorithm is an adaptation of the canonical basis approach, where we add an additional filtering step which, at each iteration, selects a random subset of the new combinations of modes. In order to obtain an unbiased sample, all candidates are assigned the same probability of getting selected. This approach avoids the exponential growth of the number of modes during computation, thus generating a random sample of the complete set of EMs within reasonable time. We generated samples of different sizes for a metabolic network of Escherichia coli, and observed that they preserve several properties of the full EM set. It is also shown that EM sampling can be used for rational strain design. A well distributed sample, that is representative of the complete set of EMs, should be suitable to most EM-based methods for analysis and optimization of metabolic networks. Source code for a cross-platform implementation in Python is freely available at http://code.google.com/p/emsampler. dmachado@deb.uminho.pt Supplementary data are available at Bioinformatics online.

  15. Application of digital sampling techniques to particle identification

    International Nuclear Information System (INIS)

    Bardelli, L.; Poggi, G.; Bini, M.; Carraresi, L.; Pasquali, G.; Taccetti, N.

    2003-01-01

    An application of digital sampling techniques is presented which can greatly simplify experiments involving sub-nanosecond time-mark determinations and energy measurements with nuclear detectors, used for Pulse Shape Analysis and Time of Flight measurements in heavy ion experiments. In this work a 100 M Sample/s, 12 bit analog to digital converter has been used: examples of this technique applied to Silicon and CsI(Tl) detectors in heavy-ions experiments involving particle identification via Pulse Shape analysis and Time of Flight measurements are presented. The system is suited for applications to large detector arrays and to different kinds of detectors. Some preliminary results regarding the simulation of current signals in Silicon detectors are also discussed. (authors)

  16. Is a 'convenience' sample useful for estimating immunization coverage in a small population?

    Science.gov (United States)

    Weir, Jean E; Jones, Carrie

    2008-01-01

    Rapid survey methodologies are widely used for assessing immunization coverage in developing countries, approximating true stratified random sampling. Non-random ('convenience') sampling is not considered appropriate for estimating immunization coverage rates but has the advantages of low cost and expediency. We assessed the validity of a convenience sample of children presenting to a travelling clinic by comparing the coverage rate in the convenience sample to the true coverage established by surveying each child in three villages in rural Papua New Guinea. The rate of DTF immunization coverage as estimated by the convenience sample was within 10% of the true coverage when the proportion of children in the sample was two-thirds or when only children over the age of one year were counted, but differed by 11% when the sample included only 53% of the children and when all eligible children were included. The convenience sample may be sufficiently accurate for reporting purposes and is useful for identifying areas of low coverage.

  17. A modified random decrement technique for modal identification from nonstationary ambient response data only

    International Nuclear Information System (INIS)

    Lin, Chang Sheng; Chiang, Dar Yun

    2012-01-01

    Modal identification is considered from response data of structural system under nonstationary ambient vibration. In a previous paper, we showed that by assuming the ambient excitation to be nonstationary white noise in the form of a product model, the nonstationary response signals can be converted into free-vibration data via the correlation technique. In the present paper, if the ambient excitation can be modeled as a nonstationary white noise in the form of a product model, then the nonstationary cross random decrement signatures of structural response evaluated at any fixed time instant are shown theoretically to be proportional to the nonstationary cross-correlation functions. The practical problem of insufficient data samples available for evaluating nonstationary random decrement signatures can be approximately resolved by first extracting the amplitude-modulating function from the response and then transforming the nonstationary responses into stationary ones. Modal-parameter identification can then be performed using the Ibrahim time-domain technique, which is effective at identifying closely spaced modes. The theory proposed can be further extended by using the filtering concept to cover the case of nonstationary color excitations. Numerical simulations confirm the validity of the proposed method for identification of modal parameters from nonstationary ambient response data

  18. Determination of Initial Conditions for the Safety Analysis by Random Sampling of Operating Parameters

    International Nuclear Information System (INIS)

    Jeong, Hae-Yong; Park, Moon-Ghu

    2015-01-01

    In most existing evaluation methodologies, which follow a conservative approach, the most conservative initial conditions are searched for each transient scenario through tremendous assessment for wide operating windows or limiting conditions for operation (LCO) allowed by the operating guidelines. In this procedure, a user effect could be involved and a remarkable time and human resources are consumed. In the present study, we investigated a more effective statistical method for the selection of the most conservative initial condition by the use of random sampling of operating parameters affecting the initial conditions. A method for the determination of initial conditions based on random sampling of plant design parameters is proposed. This method is expected to be applied for the selection of the most conservative initial plant conditions in the safety analysis using a conservative evaluation methodology. In the method, it is suggested that the initial conditions of reactor coolant flow rate, pressurizer level, pressurizer pressure, and SG level are adjusted by controlling the pump rated flow, setpoints of PLCS, PPCS, and FWCS, respectively. The proposed technique is expected to contribute to eliminate the human factors introduced in the conventional safety analysis procedure and also to reduce the human resources invested in the safety evaluation of nuclear power plants

  19. Key-space analysis of double random phase encryption technique

    Science.gov (United States)

    Monaghan, David S.; Gopinathan, Unnikrishnan; Naughton, Thomas J.; Sheridan, John T.

    2007-09-01

    We perform a numerical analysis on the double random phase encryption/decryption technique. The key-space of an encryption technique is the set of possible keys that can be used to encode data using that technique. In the case of a strong encryption scheme, many keys must be tried in any brute-force attack on that technique. Traditionally, designers of optical image encryption systems demonstrate only how a small number of arbitrary keys cannot decrypt a chosen encrypted image in their system. However, this type of demonstration does not discuss the properties of the key-space nor refute the feasibility of an efficient brute-force attack. To clarify these issues we present a key-space analysis of the technique. For a range of problem instances we plot the distribution of decryption errors in the key-space indicating the lack of feasibility of a simple brute-force attack.

  20. Fast egg collection method greatly improves randomness of egg sampling in Drosophila melanogaster

    DEFF Research Database (Denmark)

    Schou, Mads Fristrup

    2013-01-01

    When obtaining samples for population genetic studies, it is essential that the sampling is random. For Drosophila, one of the crucial steps in sampling experimental flies is the collection of eggs. Here an egg collection method is presented, which randomizes the eggs in a water column...... and diminishes environmental variance. This method was compared with a traditional egg collection method where eggs are collected directly from the medium. Within each method the observed and expected standard deviations of egg-to-adult viability were compared, whereby the difference in the randomness...... and to obtain a representative collection of genotypes, the method presented here is strongly recommended when collecting eggs from Drosophila....

  1. Knowledge of HIV/AIDS and Risk Behaviour among Students of ...

    African Journals Online (AJOL)

    This study examined the knowledge and risk behaviours on HIV/AIDS of students in colleges of Education in Osun State. The study sampled 1600 students (male and female) from two colleges of Education. A descriptive survey was adopted for the study using stratified random sampling techniques. A self- developed ...

  2. Efficiency of resource-use in Cassava production in Edo state, Nigeria

    African Journals Online (AJOL)

    This study employed the use of the Maximum Likelihood Estimation Technique in estimating the efficiency of resource-use in cassava production in Edo State. Data used for the study were sourced through the cost-route method of data collection, based on a stratified random sampling technique. The average farm size of ...

  3. The concentration of heavy metals: zinc, cadmium, lead, copper, mercury, iron and calcium in head hair of a randomly selected sample of Kenyan people

    International Nuclear Information System (INIS)

    Wandiga, S.O.; Jumba, I.O.

    1982-01-01

    An intercomparative analysis of the concentration of heavy metals:zinc, cadmium, lead, copper, mercury, iron and calcium in head hair of a randomly selected sample of Kenyan people using the techniques of atomic absorption spectrophotometry (AAS) and differential pulse anodic stripping voltammetry (DPAS) has been undertaken. The percent relative standard deviation for each sample analysed using either of the techniques show good sensitivity and correlation between the techniques. The DPAS was found to be slightly sensitive than the AAs instrument used. The recalculated body burden rations of Cd to Zn, Pb to Fe reveal no unusual health impairement symptoms and suggest a relatively clean environment in Kenya.(author)

  4. CONSISTENCY UNDER SAMPLING OF EXPONENTIAL RANDOM GRAPH MODELS.

    Science.gov (United States)

    Shalizi, Cosma Rohilla; Rinaldo, Alessandro

    2013-04-01

    The growing availability of network data and of scientific interest in distributed systems has led to the rapid development of statistical models of network structure. Typically, however, these are models for the entire network, while the data consists only of a sampled sub-network. Parameters for the whole network, which is what is of interest, are estimated by applying the model to the sub-network. This assumes that the model is consistent under sampling , or, in terms of the theory of stochastic processes, that it defines a projective family. Focusing on the popular class of exponential random graph models (ERGMs), we show that this apparently trivial condition is in fact violated by many popular and scientifically appealing models, and that satisfying it drastically limits ERGM's expressive power. These results are actually special cases of more general results about exponential families of dependent random variables, which we also prove. Using such results, we offer easily checked conditions for the consistency of maximum likelihood estimation in ERGMs, and discuss some possible constructive responses.

  5. Sampling techniques for thrips (Thysanoptera: Thripidae) in preflowering tomato.

    Science.gov (United States)

    Joost, P Houston; Riley, David G

    2004-08-01

    Sampling techniques for thrips (Thysanoptera: Thripidae) were compared in preflowering tomato plants at the Coastal Plain Experiment Station in Tifton, GA, in 2000 and 2003, to determine the most effective method of determining abundance of thrips on tomato foliage early in the growing season. Three relative sampling techniques, including a standard insect aspirator, a 946-ml beat cup, and an insect vacuum device, were compared for accuracy to an absolute method and to themselves for precision and efficiency of sampling thrips. Thrips counts of all relative sampling methods were highly correlated (R > 0.92) to the absolute method. The aspirator method was the most accurate compared with the absolute sample according to regression analysis in 2000. In 2003, all sampling methods were considered accurate according to Dunnett's test, but thrips numbers were lower and sample variation was greater than in 2000. In 2000, the beat cup method had the lowest relative variation (RV) or best precision, at 1 and 8 d after transplant (DAT). Only the beat cup method had RV values <25 for all sampling dates. In 2003, the beat cup method had the lowest RV value at 15 and 21 DAT. The beat cup method also was the most efficient method for all sample dates in both years. Frankliniella fusca (Pergande) was the most abundant thrips species on the foliage of preflowering tomato in both years of study at this location. Overall, the best thrips sampling technique tested was the beat cup method in terms of precision and sampling efficiency.

  6. Sampling strategy for a large scale indoor radiation survey - a pilot project

    International Nuclear Information System (INIS)

    Strand, T.; Stranden, E.

    1986-01-01

    Optimisation of a stratified random sampling strategy for large scale indoor radiation surveys is discussed. It is based on the results from a small scale pilot project where variances in dose rates within different categories of houses were assessed. By selecting a predetermined precision level for the mean dose rate in a given region, the number of measurements needed can be optimised. The results of a pilot project in Norway are presented together with the development of the final sampling strategy for a planned large scale survey. (author)

  7. Using maximum entropy modeling for optimal selection of sampling sites for monitoring networks

    Science.gov (United States)

    Stohlgren, Thomas J.; Kumar, Sunil; Barnett, David T.; Evangelista, Paul H.

    2011-01-01

    Environmental monitoring programs must efficiently describe state shifts. We propose using maximum entropy modeling to select dissimilar sampling sites to capture environmental variability at low cost, and demonstrate a specific application: sample site selection for the Central Plains domain (453,490 km2) of the National Ecological Observatory Network (NEON). We relied on four environmental factors: mean annual temperature and precipitation, elevation, and vegetation type. A “sample site” was defined as a 20 km × 20 km area (equal to NEON’s airborne observation platform [AOP] footprint), within which each 1 km2 cell was evaluated for each environmental factor. After each model run, the most environmentally dissimilar site was selected from all potential sample sites. The iterative selection of eight sites captured approximately 80% of the environmental envelope of the domain, an improvement over stratified random sampling and simple random designs for sample site selection. This approach can be widely used for cost-efficient selection of survey and monitoring sites.

  8. Characterization of electron microscopes with binary pseudo-random multilayer test samples

    Science.gov (United States)

    Yashchuk, Valeriy V.; Conley, Raymond; Anderson, Erik H.; Barber, Samuel K.; Bouet, Nathalie; McKinney, Wayne R.; Takacs, Peter Z.; Voronov, Dmitriy L.

    2011-09-01

    Verification of the reliability of metrology data from high quality X-ray optics requires that adequate methods for test and calibration of the instruments be developed. For such verification for optical surface profilometers in the spatial frequency domain, a modulation transfer function (MTF) calibration method based on binary pseudo-random (BPR) gratings and arrays has been suggested [1,2] and proven to be an effective calibration method for a number of interferometric microscopes, a phase shifting Fizeau interferometer, and a scatterometer [5]. Here we describe the details of development of binary pseudo-random multilayer (BPRML) test samples suitable for characterization of scanning (SEM) and transmission (TEM) electron microscopes. We discuss the results of TEM measurements with the BPRML test samples fabricated from a WiSi 2/Si multilayer coating with pseudo-randomly distributed layers. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML test samples can be used to characterize X-ray microscopes. Corresponding work with X-ray microscopes is in progress.

  9. Characterization of electron microscopes with binary pseudo-random multilayer test samples

    International Nuclear Information System (INIS)

    Yashchuk, Valeriy V.; Conley, Raymond; Anderson, Erik H.; Barber, Samuel K.; Bouet, Nathalie; McKinney, Wayne R.; Takacs, Peter Z.; Voronov, Dmitriy L.

    2011-01-01

    Verification of the reliability of metrology data from high quality X-ray optics requires that adequate methods for test and calibration of the instruments be developed. For such verification for optical surface profilometers in the spatial frequency domain, a modulation transfer function (MTF) calibration method based on binary pseudo-random (BPR) gratings and arrays has been suggested and proven to be an effective calibration method for a number of interferometric microscopes, a phase shifting Fizeau interferometer, and a scatterometer [5]. Here we describe the details of development of binary pseudo-random multilayer (BPRML) test samples suitable for characterization of scanning (SEM) and transmission (TEM) electron microscopes. We discuss the results of TEM measurements with the BPRML test samples fabricated from a WiSi 2 /Si multilayer coating with pseudo-randomly distributed layers. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML test samples can be used to characterize X-ray microscopes. Corresponding work with X-ray microscopes is in progress.

  10. LOD score exclusion analyses for candidate genes using random population samples.

    Science.gov (United States)

    Deng, H W; Li, J; Recker, R R

    2001-05-01

    While extensive analyses have been conducted to test for, no formal analyses have been conducted to test against, the importance of candidate genes with random population samples. We develop a LOD score approach for exclusion analyses of candidate genes with random population samples. Under this approach, specific genetic effects and inheritance models at candidate genes can be analysed and if a LOD score is < or = - 2.0, the locus can be excluded from having an effect larger than that specified. Computer simulations show that, with sample sizes often employed in association studies, this approach has high power to exclude a gene from having moderate genetic effects. In contrast to regular association analyses, population admixture will not affect the robustness of our analyses; in fact, it renders our analyses more conservative and thus any significant exclusion result is robust. Our exclusion analysis complements association analysis for candidate genes in random population samples and is parallel to the exclusion mapping analyses that may be conducted in linkage analyses with pedigrees or relative pairs. The usefulness of the approach is demonstrated by an application to test the importance of vitamin D receptor and estrogen receptor genes underlying the differential risk to osteoporotic fractures.

  11. Generating Random Samples of a Given Size Using Social Security Numbers.

    Science.gov (United States)

    Erickson, Richard C.; Brauchle, Paul E.

    1984-01-01

    The purposes of this article are (1) to present a method by which social security numbers may be used to draw cluster samples of a predetermined size and (2) to describe procedures used to validate this method of drawing random samples. (JOW)

  12. Pengaruh Kualitas Pelayanan Terhadap Kepuasan Nasabah Di Pt. Bank Central Asia (Bca Tbk Cabang Undaan Surabaya

    Directory of Open Access Journals (Sweden)

    Yulian Belinda Rahmawati

    2014-10-01

    Full Text Available This study aims to determine the effect of Quality of Service Characteristics on Customer Satisfaction in the PT. Bank Central Asia (BCA Branch Surabaya Undaan. The sampling in this study using stratified sampling (Stratified Random Sampling. Analysis techniques that are used are double linear regression. The calculation result shows that the quality of services simultaneous effect on customer satisfaction in PT. Bank Central Asia (BCA Branch Surabaya Undaaan. The quality of services that include variables Responsiveness, Tangibles, Empathy, Assurance, and Reliability in a partial effect on customer satisfaction in PT. Bank Central Asia (BCA Branch Surabaya undaaan.

  13. Educational attainment, formal employment and contraceptives ...

    African Journals Online (AJOL)

    Based on this, the study examines educational attainment, formal employment and contraceptives practices among working women in Lagos State University. Survey design was adopted for the study. Using Stratified and simple random sampling techniques, quantitative data was gathered through the administration of ...

  14. An Investigation into Communication Climate and Staff Efficiency in ...

    African Journals Online (AJOL)

    This study examined the relationship between communication climate and staff efficiency in selected tertiary institutions in south-western Nigeria. Using the stratified random sampling technique, 1500 workers were drawn from public and private tertiary institutions i.e Universities, Polytechnics and Colleges of Education).

  15. Penaksir Rasio Proporsi Yang Efisien Untuk Rata-rata Populasi Pada Sampling Acak Berstrata

    OpenAIRE

    Maulana, Devri; Adnan, Arisman; Sirait, Haposan

    2014-01-01

    In this article we review three proportion ratio estimators for the population mean on stratified random sampling, i.e. traditional proportion ratio estimator, proportion ratio estimator using coefficient of regression, and proportion ratio estimator usingcoefficient of regression and curtosis as discussed by Singh and Audu [5]. The three estimators are biased estimators, then the mean square error of each estimator is determined. Furthermore, these mean square errors are compa...

  16. Strategies for Coping with the Challenges of Incarceration among Nigerian Prison Inmates

    Science.gov (United States)

    Agbakwuru, Chikwe; Awujo, Grace C.

    2016-01-01

    This paper investigated the strategies for coping with the challenges of incarceration among inmates of Port Harcourt Prison, Nigeria. The population was 2,997 inmates of the prison while the sample was 250 inmates drawn through stratified random sampling technique from the same Port Harcourt prison. Six research questions were posed and data for…

  17. Universal shift of the Brewster angle and disorder-enhanced delocalization of p waves in stratified random media.

    Science.gov (United States)

    Lee, Kwang Jin; Kim, Kihong

    2011-10-10

    We study theoretically the propagation and the Anderson localization of p-polarized electromagnetic waves incident obliquely on randomly stratified dielectric media with weak uncorrelated Gaussian disorder. Using the invariant imbedding method, we calculate the localization length and the disorder-averaged transmittance in a numerically precise manner. We find that the localization length takes an extremely large maximum value at some critical incident angle, which we call the generalized Brewster angle. The disorder-averaged transmittance also takes a maximum very close to one at the same incident angle. Even in the presence of an arbitrarily weak disorder, the generalized Brewster angle is found to be substantially different from the ordinary Brewster angle in uniform media. It is a rapidly increasing function of the average dielectric permittivity and approaches 90° when the average relative dielectric permittivity is slightly larger than two. We make a remarkable observation that the dependence of the generalized Brewster angle on the average dielectric permittivity is universal in the sense that it is independent of the strength of disorder. We also find, surprisingly, that when the average relative dielectric permittivity is less than one and the incident angle is larger than the generalized Brewster angle, both the localization length and the disorder-averaged transmittance increase substantially as the strength of disorder increases in a wide range of the disorder parameter. In other words, the Anderson localization of incident p waves can be weakened by disorder in a certain parameter regime.

  18. Effect of joint mobilization techniques for primary total knee arthroplasty: Study protocol for a randomized controlled trial.

    Science.gov (United States)

    Xu, Jiao; Zhang, Juan; Wang, Xue-Qiang; Wang, Xuan-Lin; Wu, Ya; Chen, Chan-Cheng; Zhang, Han-Yu; Zhang, Zhi-Wan; Fan, Kai-Yi; Zhu, Qiang; Deng, Zhi-Wei

    2017-12-01

    Total knee arthroplasty (TKA) has become the most preferred procedure by patients for the relief of pain caused by knee osteoarthritis. TKA patients aim a speedy recovery after the surgery. Joint mobilization techniques for rehabilitation have been widely used to relieve pain and improve joint mobility. However, relevant randomized controlled trials showing the curative effect of these techniques remain lacking to date. Accordingly, this study aims to investigate whether joint mobilization techniques are valid for primary TKA. We will manage a single-blind, prospective, randomized, controlled trial of 120 patients with unilateral TKA. Patients will be randomized into an intervention group, a physical modality therapy group, and a usual care group. The intervention group will undergo joint mobilization manipulation treatment once a day and regular training twice a day for a month. The physical modality therapy group will undergo physical therapy once a day and regular training twice a day for a month. The usual care group will perform regular training twice a day for a month. Primary outcome measures will be based on the visual analog scale, the knee joint Hospital for Special Surgery score, range of motion, surrounded degree, and adverse effect. Secondary indicators will include manual muscle testing, 36-Item Short Form Health Survey, Berg Balance Scale function evaluation, Pittsburgh Sleep Quality Index, proprioception, and muscle morphology. We will direct intention-to-treat analysis if a subject withdraws from the trial. The important features of this trial for joint mobilization techniques in primary TKA are randomization procedures, single-blind, large sample size, and standardized protocol. This study aims to investigate whether joint mobilization techniques are effective for early TKA patients. The result of this study may serve as a guide for TKA patients, medical personnel, and healthcare decision makers. It has been registered at http

  19. Use of nuclear technique in samples for agricultural purposes

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Kerley A. P. de; Sperling, Eduardo Von, E-mail: kerley@ufmg.br, E-mail: kerleyfisica@yahoo.com.br [Department of Sanitary and Environmental Engineering Federal University of Minas Gerais, Belo Horizonte (Brazil); Menezes, Maria Angela B. C.; Jacomino, Vanusa M.F. [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2013-01-15

    The concern related to environment is growing. Due to this, it is needed to determine chemical elements in a large range of concentration. The neutron activation technique (NAA) determines the elemental composition by the measurement of artificial radioactivity in a sample that was submitted to a neutron flux. NAA is a sensitive and accurate technique with low detection limits. An example of application of NAA was the measurement of concentrations of rare earth elements (REE) in waste samples of phosphogypsum (PG) and cerrado soil samples (clayey and sandy soils). Additionally, a soil reference material of the International Atomic Energy Agency (IAEA) was also analyzed. The REE concentration in PG samples was two times higher than those found in national fertilizers, (total of 4,000 mg kg{sup -1}), 154 times greater than the values found in the sandy soil (26 mg kg{sup -1}) and 14 times greater than the in clayey soil (280 mg kg{sup -1}). The experimental results for the reference material were inside the uncertainty of the certified values pointing out the accuracy of the method (95%). The determination of La, Ce, Pr, Nd, Pm, Sm, Eu, Tb, Dy, Ho, Er, Tm, Yb and Lu in the samples and reference material confirmed the versatility of the technique on REE determination in soil and phosphogypsum samples that are matrices for agricultural interest. (author)

  20. Use of nuclear technique in samples for agricultural purposes

    International Nuclear Information System (INIS)

    Oliveira, Kerley A. P. de; Sperling, Eduardo Von; Menezes, Maria Angela B. C.; Jacomino, Vanusa M.F.

    2013-01-01

    The concern related to environment is growing. Due to this, it is needed to determine chemical elements in a large range of concentration. The neutron activation technique (NAA) determines the elemental composition by the measurement of artificial radioactivity in a sample that was submitted to a neutron flux. NAA is a sensitive and accurate technique with low detection limits. An example of application of NAA was the measurement of concentrations of rare earth elements (REE) in waste samples of phosphogypsum (PG) and cerrado soil samples (clayey and sandy soils). Additionally, a soil reference material of the International Atomic Energy Agency (IAEA) was also analyzed. The REE concentration in PG samples was two times higher than those found in national fertilizers, (total of 4,000 mg kg -1 ), 154 times greater than the values found in the sandy soil (26 mg kg -1 ) and 14 times greater than the in clayey soil (280 mg kg -1 ). The experimental results for the reference material were inside the uncertainty of the certified values pointing out the accuracy of the method (95%). The determination of La, Ce, Pr, Nd, Pm, Sm, Eu, Tb, Dy, Ho, Er, Tm, Yb and Lu in the samples and reference material confirmed the versatility of the technique on REE determination in soil and phosphogypsum samples that are matrices for agricultural interest. (author)

  1. Development of analytical techniques for safeguards environmental samples at JAEA

    International Nuclear Information System (INIS)

    Sakurai, Satoshi; Magara, Masaaki; Usuda, Shigekazu; Watanabe, Kazuo; Esaka, Fumitaka; Hirayama, Fumio; Lee, Chi-Gyu; Yasuda, Kenichiro; Inagawa, Jun; Suzuki, Daisuke; Iguchi, Kazunari; Kokubu, Yoko S.; Miyamoto, Yutaka; Ohzu, Akira

    2007-01-01

    JAEA has been developing, under the auspices of the Ministry of Education, Culture, Sports, Science and Technology of Japan, analytical techniques for ultra-trace amounts of nuclear materials in environmental samples in order to contribute to the strengthened safeguards system. Development of essential techniques for bulk and particle analysis, as well as screening, of the environmental swipe samples has been established as ultra-trace analytical methods of uranium and plutonium. In January 2003, JAEA was qualified, including its quality control system, as a member of the JAEA network analytical laboratories for environmental samples. Since 2004, JAEA has conducted the analysis of domestic and the IAEA samples, through which JAEA's analytical capability has been verified and improved. In parallel, advanced techniques have been developed in order to expand the applicability to the samples of various elemental composition and impurities and to improve analytical accuracy and efficiency. This paper summarizes the trace of the technical development in environmental sample analysis at JAEA, and refers to recent trends of research and development in this field. (author)

  2. The optimal injection technique for the osteoarthritic ankle: A randomized, cross-over trial

    NARCIS (Netherlands)

    Witteveen, Angelique G. H.; Kok, Aimee; Sierevelt, Inger N.; Kerkhoffs, Gino M. M. J.; van Dijk, C. Niek

    2013-01-01

    Background: To optimize the injection technique for the osteoarthritic ankle in order to enhance the effect of intra-articular injections and minimize adverse events. Methods: Randomized cross-over trial. Comparing two injection techniques in patients with symptomatic ankle osteoarthritis. Patients

  3. Water sampling techniques for continuous monitoring of pesticides in water

    Directory of Open Access Journals (Sweden)

    Šunjka Dragana

    2017-01-01

    Full Text Available Good ecological and chemical status of water represents the most important aim of the Water Framework Directive 2000/60/EC, which implies respect of water quality standards at the level of entire river basin (2008/105/EC and 2013/39/EC. This especially refers to the control of pesticide residues in surface waters. In order to achieve the set goals, a continuous monitoring program that should provide a comprehensive and interrelated overview of water status should be implemented. However, it demands the use of appropriate analysis techniques. Until now, the procedure for sampling and quantification of residual pesticide quantities in aquatic environment was based on the use of traditional sampling techniques that imply periodical collecting of individual samples. However, this type of sampling provides only a snapshot of the situation in regard to the presence of pollutants in water. As an alternative, the technique of passive sampling of pollutants in water, including pesticides has been introduced. Different samplers are available for pesticide sampling in surface water, depending on compounds. The technique itself is based on keeping a device in water over a longer period of time which varies from several days to several weeks, depending on the kind of compound. In this manner, the average concentrations of pollutants dissolved in water during a time period (time-weighted average concentrations, TWA are obtained, which enables monitoring of trends in areal and seasonal variations. The use of these techniques also leads to an increase in sensitivity of analytical methods, considering that pre-concentration of analytes takes place within the sorption medium. However, the use of these techniques for determination of pesticide concentrations in real water environments requires calibration studies for the estimation of sampling rates (Rs. Rs is a volume of water per time, calculated as the product of overall mass transfer coefficient and area of

  4. Magnetic separation techniques in sample preparation for biological analysis: a review.

    Science.gov (United States)

    He, Jincan; Huang, Meiying; Wang, Dongmei; Zhang, Zhuomin; Li, Gongke

    2014-12-01

    Sample preparation is a fundamental and essential step in almost all the analytical procedures, especially for the analysis of complex samples like biological and environmental samples. In past decades, with advantages of superparamagnetic property, good biocompatibility and high binding capacity, functionalized magnetic materials have been widely applied in various processes of sample preparation for biological analysis. In this paper, the recent advancements of magnetic separation techniques based on magnetic materials in the field of sample preparation for biological analysis were reviewed. The strategy of magnetic separation techniques was summarized. The synthesis, stabilization and bio-functionalization of magnetic nanoparticles were reviewed in detail. Characterization of magnetic materials was also summarized. Moreover, the applications of magnetic separation techniques for the enrichment of protein, nucleic acid, cell, bioactive compound and immobilization of enzyme were described. Finally, the existed problems and possible trends of magnetic separation techniques for biological analysis in the future were proposed. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Smoothing the redshift distributions of random samples for the baryon acoustic oscillations: applications to the SDSS-III BOSS DR12 and QPM mock samples

    Science.gov (United States)

    Wang, Shao-Jiang; Guo, Qi; Cai, Rong-Gen

    2017-12-01

    We investigate the impact of different redshift distributions of random samples on the baryon acoustic oscillations (BAO) measurements of D_V(z)r_d^fid/r_d from the two-point correlation functions of galaxies in the Data Release 12 of the Baryon Oscillation Spectroscopic Survey (BOSS). Big surveys, such as BOSS, usually assign redshifts to the random samples by randomly drawing values from the measured redshift distributions of the data, which would necessarily introduce fiducial signals of fluctuations into the random samples, weakening the signals of BAO, if the cosmic variance cannot be ignored. We propose a smooth function of redshift distribution that fits the data well to populate the random galaxy samples. The resulting cosmological parameters match the input parameters of the mock catalogue very well. The significance of BAO signals has been improved by 0.33σ for a low-redshift sample and by 0.03σ for a constant-stellar-mass sample, though the absolute values do not change significantly. Given the precision of the measurements of current cosmological parameters, it would be appreciated for the future improvements on the measurements of galaxy clustering.

  6. National-scale vegetation change across Britain; an analysis of sample-based surveillance data from the Countryside Surveys of 1990 and 1998

    NARCIS (Netherlands)

    Smart, S.M.; Clarke, R.T.; Poll, van de H.M.; Robertson, E.J.; Shield, E.R.; Bunce, R.G.H.; Maskell, L.C.

    2003-01-01

    Patterns of vegetation across Great Britain (GB) between 1990 and 1998 were quantified based on an analysis of plant species data from a total of 9596 fixed plots. Plots were established on a stratified random basis within 501 1 km sample squares located as part of the Countryside Survey of GB.

  7. Solid Phase Microextraction and Related Techniques for Drugs in Biological Samples

    OpenAIRE

    Moein, Mohammad Mahdi; Said, Rana; Bassyouni, Fatma; Abdel-Rehim, Mohamed

    2014-01-01

    In drug discovery and development, the quantification of drugs in biological samples is an important task for the determination of the physiological performance of the investigated drugs. After sampling, the next step in the analytical process is sample preparation. Because of the low concentration levels of drug in plasma and the variety of the metabolites, the selected extraction technique should be virtually exhaustive. Recent developments of sample handling techniques are directed, from o...

  8. The effect of surfactant on stratified and stratifying gas-liquid flows

    Science.gov (United States)

    Heiles, Baptiste; Zadrazil, Ivan; Matar, Omar

    2013-11-01

    We consider the dynamics of a stratified/stratifying gas-liquid flow in horizontal tubes. This flow regime is characterised by the thin liquid films that drain under gravity along the pipe interior, forming a pool at the bottom of the tube, and the formation of large-amplitude waves at the gas-liquid interface. This regime is also accompanied by the detachment of droplets from the interface and their entrainment into the gas phase. We carry out an experimental study involving axial- and radial-view photography of the flow, in the presence and absence of surfactant. We show that the effect of surfactant is to reduce significantly the average diameter of the entrained droplets, through a tip-streaming mechanism. We also highlight the influence of surfactant on the characteristics of the interfacial waves, and the pressure gradient that drives the flow. EPSRC Programme Grant EP/K003976/1.

  9. Electromagnetic waves in stratified media

    CERN Document Server

    Wait, James R; Fock, V A; Wait, J R

    2013-01-01

    International Series of Monographs in Electromagnetic Waves, Volume 3: Electromagnetic Waves in Stratified Media provides information pertinent to the electromagnetic waves in media whose properties differ in one particular direction. This book discusses the important feature of the waves that enables communications at global distances. Organized into 13 chapters, this volume begins with an overview of the general analysis for the electromagnetic response of a plane stratified medium comprising of any number of parallel homogeneous layers. This text then explains the reflection of electromagne

  10. Piezoelectric Versus Conventional Rotary Techniques for Impacted Third Molar Extraction: A Meta-analysis of Randomized Controlled Trials.

    Science.gov (United States)

    Jiang, Qian; Qiu, Yating; Yang, Chi; Yang, Jingyun; Chen, Minjie; Zhang, Zhiyuan

    2015-10-01

    Impacted third molars are frequently encountered in clinical work. Surgical removal of impacted third molars is often required to prevent clinical symptoms. Traditional rotary cutting instruments are potentially injurious, and piezosurgery, as a new osteotomy technique, has been introduced in oral and maxillofacial surgery. No consistent conclusion has been reached regarding whether this new technique is associated with fewer or less severe postoperative sequelae after third molar extraction.The aim of this study was to compare piezosurgery with rotary osteotomy techniques, with regard to surgery time and the severity of postoperative sequelae, including pain, swelling, and trismus.We conducted a systematic literature search in the Cochrane Library, PubMed, Embase, and Google Scholar.The eligibility criteria of this study included the following: the patients were clearly diagnosed as having impacted mandibular third molars; the patients underwent piezosurgery osteotomy, and in the control group rotary osteotomy techniques, for removing impacted third molars; the outcomes of interest include surgery time, trismus, swelling or pain; the studies are randomized controlled trials.We used random-effects models to calculate the difference in the outcomes, and the corresponding 95% confidence interval. We calculated the weighted mean difference if the trials used the same measurement, and a standardized mean difference if otherwise.A total of seven studies met the eligibility criteria and were included in our analysis. Compared with rotary osteotomy, patients undergoing piezosurgery experienced longer surgery time (mean difference 4.13 minutes, 95% confidence interval 2.75-5.52, P piezosurgery groups.The number of included randomized controlled trials and the sample size of each trial were relatively small, double blinding was not possible, and cost analysis was unavailable due to a lack of data.Our meta-analysis indicates that although patients undergoing piezosurgery

  11. Manipulation of biological samples using micro and nano techniques.

    Science.gov (United States)

    Castillo, Jaime; Dimaki, Maria; Svendsen, Winnie Edith

    2009-01-01

    The constant interest in handling, integrating and understanding biological systems of interest for the biomedical field, the pharmaceutical industry and the biomaterial researchers demand the use of techniques that allow the manipulation of biological samples causing minimal or no damage to their natural structure. Thanks to the advances in micro- and nanofabrication during the last decades several manipulation techniques offer us the possibility to image, characterize and manipulate biological material in a controlled way. Using these techniques the integration of biomaterials with remarkable properties with physical transducers has been possible, giving rise to new and highly sensitive biosensing devices. This article reviews the different techniques available to manipulate and integrate biological materials in a controlled manner either by sliding them along a surface (2-D manipulation), by grapping them and moving them to a new position (3-D manipulation), or by manipulating and relocating them applying external forces. The advantages and drawbacks are mentioned together with examples that reflect the state of the art of manipulation techniques for biological samples (171 references).

  12. Experimental analysis of an oblique turbulent flame front propagating in a stratified flow

    Energy Technology Data Exchange (ETDEWEB)

    Galizzi, C.; Escudie, D. [Universite de Lyon, CNRS, CETHIL, INSA-Lyon, UMR5008, F-69621 Cedex (France)

    2010-12-15

    This paper details the experimental study of a turbulent V-shaped flame expanding in a nonhomogeneous premixed flow. Its aim is to characterize the effects of stratification on turbulent flame characteristics. The setup consists of a stationary V-shaped flame stabilized on a rod and expanding freely in a lean premixed methane-air flow. One of the two oblique fronts interacts with a stratified slice, which has an equivalence ratio close to one and a thickness greater than that of the flame front. Several techniques such as PIV and CH{sup *} chemiluminescence are used to investigate the instantaneous fields, while laser Doppler anemometry and thermocouples are combined with a concentration probe to provide information on the mean fields. First, in order to provide a reference, the homogeneous turbulent case is studied. Next, the stratified turbulent premixed flame is investigated. Results show significant modifications of the whole flame and of the velocity field upstream of the flame front. The analysis of the geometric properties of the stratified flame indicates an increase in flame brush thickness, closely related to the local equivalence ratio. (author)

  13. Geostatistical Sampling Methods for Efficient Uncertainty Analysis in Flow and Transport Problems

    Science.gov (United States)

    Liodakis, Stylianos; Kyriakidis, Phaedon; Gaganis, Petros

    2015-04-01

    In hydrogeological applications involving flow and transport of in heterogeneous porous media the spatial distribution of hydraulic conductivity is often parameterized in terms of a lognormal random field based on a histogram and variogram model inferred from data and/or synthesized from relevant knowledge. Realizations of simulated conductivity fields are then generated using geostatistical simulation involving simple random (SR) sampling and are subsequently used as inputs to physically-based simulators of flow and transport in a Monte Carlo framework for evaluating the uncertainty in the spatial distribution of solute concentration due to the uncertainty in the spatial distribution of hydraulic con- ductivity [1]. Realistic uncertainty analysis, however, calls for a large number of simulated concentration fields; hence, can become expensive in terms of both time and computer re- sources. A more efficient alternative to SR sampling is Latin hypercube (LH) sampling, a special case of stratified random sampling, which yields a more representative distribution of simulated attribute values with fewer realizations [2]. Here, term representative implies realizations spanning efficiently the range of possible conductivity values corresponding to the lognormal random field. In this work we investigate the efficiency of alternative methods to classical LH sampling within the context of simulation of flow and transport in a heterogeneous porous medium. More precisely, we consider the stratified likelihood (SL) sampling method of [3], in which attribute realizations are generated using the polar simulation method by exploring the geometrical properties of the multivariate Gaussian distribution function. In addition, we propose a more efficient version of the above method, here termed minimum energy (ME) sampling, whereby a set of N representative conductivity realizations at M locations is constructed by: (i) generating a representative set of N points distributed on the

  14. NAIL SAMPLING TECHNIQUE AND ITS INTERPRETATION

    OpenAIRE

    TZAR MN; LEELAVATHI M

    2011-01-01

    The clinical suspicion of onychomyosis based on appearance of the nails, requires culture for confirmation. This is because treatment requires prolonged use of systemic agents which may cause side effects. One of the common problems encountered is improper nail sampling technique which results in loss of essential information. The unfamiliar terminologies used in reporting culture results may intimidate physicians resulting in misinterpretation and hamper treatment decision. This article prov...

  15. Effects of Green River Project on Cassava Farmers Production in ...

    African Journals Online (AJOL)

    This paper examined the effects of Green River project on cassava farmers' production in Ogba/Egbema/ Ndoni LGA of Rivers State. Purposive and stratified random sampling techniques were used to select the locations of Green River project, cooperative societies and respondents. Using structured questionnaire, a field ...

  16. Exercise participation and diet monitoring in pursuit of healthy aging ...

    African Journals Online (AJOL)

    This study examined the level of exercise participation and diet monitoring in pursuit of healthy aging. Descriptive survey research design and self-structured questionnaire was used to elicit information from the respondents. Proportionate stratified and simple random sampling techniques were used to select two hundred ...

  17. Test anxiety, attitude to schooling, parental influence, and peer ...

    African Journals Online (AJOL)

    This study investigated test anxiety, attitude to schooling, parental influence, and peer pressure as predictors of cheating tendencies in examination among secondary school students in Edo State, Nigeria. Ex-post facto research design was adopted for the study. Using stratified random sampling technique, 1200 senior ...

  18. Download this PDF file

    African Journals Online (AJOL)

    Methodology: Four centres were selected using stratified random sampling technique. ... globally . Ageing is the commonest cause of cataract. Exponential growth of the world population and life expectancy, results in an increase in the elderly population resulting in a ..... A. Audit of Outcome of an Extracapsular Cataract.

  19. Factors influencing user ability to retrieve information from the ...

    African Journals Online (AJOL)

    A sa mple size of 500 users and 12 librarians was selected from a study population of 5012 using stratified/simple random Sampling techniques. Two four - point likert-type questionnaires measuring three (3 ) variables that influence user ability to retrieve info rmation were developed, validated and administered.

  20. Gender differences in attitude towards mathematics in Nigerian ...

    African Journals Online (AJOL)

    This paper examined the gender differences in attitude towards mathematics in Nigerian secondary schools. A descriptive survey method was adopted for the study. Stratified random sampling technique was used to select twenty secondary schools in Makurdi Metropolis of Benue State. Three hundred and seventy-five ...

  1. Sources of marital stress experienced by married people as ...

    African Journals Online (AJOL)

    The study investigated sources of marital stress experienced by married people as perceived by lecturers of College of Education. Respondents were stratified into different strata of gender, age group, educational qualification and number of children, after which simple random sampling technique was used for selecting 20 ...

  2. 40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sample selection by random number... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square... area created in accordance with paragraph (a) of this section, select two random numbers: one each for...

  3. Sample preparation techniques for (p, X) spectrometry

    International Nuclear Information System (INIS)

    Whitehead, N.E.

    1985-01-01

    Samples are ashed at low temperature, using oxygen plasma; a rotary evaporator, and freeze drying speeded up the ashing. The new design of apparatus manufactured was only 10 watt but was as efficient as a 200 watt commercial machine; a circuit diagram is included. Samples of hair and biopsy samples of skin were analysed by the technique. A wool standard was prepared for interlaboratory comparison exercises. It was based on New Zealand merino sheep wool and was 2.9 kg in weight. A washing protocol was developed, which preserves most of the trace element content. The wool was ground in liquid nitrogen using a plastic pestle and beaker, driven by a rotary drill press. (author)

  4. Development and enrolee satisfaction with basic medical insurance in China: A systematic review and stratified cluster sampling survey.

    Science.gov (United States)

    Jing, Limei; Chen, Ru; Jing, Lisa; Qiao, Yun; Lou, Jiquan; Xu, Jing; Wang, Junwei; Chen, Wen; Sun, Xiaoming

    2017-07-01

    Basic Medical Insurance (BMI) has changed remarkably over time in China because of health reforms that aim to achieve universal coverage and better health care with adequate efforts by increasing subsidies, reimbursement, and benefits. In this paper, we present the development of BMI, including financing and operation, with a systematic review. Meanwhile, Pudong New Area in Shanghai was chosen as a typical BMI sample for its coverage and management; a stratified cluster sampling survey together with an ordinary logistic regression model was used for the analysis. Enrolee satisfaction and the factors associated with enrolee satisfaction with BMI were analysed. We found that the reenrolling rate superficially improved the BMI coverage and nearly achieved universal coverage. However, BMI funds still faced dual contradictions of fund deficit and insured under compensation, and a long-term strategy is needed to realize the integration of BMI schemes with more homogeneous coverage and benefits. Moreover, Urban Resident Basic Medical Insurance participants reported a higher rate of dissatisfaction than other participants. The key predictors of the enrolees' satisfaction were awareness of the premium and compensation, affordability of out-of-pocket costs, and the proportion of reimbursement. These results highlight the importance that the Chinese government takes measures, such as strengthening BMI fund management, exploring mixed payment methods, and regulating sequential medical orders, to develop an integrated medical insurance system of universal coverage and vertical equity while simultaneously improving enrolee satisfaction. Copyright © 2017 John Wiley & Sons, Ltd.

  5. Validation of the k-filtering technique for a signal composed of random-phase plane waves and non-random coherent structures

    Directory of Open Access Journals (Sweden)

    O. W. Roberts

    2014-12-01

    Full Text Available Recent observations of astrophysical magnetic fields have shown the presence of fluctuations being wave-like (propagating in the plasma frame and those described as being structure-like (advected by the plasma bulk velocity. Typically with single-spacecraft missions it is impossible to differentiate between these two fluctuations, due to the inherent spatio-temporal ambiguity associated with a single point measurement. However missions such as Cluster which contain multiple spacecraft have allowed for temporal and spatial changes to be resolved, using techniques such as k filtering. While this technique does not assume Taylor's hypothesis it requires both weak stationarity of the time series and that the fluctuations can be described by a superposition of plane waves with random phases. In this paper we test whether the method can cope with a synthetic signal which is composed of a combination of non-random-phase coherent structures with a mean radius d and a mean separation λ, as well as plane waves with random phase.

  6. Analysis of Turbulent Combustion in Simplified Stratified Charge Conditions

    Science.gov (United States)

    Moriyoshi, Yasuo; Morikawa, Hideaki; Komatsu, Eiji

    The stratified charge combustion system has been widely studied due to the significant potentials for low fuel consumption rate and low exhaust gas emissions. The fuel-air mixture formation process in a direct-injection stratified charge engine is influenced by various parameters, such as atomization, evaporation, and in-cylinder gas motion at high temperature and high pressure conditions. It is difficult to observe the in-cylinder phenomena in such conditions and also challenging to analyze the following stratified charge combustion. Therefore, the combustion phenomena in simplified stratified charge conditions aiming to analyze the fundamental stratified charge combustion are examined. That is, an experimental apparatus which can control the mixture distribution and the gas motion at ignition timing was developed, and the effects of turbulence intensity, mixture concentration distribution, and mixture composition on stratified charge combustion were examined. As a result, the effects of fuel, charge stratification, and turbulence on combustion characteristics were clarified.

  7. [A comparison of convenience sampling and purposive sampling].

    Science.gov (United States)

    Suen, Lee-Jen Wu; Huang, Hui-Man; Lee, Hao-Hsien

    2014-06-01

    Convenience sampling and purposive sampling are two different sampling methods. This article first explains sampling terms such as target population, accessible population, simple random sampling, intended sample, actual sample, and statistical power analysis. These terms are then used to explain the difference between "convenience sampling" and purposive sampling." Convenience sampling is a non-probabilistic sampling technique applicable to qualitative or quantitative studies, although it is most frequently used in quantitative studies. In convenience samples, subjects more readily accessible to the researcher are more likely to be included. Thus, in quantitative studies, opportunity to participate is not equal for all qualified individuals in the target population and study results are not necessarily generalizable to this population. As in all quantitative studies, increasing the sample size increases the statistical power of the convenience sample. In contrast, purposive sampling is typically used in qualitative studies. Researchers who use this technique carefully select subjects based on study purpose with the expectation that each participant will provide unique and rich information of value to the study. As a result, members of the accessible population are not interchangeable and sample size is determined by data saturation not by statistical power analysis.

  8. The Impact of Home Environment Factors on Academic Performance of Senior Secondary School Students in Garki Area District, Abuja - Nigeria

    OpenAIRE

    L. T. Dzever

    2015-01-01

    The study examined the impact of home environment factors on the academic performance of public secondary school students in Garki Area District, Abuja, Nigeria. The stratified sampling technique was used to select 300 students from six public schools, while the simple random sampling technique was used to administer the questionnaire. The study utilized a descriptive survey research design for the study. Also, data on student’s academic performance was obtained from student’s scores in four ...

  9. Are most samples of animals systematically biased? Consistent individual trait differences bias samples despite random sampling.

    Science.gov (United States)

    Biro, Peter A

    2013-02-01

    Sampling animals from the wild for study is something nearly every biologist has done, but despite our best efforts to obtain random samples of animals, 'hidden' trait biases may still exist. For example, consistent behavioral traits can affect trappability/catchability, independent of obvious factors such as size and gender, and these traits are often correlated with other repeatable physiological and/or life history traits. If so, systematic sampling bias may exist for any of these traits. The extent to which this is a problem, of course, depends on the magnitude of bias, which is presently unknown because the underlying trait distributions in populations are usually unknown, or unknowable. Indeed, our present knowledge about sampling bias comes from samples (not complete population censuses), which can possess bias to begin with. I had the unique opportunity to create naturalized populations of fish by seeding each of four small fishless lakes with equal densities of slow-, intermediate-, and fast-growing fish. Using sampling methods that are not size-selective, I observed that fast-growing fish were up to two-times more likely to be sampled than slower-growing fish. This indicates substantial and systematic bias with respect to an important life history trait (growth rate). If correlations between behavioral, physiological and life-history traits are as widespread as the literature suggests, then many animal samples may be systematically biased with respect to these traits (e.g., when collecting animals for laboratory use), and affect our inferences about population structure and abundance. I conclude with a discussion on ways to minimize sampling bias for particular physiological/behavioral/life-history types within animal populations.

  10. Application of random amplified polymorphic DNA (RAPD) markers ...

    African Journals Online (AJOL)

    SAM

    2014-06-11

    Jun 11, 2014 ... variety share an identical genome. In this field one of the most successful techniques is random ... To each minced sample, 350 µL of the same extraction buffer was added and the samples were ..... using fingerprints produced by random primers. J. Hort. Sci. 69:123-. 130. Levi A, Rowland LJ, Hartung JS ...

  11. The relationships between sixteen perfluorinated compound concentrations in blood serum and food, and other parameters, in the general population of South Korea with proportionate stratified sampling method.

    Science.gov (United States)

    Kim, Hee-Young; Kim, Seung-Kyu; Kang, Dong-Mug; Hwang, Yong-Sik; Oh, Jeong-Eun

    2014-02-01

    Serum samples were collected from volunteers of various ages and both genders using a proportionate stratified sampling method, to assess the exposure of the general population in Busan, South Korea to perfluorinated compounds (PFCs). 16 PFCs were investigated in serum samples from 306 adults (124 males and 182 females) and one day composite diet samples (breakfast, lunch, and dinner) from 20 of the serum donors, to investigate the relationship between food and serum PFC concentrations. Perfluorooctanoic acid and perfluorooctanesulfonic acid were the dominant PFCs in the serum samples, with mean concentrations of 8.4 and 13 ng/mL, respectively. Perfluorotridecanoic acid was the dominant PFC in the composite food samples, ranging from studies. We confirmed from the relationships between questionnaire results and the PFC concentrations in the serum samples, that food is one of the important contribution factors of human exposure to PFCs. However, there were no correlations between the PFC concentrations in the one day composite diet samples and the serum samples, because a one day composite diet sample is not necessarily representative of a person's long-term diet and because of the small number of samples taken. Copyright © 2013 Elsevier B.V. All rights reserved.

  12. Development of material balance evaluation technique(2)

    International Nuclear Information System (INIS)

    Lee, Byung Doo

    2000-06-01

    IAEA considers that the evaluation on material balance is one of the important activities for detecting the diversion of nuclear materials as well as measurement uncertainties and measurement bias. Nuclear material accounting reports, the results of DA and NDA, the summarized lists of material stratified by inspector are necessary for the material balance evaluation. In this report, the concepts and evaluation methods of material balance evaluation such as the estimation techniques of random and systematic errors, MUF, D and MUF-D are described. As a conclusion, it is possible for national inspection to evaluate the material balance by applying the evaluation methods of the IAEA such as error estimation using operator-inspector paired data, inspector MUF(IMUF) evaluation

  13. Assessing Principals' Quality Assurance Strategies in Osun State Secondary Schools, Nigeria

    Science.gov (United States)

    Fasasi, Yunus Adebunmi; Oyeniran, Saheed

    2014-01-01

    This paper examined principals' quality assurance strategies in secondary schools in Osun State, Nigeria. The study adopted a descriptive survey research design. Stratified random sampling technique was used to select 10 male and 10 female principals, and 190 male and190 female teachers. "Secondary School Principal Quality Assurance…

  14. Communication in marital homes and work performance among ...

    African Journals Online (AJOL)

    This study investigated the influence of communication in marital homes on secondary school teachers work performance in Akwa Ibom State. One research question and one hypothesis were formulated to guide the study. The ex-post facto research design was used in the study. Using stratified random sampling technique, ...

  15. optimal allocation of flows (water) within the volta basin system of ...

    African Journals Online (AJOL)

    Sir Onassis

    As an ex-post facto and descriptive survey, the study population embraced all the 141 secondary ... by multi-stag and stratified random sampling technique. ..... Table 4 Average Class-size in Secondary Schools in Ekiti State, Nigeria. Years ..... Ekiti State Government (1997) Approval Estimates 1997/98; Ado- Ekiti: Ministry of.

  16. A simple and efficient alternative to implementing systematic random sampling in stereological designs without a motorized microscope stage.

    Science.gov (United States)

    Melvin, Neal R; Poda, Daniel; Sutherland, Robert J

    2007-10-01

    When properly applied, stereology is a very robust and efficient method to quantify a variety of parameters from biological material. A common sampling strategy in stereology is systematic random sampling, which involves choosing a random sampling [corrected] start point outside the structure of interest, and sampling relevant objects at [corrected] sites that are placed at pre-determined, equidistant intervals. This has proven to be a very efficient sampling strategy, and is used widely in stereological designs. At the microscopic level, this is most often achieved through the use of a motorized stage that facilitates the systematic random stepping across the structure of interest. Here, we report a simple, precise and cost-effective software-based alternative to accomplishing systematic random sampling under the microscope. We believe that this approach will facilitate the use of stereological designs that employ systematic random sampling in laboratories that lack the resources to acquire costly, fully automated systems.

  17. Accounting for Sampling Error in Genetic Eigenvalues Using Random Matrix Theory.

    Science.gov (United States)

    Sztepanacz, Jacqueline L; Blows, Mark W

    2017-07-01

    The distribution of genetic variance in multivariate phenotypes is characterized by the empirical spectral distribution of the eigenvalues of the genetic covariance matrix. Empirical estimates of genetic eigenvalues from random effects linear models are known to be overdispersed by sampling error, where large eigenvalues are biased upward, and small eigenvalues are biased downward. The overdispersion of the leading eigenvalues of sample covariance matrices have been demonstrated to conform to the Tracy-Widom (TW) distribution. Here we show that genetic eigenvalues estimated using restricted maximum likelihood (REML) in a multivariate random effects model with an unconstrained genetic covariance structure will also conform to the TW distribution after empirical scaling and centering. However, where estimation procedures using either REML or MCMC impose boundary constraints, the resulting genetic eigenvalues tend not be TW distributed. We show how using confidence intervals from sampling distributions of genetic eigenvalues without reference to the TW distribution is insufficient protection against mistaking sampling error as genetic variance, particularly when eigenvalues are small. By scaling such sampling distributions to the appropriate TW distribution, the critical value of the TW statistic can be used to determine if the magnitude of a genetic eigenvalue exceeds the sampling error for each eigenvalue in the spectral distribution of a given genetic covariance matrix. Copyright © 2017 by the Genetics Society of America.

  18. PREDOMINANTLY LOW METALLICITIES MEASURED IN A STRATIFIED SAMPLE OF LYMAN LIMIT SYSTEMS AT Z  = 3.7

    Energy Technology Data Exchange (ETDEWEB)

    Glidden, Ana; Cooper, Thomas J.; Simcoe, Robert A. [Massachusetts Institute of Technology, 77 Massachusetts Ave, Cambridge, MA 02139 (United States); Cooksey, Kathy L. [Department of Physics and Astronomy, University of Hawai‘i at Hilo, 200 West Kāwili Street, Hilo, HI 96720 (United States); O’Meara, John M., E-mail: aglidden@mit.edu, E-mail: tjcooper@mit.edu, E-mail: simcoe@space.mit.edu, E-mail: kcooksey@hawaii.edu, E-mail: jomeara@smcvt.edu [Department of Physics, Saint Michael’s College, One Winooski Park, Colchester, VT 05439 (United States)

    2016-12-20

    We measured metallicities for 33 z = 3.4–4.2 absorption line systems drawn from a sample of H i-selected-Lyman limit systems (LLSs) identified in Sloan Digital Sky Survey (SDSS) quasar spectra and stratified based on metal line features. We obtained higher-resolution spectra with the Keck Echellette Spectrograph and Imager, selecting targets according to our stratification scheme in an effort to fully sample the LLS population metallicity distribution. We established a plausible range of H i column densities and measured column densities (or limits) for ions of carbon, silicon, and aluminum, finding ionization-corrected metallicities or upper limits. Interestingly, our ionization models were better constrained with enhanced α -to-aluminum abundances, with a median abundance ratio of [ α /Al] = 0.3. Measured metallicities were generally low, ranging from [M/H] = −3 to −1.68, with even lower metallicities likely for some systems with upper limits. Using survival statistics to incorporate limits, we constructed the cumulative distribution function (CDF) for LLS metallicities. Recent models of galaxy evolution propose that galaxies replenish their gas from the low-metallicity intergalactic medium (IGM) via high-density H i “flows” and eject enriched interstellar gas via outflows. Thus, there has been some expectation that LLSs at the peak of cosmic star formation ( z  ≈ 3) might have a bimodal metallicity distribution. We modeled our CDF as a mix of two Gaussian distributions, one reflecting the metallicity of the IGM and the other representative of the interstellar medium of star-forming galaxies. This bimodal distribution yielded a poor fit. A single Gaussian distribution better represented the sample with a low mean metallicity of [M/H] ≈ −2.5.

  19. Ibuprofen Versus Fennel for the Relief of Postpartum Pain: A Randomized Controlled Trial

    Directory of Open Access Journals (Sweden)

    Parvin Asti

    2011-06-01

    Full Text Available Objective: The present study aimed to compare the value of ibuprofen and fennel for postpartum painrelief in women with normal vaginal delivery.Materials and methods:In this randomized clinical trial we studied 90 women referring to obstetricsward for Normal Vaginal Delivery (NVD in Assali hospital in Khoramabad. Women were randomlyallocated to receive either oral ibuprofen or oral fennel by stratified random sampling technique. Allwomen were asked to give pain score by visual analogue scale before and at 1, 2, 3 and 4 hours aftertreatment.Results: Difference between fennel and ibuprofen groups was not significant considering severity of painbefore (P=0.22. Difference between two groups considering mean severity of pain one hour aftertreatment (P=0.57 was not significant. But comparing the mean of pain severity in two groups, showedsignificant difference after two (p<0.023, three (p<0.001 and four (p<0.001 hours after treatment.Conclusion: Ibuprofen and fennel were effective for relief of postpartum pain without any notable sideeffects, but in general ibuprofen was more effective than fennel. More studies are needed to confirm theefficacy of fennel in pain relief especially in postpartum women which must be compared to a notreatment control group.

  20. Application of the Sampling Selection Technique in Approaching Financial Audit

    Directory of Open Access Journals (Sweden)

    Victor Munteanu

    2018-03-01

    Full Text Available In his professional approach, the financial auditor has a wide range of working techniques, including selection techniques. They are applied depending on the nature of the information available to the financial auditor, the manner in which they are presented - paper or electronic format, and, last but not least, the time available. Several techniques are applied, successively or in parallel, to increase the safety of the expressed opinion and to provide the audit report with a solid basis of information. Sampling is used in the phase of control or clarification of the identified error. The main purpose is to corroborate or measure the degree of risk detected following a pertinent analysis. Since the auditor does not have time or means to thoroughly rebuild the information, the sampling technique can provide an effective response to the need for valorization.

  1. A random sampling approach for robust estimation of tissue-to-plasma ratio from extremely sparse data.

    Science.gov (United States)

    Chu, Hui-May; Ette, Ene I

    2005-09-02

    his study was performed to develop a new nonparametric approach for the estimation of robust tissue-to-plasma ratio from extremely sparsely sampled paired data (ie, one sample each from plasma and tissue per subject). Tissue-to-plasma ratio was estimated from paired/unpaired experimental data using independent time points approach, area under the curve (AUC) values calculated with the naïve data averaging approach, and AUC values calculated using sampling based approaches (eg, the pseudoprofile-based bootstrap [PpbB] approach and the random sampling approach [our proposed approach]). The random sampling approach involves the use of a 2-phase algorithm. The convergence of the sampling/resampling approaches was investigated, as well as the robustness of the estimates produced by different approaches. To evaluate the latter, new data sets were generated by introducing outlier(s) into the real data set. One to 2 concentration values were inflated by 10% to 40% from their original values to produce the outliers. Tissue-to-plasma ratios computed using the independent time points approach varied between 0 and 50 across time points. The ratio obtained from AUC values acquired using the naive data averaging approach was not associated with any measure of uncertainty or variability. Calculating the ratio without regard to pairing yielded poorer estimates. The random sampling and pseudoprofile-based bootstrap approaches yielded tissue-to-plasma ratios with uncertainty and variability. However, the random sampling approach, because of the 2-phase nature of its algorithm, yielded more robust estimates and required fewer replications. Therefore, a 2-phase random sampling approach is proposed for the robust estimation of tissue-to-plasma ratio from extremely sparsely sampled data.

  2. SCRAED - Simple and Complex Random Assignment in Experimental Designs

    OpenAIRE

    Alferes, Valentim R.

    2009-01-01

    SCRAED is a package of 37 self-contained SPSS syntax files that performs simple and complex random assignment in experimental designs. For between-subjects designs, SCRAED includes simple random assignment (no restrictions, forced equal sizes, forced unequal sizes, and unequal probabilities), block random assignment (simple and generalized blocks), and stratified random assignment (no restrictions, forced equal sizes, forced unequal sizes, and unequal probabilities). For within-subject...

  3. Should Torsion Balance Technique Continue to be Taught to Pharmacy Students?

    Science.gov (United States)

    Bilger, Rhonda; Chereson, Rasma; Salama, Noha Nabil

    2017-06-01

    Objective. To determine the types of balances used in compounding pharmacies: torsion or digital. Methods. A survey was mailed to the pharmacist-in-charge at 698 pharmacies, representing 47% of the pharmacies in Missouri as of July 2013. The pharmacies were randomly selected and stratified by region into eight regions to ensure a representative sample. Information was gathered regarding the type and use of balances and pharmacists' perspectives on the need to teach torsion balance technique to pharmacy students. Results. The response rate for the survey was 53.3%. Out of the total responses received, those pharmacies having a torsion balance, digital balance or both were 46.8%, 27.4% and 11.8%, respectively. About 68.3% of respondents compound prescriptions. The study showed that 52% of compounding pharmacies use torsion balances in their practice. Of those with a balance in their pharmacy, 65.6% favored continuation of torsion balance instruction. Conclusions. Digital balances have become increasingly popular and have replaced torsion balances in some pharmacies, especially those that compound a significant number of prescriptions. The results of this study indicate that torsion balances remain integral to compounding practice. Therefore, students should continue being taught torsion balance technique at the college.

  4. The stratified H-index makes scientific impact transparent

    DEFF Research Database (Denmark)

    Würtz, Morten; Schmidt, Morten

    2017-01-01

    The H-index is widely used to quantify and standardize researchers' scientific impact. However, the H-index does not account for the fact that co-authors rarely contribute equally to a paper. Accordingly, we propose the use of a stratified H-index to measure scientific impact. The stratified H......-index supplements the conventional H-index with three separate H-indices: one for first authorships, one for second authorships and one for last authorships. The stratified H-index takes scientific output, quality and individual author contribution into account....

  5. Conflict Resolution Strategies in Non-Government Secondary Schools in Benue State, Nigeria

    Science.gov (United States)

    Oboegbulem, Angie; Alfa, Idoko Alphonusu

    2013-01-01

    This study investigated perceived CRSs (conflict resolution strategies) for the resolution of conflicts in non-government secondary schools in Benue State, Nigeria. Three research questions and three hypotheses guided this study. Proportionate stratified random sampling technique was used in drawing 15% of the population which gave a total of 500…

  6. Analysis of the Implementation of Child Rights Law in Nigeria | Udoh ...

    African Journals Online (AJOL)

    The aim of the study was to analyse the implementation of Child Rights Law in Nigeria so far. To accomplish this: three research questions and three hypotheses were formulated to guide the investigation. Descriptive survey research was employed carrying out the study. Stratified random sampling technique was used to ...

  7. Floristic Composition and Vegetation Structure of The KNUST ...

    African Journals Online (AJOL)

    The diversity, relative importance, canopy height and cover of plant species in the Kwame Nkrumah University of Science and Technology (KNUST) Botanic Garden were evaluated in five 1-ha plots using a stratified random sampling technique in order to build an understanding of its floristic composition and structure in two ...

  8. Kindergarten Teachers' Experience with Reporting Child Abuse in Taiwan

    Science.gov (United States)

    Feng, Jui-Ying; Huang, Tzu-Yi; Wang, Chi-Jen

    2010-01-01

    Objective: The objectives were to examine factors associated with reporting child abuse among kindergarten teachers in Taiwan based on the Theory of Planned Behavior (TPB). Method: A stratified quota sampling technique was used to randomly select kindergarten teachers in Taiwan. The Child Abuse Intention Report Scale, which includes demographics,…

  9. Nuclear analytical techniques and their application to environmental samples

    International Nuclear Information System (INIS)

    Lieser, K.H.

    1986-01-01

    A survey is given on nuclear analytical techniques and their application to environmental samples. Measurement of the inherent radioactivity of elements or radionuclides allows determination of natural radioelements (e.g. Ra), man-made radioelements (e.g. Pu) and radionuclides in the environment. Activation analysis, in particular instrumental neutron activation analysis, is a very reliable and sensitive method for determination of a great number of trace elements in environmental samples, because the most abundant main constituents are not activated. Tracer techniques are very useful for studies of the behaviour and of chemical reactions of trace elements and compounds in the environment. Radioactive sources are mainly applied for excitation of characteristic X-rays (X-ray fluorescence analysis). (author)

  10. Estimation of the Coefficient of Restitution of Rocking Systems by the Random Decrement Technique

    DEFF Research Database (Denmark)

    Brincker, Rune; Demosthenous, Milton; Manos, George C.

    1994-01-01

    The aim of this paper is to investigate the possibility of estimating an average damping parameter for a rocking system due to impact, the so-called coefficient of restitution, from the random response, i.e. when the loads are random and unknown, and the response is measured. The objective...... is to obtain an estimate of the free rocking response from the measured random response using the Random Decrement (RDD) Technique, and then estimate the coefficient of restitution from this free response estimate. In the paper this approach is investigated by simulating the response of a single degree...

  11. Random Sampling of Correlated Parameters – a Consistent Solution for Unfavourable Conditions

    Energy Technology Data Exchange (ETDEWEB)

    Žerovnik, G., E-mail: gasper.zerovnik@ijs.si [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Trkov, A. [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); International Atomic Energy Agency, PO Box 100, A-1400 Vienna (Austria); Kodeli, I.A. [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Capote, R. [International Atomic Energy Agency, PO Box 100, A-1400 Vienna (Austria); Smith, D.L. [Argonne National Laboratory, 1710 Avenida del Mundo, Coronado, CA 92118-3073 (United States)

    2015-01-15

    Two methods for random sampling according to a multivariate lognormal distribution – the correlated sampling method and the method of transformation of correlation coefficients – are briefly presented. The methods are mathematically exact and enable consistent sampling of correlated inherently positive parameters with given information on the first two distribution moments. Furthermore, a weighted sampling method to accelerate the convergence of parameters with extremely large relative uncertainties is described. However, the method is efficient only for a limited number of correlated parameters.

  12. The Bootstrap, the Jackknife, and the Randomization Test: A Sampling Taxonomy.

    Science.gov (United States)

    Rodgers, J L

    1999-10-01

    A simple sampling taxonomy is defined that shows the differences between and relationships among the bootstrap, the jackknife, and the randomization test. Each method has as its goal the creation of an empirical sampling distribution that can be used to test statistical hypotheses, estimate standard errors, and/or create confidence intervals. Distinctions between the methods can be made based on the sampling approach (with replacement versus without replacement) and the sample size (replacing the whole original sample versus replacing a subset of the original sample). The taxonomy is useful for teaching the goals and purposes of resampling schemes. An extension of the taxonomy implies other possible resampling approaches that have not previously been considered. Univariate and multivariate examples are presented.

  13. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality

    Science.gov (United States)

    Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Introduction Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. Materials and methods The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. Results The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. Conclusions The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in

  14. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality.

    Science.gov (United States)

    L'Engle, Kelly; Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit dialing of mobile

  15. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality.

    Directory of Open Access Journals (Sweden)

    Kelly L'Engle

    Full Text Available Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample.The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census.The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample.The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit

  16. An experimental comparison of triggered and random pulse train uncertainties

    International Nuclear Information System (INIS)

    Henzlova, Daniela; Menlove, Howard O.; Swinhoe, Martyn T.

    2010-01-01

    In this paper we present an experimental comparison of signal-triggered and randomly triggered based analysis algorithms of neutron multiplicity data. Traditional shift register type signal-triggered multiplicity analysis of singles, doubles and triples rates is compared with analysis using randomly triggered gates. Two methods of random gate generation are explored - non-overlapping gates (Feyrunan approach) and periodic overlapping gates (fast accidentals). Using californium sources with low, medium and high rate in combination with AmLi sources (as a surrogate for plutonium) we investigate relative standard deviation (RSD) of data in order to determine if there are parameter spaces in which one of the measurement methods should be preferred. Neutron correlation analysis is a commonly used NDA technique to assay plutonium mass. The data can be collected in two distinct ways: using signal-triggered or randomly triggered counting gates. Analysis algorithms were developed for both approaches to determine singles (S), doubles (D) and triples (7) rates from the measured sample. Currently the most commonly implemented technique to collect neutron coincidence data utilizes shift register based electronics. Shift register uses signal-triggered counting gates to generate foreground multiplicity distribution of correlated+accidental events and a random gate (opened after a predefined long delay following the signal trigger) to generate background multiplicity distribution of accidental events. Modern shift registers include fast accidental option to sample data with a fixed clock frequency. This way a set of overlapping gates is used to generate background multiplicity distributions in order to improve the measurement precision. In parallel to shift register approach the Feynman variance technique is frequently used, which utilizes set of consecutive non-overlapping gates. In general, different user communities (e.g. safeguards, nuclear material accountancy, emergency

  17. Micro and Nano Techniques for the Handling of Biological Samples

    DEFF Research Database (Denmark)

    Micro and Nano Techniques for the Handling of Biological Samples reviews the different techniques available to manipulate and integrate biological materials in a controlled manner, either by sliding them along a surface (2-D manipulation), or by gripping and moving them to a new position (3-D...

  18. Differences in Mathematics Teachers' Perceived Preparedness to Demonstrate Competence in Secondary School Mathematics Content by Teacher Characteristics

    Science.gov (United States)

    Ng'eno, J. K.; Chesimet, M. C.

    2016-01-01

    A sample of 300 mathematics teachers drawn from a population of 1500 participated in this study. The participants were selected using systematic random sampling and stratified random sampling (stratified by qualification and gender). The data was collected using self-report questionnaires for mathematics teachers. One tool was used to collect…

  19. Application of radial basis function in densitometry of stratified regime of liquid-gas two phase flows

    International Nuclear Information System (INIS)

    Roshani, G.H.; Nazemi, E.; Roshani, M.M.

    2017-01-01

    In this paper, a novel method is proposed for predicting the density of liquid phase in stratified regime of liquid-gas two phase flows by utilizing dual modality densitometry technique and artificial neural network (ANN) model of radial basis function (RBF). The detection system includes a 137 Cs radioactive source and two NaI(Tl) detectors for registering transmitted and scattered photons. At the first step, a Monte Carlo simulation model was utilized to obtain the optimum position for the scattering detector in dual modality densitometry configuration. At the next step, an experimental setup was designed based on obtained optimum position for detectors from simulation in order to generate the required data for training and testing the ANN. The results show that the proposed approach could be successfully applied for predicting the density of liquid phase in stratified regime of gas-liquid two phase flows with mean relative error (MRE) of less than 0.701. - Highlights: • Density of liquid phase in stratified regime of two phase flows was predicted. • Combination of dual modality densitometry technique and ANN was utilized. • Detection system includes a 137 Cs radioactive source and two NaI(Tl) detectors. • MCNP simulation was done to obtain the optimum position for the scattering detector. • An experimental setup was designed to generate the required data for training the ANN.

  20. Women in University Management: The Nigerian Experience

    Science.gov (United States)

    Abiodun-Oyebanji, Olayemi; Olaleye, F.

    2011-01-01

    This study examined women in university management in Nigeria. It was a descriptive research of the survey type. The population of the study comprised all the public universities in southwest Nigeria, out of which three were selected through the stratified random sampling technique. Three hundred respondents who were in management positions were…

  1. Methodological integrative review of the work sampling technique used in nursing workload research.

    Science.gov (United States)

    Blay, Nicole; Duffield, Christine M; Gallagher, Robyn; Roche, Michael

    2014-11-01

    To critically review the work sampling technique used in nursing workload research. Work sampling is a technique frequently used by researchers and managers to explore and measure nursing activities. However, work sampling methods used are diverse making comparisons of results between studies difficult. Methodological integrative review. Four electronic databases were systematically searched for peer-reviewed articles published between 2002-2012. Manual scanning of reference lists and Rich Site Summary feeds from contemporary nursing journals were other sources of data. Articles published in the English language between 2002-2012 reporting on research which used work sampling to examine nursing workload. Eighteen articles were reviewed. The review identified that the work sampling technique lacks a standardized approach, which may have an impact on the sharing or comparison of results. Specific areas needing a shared understanding included the training of observers and subjects who self-report, standardization of the techniques used to assess observer inter-rater reliability, sampling methods and reporting of outcomes. Work sampling is a technique that can be used to explore the many facets of nursing work. Standardized reporting measures would enable greater comparison between studies and contribute to knowledge more effectively. Author suggestions for the reporting of results may act as guidelines for researchers considering work sampling as a research method. © 2014 John Wiley & Sons Ltd.

  2. Designing Wood Supply Scenarios from Forest Inventories with Stratified Predictions

    Directory of Open Access Journals (Sweden)

    Philipp Kilham

    2018-02-01

    Full Text Available Forest growth and wood supply projections are increasingly used to estimate the future availability of woody biomass and the correlated effects on forests and climate. This research parameterizes an inventory-based business-as-usual wood supply scenario, with a focus on southwest Germany and the period 2002–2012 with a stratified prediction. First, the Classification and Regression Trees algorithm groups the inventory plots into strata with corresponding harvest probabilities. Second, Random Forest algorithms generate individual harvest probabilities for the plots of each stratum. Third, the plots with the highest individual probabilities are selected as harvested until the harvest probability of the stratum is fulfilled. Fourth, the harvested volume of these plots is predicted with a linear regression model trained on harvested plots only. To illustrate the pros and cons of this method, it is compared to a direct harvested volume prediction with linear regression, and a combination of logistic regression and linear regression. Direct harvested volume regression predicts comparable volume figures, but generates these volumes in a way that differs from business-as-usual. The logistic model achieves higher overall classification accuracies, but results in underestimations or overestimations of harvest shares for several subsets of the data. The stratified prediction method balances this shortcoming, and can be of general use for forest growth and timber supply projections from large-scale forest inventories.

  3. Systematic comparison of static and dynamic headspace sampling techniques for gas chromatography.

    Science.gov (United States)

    Kremser, Andreas; Jochmann, Maik A; Schmidt, Torsten C

    2016-09-01

    Six automated, headspace-based sample preparation techniques were used to extract volatile analytes from water with the goal of establishing a systematic comparison between commonly available instrumental alternatives. To that end, these six techniques were used in conjunction with the same gas chromatography instrument for analysis of a common set of volatile organic carbon (VOC) analytes. The methods were thereby divided into three classes: static sampling (by syringe or loop), static enrichment (SPME and PAL SPME Arrow), and dynamic enrichment (ITEX and trap sampling). For PAL SPME Arrow, different sorption phase materials were also included in the evaluation. To enable an effective comparison, method detection limits (MDLs), relative standard deviations (RSDs), and extraction yields were determined and are discussed for all techniques. While static sampling techniques exhibited sufficient extraction yields (approx. 10-20 %) to be reliably used down to approx. 100 ng L(-1), enrichment techniques displayed extraction yields of up to 80 %, resulting in MDLs down to the picogram per liter range. RSDs for all techniques were below 27 %. The choice on one of the different instrumental modes of operation (aforementioned classes) was thereby the most influential parameter in terms of extraction yields and MDLs. Individual methods inside each class showed smaller deviations, and the least influences were observed when evaluating different sorption phase materials for the individual enrichment techniques. The option of selecting specialized sorption phase materials may, however, be more important when analyzing analytes with different properties such as high polarity or the capability of specific molecular interactions. Graphical Abstract PAL SPME Arrow during the extraction of volatile analytes from the headspace of an aqueous sample.

  4. Theory of sampling and its application in tissue based diagnosis

    Directory of Open Access Journals (Sweden)

    Kayser Gian

    2009-02-01

    Full Text Available Abstract Background A general theory of sampling and its application in tissue based diagnosis is presented. Sampling is defined as extraction of information from certain limited spaces and its transformation into a statement or measure that is valid for the entire (reference space. The procedure should be reproducible in time and space, i.e. give the same results when applied under similar circumstances. Sampling includes two different aspects, the procedure of sample selection and the efficiency of its performance. The practical performance of sample selection focuses on search for localization of specific compartments within the basic space, and search for presence of specific compartments. Methods When a sampling procedure is applied in diagnostic processes two different procedures can be distinguished: I the evaluation of a diagnostic significance of a certain object, which is the probability that the object can be grouped into a certain diagnosis, and II the probability to detect these basic units. Sampling can be performed without or with external knowledge, such as size of searched objects, neighbourhood conditions, spatial distribution of objects, etc. If the sample size is much larger than the object size, the application of a translation invariant transformation results in Kriege's formula, which is widely used in search for ores. Usually, sampling is performed in a series of area (space selections of identical size. The size can be defined in relation to the reference space or according to interspatial relationship. The first method is called random sampling, the second stratified sampling. Results Random sampling does not require knowledge about the reference space, and is used to estimate the number and size of objects. Estimated features include area (volume fraction, numerical, boundary and surface densities. Stratified sampling requires the knowledge of objects (and their features and evaluates spatial features in relation to

  5. RF Sub-sampling Receiver Architecture based on Milieu Adapting Techniques

    DEFF Research Database (Denmark)

    Behjou, Nastaran; Larsen, Torben; Jensen, Ole Kiel

    2012-01-01

    A novel sub-sampling based architecture is proposed which has the ability of reducing the problem of image distortion and improving the signal to noise ratio significantly. The technique is based on sensing the environment and adapting the sampling rate of the receiver to the best possible...

  6. Adaptive importance sampling for probabilistic validation of advanced driver assistance systems

    NARCIS (Netherlands)

    Gietelink, O.J.; Schutter, B. de; Verhaegen, M.

    2006-01-01

    We present an approach for validation of advanced driver assistance systems, based on randomized algorithms. The new method consists of an iterative randomized simulation using adaptive importance sampling. The randomized algorithm is more efficient than conventional simulation techniques. The

  7. Cone penetrometer tests and HydroPunch sampling: A screening technique for plume definition

    International Nuclear Information System (INIS)

    Smolley, M.; Kappmeyer, J.C.

    1991-01-01

    Cone penetrometer tests and HydroPunch sampling were used to define the extent of volatile organic compounds in ground water. The investigation indicated that the combination of the these techniques is effective for obtaining ground water samples for preliminary plume definition. HydroPunch samples can be collected in unconsolidated sediments and the analytical results obtained from these samples are comparable to those obtained from adjacent monitoring wells. This sampling method is a rapid and cost-effective screening technique for characterizing the extent of contaminant plumes in soft sediment environments. Use of this screening technique allowed monitoring wells to be located at the plume boundary, thereby reducing the number of wells installed and the overall cost of the plume definition program

  8. Financing Adult Education: How Adequate Are Current Sources in Facilitating Access and Participation in Centres in Murang'a South Sub-County, Murang'a County, Kenya?

    Science.gov (United States)

    Maina, Ndonga James; Orodho, John Aluko

    2016-01-01

    The thrust of this study was to examine the level of adequacy of current sources in facilitating access and participation in adult education centres in Murang'a South Sub-County, Murang'a County, Kenya. The study adopted the descriptive survey design. Combinations of purposive and stratified random sampling techniques were used to select 82…

  9. Chance constrained problems: penalty reformulation and performance of sample approximation technique

    Czech Academy of Sciences Publication Activity Database

    Branda, Martin

    2012-01-01

    Roč. 48, č. 1 (2012), s. 105-122 ISSN 0023-5954 R&D Projects: GA ČR(CZ) GBP402/12/G097 Institutional research plan: CEZ:AV0Z10750506 Keywords : chance constrained problems * penalty functions * asymptotic equivalence * sample approximation technique * investment problem Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.619, year: 2012 http://library.utia.cas.cz/separaty/2012/E/branda-chance constrained problems penalty reformulation and performance of sample approximation technique.pdf

  10. Adaptive importance sampling of random walks on continuous state spaces

    International Nuclear Information System (INIS)

    Baggerly, K.; Cox, D.; Picard, R.

    1998-01-01

    The authors consider adaptive importance sampling for a random walk with scoring in a general state space. Conditions under which exponential convergence occurs to the zero-variance solution are reviewed. These results generalize previous work for finite, discrete state spaces in Kollman (1993) and in Kollman, Baggerly, Cox, and Picard (1996). This paper is intended for nonstatisticians and includes considerable explanatory material

  11. Executive control resources and frequency of fatty food consumption: findings from an age-stratified community sample.

    Science.gov (United States)

    Hall, Peter A

    2012-03-01

    Fatty foods are regarded as highly appetitive, and self-control is often required to resist consumption. Executive control resources (ECRs) are potentially facilitative of self-control efforts, and therefore could predict success in the domain of dietary self-restraint. It is not currently known whether stronger ECRs facilitate resistance to fatty food consumption, and moreover, it is unknown whether such an effect would be stronger in some age groups than others. The purpose of the present study was to examine the association between ECRs and consumption of fatty foods among healthy community-dwelling adults across the adult life span. An age-stratified sample of individuals between 18 and 89 years of age attended two laboratory sessions. During the first session they completed two computer-administered tests of ECRs (Stroop and Go-NoGo) and a test of general cognitive function (Wechsler Abbreviated Scale of Intelligence); participants completed two consecutive 1-week recall measures to assess frequency of fatty and nonfatty food consumption. Regression analyses revealed that stronger ECRs were associated with lower frequency of fatty food consumption over the 2-week interval. This association was observed for both measures of ECR and a composite measure. The effect remained significant after adjustment for demographic variables (age, gender, socioeconomic status), general cognitive function, and body mass index. The observed effect of ECRs on fatty food consumption frequency was invariant across age group, and did not generalize to nonfatty food consumption. ECRs may be potentially important, though understudied, determinants of dietary behavior in adults across the life span.

  12. Toward a Principled Sampling Theory for Quasi-Orders.

    Science.gov (United States)

    Ünlü, Ali; Schrepp, Martin

    2016-01-01

    Quasi-orders, that is, reflexive and transitive binary relations, have numerous applications. In educational theories, the dependencies of mastery among the problems of a test can be modeled by quasi-orders. Methods such as item tree or Boolean analysis that mine for quasi-orders in empirical data are sensitive to the underlying quasi-order structure. These data mining techniques have to be compared based on extensive simulation studies, with unbiased samples of randomly generated quasi-orders at their basis. In this paper, we develop techniques that can provide the required quasi-order samples. We introduce a discrete doubly inductive procedure for incrementally constructing the set of all quasi-orders on a finite item set. A randomization of this deterministic procedure allows us to generate representative samples of random quasi-orders. With an outer level inductive algorithm, we consider the uniform random extensions of the trace quasi-orders to higher dimension. This is combined with an inner level inductive algorithm to correct the extensions that violate the transitivity property. The inner level correction step entails sampling biases. We propose three algorithms for bias correction and investigate them in simulation. It is evident that, on even up to 50 items, the new algorithms create close to representative quasi-order samples within acceptable computing time. Hence, the principled approach is a significant improvement to existing methods that are used to draw quasi-orders uniformly at random but cannot cope with reasonably large item sets.

  13. Toward a Principled Sampling Theory for Quasi-Orders

    Science.gov (United States)

    Ünlü, Ali; Schrepp, Martin

    2016-01-01

    Quasi-orders, that is, reflexive and transitive binary relations, have numerous applications. In educational theories, the dependencies of mastery among the problems of a test can be modeled by quasi-orders. Methods such as item tree or Boolean analysis that mine for quasi-orders in empirical data are sensitive to the underlying quasi-order structure. These data mining techniques have to be compared based on extensive simulation studies, with unbiased samples of randomly generated quasi-orders at their basis. In this paper, we develop techniques that can provide the required quasi-order samples. We introduce a discrete doubly inductive procedure for incrementally constructing the set of all quasi-orders on a finite item set. A randomization of this deterministic procedure allows us to generate representative samples of random quasi-orders. With an outer level inductive algorithm, we consider the uniform random extensions of the trace quasi-orders to higher dimension. This is combined with an inner level inductive algorithm to correct the extensions that violate the transitivity property. The inner level correction step entails sampling biases. We propose three algorithms for bias correction and investigate them in simulation. It is evident that, on even up to 50 items, the new algorithms create close to representative quasi-order samples within acceptable computing time. Hence, the principled approach is a significant improvement to existing methods that are used to draw quasi-orders uniformly at random but cannot cope with reasonably large item sets. PMID:27965601

  14. Sampling methods and non-destructive examination techniques for large radioactive waste packages

    International Nuclear Information System (INIS)

    Green, T.H.; Smith, D.L.; Burgoyne, K.E.; Maxwell, D.J.; Norris, G.H.; Billington, D.M.; Pipe, R.G.; Smith, J.E.; Inman, C.M.

    1992-01-01

    Progress is reported on work undertaken to evaluate quality checking methods for radioactive wastes. A sampling rig was designed, fabricated and used to develop techniques for the destructive sampling of cemented simulant waste using remotely operated equipment. An engineered system for the containment of cooling water was designed and manufactured and successfully demonstrated with the drum and coring equipment mounted in both vertical and horizontal orientations. The preferred in-cell orientation was found to be with the drum and coring machinery mounted in a horizontal position. Small powdered samples can be taken from cemented homogeneous waste cores using a hollow drill/vacuum section technique with the preferred subsampling technique being to discard the outer 10 mm layer to obtain a representative sample of the cement core. Cement blends can be dissolved using fusion techniques and the resulting solutions are stable to gelling for periods in excess of one year. Although hydrochloric acid and nitric acid are promising solvents for dissolution of cement blends, the resultant solutions tend to form silicic acid gels. An estimate of the beta-emitter content of cemented waste packages can be obtained by a combination of non-destructive and destructive techniques. The errors will probably be in excess of +/-60 % at the 95 % confidence level. Real-time X-ray video-imaging techniques have been used to analyse drums of uncompressed, hand-compressed, in-drum compacted and high-force compacted (i.e. supercompacted) simulant waste. The results have confirmed the applicability of this technique for NDT of low-level waste. 8 refs., 12 figs., 3 tabs

  15. Precision of systematic and random sampling in clustered populations: habitat patches and aggregating organisms.

    Science.gov (United States)

    McGarvey, Richard; Burch, Paul; Matthews, Janet M

    2016-01-01

    Natural populations of plants and animals spatially cluster because (1) suitable habitat is patchy, and (2) within suitable habitat, individuals aggregate further into clusters of higher density. We compare the precision of random and systematic field sampling survey designs under these two processes of species clustering. Second, we evaluate the performance of 13 estimators for the variance of the sample mean from a systematic survey. Replicated simulated surveys, as counts from 100 transects, allocated either randomly or systematically within the study region, were used to estimate population density in six spatial point populations including habitat patches and Matérn circular clustered aggregations of organisms, together and in combination. The standard one-start aligned systematic survey design, a uniform 10 x 10 grid of transects, was much more precise. Variances of the 10 000 replicated systematic survey mean densities were one-third to one-fifth of those from randomly allocated transects, implying transect sample sizes giving equivalent precision by random survey would need to be three to five times larger. Organisms being restricted to patches of habitat was alone sufficient to yield this precision advantage for the systematic design. But this improved precision for systematic sampling in clustered populations is underestimated by standard variance estimators used to compute confidence intervals. True variance for the survey sample mean was computed from the variance of 10 000 simulated survey mean estimates. Testing 10 published and three newly proposed variance estimators, the two variance estimators (v) that corrected for inter-transect correlation (ν₈ and ν(W)) were the most accurate and also the most precise in clustered populations. These greatly outperformed the two "post-stratification" variance estimators (ν₂ and ν₃) that are now more commonly applied in systematic surveys. Similar variance estimator performance rankings were found with

  16. A Comparison of Soil-Water Sampling Techniques

    Science.gov (United States)

    Tindall, J. A.; Figueroa-Johnson, M.; Friedel, M. J.

    2007-12-01

    The representativeness of soil pore water extracted by suction lysimeters in ground-water monitoring studies is a problem that often confounds interpretation of measured data. Current soil water sampling techniques cannot identify the soil volume from which a pore water sample is extracted, neither macroscopic, microscopic, or preferential flowpath. This research was undertaken to compare values of extracted suction lysimeters samples from intact soil cores with samples obtained by the direct extraction methods to determine what portion of soil pore water is sampled by each method. Intact soil cores (30 centimeter (cm) diameter by 40 cm height) were extracted from two different sites - a sandy soil near Altamonte Springs, Florida and a clayey soil near Centralia in Boone County, Missouri. Isotopically labeled water (O18? - analyzed by mass spectrometry) and bromide concentrations (KBr- - measured using ion chromatography) from water samples taken by suction lysimeters was compared with samples obtained by direct extraction methods of centrifugation and azeotropic distillation. Water samples collected by direct extraction were about 0.25 ? more negative (depleted) than that collected by suction lysimeter values from a sandy soil and about 2-7 ? more negative from a well structured clayey soil. Results indicate that the majority of soil water in well-structured soil is strongly bound to soil grain surfaces and is not easily sampled by suction lysimeters. In cases where a sufficient volume of water has passed through the soil profile and displaced previous pore water, suction lysimeters will collect a representative sample of soil pore water from the sampled depth interval. It is suggested that for stable isotope studies monitoring precipitation and soil water, suction lysimeter should be installed at shallow depths (10 cm). Samples should also be coordinated with precipitation events. The data also indicate that each extraction method be use to sample a different

  17. Can groundwater sampling techniques used in monitoring wells influence methane concentrations and isotopes?

    Science.gov (United States)

    Rivard, Christine; Bordeleau, Geneviève; Lavoie, Denis; Lefebvre, René; Malet, Xavier

    2018-03-06

    Methane concentrations and isotopic composition in groundwater are the focus of a growing number of studies. However, concerns are often expressed regarding the integrity of samples, as methane is very volatile and may partially exsolve during sample lifting in the well and transfer to sampling containers. While issues concerning bottle-filling techniques have already been documented, this paper documents a comparison of methane concentration and isotopic composition obtained with three devices commonly used to retrieve water samples from dedicated observation wells. This work lies within the framework of a larger project carried out in the Saint-Édouard area (southern Québec, Canada), whose objective was to assess the risk to shallow groundwater quality related to potential shale gas exploitation. The selected sampling devices, which were tested on ten wells during three sampling campaigns, consist of an impeller pump, a bladder pump, and disposable sampling bags (HydraSleeve). The sampling bags were used both before and after pumping, to verify the appropriateness of a no-purge approach, compared to the low-flow approach involving pumping until stabilization of field physicochemical parameters. Results show that methane concentrations obtained with the selected sampling techniques are usually similar and that there is no systematic bias related to a specific technique. Nonetheless, concentrations can sometimes vary quite significantly (up to 3.5 times) for a given well and sampling event. Methane isotopic composition obtained with all sampling techniques is very similar, except in some cases where sampling bags were used before pumping (no-purge approach), in wells where multiple groundwater sources enter the borehole.

  18. Development of environmental sample analysis techniques for safeguards

    International Nuclear Information System (INIS)

    Magara, Masaaki; Hanzawa, Yukiko; Esaka, Fumitaka

    1999-01-01

    JAERI has been developing environmental sample analysis techniques for safeguards and preparing a clean chemistry laboratory with clean rooms. Methods to be developed are a bulk analysis and a particle analysis. In the bulk analysis, Inductively-Coupled Plasma Mass Spectrometer or Thermal Ionization Mass Spectrometer are used to measure nuclear materials after chemical treatment of sample. In the particle analysis, Electron Probe Micro Analyzer and Secondary Ion Mass Spectrometer are used for elemental analysis and isotopic analysis, respectively. The design of the clean chemistry laboratory has been carried out and construction will be completed by the end of March, 2001. (author)

  19. Exploring the role of wave drag in the stable stratified oceanic and atmospheric bottom boundary layer in the cnrs-toulouse (cnrm-game) large stratified water flume

    NARCIS (Netherlands)

    Kleczek, M.; Steeneveld, G.J.; Paci, A.; Calmer, R.; Belleudy, A.; Canonici, J.C.; Murguet, F.; Valette, V.

    2014-01-01

    This paper reports on a laboratory experiment in the CNRM-GAME (Toulouse) stratified water flume of a stably stratified boundary layer, in order to quantify the momentum transfer due to orographically induced gravity waves by gently undulating hills in a boundary layer flow. In a stratified fluid, a

  20. Unwilling or Unable to Cheat? Evidence from a Randomized Tax Audit Experiment in Denmark

    DEFF Research Database (Denmark)

    Kleven, Henrik Jacobsen; Knudsen, Martin B.; Kreiner, Claus Thustrup

    2010-01-01

    This paper analyzes a randomized tax enforcement experiment in Denmark. In the base year, a stratified and representative sample of over 40,000 individual income tax filers was selected for the experiment. Half of the tax filers were randomly selected to be thoroughly audited, while the rest were...... deliberately not audited. The following year, "threat-of-audit" letters were randomly assigned and sent to tax filers in both groups. Using comprehensive administrative tax data, we present four main findings. First, we find that the tax evasion rate is very small (0.3%) for income subject to third...... impact on tax evasion, but that this effect is small in comparison to avoidance responses. Third, we find that prior audits substantially increase self-reported income, implying that individuals update their beliefs about detection probability based on experiencing an audit. Fourth, threat-of-audit...

  1. A Table-Based Random Sampling Simulation for Bioluminescence Tomography

    Directory of Open Access Journals (Sweden)

    Xiaomeng Zhang

    2006-01-01

    Full Text Available As a popular simulation of photon propagation in turbid media, the main problem of Monte Carlo (MC method is its cumbersome computation. In this work a table-based random sampling simulation (TBRS is proposed. The key idea of TBRS is to simplify multisteps of scattering to a single-step process, through randomly table querying, thus greatly reducing the computing complexity of the conventional MC algorithm and expediting the computation. The TBRS simulation is a fast algorithm of the conventional MC simulation of photon propagation. It retained the merits of flexibility and accuracy of conventional MC method and adapted well to complex geometric media and various source shapes. Both MC simulations were conducted in a homogeneous medium in our work. Also, we present a reconstructing approach to estimate the position of the fluorescent source based on the trial-and-error theory as a validation of the TBRS algorithm. Good agreement is found between the conventional MC simulation and the TBRS simulation.

  2. Standardization of proton-induced x-ray emission technique for analysis of thick samples

    Science.gov (United States)

    Ali, Shad; Zeb, Johar; Ahad, Abdul; Ahmad, Ishfaq; Haneef, M.; Akbar, Jehan

    2015-09-01

    This paper describes the standardization of the proton-induced x-ray emission (PIXE) technique for finding the elemental composition of thick samples. For the standardization, three different samples of standard reference materials (SRMs) were analyzed using this technique and the data were compared with the already known data of these certified SRMs. These samples were selected in order to cover the maximum range of elements in the periodic table. Each sample was irradiated for three different values of collected beam charges at three different times. A proton beam of 2.57 MeV obtained using 5UDH-II Pelletron accelerator was used for excitation of x-rays from the sample. The acquired experimental data were analyzed using the GUPIXWIN software. The results show that the SRM data and the data obtained using the PIXE technique are in good agreement.

  3. Sampling from the normal and exponential distributions

    International Nuclear Information System (INIS)

    Chaplin, K.R.; Wills, C.A.

    1982-01-01

    Methods for generating random numbers from the normal and exponential distributions are described. These involve dividing each function into subregions, and for each of these developing a method of sampling usually based on an acceptance rejection technique. When sampling from the normal or exponential distribution, each subregion provides the required random value with probability equal to the ratio of its area to the total area. Procedures written in FORTRAN for the CYBER 175/CDC 6600 system are provided to implement the two algorithms

  4. 40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or as...

  5. Review of sample preparation techniques for the analysis of pesticide residues in soil.

    Science.gov (United States)

    Tadeo, José L; Pérez, Rosa Ana; Albero, Beatriz; García-Valcárcel, Ana I; Sánchez-Brunete, Consuelo

    2012-01-01

    This paper reviews the sample preparation techniques used for the analysis of pesticides in soil. The present status and recent advances made during the last 5 years in these methods are discussed. The analysis of pesticide residues in soil requires the extraction of analytes from this matrix, followed by a cleanup procedure, when necessary, prior to their instrumental determination. The optimization of sample preparation is a very important part of the method development that can reduce the analysis time, the amount of solvent, and the size of samples. This review considers all aspects of sample preparation, including extraction and cleanup. Classical extraction techniques, such as shaking, Soxhlet, and ultrasonic-assisted extraction, and modern techniques like pressurized liquid extraction, microwave-assisted extraction, solid-phase microextraction and QuEChERS (Quick, Easy, Cheap, Effective, Rugged, and Safe) are reviewed. The different cleanup strategies applied for the purification of soil extracts are also discussed. In addition, the application of these techniques to environmental studies is considered.

  6. A Nationwide Random Sampling Survey of Potential Complicated Grief in Japan

    Science.gov (United States)

    Mizuno, Yasunao; Kishimoto, Junji; Asukai, Nozomu

    2012-01-01

    To investigate the prevalence of significant loss, potential complicated grief (CG), and its contributing factors, we conducted a nationwide random sampling survey of Japanese adults aged 18 or older (N = 1,343) using a self-rating Japanese-language version of the Complicated Grief Brief Screen. Among them, 37.0% experienced their most significant…

  7. Reinforcing Sampling Distributions through a Randomization-Based Activity for Introducing ANOVA

    Science.gov (United States)

    Taylor, Laura; Doehler, Kirsten

    2015-01-01

    This paper examines the use of a randomization-based activity to introduce the ANOVA F-test to students. The two main goals of this activity are to successfully teach students to comprehend ANOVA F-tests and to increase student comprehension of sampling distributions. Four sections of students in an advanced introductory statistics course…

  8. Laboratory techniques for safe encapsulation of α-emitting powder samples

    International Nuclear Information System (INIS)

    Chamberlain, H.E.; Pottinger, J.S.

    1984-01-01

    Plutonium oxide powder samples can be encapsulated in thin plastic film to prevent spread of contamination in counting and X-ray diffraction equipment. The film has to be thin enough to transmit X-rays and α-particles. Techniques are described for the wrapping process and the precautions necessary to keep the sample processing line free of significant contamination. (author)

  9. Aligning the Economic Value of Companion Diagnostics and Stratified Medicines

    Directory of Open Access Journals (Sweden)

    Edward D. Blair

    2012-11-01

    Full Text Available The twin forces of payors seeking fair pricing and the rising costs of developing new medicines has driven a closer relationship between pharmaceutical companies and diagnostics companies, because stratified medicines, guided by companion diagnostics, offer better commercial, as well as clinical, outcomes. Stratified medicines have created clinical success and provided rapid product approvals, particularly in oncology, and indeed have changed the dynamic between drug and diagnostic developers. The commercial payback for such partnerships offered by stratified medicines has been less well articulated, but this has shifted as the benefits in risk management, pricing and value creation for all stakeholders become clearer. In this larger healthcare setting, stratified medicine provides both physicians and patients with greater insight on the disease and provides rationale for providers to understand cost-effectiveness of treatment. This article considers how the economic value of stratified medicine relationships can be recognized and translated into better outcomes for all healthcare stakeholders.

  10. Randomization tests

    CERN Document Server

    Edgington, Eugene

    2007-01-01

    Statistical Tests That Do Not Require Random Sampling Randomization Tests Numerical Examples Randomization Tests and Nonrandom Samples The Prevalence of Nonrandom Samples in Experiments The Irrelevance of Random Samples for the Typical Experiment Generalizing from Nonrandom Samples Intelligibility Respect for the Validity of Randomization Tests Versatility Practicality Precursors of Randomization Tests Other Applications of Permutation Tests Questions and Exercises Notes References Randomized Experiments Unique Benefits of Experiments Experimentation without Mani

  11. Large eddy simulation of stably stratified turbulence

    International Nuclear Information System (INIS)

    Shen Zhi; Zhang Zhaoshun; Cui Guixiang; Xu Chunxiao

    2011-01-01

    Stably stratified turbulence is a common phenomenon in atmosphere and ocean. In this paper the large eddy simulation is utilized for investigating homogeneous stably stratified turbulence numerically at Reynolds number Re = uL/v = 10 2 ∼10 3 and Froude number Fr = u/NL = 10 −2 ∼10 0 in which u is root mean square of velocity fluctuations, L is integral scale and N is Brunt-Vaïsälä frequency. Three sets of computation cases are designed with different initial conditions, namely isotropic turbulence, Taylor Green vortex and internal waves, to investigate the statistical properties from different origins. The computed horizontal and vertical energy spectra are consistent with observation in atmosphere and ocean when the composite parameter ReFr 2 is greater than O(1). It has also been found in this paper that the stratification turbulence can be developed under different initial velocity conditions and the internal wave energy is dominated in the developed stably stratified turbulence.

  12. Yield and quality of ground water from stratified-drift aquifers, Taunton River basin, Massachusetts : executive summary

    Science.gov (United States)

    Lapham, Wayne W.; Olimpio, Julio C.

    1989-01-01

    Water shortages are a chronic problem in parts of the Taunton River basin and are caused by a combination of factors. Water use in this part of the Boston metropolitan area is likely to increase during the next decade. The Massachusetts Division of Water Resources projects that about 50% of the cities and towns within and on the perimeter of the basin may have water supply deficits by 1990 if water management projects are not pursued throughout the 1980s. Estimates of the long-term yield of the 26 regional aquifers indicate that the yields of the two most productive aquifers equal or exceed 11.9 and 11.3 cu ft/sec, 90% of the time, respectively, if minimum stream discharge is maintained at 99.5% flow duration. Eighteen of the 26 aquifers were pumped for public water supply during 1983. Further analysis of the yield characteristics of these 18 aquifers indicates that the 1983 pumping rate of each of these 18 aquifers can be sustained at least 70% of the time. Selected physical properties and concentrations of major chemical constituents in groundwater from the stratified-drift aquifers at 80 sampling sites were used to characterize general water quality in aquifers throughout the basin. The pH of the groundwater ranged from 5.4 to 7.0. Natural elevated concentrations of Fe and Mn in water in the stratified-drift aquifers are present locally in the basin. Natural concentrations of these two metals commonly exceed the limits of 0.3 mg/L for Fe and 0.05 mg/L for Mn recommended for drinking water. Fifty-one analyses of selected trace metals in groundwater samples from stratified-drift aquifers throughout the basin were used to characterize trace metal concentrations in the groundwater. Of the 10 constituents sampled that have US EPA limits recommended for drinking water, only the Pb concentration in water at one site (60 micrograms/L) exceeded the recommended limit of 50 micrograms/L. Analyses of selected organic compounds in water in the stratified-drift aquifers at 74

  13. Creating ensembles of decision trees through sampling

    Science.gov (United States)

    Kamath, Chandrika; Cantu-Paz, Erick

    2005-08-30

    A system for decision tree ensembles that includes a module to read the data, a module to sort the data, a module to evaluate a potential split of the data according to some criterion using a random sample of the data, a module to split the data, and a module to combine multiple decision trees in ensembles. The decision tree method is based on statistical sampling techniques and includes the steps of reading the data; sorting the data; evaluating a potential split according to some criterion using a random sample of the data, splitting the data, and combining multiple decision trees in ensembles.

  14. Sampling phased array a new technique for signal processing and ultrasonic imaging

    OpenAIRE

    Bulavinov, A.; Joneit, D.; Kröning, M.; Bernus, L.; Dalichow, M.H.; Reddy, K.M.

    2006-01-01

    Different signal processing and image reconstruction techniques are applied in ultrasonic non-destructive material evaluation. In recent years, rapid development in the fields of microelectronics and computer engineering lead to wide application of phased array systems. A new phased array technique, called "Sampling Phased Array" has been developed in Fraunhofer Institute for non-destructive testing. It realizes unique approach of measurement and processing of ultrasonic signals. The sampling...

  15. The role of graphene-based sorbents in modern sample preparation techniques.

    Science.gov (United States)

    de Toffoli, Ana Lúcia; Maciel, Edvaldo Vasconcelos Soares; Fumes, Bruno Henrique; Lanças, Fernando Mauro

    2018-01-01

    The application of graphene-based sorbents in sample preparation techniques has increased significantly since 2011. These materials have good physicochemical properties to be used as sorbent and have shown excellent results in different sample preparation techniques. Graphene and its precursor graphene oxide have been considered to be good candidates to improve the extraction and concentration of different classes of target compounds (e.g., parabens, polycyclic aromatic hydrocarbon, pyrethroids, triazines, and so on) present in complex matrices. Its applications have been employed during the analysis of different matrices (e.g., environmental, biological and food). In this review, we highlight the most important characteristics of graphene-based material, their properties, synthesis routes, and the most important applications in both off-line and on-line sample preparation techniques. The discussion of the off-line approaches includes methods derived from conventional solid-phase extraction focusing on the miniaturized magnetic and dispersive modes. The modes of microextraction techniques called stir bar sorptive extraction, solid phase microextraction, and microextraction by packed sorbent are discussed. The on-line approaches focus on the use of graphene-based material mainly in on-line solid phase extraction, its variation called in-tube solid-phase microextraction, and on-line microdialysis systems. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Stratified medicine and reimbursement issues

    NARCIS (Netherlands)

    Fugel, Hans-Joerg; Nuijten, Mark; Postma, Maarten

    2012-01-01

    Stratified Medicine (SM) has the potential to target patient populations who will most benefit from a therapy while reducing unnecessary health interventions associated with side effects. The link between clinical biomarkers/diagnostics and therapies provides new opportunities for value creation to

  17. LOD score exclusion analyses for candidate QTLs using random population samples.

    Science.gov (United States)

    Deng, Hong-Wen

    2003-11-01

    While extensive analyses have been conducted to test for, no formal analyses have been conducted to test against, the importance of candidate genes as putative QTLs using random population samples. Previously, we developed an LOD score exclusion mapping approach for candidate genes for complex diseases. Here, we extend this LOD score approach for exclusion analyses of candidate genes for quantitative traits. Under this approach, specific genetic effects (as reflected by heritability) and inheritance models at candidate QTLs can be analyzed and if an LOD score is < or = -2.0, the locus can be excluded from having a heritability larger than that specified. Simulations show that this approach has high power to exclude a candidate gene from having moderate genetic effects if it is not a QTL and is robust to population admixture. Our exclusion analysis complements association analysis for candidate genes as putative QTLs in random population samples. The approach is applied to test the importance of Vitamin D receptor (VDR) gene as a potential QTL underlying the variation of bone mass, an important determinant of osteoporosis.

  18. Fringe biasing: A variance reduction technique for optically thick meshes

    Energy Technology Data Exchange (ETDEWEB)

    Smedley-Stevenson, R. P. [AWE PLC, Aldermaston Reading, Berkshire, RG7 4PR (United Kingdom)

    2013-07-01

    Fringe biasing is a stratified sampling scheme applicable to Monte Carlo thermal radiation transport codes. The thermal emission source in optically thick cells is partitioned into separate contributions from the cell interiors (where the likelihood of the particles escaping the cells is virtually zero) and the 'fringe' regions close to the cell boundaries. Thermal emission in the cell interiors can now be modelled with fewer particles, the remaining particles being concentrated in the fringes so that they are more likely to contribute to the energy exchange between cells. Unlike other techniques for improving the efficiency in optically thick regions (such as random walk and discrete diffusion treatments), fringe biasing has the benefit of simplicity, as the associated changes are restricted to the sourcing routines with the particle tracking routines being unaffected. This paper presents an analysis of the potential for variance reduction achieved from employing the fringe biasing technique. The aim of this analysis is to guide the implementation of this technique in Monte Carlo thermal radiation codes, specifically in order to aid the choice of the fringe width and the proportion of particles allocated to the fringe (which are interrelated) in multi-dimensional simulations, and to confirm that the significant levels of variance reduction achieved in simulations can be understood by studying the behaviour for simple test cases. The variance reduction properties are studied for a single cell in a slab geometry purely absorbing medium, investigating the accuracy of the scalar flux and current tallies on one of the interfaces with the surrounding medium. (authors)

  19. Fringe biasing: A variance reduction technique for optically thick meshes

    International Nuclear Information System (INIS)

    Smedley-Stevenson, R. P.

    2013-01-01

    Fringe biasing is a stratified sampling scheme applicable to Monte Carlo thermal radiation transport codes. The thermal emission source in optically thick cells is partitioned into separate contributions from the cell interiors (where the likelihood of the particles escaping the cells is virtually zero) and the 'fringe' regions close to the cell boundaries. Thermal emission in the cell interiors can now be modelled with fewer particles, the remaining particles being concentrated in the fringes so that they are more likely to contribute to the energy exchange between cells. Unlike other techniques for improving the efficiency in optically thick regions (such as random walk and discrete diffusion treatments), fringe biasing has the benefit of simplicity, as the associated changes are restricted to the sourcing routines with the particle tracking routines being unaffected. This paper presents an analysis of the potential for variance reduction achieved from employing the fringe biasing technique. The aim of this analysis is to guide the implementation of this technique in Monte Carlo thermal radiation codes, specifically in order to aid the choice of the fringe width and the proportion of particles allocated to the fringe (which are interrelated) in multi-dimensional simulations, and to confirm that the significant levels of variance reduction achieved in simulations can be understood by studying the behaviour for simple test cases. The variance reduction properties are studied for a single cell in a slab geometry purely absorbing medium, investigating the accuracy of the scalar flux and current tallies on one of the interfaces with the surrounding medium. (authors)

  20. Classification of Phishing Email Using Random Forest Machine Learning Technique

    OpenAIRE

    Akinyelu, Andronicus A.; Adewumi, Aderemi O.

    2013-01-01

    Phishing is one of the major challenges faced by the world of e-commerce today. Thanks to phishing attacks, billions of dollars have been lost by many companies and individuals. In 2012, an online report put the loss due to phishing attack at about $1.5 billion. This global impact of phishing attacks will continue to be on the increase and thus requires more efficient phishing detection techniques to curb the menace. This paper investigates and reports the use of random forest machine learnin...

  1. Sampling Strategies and Processing of Biobank Tissue Samples from Porcine Biomedical Models.

    Science.gov (United States)

    Blutke, Andreas; Wanke, Rüdiger

    2018-03-06

    In translational medical research, porcine models have steadily become more popular. Considering the high value of individual animals, particularly of genetically modified pig models, and the often-limited number of available animals of these models, establishment of (biobank) collections of adequately processed tissue samples suited for a broad spectrum of subsequent analyses methods, including analyses not specified at the time point of sampling, represent meaningful approaches to take full advantage of the translational value of the model. With respect to the peculiarities of porcine anatomy, comprehensive guidelines have recently been established for standardized generation of representative, high-quality samples from different porcine organs and tissues. These guidelines are essential prerequisites for the reproducibility of results and their comparability between different studies and investigators. The recording of basic data, such as organ weights and volumes, the determination of the sampling locations and of the numbers of tissue samples to be generated, as well as their orientation, size, processing and trimming directions, are relevant factors determining the generalizability and usability of the specimen for molecular, qualitative, and quantitative morphological analyses. Here, an illustrative, practical, step-by-step demonstration of the most important techniques for generation of representative, multi-purpose biobank specimen from porcine tissues is presented. The methods described here include determination of organ/tissue volumes and densities, the application of a volume-weighted systematic random sampling procedure for parenchymal organs by point-counting, determination of the extent of tissue shrinkage related to histological embedding of samples, and generation of randomly oriented samples for quantitative stereological analyses, such as isotropic uniform random (IUR) sections generated by the "Orientator" and "Isector" methods, and vertical

  2. Techniques of sample attack used in soil and mineral analysis. Phase I

    International Nuclear Information System (INIS)

    Chiu, N.W.; Dean, J.R.; Sill, C.W.

    1984-07-01

    Several techniques of sample attack for the determination of radioisotopes are reviewed. These techniques include: 1) digestion with nitric or hydrochloric acid in Parr digestion bomb, 2) digestion with a mixture of nitric and hydrochloric acids, 3) digestion with a mixture of hydrofluoric, nitric and perchloric acids, and 4) fusion with sodium carbonate, potassium fluoride or alkali pyrosulfates. The effectiveness of these techniques to decompose various soils and minerals containing radioisotopes such as lead-210 uranium, thorium and radium-226 are discussed. The combined procedure of potassium fluoride fusion followed by alkali pyrosulfate fusion is recommended for radium-226, uranium and thorium analysis. This technique guarantees the complete dissolution of samples containing refractory materials such as silica, silicates, carbides, oxides and sulfates. For the lead-210 analysis, the procedure of digestion with a mixture of hydrofluoric, nitric and perchloric acids followed by fusion with alkali pyrosulfate is recommended. These two procedures are detailed. Schemes for the sequential separation of the radioisotopes from a dissolved sample solution are outlined. Procedures for radiochemical analysis are suggested

  3. Assessment of Natural Radioactivity in TENORM Samples Using Different Techniques

    International Nuclear Information System (INIS)

    Salman, Kh.A.; Shahein, A.Y.

    2009-01-01

    In petroleum oil industries, technologically-enhanced, naturally occurring radioactive materials are produced. The presence of TENORM constitutes a significant radiological human health hazard. In the present work, liquid scintillation counting technique was used to determine both 222 Rn and 226 Ra concentrations in TENORM samples, by measuring 222 Rn concentrations in the sample at different intervals of time after preparation. The radiation doses from the TENORM samples were estimated using thermoluminenscent detector (TLD-4000). The estimated radiation doses were found to be proportional to both the measured radiation doses in site and natural activity concentration in the samples that measured with LSC

  4. Analisis Faktor-faktor yang Mempengaruhi Migrasi Masuk ke Kota Denpasar

    OpenAIRE

    Trendyari, A.A. Tara; Yasa, I Nyoman Mahaendra

    2014-01-01

    This study aimed to analyze the influence of socio-economic variables such as income, employment, investment, educational services access, and health services access of the in-migration to Denpasar City. The location of this research is in Denpasar City, with number of samples are 100 respondents, obtained by stratified random sampling method based on region distrata districts in Denpasar City. The data were collected using questionnaires and interviews. While the analysis techniques used in ...

  5. Pengaruh Organizational Citizenship Behavior pada Performance dengan Service Quality, Satisfaction dan Behavior Intention Sebagai Anteseden

    OpenAIRE

    Joko Suyono; Sinto Sunaryo

    2015-01-01

    The purpose of the study is to observe the influence of organizational citizen behavior to performance. Organizational citizen behavior is affected by three variables namely satisfaction, service quality and behavior intention. The study was conducted on 12 nurses, 128 patients, and 10 nursing supervisor at a private hospital in Surakarta, Jawa Tengah. Stratified random sampling technique was applied to determine the number of sample. The result was based on structural equation modeling (SEM)...

  6. Pengaruh Organizational Citizenship Behavior Pada Performance Dengan Service Quality, Satisfaction Dan Behavior Intention Sebagai Anteseden

    OpenAIRE

    Sinto Sunaryo, Joko Suyono dan

    2015-01-01

    The purpose of the study is to observe the influence of organizational citizen behavior to performance. Organizational citizen behavior is affected by three variables namely satisfaction, service quality and behavior intention. The study was conducted on 12 nurses, 128 patients, and 10 nursing supervisor at a private hospital in Surakarta, Jawa Tengah. Stratified random sampling technique was applied to determine the number of sample. The result was based on structural equation modeling (SEM)...

  7. The Stratified Legitimacy of Abortions.

    Science.gov (United States)

    Kimport, Katrina; Weitz, Tracy A; Freedman, Lori

    2016-12-01

    Roe v. Wade was heralded as an end to unequal access to abortion care in the United States. However, today, despite being common and safe, abortion is performed only selectively in hospitals and private practices. Drawing on 61 interviews with obstetrician-gynecologists in these settings, we examine how they determine which abortions to perform. We find that they distinguish between more and less legitimate abortions, producing a narrative of stratified legitimacy that privileges abortions for intended pregnancies, when the fetus is unhealthy, and when women perform normative gendered sexuality, including distress about the abortion, guilt about failure to contracept, and desire for motherhood. This stratified legitimacy can perpetuate socially-inflected inequality of access and normative gendered sexuality. Additionally, we argue that the practice by physicians of distinguishing among abortions can legitimate legislative practices that regulate and restrict some kinds of abortion, further constraining abortion access. © American Sociological Association 2016.

  8. Stratified charge rotary engine for general aviation

    Science.gov (United States)

    Mount, R. E.; Parente, A. M.; Hady, W. F.

    1986-01-01

    A development history, a current development status assessment, and a design feature and performance capabilities account are given for stratified-charge rotary engines applicable to aircraft propulsion. Such engines are capable of operating on Jet-A fuel with substantial cost savings, improved altitude capability, and lower fuel consumption by comparison with gas turbine powerplants. Attention is given to the current development program of a 400-hp engine scheduled for initial operations in early 1990. Stratified charge rotary engines are also applicable to ground power units, airborne APUs, shipboard generators, and vehicular engines.

  9. Work Sampling Study of an Engineering Professor during a Regular Contract Period

    Science.gov (United States)

    Brink, Jan; McDonald, Dale B.

    2015-01-01

    Work sampling is a technique that has been employed in industry and fields such as healthcare for some time. It is a powerful technique, and an alternative to conventional stop watch time studies, used by industrial engineers to focus upon random work sampling observations. This study applies work sampling to the duties performed by an individual…

  10. Attenuation of species abundance distributions by sampling

    Science.gov (United States)

    Shimadzu, Hideyasu; Darnell, Ross

    2015-01-01

    Quantifying biodiversity aspects such as species presence/ absence, richness and abundance is an important challenge to answer scientific and resource management questions. In practice, biodiversity can only be assessed from biological material taken by surveys, a difficult task given limited time and resources. A type of random sampling, or often called sub-sampling, is a commonly used technique to reduce the amount of time and effort for investigating large quantities of biological samples. However, it is not immediately clear how (sub-)sampling affects the estimate of biodiversity aspects from a quantitative perspective. This paper specifies the effect of (sub-)sampling as attenuation of the species abundance distribution (SAD), and articulates how the sampling bias is induced to the SAD by random sampling. The framework presented also reveals some confusion in previous theoretical studies. PMID:26064626

  11. Experimental study of unsteady thermally stratified flow

    International Nuclear Information System (INIS)

    Lee, Sang Jun; Chung, Myung Kyoon

    1985-01-01

    Unsteady thermally stratified flow caused by two-dimensional surface discharge of warm water into a oblong channel was investigated. Experimental study was focused on the rapidly developing thermal diffusion at small Richardson number. The basic objectives were to study the interfacial mixing between a flowing layer of warm water and an underlying body of cold water and to accumulate experimental data to test computational turbulence models. Mean velocity field measurements were carried out by using NMR-CT(Nuclear Magnetic Resonance-Computerized Tomography). It detects quantitative flow image of any desired section in any direction of flow in short time. Results show that at small Richardson number warm layer rapidly penetrates into the cold layer because of strong turbulent mixing and instability between the two layers. It is found that the transfer of heat across the interface is more vigorous than that of momentum. It is also proved that the NMR-CT technique is a very valuable tool to measure unsteady three dimensional flow field. (Author)

  12. Effect of manual therapy techniques on headache disability in patients with tension-type headache. Randomized controlled trial.

    Science.gov (United States)

    Espí-López, G V; Rodríguez-Blanco, C; Oliva-Pascual-Vaca, A; Benítez-Martínez, J C; Lluch, E; Falla, D

    2014-12-01

    Tension-type headache (TTH) is the most common type of primary headache however there is no clear evidence as to which specific treatment is most effective or whether combined treatment is more effective than individual treatments. To assess the effectiveness of manual therapy techniques, applied to the suboccipital region, on aspects of disability in a sample of patients with tension-type headache. Randomized Controlled Trial. Specialized centre for headache treatment. Seventy-six (62 women) patients (age: 39.9 ± 10.9 years) with episodic chronic TTH. Patients were randomly divided into four treatment groups: 1) suboccipital soft tissue inhibition; 2) occiput-atlas-axis manipulation; 3) combined treatment of both techniques; 4) control. Four sessions were applied over 4 weeks and disability was assessed before and after treatment using the Headache Disability Inventory (HDI). Headache frequency, severity and the functional and emotional subscales of the questionnaire were assessed. Photophobia, phonophobia and pericranial tenderness were also monitored. Headache frequency was significantly reduced with the manipulative and combined treatment (Ptreatment groups (Ptreatment also reduced the score on the emotional subscale of the HDI (Ptreatments were combined, effectiveness was noted for all aspects of disability and other symptoms including photophobia, phonophobia and pericranial tenderness. Although individual manual therapy treatments showed a positive change in headache features, measures of photophobia, photophobia and pericranial tenderness only improved in the group that received the combined treatment suggesting that combined treatment is the most appropriate for symptomatic relief of TTH.

  13. Blocked Randomization with Randomly Selected Block Sizes

    Directory of Open Access Journals (Sweden)

    Jimmy Efird

    2010-12-01

    Full Text Available When planning a randomized clinical trial, careful consideration must be given to how participants are selected for various arms of a study. Selection and accidental bias may occur when participants are not assigned to study groups with equal probability. A simple random allocation scheme is a process by which each participant has equal likelihood of being assigned to treatment versus referent groups. However, by chance an unequal number of individuals may be assigned to each arm of the study and thus decrease the power to detect statistically significant differences between groups. Block randomization is a commonly used technique in clinical trial design to reduce bias and achieve balance in the allocation of participants to treatment arms, especially when the sample size is small. This method increases the probability that each arm will contain an equal number of individuals by sequencing participant assignments by block. Yet still, the allocation process may be predictable, for example, when the investigator is not blind and the block size is fixed. This paper provides an overview of blocked randomization and illustrates how to avoid selection bias by using random block sizes.

  14. Review of online coupling of sample preparation techniques with liquid chromatography.

    Science.gov (United States)

    Pan, Jialiang; Zhang, Chengjiang; Zhang, Zhuomin; Li, Gongke

    2014-03-07

    Sample preparation is still considered as the bottleneck of the whole analytical procedure, and efforts has been conducted towards the automation, improvement of sensitivity and accuracy, and low comsuption of organic solvents. Development of online sample preparation techniques (SP) coupled with liquid chromatography (LC) is a promising way to achieve these goals, which has attracted great attention. This article reviews the recent advances on the online SP-LC techniques. Various online SP techniques have been described and summarized, including solid-phase-based extraction, liquid-phase-based extraction assisted with membrane, microwave assisted extraction, ultrasonic assisted extraction, accelerated solvent extraction and supercritical fluids extraction. Specially, the coupling approaches of online SP-LC systems and the corresponding interfaces have been discussed and reviewed in detail, such as online injector, autosampler combined with transport unit, desorption chamber and column switching. Typical applications of the online SP-LC techniques have been summarized. Then the problems and expected trends in this field are attempted to be discussed and proposed in order to encourage the further development of online SP-LC techniques. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. Cleaning and Cleanliness Verification Techniques for Mars Returned Sample Handling

    Science.gov (United States)

    Mickelson, E. T.; Lindstrom, D. J.; Allton, J. H.; Hittle, J. D.

    2002-01-01

    Precision cleaning and cleanliness verification techniques are examined as a subset of a comprehensive contamination control strategy for a Mars sample return mission. Additional information is contained in the original extended abstract.

  16. Additive non-uniform random sampling in superimposed fiber Bragg grating strain gauge

    Science.gov (United States)

    Ma, Y. C.; Liu, H. Y.; Yan, S. B.; Yang, Y. H.; Yang, M. W.; Li, J. M.; Tang, J.

    2013-05-01

    This paper demonstrates an additive non-uniform random sampling and interrogation method for dynamic and/or static strain gauge using a reflection spectrum from two superimposed fiber Bragg gratings (FBGs). The superimposed FBGs are designed to generate non-equidistant space of a sensing pulse train in the time domain during dynamic strain gauge. By combining centroid finding with smooth filtering methods, both the interrogation speed and accuracy are improved. A 1.9 kHz dynamic strain is measured by generating an additive non-uniform randomly distributed 2 kHz optical sensing pulse train from a mean 500 Hz triangular periodically changing scanning frequency.

  17. Development and Evaluation of a Model-Supported Scientific Inquiry Training Program for Elementary Teachers in Indonesia

    OpenAIRE

    Chandra Ertikanto; Herpratiwi; Tina Yunarti; Post-graduate School of Mathematics Education, Faculty of Teacher Training and Education, University of Lampung, Indonesia,

    2017-01-01

    A teacher training program, named Model-Supported Scientific Inquiry Training Program (MSSITP) has been successfully developed to improve the inquiry skills of Indonesian elementary teachers. The skills enhanced by MSSITP are defining problems, formulating hypotheses, planning and doing investigations, drawing conclusions, and communicating the results. This teacher training program was evaluated by 48 teachers selected by stratified random sampling technique from 48 element...

  18. Elemental analyses of goundwater: demonstrated advantage of low-flow sampling and trace-metal clean techniques over standard techniques

    Science.gov (United States)

    Creasey, C. L.; Flegal, A. R.

    The combined use of both (1) low-flow purging and sampling and (2) trace-metal clean techniques provides more representative measurements of trace-element concentrations in groundwater than results derived with standard techniques. The use of low-flow purging and sampling provides relatively undisturbed groundwater samples that are more representative of in situ conditions, and the use of trace-element clean techniques limits the inadvertent introduction of contaminants during sampling, storage, and analysis. When these techniques are applied, resultant trace-element concentrations are likely to be markedly lower than results based on standard sampling techniques. In a comparison of data derived from contaminated and control groundwater wells at a site in California, USA, trace-element concentrations from this study were 2-1000 times lower than those determined by the conventional techniques used in sampling of the same wells prior to (5months) and subsequent to (1month) the collections for this study. Specifically, the cadmium and chromium concentrations derived using standard sampling techniques exceed the California Maximum Contaminant Levels (MCL), whereas in this investigation concentrations of both of those elements are substantially below their MCLs. Consequently, the combined use of low-flow and trace-metal clean techniques may preclude erroneous reports of trace-element contamination in groundwater. Résumé L'utilisation simultanée de la purge et de l'échantillonnage à faible débit et des techniques sans traces de métaux permet d'obtenir des mesures de concentrations en éléments en traces dans les eaux souterraines plus représentatives que les résultats fournis par les techniques classiques. L'utilisation de la purge et de l'échantillonnage à faible débit donne des échantillons d'eau souterraine relativement peu perturbés qui sont plus représentatifs des conditions in situ, et le recours aux techniques sans éléments en traces limite l

  19. Dental Students' Perceptions of Digital and Conventional Impression Techniques: A Randomized Controlled Trial.

    Science.gov (United States)

    Zitzmann, Nicola U; Kovaltschuk, Irina; Lenherr, Patrik; Dedem, Philipp; Joda, Tim

    2017-10-01

    The aim of this randomized controlled trial was to analyze inexperienced dental students' perceptions of the difficulty and applicability of digital and conventional implant impressions and their preferences including performance. Fifty undergraduate dental students at a dental school in Switzerland were randomly divided into two groups (2×25). Group A first took digital impressions in a standardized phantom model and then conventional impressions, while the procedures were reversed for Group B. Participants were asked to complete a VAS questionnaire (0-100) on the level of difficulty and applicability (user/patient-friendliness) of both techniques. They were asked which technique they preferred and perceived to be more efficient. A quotient of "effective scan time per software-recorded time" (TRIOS) was calculated as an objective quality indicator for intraoral optical scanning (IOS). The majority of students perceived IOS as easier than the conventional technique. Most (72%) preferred the digital approach using IOS to take the implant impression to the conventional method (12%) or had no preference (12%). Although total work was similar for males and females, the TRIOS quotient indicated that male students tended to use their time more efficiently. In this study, dental students with no clinical experience were very capable of acquiring digital tools, indicating that digital impression techniques can be included early in the dental curriculum to help them catch up with ongoing development in computer-assisted technologies used in oral rehabilitation.

  20. A secure cyclic steganographic technique for color images using randomization

    International Nuclear Information System (INIS)

    Muhammad, K.; Ahmad, J.; Rehman, N.U.

    2014-01-01

    Information Security is a major concern in today's modern era. Almost all the communicating bodies want the security, confidentiality and integrity of their personal data. But this security goal cannot be achieved easily when we are using an open network like internet. Steganography provides one of the best solutions to this problem. This paper represents a new Cyclic Steganographic Technique (CST) based on Least Significant Bit (LSB) for true color (RGB) images. The proposed method hides the secret data in the LSBs of cover image pixels in a randomized cyclic manner. The proposed technique is evaluated using both subjective and objective analysis using histograms changeability, Peak Signal-to-Noise Ratio (PSNR) and Mean Square Error (MSE). Experimentally it is found that the proposed method gives promising results in terms of security, imperceptibility and robustness as compared to some existent methods and vindicates this new algorithm. (author)

  1. THE STUDY OF HEAVY METAL FROM ENVIRONMENTAL SAMPLES BY ATOMIC TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Ion V. POPESCU

    2011-05-01

    Full Text Available Using the Atomic Absorption Spectrometry (AAS and Energy Dispersive X-ray spectrometry (EDXRF techniques we analyzed the contents of heavy metals ( Cd, Cr, Ni, Pb, Ti, Sr, Co, Bi from eight wild mushrooms and soil substrate samples (48 samples of eight fungal species and 32 underlying soil samples, collected from ten forest sites of Dambovița County Romania. It was determined that the elements, especially heavy metals, in soil were characteristic of the acidic soils of the Romanian forest lands and are influenced by industrial pollution. Analytical possibilities of AAS and EDXRF analytical techniques have been compared and the heavy metal transfer from substrate to mushrooms has been studied. The coefficient of accumulation of essential and heavy metals has been calculated as well. Heavy metal contents of all analyzed mushrooms were generally higher than previously reported in literature.

  2. Comparison between correlated sampling and the perturbation technique of MCNP5 for fixed-source problems

    International Nuclear Information System (INIS)

    He Tao; Su Bingjing

    2011-01-01

    Highlights: → The performance of the MCNP differential operator perturbation technique is compared with that of the MCNP correlated sampling method for three types of fixed-source problems. → In terms of precision, the MCNP perturbation technique outperforms correlated sampling for one type of problem but performs comparably with or even under-performs correlated sampling for the other two types of problems. → In terms of accuracy, the MCNP perturbation calculations may predict inaccurate results for some of the test problems. However, the accuracy can be improved if the midpoint correction technique is used. - Abstract: Correlated sampling and the differential operator perturbation technique are two methods that enable MCNP (Monte Carlo N-Particle) to simulate small response change between an original system and a perturbed system. In this work the performance of the MCNP differential operator perturbation technique is compared with that of the MCNP correlated sampling method for three types of fixed-source problems. In terms of precision of predicted response changes, the MCNP perturbation technique outperforms correlated sampling for the problem involving variation of nuclide concentrations in the same direction but performs comparably with or even underperforms correlated sampling for the other two types of problems that involve void or variation of nuclide concentrations in opposite directions. In terms of accuracy, the MCNP differential operator perturbation calculations may predict inaccurate results that deviate from the benchmarks well beyond their uncertainty ranges for some of the test problems. However, the accuracy of the MCNP differential operator perturbation can be improved if the midpoint correction technique is used.

  3. Probability sampling design in ethnobotanical surveys of medicinal plants

    Directory of Open Access Journals (Sweden)

    Mariano Martinez Espinosa

    2012-07-01

    Full Text Available Non-probability sampling design can be used in ethnobotanical surveys of medicinal plants. However, this method does not allow statistical inferences to be made from the data generated. The aim of this paper is to present a probability sampling design that is applicable in ethnobotanical studies of medicinal plants. The sampling design employed in the research titled "Ethnobotanical knowledge of medicinal plants used by traditional communities of Nossa Senhora Aparecida do Chumbo district (NSACD, Poconé, Mato Grosso, Brazil" was used as a case study. Probability sampling methods (simple random and stratified sampling were used in this study. In order to determine the sample size, the following data were considered: population size (N of 1179 families; confidence coefficient, 95%; sample error (d, 0.05; and a proportion (p, 0.5. The application of this sampling method resulted in a sample size (n of at least 290 families in the district. The present study concludes that probability sampling methods necessarily have to be employed in ethnobotanical studies of medicinal plants, particularly where statistical inferences have to be made using data obtained. This can be achieved by applying different existing probability sampling methods, or better still, a combination of such methods.

  4. A technique for extracting blood samples from mice in fire toxicity tests

    Science.gov (United States)

    Bucci, T. J.; Hilado, C. J.; Lopez, M. T.

    1976-01-01

    The extraction of adequate blood samples from moribund and dead mice has been a problem because of the small quantity of blood in each animal and the short time available between the animals' death and coagulation of the blood. These difficulties are particularly critical in fire toxicity tests because removal of the test animals while observing proper safety precautions for personnel is time-consuming. Techniques for extracting blood samples from mice were evaluated, and a technique was developed to obtain up to 0.8 ml of blood from a single mouse after death. The technique involves rapid exposure and cutting of the posterior vena cava and accumulation of blood in the peritoneal space. Blood samples of 0.5 ml or more from individual mice have been consistently obtained as much as 16 minutes after apparent death. Results of carboxyhemoglobin analyses of blood appeared reproducible and consistent with carbon monoxide concentrations in the exposure chamber.

  5. A stochastic learning algorithm for layered neural networks

    International Nuclear Information System (INIS)

    Bartlett, E.B.; Uhrig, R.E.

    1992-01-01

    The random optimization method typically uses a Gaussian probability density function (PDF) to generate a random search vector. In this paper the random search technique is applied to the neural network training problem and is modified to dynamically seek out the optimal probability density function (OPDF) from which to select the search vector. The dynamic OPDF search process, combined with an auto-adaptive stratified sampling technique and a dynamic node architecture (DNA) learning scheme, completes the modifications of the basic method. The DNA technique determines the appropriate number of hidden nodes needed for a given training problem. By using DNA, researchers do not have to set the neural network architectures before training is initiated. The approach is applied to networks of generalized, fully interconnected, continuous perceptions. Computer simulation results are given

  6. A scatter-corrected list-mode reconstruction and a practical scatter/random approximation technique for dynamic PET imaging

    International Nuclear Information System (INIS)

    Cheng, J-C; Rahmim, Arman; Blinder, Stephan; Camborde, Marie-Laure; Raywood, Kelvin; Sossi, Vesna

    2007-01-01

    We describe an ordinary Poisson list-mode expectation maximization (OP-LMEM) algorithm with a sinogram-based scatter correction method based on the single scatter simulation (SSS) technique and a random correction method based on the variance-reduced delayed-coincidence technique. We also describe a practical approximate scatter and random-estimation approach for dynamic PET studies based on a time-averaged scatter and random estimate followed by scaling according to the global numbers of true coincidences and randoms for each temporal frame. The quantitative accuracy achieved using OP-LMEM was compared to that obtained using the histogram-mode 3D ordinary Poisson ordered subset expectation maximization (3D-OP) algorithm with similar scatter and random correction methods, and they showed excellent agreement. The accuracy of the approximated scatter and random estimates was tested by comparing time activity curves (TACs) as well as the spatial scatter distribution from dynamic non-human primate studies obtained from the conventional (frame-based) approach and those obtained from the approximate approach. An excellent agreement was found, and the time required for the calculation of scatter and random estimates in the dynamic studies became much less dependent on the number of frames (we achieved a nearly four times faster performance on the scatter and random estimates by applying the proposed method). The precision of the scatter fraction was also demonstrated for the conventional and the approximate approach using phantom studies

  7. Comparison of sampling techniques for Rift Valley Fever virus ...

    African Journals Online (AJOL)

    We investigated mosquito sampling techniques with two types of traps and attractants at different time for trapping potential vectors for Rift Valley Fever virus. The study was conducted in six villages in Ngorongoro district in Tanzania from September to October 2012. A total of 1814 mosquitoes were collected, of which 738 ...

  8. A line-based vegetation sampling technique and its application in ...

    African Journals Online (AJOL)

    percentage cover, density and intercept frequency) and also provides plant size distributions, yet requires no more sampling effort than the line-intercept method.. A field test of the three techniques in succulent karoo, showed that the discriminating ...

  9. Comparison of correlation analysis techniques for irregularly sampled time series

    Directory of Open Access Journals (Sweden)

    K. Rehfeld

    2011-06-01

    Full Text Available Geoscientific measurements often provide time series with irregular time sampling, requiring either data reconstruction (interpolation or sophisticated methods to handle irregular sampling. We compare the linear interpolation technique and different approaches for analyzing the correlation functions and persistence of irregularly sampled time series, as Lomb-Scargle Fourier transformation and kernel-based methods. In a thorough benchmark test we investigate the performance of these techniques.

    All methods have comparable root mean square errors (RMSEs for low skewness of the inter-observation time distribution. For high skewness, very irregular data, interpolation bias and RMSE increase strongly. We find a 40 % lower RMSE for the lag-1 autocorrelation function (ACF for the Gaussian kernel method vs. the linear interpolation scheme,in the analysis of highly irregular time series. For the cross correlation function (CCF the RMSE is then lower by 60 %. The application of the Lomb-Scargle technique gave results comparable to the kernel methods for the univariate, but poorer results in the bivariate case. Especially the high-frequency components of the signal, where classical methods show a strong bias in ACF and CCF magnitude, are preserved when using the kernel methods.

    We illustrate the performances of interpolation vs. Gaussian kernel method by applying both to paleo-data from four locations, reflecting late Holocene Asian monsoon variability as derived from speleothem δ18O measurements. Cross correlation results are similar for both methods, which we attribute to the long time scales of the common variability. The persistence time (memory is strongly overestimated when using the standard, interpolation-based, approach. Hence, the Gaussian kernel is a reliable and more robust estimator with significant advantages compared to other techniques and suitable for large scale application to paleo-data.

  10. Application of nuclear and allied techniques for the characterisation of forensic samples

    International Nuclear Information System (INIS)

    Sudersanan, M.; Kayasth, S.R.; Pant, D.R.; Chattopadhyay, N.; Bhattacharyya, C.N.

    2002-01-01

    Full text: Forensic science deals with the application of various techniques for physics, chemistry and biology for crime investigation. The legal implication of such analysis put considerable restriction on the choice of analytical techniques. Moreover, the unknown nature of the materials, the limited availability of samples and the large number of elements to be analysed put considerable strain on the analytical chemist on the selection of the appropriate technique. The availability of nuclear techniques has considerably enhanced the scope of forensic analysis. This paper deals with the recent results on the use of nuclear and allied analytical techniques for forensic applications. One of the important types of samples of forensic importance pertain to the identification of gunshot residues. The use of nuclear techniques has considerably simplified the interpretation of results through the use of appropriate elements like Ba, Cu, Sb, Zn, As and Sn etc. The combination of non-nuclear techniques for elements like Pb and Ni which are not easily amenable to be analysed by NAA and the use of appropriate separation procedure has led to the use of this method as a valid and versatile analytical procedure. In view of the presence of a large amounts of extraneous materials like cloth, body tissues etc in these samples and the limited availability of materials, the procedures for sample collection, dissolution and analysis have been standardized. Analysis of unknown materials like powders, metallic pieces etc. for the possible presence of nuclear materials or as materials in illicit trafficking is becoming important in recent years. The use of multi-technique approach is important in this case. Use of non-destructive techniques like XRF and radioactive counting enables the preliminary identification of materials and for the detection of radioactivity. Subsequent analysis by NAA or other appropriate analytical methods allows the characterization of the materials. Such

  11. Additive non-uniform random sampling in superimposed fiber Bragg grating strain gauge

    International Nuclear Information System (INIS)

    Ma, Y C; Liu, H Y; Yan, S B; Li, J M; Tang, J; Yang, Y H; Yang, M W

    2013-01-01

    This paper demonstrates an additive non-uniform random sampling and interrogation method for dynamic and/or static strain gauge using a reflection spectrum from two superimposed fiber Bragg gratings (FBGs). The superimposed FBGs are designed to generate non-equidistant space of a sensing pulse train in the time domain during dynamic strain gauge. By combining centroid finding with smooth filtering methods, both the interrogation speed and accuracy are improved. A 1.9 kHz dynamic strain is measured by generating an additive non-uniform randomly distributed 2 kHz optical sensing pulse train from a mean 500 Hz triangular periodically changing scanning frequency. (paper)

  12. Hubungan antara Pengetahuan dan Kebiasaan Mengkonsumsi Fast Food dengan Status Gizi pada Remaja

    OpenAIRE

    Hanum, T. Syarifah Latifah; Dewi, Ari Pristiana; ', Erwin '

    2015-01-01

    The aim of this research is to identify the correlation between knowledge and habit of consuming fast food with nutrition status in teenage. This research used a cross sectional design. The sample was 83 teenage were taken by using technique of proportionate stratified random sampling. The measuring instrument used was questionnaire as its measure system which was designed by the researcher herself and has been proven by using validity and reliability tests. The analyses of this research were...

  13. Obesity among adolescents in five Arab countries: relative to gender and age

    OpenAIRE

    A.O. Musaiger; M. Al-Mannai; O. Al-Lalla; S. Saghir; I. Halahleh; M. M. Benhamed; F. Kalam; E.Y.A. Ali

    2013-01-01

    Objective: To determine the prevalence of overweight and obesity among adolescents in five Arab countries, relative to age and sex. Methods: A multistage stratified random sampling technique was used to select the secondary school students from five Arab countries (Kuwait, Libya, Palestine, Syria and United Arab Emirates). The total sample was 3,302 (1,584 males, 1,718 females). Weight and height were measured, and body mass index was used to calculate the proportion of overweight and obesity...

  14. Rare event techniques applied in the Rasmussen study

    International Nuclear Information System (INIS)

    Vesely, W.E.

    1977-01-01

    The Rasmussen Study estimated public risks from commercial nuclear power plant accidents, and therefore the statistics of rare events had to be treated. Two types of rare events were specifically handled, those rare events which were probabilistically rare events and those which were statistically rare events. Four techniques were used to estimate probabilities of rare events. These techniques were aggregating data samples, discretizing ''continuous'' events, extrapolating from minor to catastrophic severities, and decomposing events using event trees and fault trees. In aggregating or combining data the goal was to enlarge the data sample so that the rare event was no longer rare, i.e., so that the enlarged data sample contained one or more occurrences of the event of interest. This aggregation gave rise to random variable treatments of failure rates, occurrence frequencies, and other characteristics estimated from data. This random variable treatment can be interpreted as being comparable to an empirical Bayes technique or a Bayesian technique. In the discretizing event technique, events of a detailed nature were grouped together into a grosser event for purposes of analysis as well as for data collection. The treatment of data characteristics as random variables helped to account for the uncertainties arising from this discretizing. In the severity extrapolation technique a severity variable was associated with each event occurrence for the purpose of predicting probabilities of catastrophic occurrences. Tail behaviors of distributions therefore needed to be considered. Finally, event trees and fault trees were used to express accident occurrences and system failures in terms of more basic events for which data existed. Common mode failures and general dependencies therefore needed to be treated. 2 figures

  15. Determination of palladium in biological samples applying nuclear analytical techniques

    International Nuclear Information System (INIS)

    Cavalcante, Cassio Q.; Sato, Ivone M.; Salvador, Vera L. R.; Saiki, Mitiko

    2008-01-01

    This study presents Pd determinations in bovine tissue samples containing palladium prepared in the laboratory, and CCQM-P63 automotive catalyst materials of the Proficiency Test, using instrumental thermal and epithermal neutron activation analysis and energy dispersive X-ray fluorescence techniques. Solvent extraction and solid phase extraction procedures were also applied to separate Pd from interfering elements before the irradiation in the nuclear reactor. The results obtained by different techniques were compared against each other to examine sensitivity, precision and accuracy. (author)

  16. Experimental investigation and CFD simulation of horizontal stratified two-phase flow phenomena

    International Nuclear Information System (INIS)

    Vallee, Christophe; Hoehne, Thomas; Prasser, Horst-Michael; Suehnel, Tobias

    2008-01-01

    For the investigation of stratified two-phase flow, two horizontal channels with rectangular cross-section were built at Forschungszentrum Dresden-Rossendorf (FZD). The channels allow the investigation of air/water co-current flows, especially the slug behaviour, at atmospheric pressure and room temperature. The test-sections are made of acrylic glass, so that optical techniques, like high-speed video observation or particle image velocimetry (PIV), can be applied for measurements. The rectangular cross-section was chosen to provide better observation possibilities. Moreover, dynamic pressure measurements were performed and synchronised with the high-speed camera system. CFD post-test simulations of stratified flows were performed using the code ANSYS CFX. The Euler-Euler two fluid model with the free surface option was applied on grids of minimum 4 x 10 5 control volumes. The turbulence was modelled separately for each phase using the k-ω-based shear stress transport (SST) turbulence model. The results compare very well in terms of slug formation, velocity, and breaking. The qualitative agreement between calculation and experiment is encouraging and shows that CFD can be a useful tool in studying horizontal two-phase flow

  17. Experimental investigation and CFD simulation of horizontal stratified two-phase flow phenomena

    International Nuclear Information System (INIS)

    Vallee, Christophe; Hohne, Thomas; Prasser, Horst-Michael; Suhnel, Tobias

    2007-01-01

    For the investigation of stratified two-phase flow, two horizontal channels with rectangular cross-section were built at Forschungszentrum Rossendorf. The channels allow the investigation of air/water co-current flows, especially the slug behaviour, at atmospheric pressure and room temperature. The test-sections are made of acrylic glass, so that optical techniques, like high-speed video observation or particle image velocimetry (PIV), can be applied for measurements. The rectangular cross-section was chosen to provide better observation possibilities. Moreover, dynamic pressure measurements were performed and synchronized with the high-speed camera system. CFD post test simulations of stratified flows were performed using the code ANSYS CFX. The Euler- Euler two fluid model with the free surface option was applied on grids of minimum 4.10 5 control volumes. The turbulence was modelled separately for each phase using the k-ω based shear stress transport (SST) turbulence model. The results compare very well in terms of slug formation, velocity, and breaking. The qualitative agreement between calculation and experiment is encouraging and shows that CFD can be a useful tool in studying horizontal two-phase flow. (authors)

  18. Experimental investigation and CFD simulation of horizontal stratified two-phase flow phenomena

    Energy Technology Data Exchange (ETDEWEB)

    Vallee, Christophe [Forschungszentrum Dresden-Rossendorf e.V., Dresden (Germany)], E-mail: c.vallee@fzd.de; Hoehne, Thomas; Prasser, Horst-Michael; Suehnel, Tobias [Forschungszentrum Dresden-Rossendorf e.V., Dresden (Germany)

    2008-03-15

    For the investigation of stratified two-phase flow, two horizontal channels with rectangular cross-section were built at Forschungszentrum Dresden-Rossendorf (FZD). The channels allow the investigation of air/water co-current flows, especially the slug behaviour, at atmospheric pressure and room temperature. The test-sections are made of acrylic glass, so that optical techniques, like high-speed video observation or particle image velocimetry (PIV), can be applied for measurements. The rectangular cross-section was chosen to provide better observation possibilities. Moreover, dynamic pressure measurements were performed and synchronised with the high-speed camera system. CFD post-test simulations of stratified flows were performed using the code ANSYS CFX. The Euler-Euler two fluid model with the free surface option was applied on grids of minimum 4 x 10{sup 5} control volumes. The turbulence was modelled separately for each phase using the k-{omega}-based shear stress transport (SST) turbulence model. The results compare very well in terms of slug formation, velocity, and breaking. The qualitative agreement between calculation and experiment is encouraging and shows that CFD can be a useful tool in studying horizontal two-phase flow.

  19. A new sampling technique for surface exposure dating using a portable electric rock cutter

    Directory of Open Access Journals (Sweden)

    Yusuke Suganuma

    2012-07-01

    Full Text Available Surface exposure dating using in situ cosmogenic nuclides has contributed to our understanding of Earth-surface processes. The precision of the ages estimated by this method is affected by the sample geometry; therefore, high accuracy measurements of the thickness and shape of the rock sample (thickness and shape is crucial. However, it is sometimes diffi cult to meet these requirements by conventional sampling methods with a hammer and chisel. Here, we propose a new sampling technique using a portable electric rock cutter. This sampling technique is faster, produces more precisely shaped samples, and allows for a more precise age interpretation. A simple theoretical modeldemonstrates that the age error due to defective sample geometry increases as the total sample thickness increases, indicating the importance of precise sampling for surface exposure dating.

  20. Mantle biopsy: a technique for nondestructive tissue-sampling of freshwater mussels

    Science.gov (United States)

    David J. Berg; Wendell R. Haag; Sheldon I. Guttman; James B. Sickel

    1995-01-01

    Mantle biopsy is a means of obtaining tissue samples for genetic, physiological, and contaminant studies of bivalves; but the effects of this biopsy on survival have not been determined. We describe a simple technique for obtaining such samples from unionacean bivalves and how we compared survival among biopsied and control organisms in field experiments. Survival was...

  1. A Monte Carlo Sampling Technique for Multi-phonon Processes

    Energy Technology Data Exchange (ETDEWEB)

    Hoegberg, Thure

    1961-12-15

    A sampling technique for selecting scattering angle and energy gain in Monte Carlo calculations of neutron thermalization is described. It is supposed that the scattering is separated into processes involving different numbers of phonons. The number of phonons involved is first determined. Scattering angle and energy gain are then chosen by using special properties of the multi-phonon term.

  2. Simulation model of stratified thermal energy storage tank using finite difference method

    Science.gov (United States)

    Waluyo, Joko

    2016-06-01

    Stratified TES tank is normally used in the cogeneration plant. The stratified TES tanks are simple, low cost, and equal or superior in thermal performance. The advantage of TES tank is that it enables shifting of energy usage from off-peak demand for on-peak demand requirement. To increase energy utilization in a stratified TES tank, it is required to build a simulation model which capable to simulate the charging phenomenon in the stratified TES tank precisely. This paper is aimed to develop a novel model in addressing the aforementioned problem. The model incorporated chiller into the charging of stratified TES tank system in a closed system. The model was developed in one-dimensional type involve with heat transfer aspect. The model covers the main factors affect to degradation of temperature distribution namely conduction through the tank wall, conduction between cool and warm water, mixing effect on the initial flow of the charging as well as heat loss to surrounding. The simulation model is developed based on finite difference method utilizing buffer concept theory and solved in explicit method. Validation of the simulation model is carried out using observed data obtained from operating stratified TES tank in cogeneration plant. The temperature distribution of the model capable of representing S-curve pattern as well as simulating decreased charging temperature after reaching full condition. The coefficient of determination values between the observed data and model obtained higher than 0.88. Meaning that the model has capability in simulating the charging phenomenon in the stratified TES tank. The model is not only capable of generating temperature distribution but also can be enhanced for representing transient condition during the charging of stratified TES tank. This successful model can be addressed for solving the limitation temperature occurs in charging of the stratified TES tank with the absorption chiller. Further, the stratified TES tank can be

  3. Evaluation of primary immunization coverage of infants under universal immunization programme in an urban area of bangalore city using cluster sampling and lot quality assurance sampling techniques.

    Science.gov (United States)

    K, Punith; K, Lalitha; G, Suman; Bs, Pradeep; Kumar K, Jayanth

    2008-07-01

    Is LQAS technique better than cluster sampling technique in terms of resources to evaluate the immunization coverage in an urban area? To assess and compare the lot quality assurance sampling against cluster sampling in the evaluation of primary immunization coverage. Population-based cross-sectional study. Areas under Mathikere Urban Health Center. Children aged 12 months to 23 months. 220 in cluster sampling, 76 in lot quality assurance sampling. Percentages and Proportions, Chi square Test. (1) Using cluster sampling, the percentage of completely immunized, partially immunized and unimmunized children were 84.09%, 14.09% and 1.82%, respectively. With lot quality assurance sampling, it was 92.11%, 6.58% and 1.31%, respectively. (2) Immunization coverage levels as evaluated by cluster sampling technique were not statistically different from the coverage value as obtained by lot quality assurance sampling techniques. Considering the time and resources required, it was found that lot quality assurance sampling is a better technique in evaluating the primary immunization coverage in urban area.

  4. TRAN-STAT: statistics for environmental studies, Number 22. Comparison of soil-sampling techniques for plutonium at Rocky Flats

    International Nuclear Information System (INIS)

    Gilbert, R.O.; Bernhardt, D.E.; Hahn, P.B.

    1983-01-01

    A summary of a field soil sampling study conducted around the Rocky Flats Colorado plant in May 1977 is preseted. Several different soil sampling techniques that had been used in the area were applied at four different sites. One objective was to comparethe average 239 - 240 Pu concentration values obtained by the various soil sampling techniques used. There was also interest in determining whether there are differences in the reproducibility of the various techniques and how the techniques compared with the proposed EPA technique of sampling to 1 cm depth. Statistically significant differences in average concentrations between the techniques were found. The differences could be largely related to the differences in sampling depth-the primary physical variable between the techniques. The reproducibility of the techniques was evaluated by comparing coefficients of variation. Differences between coefficients of variation were not statistically significant. Average (median) coefficients ranged from 21 to 42 percent for the five sampling techniques. A laboratory study indicated that various sample treatment and particle sizing techniques could increase the concentration of plutonium in the less than 10 micrometer size fraction by up to a factor of about 4 compared to the 2 mm size fraction

  5. Free Falling in Stratified Fluids

    Science.gov (United States)

    Lam, Try; Vincent, Lionel; Kanso, Eva

    2017-11-01

    Leaves falling in air and discs falling in water are examples of unsteady descents due to complex interaction between gravitational and aerodynamic forces. Understanding these descent modes is relevant to many branches of engineering and science such as estimating the behavior of re-entry space vehicles to studying biomechanics of seed dispersion. For regularly shaped objects falling in homogenous fluids, the motion is relatively well understood. However, less is known about how density stratification of the fluid medium affects the falling behavior. Here, we experimentally investigate the descent of discs in both pure water and in stable linearly stratified fluids for Froude numbers Fr 1 and Reynolds numbers Re between 1000 -2000. We found that stable stratification (1) enhances the radial dispersion of the disc at landing, (2) increases the descent time, (3) decreases the inclination (or nutation) angle, and (4) decreases the fluttering amplitude while falling. We conclude by commenting on how the corresponding information can be used as a predictive model for objects free falling in stratified fluids.

  6. Evaluation of primary immunization coverage of infants under universal immunization programme in an urban area of Bangalore city using cluster sampling and lot quality assurance sampling techniques

    Directory of Open Access Journals (Sweden)

    Punith K

    2008-01-01

    Full Text Available Research Question: Is LQAS technique better than cluster sampling technique in terms of resources to evaluate the immunization coverage in an urban area? Objective: To assess and compare the lot quality assurance sampling against cluster sampling in the evaluation of primary immunization coverage. Study Design: Population-based cross-sectional study. Study Setting: Areas under Mathikere Urban Health Center. Study Subjects: Children aged 12 months to 23 months. Sample Size: 220 in cluster sampling, 76 in lot quality assurance sampling. Statistical Analysis: Percentages and Proportions, Chi square Test. Results: (1 Using cluster sampling, the percentage of completely immunized, partially immunized and unimmunized children were 84.09%, 14.09% and 1.82%, respectively. With lot quality assurance sampling, it was 92.11%, 6.58% and 1.31%, respectively. (2 Immunization coverage levels as evaluated by cluster sampling technique were not statistically different from the coverage value as obtained by lot quality assurance sampling techniques. Considering the time and resources required, it was found that lot quality assurance sampling is a better technique in evaluating the primary immunization coverage in urban area.

  7. Monte Carlo and Quasi-Monte Carlo Sampling

    CERN Document Server

    Lemieux, Christiane

    2009-01-01

    Presents essential tools for using quasi-Monte Carlo sampling in practice. This book focuses on issues related to Monte Carlo methods - uniform and non-uniform random number generation, variance reduction techniques. It covers several aspects of quasi-Monte Carlo methods.

  8. Parameterizing Spatial Models of Infectious Disease Transmission that Incorporate Infection Time Uncertainty Using Sampling-Based Likelihood Approximations.

    Directory of Open Access Journals (Sweden)

    Rajat Malik

    Full Text Available A class of discrete-time models of infectious disease spread, referred to as individual-level models (ILMs, are typically fitted in a Bayesian Markov chain Monte Carlo (MCMC framework. These models quantify probabilistic outcomes regarding the risk of infection of susceptible individuals due to various susceptibility and transmissibility factors, including their spatial distance from infectious individuals. The infectious pressure from infected individuals exerted on susceptible individuals is intrinsic to these ILMs. Unfortunately, quantifying this infectious pressure for data sets containing many individuals can be computationally burdensome, leading to a time-consuming likelihood calculation and, thus, computationally prohibitive MCMC-based analysis. This problem worsens when using data augmentation to allow for uncertainty in infection times. In this paper, we develop sampling methods that can be used to calculate a fast, approximate likelihood when fitting such disease models. A simple random sampling approach is initially considered followed by various spatially-stratified schemes. We test and compare the performance of our methods with both simulated data and data from the 2001 foot-and-mouth disease (FMD epidemic in the U.K. Our results indicate that substantial computation savings can be obtained--albeit, of course, with some information loss--suggesting that such techniques may be of use in the analysis of very large epidemic data sets.

  9. Analytical techniques for measurement of 99Tc in environmental samples

    International Nuclear Information System (INIS)

    Anon.

    1979-01-01

    Three new methods have been developed for measuring 99 Tc in environmental samples. The most sensitive method is isotope dilution mass spectrometry, which allows measurement of about 1 x 10 -12 grams of 99 Tc. Results on analysis of five samples by this method compare very well with values obtained by a second independent method, which involves counting of beta particles from 99 Tc and internal conversion electrons from /sup 97m/Tc. A third method involving electrothermal atomic absorption has also been developed. Although this method is not as sensitive as the first two techniques, the cost per analysis is expected to be considerably less for certain types of samples

  10. Determination of some trace elements in biological samples using XRF and TXRF techniques

    International Nuclear Information System (INIS)

    Khuder, A.; Karjou, J.; Sawan, M. K.

    2006-07-01

    XRF and TXRF techniques were successfully used for the multi-element determination of trace elements in whole blood and human head hair samples. This was achieved by the direct analysis using XRF technique with different collimation units and by the optimized chemical procedures for TXRF analysis. Light element of S and P were preferably determined by XRF with primary x-ray excitation, while, elements of K, Ca, Fe, and Br were determined with a very good accuracy and precision using XRF with Cu- and Mo-secondary targets. The chemical procedure dependent on the preconcentration of trace elements by APDC was superiorly used for the determination of traces of Ni and Pb in the range of 1.0-1.7 μg/dl and 11-23 μg/dl, respectively, in whole blood samples by TXRF technique; determination of other elements as Cu and Zn was also achievable using this approach. Rb in whole blood samples was determined directly after the digestion of samples using PTFE-bomb for TXRF analysis. (author)

  11. Recent advances in sample preparation techniques and methods of sulfonamides detection - A review.

    Science.gov (United States)

    Dmitrienko, Stanislava G; Kochuk, Elena V; Apyari, Vladimir V; Tolmacheva, Veronika V; Zolotov, Yury A

    2014-11-19

    Sulfonamides (SAs) have been the most widely used antimicrobial drugs for more than 70 years, and their residues in foodstuffs and environmental samples pose serious health hazards. For this reason, sensitive and specific methods for the quantification of these compounds in numerous matrices have been developed. This review intends to provide an updated overview of the recent trends over the past five years in sample preparation techniques and methods for detecting SAs. Examples of the sample preparation techniques, including liquid-liquid and solid-phase extraction, dispersive liquid-liquid microextraction and QuEChERS, are given. Different methods of detecting the SAs present in food and feed and in environmental, pharmaceutical and biological samples are discussed. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Impressions of the turbulence variability in a weakly stratified, flat-bottom deep-sea ‘boundary layer’

    NARCIS (Netherlands)

    van Haren, H.

    2015-01-01

    The character of turbulent overturns in a weakly stratified deep-sea is investigated in some detail using 144 high-resolution temperature sensors at 0.7 m intervals, starting 5 m above the bottom. A 9-day, 1 Hz sampled record from the 912 m depth flat-bottom (<0.5% bottom-slope) mooring site in the

  13. IMAGE SEGMENTATION BASED ON MARKOV RANDOM FIELD AND WATERSHED TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    This paper presented a method that incorporates Markov Random Field(MRF), watershed segmentation and merging techniques for performing image segmentation and edge detection tasks. MRF is used to obtain an initial estimate of x regions in the image under process where in MRF model, gray level x, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The process needs an initial segmented result. An initial segmentation is got based on K-means clustering technique and the minimum distance, then the region process in modeled by MRF to obtain an image contains different intensity regions. Starting from this we calculate the gradient values of that image and then employ a watershed technique. When using MRF method it obtains an image that has different intensity regions and has all the edge and region information, then it improves the segmentation result by superimpose closed and an accurate boundary of each region using watershed algorithm. After all pixels of the segmented regions have been processed, a map of primitive region with edges is generated. Finally, a merge process based on averaged mean values is employed. The final segmentation and edge detection result is one closed boundary per actual region in the image.

  14. Efficacy and complications associated with a modified inferior alveolar nerve block technique. A randomized, triple-blind clinical trial.

    Science.gov (United States)

    Montserrat-Bosch, Marta; Figueiredo, Rui; Nogueira-Magalhães, Pedro; Arnabat-Dominguez, Josep; Valmaseda-Castellón, Eduard; Gay-Escoda, Cosme

    2014-07-01

    To compare the efficacy and complication rates of two different techniques for inferior alveolar nerve blocks (IANB). A randomized, triple-blind clinical trial comprising 109 patients who required lower third molar removal was performed. In the control group, all patients received an IANB using the conventional Halsted technique, whereas in the experimental group, a modified technique using a more inferior injection point was performed. A total of 100 patients were randomized. The modified technique group showed a significantly higher onset time in the lower lip and chin area, and was frequently associated to a lingual electric discharge sensation. Three failures were recorded, 2 of them in the experimental group. No relevant local or systemic complications were registered. Both IANB techniques used in this trial are suitable for lower third molar removal. However, performing an inferior alveolar nerve block in a more inferior position (modified technique) extends the onset time, does not seem to reduce the risk of intravascular injections and might increase the risk of lingual nerve injuries.

  15. Cost-effective sampling of 137Cs-derived net soil redistribution: part 1 – estimating the spatial mean across scales of variation

    International Nuclear Information System (INIS)

    Li, Y.; Chappell, A.; Nyamdavaa, B.; Yu, H.; Davaasuren, D.; Zoljargal, K.

    2015-01-01

    The 137 Cs technique for estimating net time-integrated soil redistribution is valuable for understanding the factors controlling soil redistribution by all processes. The literature on this technique is dominated by studies of individual fields and describes its typically time-consuming nature. We contend that the community making these studies has inappropriately assumed that many 137 Cs measurements are required and hence estimates of net soil redistribution can only be made at the field scale. Here, we support future studies of 137 Cs-derived net soil redistribution to apply their often limited resources across scales of variation (field, catchment, region etc.) without compromising the quality of the estimates at any scale. We describe a hybrid, design-based and model-based, stratified random sampling design with composites to estimate the sampling variance and a cost model for fieldwork and laboratory measurements. Geostatistical mapping of net (1954–2012) soil redistribution as a case study on the Chinese Loess Plateau is compared with estimates for several other sampling designs popular in the literature. We demonstrate the cost-effectiveness of the hybrid design for spatial estimation of net soil redistribution. To demonstrate the limitations of current sampling approaches to cut across scales of variation, we extrapolate our estimate of net soil redistribution across the region, show that for the same resources, estimates from many fields could have been provided and would elucidate the cause of differences within and between regional estimates. We recommend that future studies evaluate carefully the sampling design to consider the opportunity to investigate 137 Cs-derived net soil redistribution across scales of variation. - Highlights: • The 137 Cs technique estimates net time-integrated soil redistribution by all processes. • It is time-consuming and dominated by studies of individual fields. • We use limited resources to estimate soil

  16. Turbulent transport of passive scalar behind line sources in an unstably stratified open channel flow

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Chun-Ho [The Hong Kong Polytechnic University, Kowloon (Hong Kong). Department of Building and Real Estate; Leung, Dennis Y.C. [The University of Hong Kong (Hong Kong). Department of Mechanical Engineering

    2006-11-15

    This study employs a direct numerical simulation (DNS) technique to study the flow, turbulence structure, and passive scalar plume transport behind line sources in an unstably stratified open channel flow. The scalar transport behaviors for five emission heights (z{sub s}=0, 0.25H, 0.5H, 0.75H, and H, where H is the channel height) at a Reynolds number of 3000, a Prandtl number and a Schmidt number of 0.72, and a Richardson number of -0.2 are investigated. The vertically meandering mean plume heights and dispersion coefficients calculated by the current DNS model agree well with laboratory results and field measurements in literature. It is found that the plume meandering is due to the movement of the positive and negative vertical turbulent scalar fluxes above and below the mean plume heights, respectively. These findings help explaining the plume meandering mechanism in the unstably stratified atmospheric boundary layer. (author)

  17. Symbol synchronization and sampling frequency synchronization techniques in real-time DDO-OFDM systems

    Science.gov (United States)

    Chen, Ming; He, Jing; Cao, Zizheng; Tang, Jin; Chen, Lin; Wu, Xian

    2014-09-01

    In this paper, we propose and experimentally demonstrate a symbol synchronization and sampling frequency synchronization techniques in real-time direct-detection optical orthogonal frequency division multiplexing (DDO-OFDM) system, over 100-km standard single mode fiber (SSMF) using a cost-effective directly modulated distributed feedback (DFB) laser. The experiment results show that the proposed symbol synchronization based on training sequence (TS) has a low complexity and high accuracy even at a sampling frequency offset (SFO) of 5000-ppm. Meanwhile, the proposed pilot-assisted sampling frequency synchronization between digital-to-analog converter (DAC) and analog-to-digital converter (ADC) is capable of estimating SFOs with an accuracy of technique can also compensate SFO effects within a small residual SFO caused by deviation of SFO estimation and low-precision or unstable clock source. The two synchronization techniques are suitable for high-speed DDO-OFDM transmission systems.

  18. Prevalence and Risk Factors of Dengue Infection in Khanh Hoa Province, Viet Nam: A Stratified Cluster Sampling Survey.

    Science.gov (United States)

    Mai, Vien Quang; Mai, Trịnh Thị Xuan; Tam, Ngo Le Minh; Nghia, Le Trung; Komada, Kenichi; Murakami, Hitoshi

    2018-05-19

    Dengue is a clinically important arthropod-borne viral disease with increasing global incidence. Here we aimed to estimate the prevalence of dengue infections in Khanh Hoa Province, central Viet Nam, and to identify risk factors for infection. We performed a stratified cluster sampling survey including residents of 3-60 years of age in Nha Trang City, Ninh Hoa District and Dien Khanh District, Khanh Hoa Province, in October 2011. Immunoglobulin G (IgG) and immunoglobulin M (IgM) against dengue were analyzed using a rapid test kit. Participants completed a questionnaire exploring clinical dengue incidence, socio-economic status, and individual behavior. A household checklist was used to examine environment, mosquito larvae presence, and exposure to public health interventions. IgG positivity was 20.5% (urban, 16.3%; rural, 23.0%), IgM positivity was 6.7% (urban, 6.4%; rural, 6.9%), and incidence of clinically compatible dengue during the prior 3 months was 2.8 per 1,000 persons (urban, 1.7; rural, 3.4). For IgG positivity, the adjusted odds ratio (AOR) was 2.68 (95% confidence interval [CI], 1.24-5.81) for mosquito larvae presence in water pooled in old tires and was 3.09 (95% CI, 1.75-5.46) for proximity to a densely inhabited area. For IgM positivity, the AOR was 3.06 (95% CI, 1.50-6.23) for proximity to a densely inhabited area. Our results indicated rural penetration of dengue infections. Control measures should target densely inhabited areas, and may include clean-up of discarded tires and water-collecting waste.

  19. Clinical research in small genomically stratified patient populations.

    Science.gov (United States)

    Martin-Liberal, J; Rodon, J

    2017-07-01

    The paradigm of early drug development in cancer is shifting from 'histology-oriented' to 'molecularly oriented' clinical trials. This change can be attributed to the vast amount of tumour biology knowledge generated by large international research initiatives such as The Cancer Genome Atlas (TCGA) and the use of next generation sequencing (NGS) techniques developed in recent years. However, targeting infrequent molecular alterations entails a series of special challenges. The optimal molecular profiling method, the lack of standardised biological thresholds, inter- and intra-tumor heterogeneity, availability of enough tumour material, correct clinical trials design, attrition rate, logistics or costs are only some of the issues that need to be taken into consideration in clinical research in small genomically stratified patient populations. This article examines the most relevant challenges inherent to clinical research in these populations. Moreover, perspectives from the Academia point of view are reviewed as well as initiatives to be taken in forthcoming years. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Thermal stratification built up in hot water tank with different inlet stratifiers

    DEFF Research Database (Denmark)

    Dragsted, Janne; Furbo, Simon; Dannemand, Mark

    2017-01-01

    Thermal stratification in a water storage tank can strongly increase the thermal performance of solar heating systems. Thermal stratification can be built up in a storage tank during charge, if the heated water enters through an inlet stratifier. Experiments with a test tank have been carried out...... in order to elucidate how well thermal stratification is established in the tank with differently designed inlet stratifiers under different controlled laboratory conditions. The investigated inlet stratifiers are from Solvis GmbH & Co KG and EyeCular Technologies ApS. The inlet stratifier from Solvis Gmb...... for Solvis GmbH & Co KG had a better performance at 4 l/min. In the intermediate charge test the stratifier from EyeCular Technologies ApS had a better performance in terms of maintaining the thermal stratification in the storage tank while charging with a relative low temperature. [All rights reserved...

  1. Modified emission-transmission method for determining trace elements in solid samples using the XRF techniques

    International Nuclear Information System (INIS)

    Poblete, V.; Alvarez, M.; Hermosilla, M.

    2000-01-01

    This is a study of an analysis of trace elements in medium thick solid samples, by the modified transmission emission method, using the energy dispersion X-ray fluorescence technique (EDXRF). The effects of absorption and reinforcement are the main disadvantages of the EDXRF technique for the quantitative analysis of bigger elements and trace elements in solid samples. The implementation of this method and its application to a variety of samples was carried out using an infinitely thick multi-element white sample that calculates the correction factors by absorbing all the analytes in the sample. The discontinuities in the masic absorption coefficients versus energies association for each element, with medium thick and homogenous samples, are analyzed and corrected. A thorough analysis of the different theoretical and test variables are proven by using real samples, including certified material with known concentration. The simplicity of the calculation method and the results obtained show the method's major precision, with possibilities for the non-destructive routine analysis of different solid samples, using the EDXRF technique (author)

  2. Large eddy simulation of turbulent and stably-stratified flows

    International Nuclear Information System (INIS)

    Fallon, Benoit

    1994-01-01

    The unsteady turbulent flow over a backward-facing step is studied by mean of Large Eddy Simulations with structure function sub grid model, both in isothermal and stably-stratified configurations. Without stratification, the flow develops highly-distorted Kelvin-Helmholtz billows, undergoing to helical pairing, with A-shaped vortices shed downstream. We show that forcing injected by recirculation fluctuations governs this oblique mode instabilities development. The statistical results show good agreements with the experimental measurements. For stably-stratified configurations, the flow remains more bi-dimensional. We show with increasing stratification, how the shear layer growth is frozen by inhibition of pairing process then of Kelvin-Helmholtz instabilities, and the development of gravity waves or stable density interfaces. Eddy structures of the flow present striking analogies with the stratified mixing layer. Additional computations show the development of secondary Kelvin-Helmholtz instabilities on the vorticity layers between two primary structures. This important mechanism based on baroclinic effects (horizontal density gradients) constitutes an additional part of the turbulent mixing process. Finally, the feasibility of Large Eddy Simulation is demonstrated for industrial flows, by studying a complex stratified cavity. Temperature fluctuations are compared to experimental measurements. We also develop three-dimensional un-stationary animations, in order to understand and visualize turbulent interactions. (author) [fr

  3. Random assay in radioimmunoassay: Feasibility and application compared with batch assay

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Min; Lee, Hwan Hee; Park, Sohyun; Kim, Tae Sung; Kim, Seok Ki [Dept. of Nuclear MedicineNational Cancer Center, Goyang (Korea, Republic of)

    2016-12-15

    The batch assay has been conventionally used for radioimmunoassay (RIA) because of its technical robustness and practical convenience. However, it has limitations in terms of the relative lag of report time due to the necessity of multiple assays in a small number of samples compared with the random assay technique. In this study, we aimed to verify whether the random assay technique can be applied in RIA and is feasible in daily practice. The coefficients of variation (CVs) of eight standard curves within a single kit were calculated in a CA-125 immunoradiometric assay (IRMA) for the reference of the practically ideal CV of the CA-125 kit. Ten standard curves of 10 kits from 2 prospectively collected lots (pLot) and 85 standard curves of 85 kits from 3 retrospectively collected lots (Lot) were obtained. Additionally, the raw measurement data of both 170 control references and 1123 patients' sera were collected retrospectively between December 2015 and January 2016. A standard curve of the first kit of each lot was used as a master standard curve for a random assay. The CVs of inter-kits were analyzed in each lot, respectively. All raw measurements were normalized by decay and radioactivity. The CA-125 values from control samples and patients' sera were compared using the original batch assay and random assay. In standard curve analysis, the CVs of inter-kits in pLots and Lots were comparable to those within a single kit. The CVs from the random assay with normalization were similar to those from the batch assay in the control samples (CVs % of low/high concentration; Lot1 2.71/1.91, Lot2 2.35/1.83, Lot3 2.83/2.08 vs. Lot1 2.05/1.21, Lot2 1.66/1.48, Lot3 2.41/2.14). The ICCs between the batch assay and random assay using patients' sera were satisfactory (Lot1 1.00, Lot2 0.999, Lot3 1.00). The random assay technique could be successfully applied to the conventional CA-125 IRMA kits. The random assay showed strong agreement with the batch assay. The

  4. Recent Trends in Microextraction Techniques Employed in Analytical and Bioanalytical Sample Preparation

    Directory of Open Access Journals (Sweden)

    Abuzar Kabir

    2017-12-01

    Full Text Available Sample preparation has been recognized as a major step in the chemical analysis workflow. As such, substantial efforts have been made in recent years to simplify the overall sample preparation process. Major focusses of these efforts have included miniaturization of the extraction device; minimizing/eliminating toxic and hazardous organic solvent consumption; eliminating sample pre-treatment and post-treatment steps; reducing the sample volume requirement; reducing extraction equilibrium time, maximizing extraction efficiency etc. All these improved attributes are congruent with the Green Analytical Chemistry (GAC principles. Classical sample preparation techniques such as solid phase extraction (SPE and liquid-liquid extraction (LLE are being rapidly replaced with emerging miniaturized and environmentally friendly techniques such as Solid Phase Micro Extraction (SPME, Stir bar Sorptive Extraction (SBSE, Micro Extraction by Packed Sorbent (MEPS, Fabric Phase Sorptive Extraction (FPSE, and Dispersive Liquid-Liquid Micro Extraction (DLLME. In addition to the development of many new generic extraction sorbents in recent years, a large number of molecularly imprinted polymers (MIPs created using different template molecules have also enriched the large cache of microextraction sorbents. Application of nanoparticles as high-performance extraction sorbents has undoubtedly elevated the extraction efficiency and method sensitivity of modern chromatographic analyses to a new level. Combining magnetic nanoparticles with many microextraction sorbents has opened up new possibilities to extract target analytes from sample matrices containing high volumes of matrix interferents. The aim of the current review is to critically audit the progress of microextraction techniques in recent years, which has indisputably transformed the analytical chemistry practices, from biological and therapeutic drug monitoring to the environmental field; from foods to phyto

  5. Development of core sampling technique for ITER Type B radwaste

    Energy Technology Data Exchange (ETDEWEB)

    Kim, S. G.; Hong, K. P.; Oh, W. H.; Park, M. C.; Jung, S. H.; Ahn, S. B. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Type B radwaste (intermediate level and long lived radioactive waste) imported from ITER vacuum vessel are to be treated and stored in basement of hot cell building. The Type B radwaste treatment process is composed of buffer storage, cutting, sampling/tritium measurement, tritium removal, characterization, pre-packaging, inspection/decontamination, and storage etc. The cut slices of Type B radwaste components generated from cutting process undergo sampling process before and after tritium removal process. The purpose of sampling is to obtain small pieces of samples in order to investigate the tritium content and concentration of Type B radwaste. Core sampling, which is the candidates of sampling technique to be applied to ITER hot cell, is available for not thick (less than 50 mm) metal without use of coolant. Experimented materials were SS316L and CuCrZr in order to simulate ITER Type B radwaste. In core sampling, substantial secondary wastes from cutting chips will be produced unavoidably. Thus, core sampling machine will have to be equipped with disposal system such as suction equipment. Core sampling is considered an unfavorable method for tool wear compared to conventional drilling.

  6. PENGARUH KOMITMEN ORGANISASIONAL,MOTIVASI INTRINSIK,DAN MOTIVASI EKSTRINSIK TERHADAP KINERJA DOKTER(Studi pada Dokter Rumah Sakit Umum Daerah Ulin Banjarmasin)

    OpenAIRE

    Agustina Rahmah; Ahmad Alim Bachri; Anna Nur Faidah

    2016-01-01

    This  research  aimed  to  know  and  analyse  influence  of  organizational commitment, intrinsic motivation, and extrinsic motivation either partial and simultaneus  on  doctor’s  performance  in  Ulin  hospital  region  general  service office Banjarmasin.  The purpose of this study is explanatory research. Research sample stipulating technique  that  used  stratified  proporsional  random  sampling,  got  research sample as much as 112 respondents from population 157 person. Data anal...

  7. A simple sample size formula for analysis of covariance in cluster randomized trials.

    NARCIS (Netherlands)

    Teerenstra, S.; Eldridge, S.; Graff, M.J.; Hoop, E. de; Borm, G.F.

    2012-01-01

    For cluster randomized trials with a continuous outcome, the sample size is often calculated as if an analysis of the outcomes at the end of the treatment period (follow-up scores) would be performed. However, often a baseline measurement of the outcome is available or feasible to obtain. An

  8. 3D-Laser-Scanning Technique Applied to Bulk Density Measurements of Apollo Lunar Samples

    Science.gov (United States)

    Macke, R. J.; Kent, J. J.; Kiefer, W. S.; Britt, D. T.

    2015-01-01

    In order to better interpret gravimetric data from orbiters such as GRAIL and LRO to understand the subsurface composition and structure of the lunar crust, it is import to have a reliable database of the density and porosity of lunar materials. To this end, we have been surveying these physical properties in both lunar meteorites and Apollo lunar samples. To measure porosity, both grain density and bulk density are required. For bulk density, our group has historically utilized sub-mm bead immersion techniques extensively, though several factors have made this technique problematic for our work with Apollo samples. Samples allocated for measurement are often smaller than optimal for the technique, leading to large error bars. Also, for some samples we were required to use pure alumina beads instead of our usual glass beads. The alumina beads were subject to undesirable static effects, producing unreliable results. Other investigators have tested the use of 3d laser scanners on meteorites for measuring bulk volumes. Early work, though promising, was plagued with difficulties including poor response on dark or reflective surfaces, difficulty reproducing sharp edges, and large processing time for producing shape models. Due to progress in technology, however, laser scanners have improved considerably in recent years. We tested this technique on 27 lunar samples in the Apollo collection using a scanner at NASA Johnson Space Center. We found it to be reliable and more precise than beads, with the added benefit that it involves no direct contact with the sample, enabling the study of particularly friable samples for which bead immersion is not possible

  9. Analysis of pure and malachite green doped polysulfone sample using FT-IR technique

    Science.gov (United States)

    Nayak, Rashmi J.; Khare, P. K.; Nayak, J. G.

    2018-05-01

    The sample of pure and malachite green doped Polysulfone in the form of foil was prepared by isothermal immersion technique. For the preparation of pure sample 4 gm of Polysulfone was dissolved in 50 ml of Dimethyl farmamide (DMF) solvent, while for the preparation of doped sample 10 mg, 50 mg and 100 mg Malachite Green was mixed with 4 gm of Polysulfone respectively. For the study of structural characterization of these pure and doped sample, Fourier Transform Infra-Red Spectroscopy (FT-IR) technique was used. This study shows that the intensity of transmittance decreases as the ratio of doping increases in pure polysulfone. The reduction in intensity of transmittance is clearly apparent in the present case more over the bands were broader which indicates towards charge transfer interaction between the donar and acceptor molecule.

  10. Association between Spouse/Child Separation and Migration-Related Stress among a Random Sample of Rural-to-Urban Migrants in Wuhan, China.

    Directory of Open Access Journals (Sweden)

    Yan Guo

    Full Text Available Millions of people move from rural areas to urban areas in China to pursue new opportunities while leaving their spouses and children at rural homes. Little is known about the impact of migration-related separation on mental health of these rural migrants in urban China.Survey data from a random sample of rural-to-urban migrants (n = 1113, aged 18-45 from Wuhan were analyzed. The Domestic Migration Stress Questionnaire (DMSQ, an instrument with four subconstructs, was used to measure migration-related stress. The relationship between spouse/child separation and stress was assessed using survey estimation methods to account for the multi-level sampling design.16.46% of couples were separated from their spouses (spouse-separation only, 25.81% of parents were separated from their children (child separation only. Among the participants who married and had children, 5.97% were separated from both their spouses and children (double separation. Spouse-separation only and double separation did not scored significantly higher on DMSQ than those with no separation. Compared to parents without child separation, parents with child separation scored significantly higher on DMSQ (mean score = 2.88, 95% CI: [2.81, 2.95] vs. 2.60 [2.53, 2.67], p < .05. Stratified analysis by separation type and by gender indicated that the association was stronger for child-separation only and for female participants.Child-separation is an important source of migration-related stress, and the effect is particularly strong for migrant women. Public policies and intervention programs should consider these factors to encourage and facilitate the co-migration of parents with their children to mitigate migration-related stress.

  11. Rationale, design, methodology and sample characteristics for the Vietnam pre-conceptual micronutrient supplementation trial (PRECONCEPT: a randomized controlled study

    Directory of Open Access Journals (Sweden)

    Nguyen Phuong H

    2012-10-01

    Full Text Available Abstract Background Low birth weight and maternal anemia remain intractable problems in many developing countries. The adequacy of the current strategy of providing iron-folic acid (IFA supplements only during pregnancy has been questioned given many women enter pregnancy with poor iron stores, the substantial micronutrient demand by maternal and fetal tissues, and programmatic issues related to timing and coverage of prenatal care. Weekly IFA supplementation for women of reproductive age (WRA improves iron status and reduces the burden of anemia in the short term, but few studies have evaluated subsequent pregnancy and birth outcomes. The Preconcept trial aims to determine whether pre-pregnancy weekly IFA or multiple micronutrient (MM supplementation will improve birth outcomes and maternal and infant iron status compared to the current practice of prenatal IFA supplementation only. This paper provides an overview of study design, methodology and sample characteristics from baseline survey data and key lessons learned. Methods/design We have recruited 5011 WRA in a double-blind stratified randomized controlled trial in rural Vietnam and randomly assigned them to receive weekly supplements containing either: 1 2800 μg folic acid 2 60 mg iron and 2800 μg folic acid or 3 MM. Women who become pregnant receive daily IFA, and are being followed through pregnancy, delivery, and up to three months post-partum. Study outcomes include birth outcomes and maternal and infant iron status. Data are being collected on household characteristics, maternal diet and mental health, anthropometry, infant feeding practices, morbidity and compliance. Discussion The study is timely and responds to the WHO Global Expert Consultation which identified the need to evaluate the long term benefits of weekly IFA and MM supplementation in WRA. Findings will generate new information to help guide policy and programs designed to reduce the burden of anemia in women and

  12. Presence of psychoactive substances in oral fluid from randomly selected drivers in Denmark

    DEFF Research Database (Denmark)

    Simonsen, K. Wiese; Steentoft, A.; Hels, Tove

    2012-01-01

    . The percentage of drivers positive for medicinal drugs above the Danish legal concentration limit was 0.4%; while, 0.3% of the drivers tested positive for one or more illicit drug at concentrations exceeding the Danish legal limit. Tetrahydrocannabinol, cocaine, and amphetamine were the most frequent illicit......This roadside study is the Danish part of the EU-project DRUID (Driving under the Influence of Drugs, Alcohol, and Medicines) and included three representative regions in Denmark. Oral fluid samples (n = 3002) were collected randomly from drivers using a sampling scheme stratified by time, season......, and road type. The oral fluid samples were screened for 29 illegal and legal psychoactive substances and metabolites as well as ethanol. Fourteen (0.5%) drivers were positive for ethanol alone or in combination with drugs) at concentrations above 0.53 g/l (0.5 mg/g), which is the Danish legal limit...

  13. Comparison of different anesthesia techniques during esophagogastroduedenoscopy in children: a randomized trial.

    Science.gov (United States)

    Patino, Mario; Glynn, Susan; Soberano, Mark; Putnam, Philip; Hossain, Md Monir; Hoffmann, Clifford; Samuels, Paul; Kibelbek, Michael J; Gunter, Joel

    2015-10-01

    Esophagogastroduedenoscopy (EGD) in children is usually performed under general anesthesia. Anesthetic goals include minimization of airway complications while maximizing operating room (OR) efficiency. Currently, there is no consensus on which anesthetic technique best meets these goals. We performed a prospective randomized study comparing three different anesthetic techniques. To evaluate the incidence of respiratory complications (primary aim) and institutional efficiency (secondary aim) among three different anesthetic techniques in children undergoing EGD. Subjects received a standardized inhalation induction of anesthesia followed by randomization to one of the three groups: Group intubated, sevoflurane (IS), Group intubated, propofol (IP), and Group native airway, nonintubated, propofol (NA). Respiratory complications included minor desaturation (SpO2 between 94% and 85%), severe desaturation (SpO2 < 85%), apnea, airway obstruction/laryngospasm, aspiration, and/or inadequate anesthesia during the endoscopy. Evaluation of institutional efficiency was determined by examining the time spent during the different phases of care (anesthesia preparation, procedure, OR stay, recovery, and total perioperative care). One hundred and seventy-nine children aged 1-12 years (median 7 years; 4.0, 10.0) were enrolled (Group IS N = 60, Group IP N = 59, Group NA N = 61). The incidence of respiratory complications was higher in the Group NA (0.459) vs Group IS (0.033) or Group IP (0.086) (P < 0.0001). The most commonly observed complications were desaturation, inadequate anesthesia, and apnea. There were no differences in institutional efficiency among the three groups. Respiratory complications were more common in Group NA. The use of native airway with propofol maintenance during EGD does not offer advantages with respect to respiratory complications or institutional efficiency. © 2015 John Wiley & Sons Ltd.

  14. Comparison of sampling designs for estimating deforestation from landsat TM and MODIS imagery: a case study in Mato Grosso, Brazil.

    Science.gov (United States)

    Zhu, Shanyou; Zhang, Hailong; Liu, Ronggao; Cao, Yun; Zhang, Guixin

    2014-01-01

    Sampling designs are commonly used to estimate deforestation over large areas, but comparisons between different sampling strategies are required. Using PRODES deforestation data as a reference, deforestation in the state of Mato Grosso in Brazil from 2005 to 2006 is evaluated using Landsat imagery and a nearly synchronous MODIS dataset. The MODIS-derived deforestation is used to assist in sampling and extrapolation. Three sampling designs are compared according to the estimated deforestation of the entire study area based on simple extrapolation and linear regression models. The results show that stratified sampling for strata construction and sample allocation using the MODIS-derived deforestation hotspots provided more precise estimations than simple random and systematic sampling. Moreover, the relationship between the MODIS-derived and TM-derived deforestation provides a precise estimate of the total deforestation area as well as the distribution of deforestation in each block.

  15. Stratified Medicine and Reimbursement Issues

    Directory of Open Access Journals (Sweden)

    Hans-Joerg eFugel

    2012-10-01

    Full Text Available Stratified Medicine (SM has the potential to target patient populations who will most benefit from a therapy while reducing unnecessary health interventions associated with side effects. The link between clinical biomarkers/diagnostics and therapies provides new opportunities for value creation to strengthen the value proposition to pricing and reimbursement (P&R authorities. However, the introduction of SM challenges current reimbursement schemes in many EU countries and the US as different P&R policies have been adopted for drugs and diagnostics. Also, there is a lack of a consistent process for value assessment of more complex diagnostics in these markets. New, innovative approaches and more flexible P&R systems are needed to reflect the added value of diagnostic tests and to stimulate investments in new technologies. Yet, the framework for access of diagnostic–based therapies still requires further development while setting the right incentives and appropriate align stakeholders interests when realizing long- term patient benefits. This article addresses the reimbursement challenges of SM approaches in several EU countries and the US outlining some options to overcome existing reimbursement barriers for stratified medicine.

  16. Evaluation of a Class of Simple and Effective Uncertainty Methods for Sparse Samples of Random Variables and Functions

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Vicente [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bonney, Matthew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Schroeder, Benjamin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Weirs, V. Gregory [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-11-01

    When very few samples of a random quantity are available from a source distribution of unknown shape, it is usually not possible to accurately infer the exact distribution from which the data samples come. Under-estimation of important quantities such as response variance and failure probabilities can result. For many engineering purposes, including design and risk analysis, we attempt to avoid under-estimation with a strategy to conservatively estimate (bound) these types of quantities -- without being overly conservative -- when only a few samples of a random quantity are available from model predictions or replicate experiments. This report examines a class of related sparse-data uncertainty representation and inference approaches that are relatively simple, inexpensive, and effective. Tradeoffs between the methods' conservatism, reliability, and risk versus number of data samples (cost) are quantified with multi-attribute metrics use d to assess method performance for conservative estimation of two representative quantities: central 95% of response; and 10-4 probability of exceeding a response threshold in a tail of the distribution. Each method's performance is characterized with 10,000 random trials on a large number of diverse and challenging distributions. The best method and number of samples to use in a given circumstance depends on the uncertainty quantity to be estimated, the PDF character, and the desired reliability of bounding the true value. On the basis of this large data base and study, a strategy is proposed for selecting the method and number of samples for attaining reasonable credibility levels in bounding these types of quantities when sparse samples of random variables or functions are available from experiments or simulations.

  17. Convergence analysis for Latin-hypercube lattice-sample selection strategies for 3D correlated random hydraulic-conductivity fields

    OpenAIRE

    Simuta-Champo, R.; Herrera-Zamarrón, G. S.

    2010-01-01

    The Monte Carlo technique provides a natural method for evaluating uncertainties. The uncertainty is represented by a probability distribution or by related quantities such as statistical moments. When the groundwater flow and transport governing equations are solved and the hydraulic conductivity field is treated as a random spatial function, the hydraulic head, velocities and concentrations also become random spatial functions. When that is the case, for the stochastic simulation of groundw...

  18. Separation Techniques for Quantification of Radionuclides in Environmental Samples

    Directory of Open Access Journals (Sweden)

    Dusan Galanda

    2009-01-01

    Full Text Available The reliable and quantitative measurement of radionuclides is important in order to determine environmental quality and radiation safety, and to monitor regulatory compliance. We examined soil samples from Podunajske Biskupice, near the city of Bratislava in the Slovak Republic, for the presence of several natural (238U, 232Th, 40K and anthropogenic (137Cs, 90Sr, 239Pu, 240Pu, 241Am radionuclides. The area is adjacent to a refinery and hazardous waste processing center, as well as the municipal incinerator plant, and so might possess an unusually high level of ecotoxic metals. We found that the levels of both naturally occurring and anthropogenic radionuclides fell within the expected ranges, indicating that these facilities pose no radiological threat to the local environment. During the course of our analysis, we modified existing techniques in order to allow us to handle the unusually large and complex samples that were needed to determine the levels of 239Pu, 240Pu, and 241Am activity. We also rated three commercial techniques for the separation of 90Sr from aqueous solutions and found that two of them, AnaLig Sr-01 and Empore Extraction Disks, were suitable for the quantitative and reliable separation of 90Sr, while the third, Sr-Spec Resin, was less so. The main criterion in evaluating these methods was the chemical recovery of 90Sr, which was less than we had expected. We also considered speed of separation and additional steps needed to prepare the sample for separation.

  19. Unsteady natural convection flow past an accelerated vertical plate in a thermally stratified fluid

    Directory of Open Access Journals (Sweden)

    Deka Rudra Kt.

    2009-01-01

    Full Text Available An exact solution to one-dimensional unsteady natural convection flow past an infinite vertical accelerated plate, immersed in a viscous thermally stratified fluid is investigated. Pressure work term and the vertical temperature advection are considered in the thermodynamic energy equation. The dimensionless governing equations are solved by Laplace Transform techniques for the Prandtl number unity. The velocity and temperature profiles as well as the skin-friction and the rate of heat transfer are presented graphically and discussed the effects of the Grashof number Gr, stratification parameter S at various times t.

  20. Effectiveness of a Treatment Involving Soft Tissue Techniques and/or Neural Mobilization Techniques in the Management of Tension-Type Headache: A Randomized Controlled Trial.

    Science.gov (United States)

    Ferragut-Garcías, Alejandro; Plaza-Manzano, Gustavo; Rodríguez-Blanco, Cleofás; Velasco-Roldán, Olga; Pecos-Martín, Daniel; Oliva-Pascual-Vaca, Jesús; Llabrés-Bennasar, Bartomeu; Oliva-Pascual-Vaca, Ángel

    2017-02-01

    To evaluate the effects of a protocol involving soft tissue techniques and/or neural mobilization techniques in the management of patients with frequent episodic tension-type headache (FETTH) and those with chronic tension-type headache (CTTH). Randomized, double-blind, placebo-controlled before and after trial. Rehabilitation area of the local hospital and a private physiotherapy center. Patients (N=97; 78 women, 19 men) diagnosed with FETTH or CTTH were randomly assigned to groups A, B, C, or D. (A) Placebo superficial massage; (B) soft tissue techniques; (C) neural mobilization techniques; (D) a combination of soft tissue and neural mobilization techniques. The pressure pain threshold (PPT) in the temporal muscles (points 1 and 2) and supraorbital region (point 3), the frequency and maximal intensity of pain crisis, and the score in the Headache Impact Test-6 (HIT-6) were evaluated. All variables were assessed before the intervention, at the end of the intervention, and 15 and 30 days after the intervention. Groups B, C, and D had an increase in PPT and a reduction in frequency, maximal intensity, and HIT-6 values in all time points after the intervention as compared with baseline and group A (P<.001 for all cases). Group D had the highest PPT values and the lowest frequency and HIT-6 values after the intervention. The application of soft tissue and neural mobilization techniques to patients with FETTH or CTTH induces significant changes in PPT, the characteristics of pain crisis, and its effect on activities of daily living as compared with the application of these techniques as isolated interventions. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  1. Numerical simulation of stratified flows with different k-ε turbulence models

    International Nuclear Information System (INIS)

    Dagestad, S.

    1991-01-01

    The thesis comprises the numerical simulation of stratified flows with different k-ε models. When using the k-ε model, two equations are solved to describe the turbulence. The k-equation represents the turbulent kinetic energy of the turbulence and the ε-equation is the turbulent dissipation. Different k-ε models predict stratified flows differently. The standard k-ε model leads to higher turbulent mixing than the low-Reynolds model does. For lower Froude numbers, F 0 , this effect becomes enhanced. Buoyancy extension of the k-ε model also leads to less vertical mixing in cases with strong stratification. When the stratification increases, buoyancy-extension becomes larger influence. The turbulent Prandtl number effects have large impact on the transport of heat and the development of the flow. Two different formulae which express the turbulent Prandtl effects have been tested. For unstably stratified flows, the rapid mixing and three-dimensionality of the flow can in fact be computed using a k-ε model when buoyancy-extended is employed. The turbulent heat transfer and thus turbulent production in unstable stratified flows depends strongly upon the turbulent Prandtl number effect. The main conclusions are: Stable stratified flows should be computed with a buoyancy-extended low-Reynolds k-ε model; Unstable stratified flows should be computed with a buoyancy-extended standard k-ε model; The turbulent Prandtl number effects should be included in the computations; Buoyancy-extension has lead to more correct description of the physics for all of the investigated flows. 78 refs., 128 figs., 17 tabs

  2. Accelerated Solvent Extraction: An Innovative Sample Extraction Technique for Natural Products

    International Nuclear Information System (INIS)

    Hazlina Ahmad Hassali; Azfar Hanif Abd Aziz; Rosniza Razali

    2015-01-01

    Accelerated solvent extraction (ASE) is one of the novel techniques that have been developed for the extraction of phytochemicals from plants in order to shorten the extraction time, decrease the solvent consumption, increase the extraction yield and enhance the quality of extracts. This technique combines elevated temperatures and pressure with liquid solvents. This paper gives a brief overview of accelerated solvent extraction technique for sample preparation and its application to the extraction of natural products. Through practical examples, the effects of operational parameters such as temperature, volume of solvent used, extraction time and extraction yields on the performance of ASE are discussed. It is demonstrated that ASE technique allows reduced solvent consumption and shorter extraction time, while the extraction yields are even higher than those obtained with conventional methods. (author)

  3. Actual distribution of Cronobacter spp. in industrial batches of powdered infant formula and consequences for performance of sampling strategies.

    Science.gov (United States)

    Jongenburger, I; Reij, M W; Boer, E P J; Gorris, L G M; Zwietering, M H

    2011-11-15

    The actual spatial distribution of microorganisms within a batch of food influences the results of sampling for microbiological testing when this distribution is non-homogeneous. In the case of pathogens being non-homogeneously distributed, it markedly influences public health risk. This study investigated the spatial distribution of Cronobacter spp. in powdered infant formula (PIF) on industrial batch-scale for both a recalled batch as well a reference batch. Additionally, local spatial occurrence of clusters of Cronobacter cells was assessed, as well as the performance of typical sampling strategies to determine the presence of the microorganisms. The concentration of Cronobacter spp. was assessed in the course of the filling time of each batch, by taking samples of 333 g using the most probable number (MPN) enrichment technique. The occurrence of clusters of Cronobacter spp. cells was investigated by plate counting. From the recalled batch, 415 MPN samples were drawn. The expected heterogeneous distribution of Cronobacter spp. could be quantified from these samples, which showed no detectable level (detection limit of -2.52 log CFU/g) in 58% of samples, whilst in the remainder concentrations were found to be between -2.52 and 2.75 log CFU/g. The estimated average concentration in the recalled batch was -2.78 log CFU/g and a standard deviation of 1.10 log CFU/g. The estimated average concentration in the reference batch was -4.41 log CFU/g, with 99% of the 93 samples being below the detection limit. In the recalled batch, clusters of cells occurred sporadically in 8 out of 2290 samples of 1g taken. The two largest clusters contained 123 (2.09 log CFU/g) and 560 (2.75 log CFU/g) cells. Various sampling strategies were evaluated for the recalled batch. Taking more and smaller samples and keeping the total sampling weight constant, considerably improved the performance of the sampling plans to detect such a type of contaminated batch. Compared to random sampling

  4. Practical aspects of the resin bead technique for mass spectrometric sample loading

    International Nuclear Information System (INIS)

    Walker, R.L.; Pritchard, C.A.; Carter, J.A.; Smith, D.H.

    1976-07-01

    Using an anion resin bead as a loading vehicle for uranium and plutonium samples which are to be analyzed isotopically in a mass spectrometer has many advantages over conventional techniques. It is applicable to any laboratory routinely performing such analyses, but should be particularly relevant for Safeguards' purposes. Because the techniques required differ markedly from those of conventional methods, this report has been written to describe them in detail to enable those unfamiliar with the technique to master it with a minimum of trouble

  5. Neurofeedback Against Binge Eating: A Randomized Controlled Trial in a Female Subclinical Threshold Sample.

    Science.gov (United States)

    Schmidt, Jennifer; Martin, Alexandra

    2016-09-01

    Brain-directed treatment techniques, such as neurofeedback, have recently been proposed as adjuncts in the treatment of eating disorders to improve therapeutic outcomes. In line with this recommendation, a cue exposure EEG-neurofeedback protocol was developed. The present study aimed at the evaluation of the specific efficacy of neurofeedback to reduce subjective binge eating in a female subthreshold sample. A total of 75 subjects were randomized to EEG-neurofeedback, mental imagery with a comparable treatment set-up or a waitlist group. At post-treatment, only EEG-neurofeedback led to a reduced frequency of binge eating (p = .015, g = 0.65). The effects remained stable to a 3-month follow-up. EEG-neurofeedback further showed particular beneficial effects on perceived stress and dietary self-efficacy. Differences in outcomes did not arise from divergent treatment expectations. Because EEG-neurofeedback showed a specific efficacy, it may be a promising brain-directed approach that should be tested as a treatment adjunct in clinical groups with binge eating. Copyright © 2016 John Wiley & Sons, Ltd and Eating Disorders Association. Copyright © 2016 John Wiley & Sons, Ltd and Eating Disorders Association.

  6. Machine-learning techniques for family demography: an application of random forests to the analysis of divorce determinants in Germany

    OpenAIRE

    Arpino, Bruno; Le Moglie, Marco; Mencarini, Letizia

    2018-01-01

    Demographers often analyze the determinants of life-course events with parametric regression-type approaches. Here, we present a class of nonparametric approaches, broadly defined as machine learning (ML) techniques, and discuss advantages and disadvantages of a popular type known as random forest. We argue that random forests can be useful either as a substitute, or a complement, to more standard parametric regression modeling. Our discussion of random forests is intuitive and...

  7. Microbiological analysis after complete or partial removal of carious dentin using two different techniques in primary teeth: A randomized clinical trial

    Science.gov (United States)

    Singhal, Deepak Kumar; Acharya, Shashidhar; Thakur, Arun Singh

    2016-01-01

    Background: The management of deep carious lesions can be done by various techniques but residual caries dilemma still persists and bacterial reduction in cavities treated by either partial or complete caries removal techniques is debatable. So the objective of the present randomized clinical trial was to compare microbial counts in cavities submitted to complete caries removal and partial caries removal using either hand instruments or burs before and after 3 weeks of restoration. Materials and Methods: Primary molars with acute carious lesions in inner half of dentine and vital pulp were randomly divided into three groups of 14 each: Group A: Partial caries removal using hand instruments atraumatic restorative treatment (ART) only; Group B: Partial caries removal using bur; Group C: Complete caries removal using bur and caries detector dye. Dentine sample obtained after caries removal and 3 weeks after restoration, were subjected to microbial culture and counting (colony-forming units [CFU]/mg of dentine) for total viable bacterial count, Streptococcus spp., mutans streptococci, Lactobacillus spp. Results: Three techniques of caries removal showed significant (P < 0.05) reduction in all microorganisms studied after 3 weeks of evaluation, but there was no statistically significant difference in percentage reduction of microbial count among three groups. Conclusion: Results suggest the use of partial caries removal in a single session as compared to complete caries removal as a part of treatment of deep lesions in deciduous teeth in order to reduce the risk of pulp exposure. Partial caries removal using ART can be preferred for community settings as public health procedure for caries management. PMID:26962313

  8. Teaching Techniques, Types of Personality, and English Listening Skill

    Directory of Open Access Journals (Sweden)

    Ni Made Ratminingsih

    2013-01-01

    Full Text Available Abstract: Teaching Techniques, Types of Personality, and English Listening Skill. This study inves­tigated the effect of teaching techniques and types of personality on English listening skill. This experi­mental study involved 88 students under investigation, which were determined randomly through multi-stage random sampling technique. The results of the research indicate that there is an interaction effect between the teaching techniques and types of personality on the English listening skill; there is no significant difference in the listening skill between the group of students who learn using the game technique and those who learn using the song technique; the listening skill of students having extrovert personality is better than those having introvert personality; the listening skill of students having extrovert personality who learn using the game technique is lower than those who learn using the song technique; and the listen­ing skill of students having introvert personality who learn using the game technique is higher than those who learn using the song technique. Abstrak: Teknik Pembelajaran, Tipe Kepribadian, dan Keterampilan Mendengarkan Bahasa Inggris. Penelitian ini bertujuan untuk mengetahui pengaruh teknik pembelajaran dan tipe kepribadian terhadap keterampilan mendengarkan bahasa Inggris. Penelitian ini melibatkan 88 orang siswa, yang ditentukan secara acak melalui multi stage random sampling technique. Hasil penelitian menunjukkan bahwa terdapat pengaruh interaksi antara teknik pembelajaran dan tipe kepribadian terhadap keterampilan mendengarkan bahasa Inggris; tidak terdapat perbedaan yang signifikan pada keterampilan mendengarkan antara siswa yang belajar dengan teknik pembelajaran permainan dan lagu; keterampilan mendengarkan siswa yang berkepribadian ekstroversi lebih baik daripada yang berkepribadian introversi; keterampilan mendengarkan siswa yang berkepribadian ekstroversi, yang belajar dengan teknik pembelajaran

  9. A fully blanketed early B star LTE model atmosphere using an opacity sampling technique

    International Nuclear Information System (INIS)

    Phillips, A.P.; Wright, S.L.

    1980-01-01

    A fully blanketed LTE model of a stellar atmosphere with Tsub(e) = 21914 K (thetasub(e) = 0.23), log g = 4 is presented. The model includes an explicit representation of the opacity due to the strongest lines, and uses a statistical opacity sampling technique to represent the weaker line opacity. The sampling technique is subjected to several tests and the model is compared with an atmosphere calculated using the line-distribution function method. The limitations of the distribution function method and the particular opacity sampling method used here are discussed in the light of the results obtained. (author)

  10. Sample preparation techniques based on combustion reactions in closed vessels - A brief overview and recent applications

    International Nuclear Information System (INIS)

    Flores, Erico M.M.; Barin, Juliano S.; Mesko, Marcia F.; Knapp, Guenter

    2007-01-01

    In this review, a general discussion of sample preparation techniques based on combustion reactions in closed vessels is presented. Applications for several kinds of samples are described, taking into account the literature data reported in the last 25 years. The operational conditions as well as the main characteristics and drawbacks are discussed for bomb combustion, oxygen flask and microwave-induced combustion (MIC) techniques. Recent applications of MIC techniques are discussed with special concern for samples not well digested by conventional microwave-assisted wet digestion as, for example, coal and also for subsequent determination of halogens

  11. MC3D modelling of stratified explosion

    International Nuclear Information System (INIS)

    Picchi, S.; Berthoud, G.

    1999-01-01

    It is known that a steam explosion can occur in a stratified geometry and that the observed yields are lower than in the case of explosion in a premixture configuration. However, very few models are available to quantify the amount of melt which can be involved and the pressure peak that can be developed. In the stratified application of the MC3D code, mixing and fragmentation of the melt are explained by the growth of Kelvin Helmholtz instabilities due to the shear flow of the two phase coolant above the melt. Such a model is then used to recalculate the Frost-Ciccarelli tin-water experiment. Pressure peak, speed of propagation, bubble shape and erosion height are well reproduced as well as the influence of the inertial constraint (height of the water pool). (author)

  12. MC3D modelling of stratified explosion

    Energy Technology Data Exchange (ETDEWEB)

    Picchi, S.; Berthoud, G. [DTP/SMTH/LM2, CEA, 38 - Grenoble (France)

    1999-07-01

    It is known that a steam explosion can occur in a stratified geometry and that the observed yields are lower than in the case of explosion in a premixture configuration. However, very few models are available to quantify the amount of melt which can be involved and the pressure peak that can be developed. In the stratified application of the MC3D code, mixing and fragmentation of the melt are explained by the growth of Kelvin Helmholtz instabilities due to the shear flow of the two phase coolant above the melt. Such a model is then used to recalculate the Frost-Ciccarelli tin-water experiment. Pressure peak, speed of propagation, bubble shape and erosion height are well reproduced as well as the influence of the inertial constraint (height of the water pool). (author)

  13. Discrete element method (DEM) simulations of stratified sampling during solid dosage form manufacturing.

    Science.gov (United States)

    Hancock, Bruno C; Ketterhagen, William R

    2011-10-14

    Discrete element model (DEM) simulations of the discharge of powders from hoppers under gravity were analyzed to provide estimates of dosage form content uniformity during the manufacture of solid dosage forms (tablets and capsules). For a system that exhibits moderate segregation the effects of sample size, number, and location within the batch were determined. The various sampling approaches were compared to current best-practices for sampling described in the Product Quality Research Institute (PQRI) Blend Uniformity Working Group (BUWG) guidelines. Sampling uniformly across the discharge process gave the most accurate results with respect to identifying segregation trends. Sigmoidal sampling (as recommended in the PQRI BUWG guidelines) tended to overestimate potential segregation issues, whereas truncated sampling (common in industrial practice) tended to underestimate them. The size of the sample had a major effect on the absolute potency RSD. The number of sampling locations (10 vs. 20) had very little effect on the trends in the data, and the number of samples analyzed at each location (1 vs. 3 vs. 7) had only a small effect for the sampling conditions examined. The results of this work provide greater understanding of the effect of different sampling approaches on the measured content uniformity of real dosage forms, and can help to guide the choice of appropriate sampling protocols. Copyright © 2011 Elsevier B.V. All rights reserved.

  14. The effect of existing turbulence on stratified shear instability

    Science.gov (United States)

    Kaminski, Alexis; Smyth, William

    2017-11-01

    Ocean turbulence is an essential process governing, for example, heat uptake by the ocean. In the stably-stratified ocean interior, this turbulence occurs in discrete events driven by vertical variations of the horizontal velocity. Typically, these events have been modelled by assuming an initially laminar stratified shear flow which develops wavelike instabilities, becomes fully turbulent, and then relaminarizes into a stable state. However, in the real ocean there is always some level of turbulence left over from previous events, and it is not yet understood how this turbulence impacts the evolution of future mixing events. Here, we perform a series of direct numerical simulations of turbulent events developing in stratified shear flows that are already at least weakly turbulent. We do so by varying the amplitude of the initial perturbations, and examine the subsequent development of the instability and the impact on the resulting turbulent fluxes. This work is supported by NSF Grant OCE1537173.

  15. Effects of pushing techniques in birth on mother and fetus: a randomized study.

    Science.gov (United States)

    Yildirim, Gulay; Beji, Nezihe Kizilkaya

    2008-03-01

    The Valsalva pushing technique is used routinely in the second stage of labor in many countries, and it is accepted as standard obstetric management in Turkey. The purpose of this study was to determine the effects of pushing techniques on mother and fetus in birth in this setting. This randomized study was conducted between July 2003 and June 2004 in Bakirkoy Maternity and Children's Teaching Hospital in Istanbul, Turkey. One hundred low-risk primiparas between 38 and 42 weeks' gestation, who expected a spontaneous vaginal delivery, were randomized to either a spontaneous pushing group or a Valsalva-type pushing group. Spontaneous pushing women were informed during the first stage of labor about spontaneous pushing technique (open glottis pushing while breathing out) and were supported in pushing spontaneously in the second stage of labor. Similarly, Valsalva pushing women were informed during the first stage of labor about the Valsalva pushing technique (closed glottis pushing while holding their breath) and were supported in using Valsalva pushing in the second stage of labor. Perineal tears, postpartum hemorrhage, and hemoglobin levels were evaluated in mothers; and umbilical artery pH, Po(2) (mmHg), and Pco(2) (mmHg) levels and Apgar scores at 1 and 5 minutes were evaluated in newborns in both groups. No significant differences were found between the two groups in their demographics, incidence of nonreassuring fetal surveillance patterns, or use of oxytocin. The second stage of labor and duration of the expulsion phase were significantly longer with Valsalva-type pushing. Differences in the incidence of episiotomy, perineal tears, or postpartum hemorrhage were not significant between the groups. The baby fared better with spontaneous pushing, with higher 1- and 5-minute Apgar scores, and higher umbilical cord pH and Po(2) levels. After the birth, women expressed greater satisfaction with spontaneous pushing. Educating women about the spontaneous pushing

  16. The application of statistical and/or non-statistical sampling techniques by internal audit functions in the South African banking industry

    Directory of Open Access Journals (Sweden)

    D.P. van der Nest

    2015-03-01

    Full Text Available This article explores the use by internal audit functions of audit sampling techniques in order to test the effectiveness of controls in the banking sector. The article focuses specifically on the use of statistical and/or non-statistical sampling techniques by internal auditors. The focus of the research for this article was internal audit functions in the banking sector of South Africa. The results discussed in the article indicate that audit sampling is still used frequently as an audit evidence-gathering technique. Non-statistical sampling techniques are used more frequently than statistical sampling techniques for the evaluation of the sample. In addition, both techniques are regarded as important for the determination of the sample size and the selection of the sample items

  17. Approximation of Quantities of Interest in Stochastic PDEs by the Random Discrete L^2 Projection on Polynomial Spaces

    KAUST Repository

    Migliorati, G.

    2013-05-30

    In this work we consider the random discrete L^2 projection on polynomial spaces (hereafter RDP) for the approximation of scalar quantities of interest (QOIs) related to the solution of a partial differential equation model with random input parameters. In the RDP technique the QOI is first computed for independent samples of the random input parameters, as in a standard Monte Carlo approach, and then the QOI is approximated by a multivariate polynomial function of the input parameters using a discrete least squares approach. We consider several examples including the Darcy equations with random permeability, the linear elasticity equations with random elastic coefficient, and the Navier--Stokes equations in random geometries and with random fluid viscosity. We show that the RDP technique is well suited to QOIs that depend smoothly on a moderate number of random parameters. Our numerical tests confirm the theoretical findings in [G. Migliorati, F. Nobile, E. von Schwerin, and R. Tempone, Analysis of the Discrete $L^2$ Projection on Polynomial Spaces with Random Evaluations, MOX report 46-2011, Politecnico di Milano, Milano, Italy, submitted], which have shown that, in the case of a single uniformly distributed random parameter, the RDP technique is stable and optimally convergent if the number of sampling points is proportional to the square of the dimension of the polynomial space. Here optimality means that the weighted $L^2$ norm of the RDP error is bounded from above by the best $L^\\\\infty$ error achievable in the given polynomial space, up to logarithmic factors. In the case of several random input parameters, the numerical evidence indicates that the condition on quadratic growth of the number of sampling points could be relaxed to a linear growth and still achieve stable and optimal convergence. This makes the RDP technique very promising for moderately high dimensional uncertainty quantification.

  18. Phenomenology of two-dimensional stably stratified turbulence under large-scale forcing

    KAUST Repository

    Kumar, Abhishek; Verma, Mahendra K.; Sukhatme, Jai

    2017-01-01

    In this paper, we characterise the scaling of energy spectra, and the interscale transfer of energy and enstrophy, for strongly, moderately and weakly stably stratified two-dimensional (2D) turbulence, restricted in a vertical plane, under large-scale random forcing. In the strongly stratified case, a large-scale vertically sheared horizontal flow (VSHF) coexists with small scale turbulence. The VSHF consists of internal gravity waves and the turbulent flow has a kinetic energy (KE) spectrum that follows an approximate k−3 scaling with zero KE flux and a robust positive enstrophy flux. The spectrum of the turbulent potential energy (PE) also approximately follows a k−3 power-law and its flux is directed to small scales. For moderate stratification, there is no VSHF and the KE of the turbulent flow exhibits Bolgiano–Obukhov scaling that transitions from a shallow k−11/5 form at large scales, to a steeper approximate k−3 scaling at small scales. The entire range of scales shows a strong forward enstrophy flux, and interestingly, large (small) scales show an inverse (forward) KE flux. The PE flux in this regime is directed to small scales, and the PE spectrum is characterised by an approximate k−1.64 scaling. Finally, for weak stratification, KE is transferred upscale and its spectrum closely follows a k−2.5 scaling, while PE exhibits a forward transfer and its spectrum shows an approximate k−1.6 power-law. For all stratification strengths, the total energy always flows from large to small scales and almost all the spectral indicies are well explained by accounting for the scale-dependent nature of the corresponding flux.

  19. Phenomenology of two-dimensional stably stratified turbulence under large-scale forcing

    KAUST Repository

    Kumar, Abhishek

    2017-01-11

    In this paper, we characterise the scaling of energy spectra, and the interscale transfer of energy and enstrophy, for strongly, moderately and weakly stably stratified two-dimensional (2D) turbulence, restricted in a vertical plane, under large-scale random forcing. In the strongly stratified case, a large-scale vertically sheared horizontal flow (VSHF) coexists with small scale turbulence. The VSHF consists of internal gravity waves and the turbulent flow has a kinetic energy (KE) spectrum that follows an approximate k−3 scaling with zero KE flux and a robust positive enstrophy flux. The spectrum of the turbulent potential energy (PE) also approximately follows a k−3 power-law and its flux is directed to small scales. For moderate stratification, there is no VSHF and the KE of the turbulent flow exhibits Bolgiano–Obukhov scaling that transitions from a shallow k−11/5 form at large scales, to a steeper approximate k−3 scaling at small scales. The entire range of scales shows a strong forward enstrophy flux, and interestingly, large (small) scales show an inverse (forward) KE flux. The PE flux in this regime is directed to small scales, and the PE spectrum is characterised by an approximate k−1.64 scaling. Finally, for weak stratification, KE is transferred upscale and its spectrum closely follows a k−2.5 scaling, while PE exhibits a forward transfer and its spectrum shows an approximate k−1.6 power-law. For all stratification strengths, the total energy always flows from large to small scales and almost all the spectral indicies are well explained by accounting for the scale-dependent nature of the corresponding flux.

  20. A sub-sampled approach to extremely low-dose STEM

    Energy Technology Data Exchange (ETDEWEB)

    Stevens, A. [OptimalSensing, Southlake, Texas 76092, USA; Duke University, ECE, Durham, North Carolina 27708, USA; Luzi, L. [Rice University, ECE, Houston, Texas 77005, USA; Yang, H. [Lawrence Berkeley National Laboratory, Berkeley, California 94720, USA; Kovarik, L. [Pacific NW National Laboratory, Richland, Washington 99354, USA; Mehdi, B. L. [Pacific NW National Laboratory, Richland, Washington 99354, USA; University of Liverpool, Materials Engineering, Liverpool L69 3GH, United Kingdom; Liyu, A. [Pacific NW National Laboratory, Richland, Washington 99354, USA; Gehm, M. E. [Duke University, ECE, Durham, North Carolina 27708, USA; Browning, N. D. [Pacific NW National Laboratory, Richland, Washington 99354, USA; University of Liverpool, Materials Engineering, Liverpool L69 3GH, United Kingdom

    2018-01-22

    The inpainting of randomly sub-sampled images acquired by scanning transmission electron microscopy (STEM) is an attractive method for imaging under low-dose conditions (≤ 1 e-2) without changing either the operation of the microscope or the physics of the imaging process. We show that 1) adaptive sub-sampling increases acquisition speed, resolution, and sensitivity; and 2) random (non-adaptive) sub-sampling is equivalent, but faster than, traditional low-dose techniques. Adaptive sub-sampling opens numerous possibilities for the analysis of beam sensitive materials and in-situ dynamic processes at the resolution limit of the aberration corrected microscope and is demonstrated here for the analysis of the node distribution in metal-organic frameworks (MOFs).

  1. Comparison of the efficacy of two anesthetic techniques of mandibular primary first molar: A randomized clinical trial

    Directory of Open Access Journals (Sweden)

    Davood Ghasemi Tudeshchoie

    2013-01-01

    Full Text Available Background: The most common technique to anesthetize mandibular primary teeth is inferior alveolar (I.A nerve block injection which induces a relatively sustained anesthesia and in turn may potentially traumatize soft-tissues. Therefore, the need of having an alternative technique of anesthesia with a shorter term but the same efficacy is reasonable. The aim of this study was a comparison of the efficacy of two anesthetic techniques of mandibular primary first molar. Materials and Methods: In this randomized crossover clinical trial, 40 children with ages ranged from 5 years to 8 years whose mandibular primary first molars were eligible for pulpotomy, were selected and divided randomly into two groups. The right and left mandibular first molars of group A were anesthetized with infiltration and I. A nerve block techniques in the first and second sessions respectively. The left and right mandibular first molars of group B were anesthetized with I.A nerve block and infiltration techniques in the first and second sessions respectively. The severity of pain were measured and recorded according to sound-eye-motor scale by a certain person. Data was analyzed using Wilcoxon Signed Rank and Mann-Whitney U tests (P < 0.05. Results: The severity of pain was lower in infiltration technique versus I.A nerve block. There were no significant differences between the severities of pain on pulpal exposure of two techniques. Conclusion: It seems that infiltration technique is more favorable to anesthetize the mandibular primary first molar compared to I.A nerve block.

  2. X-ray spectrometry and X-ray microtomography techniques for soil and geological samples analysis

    International Nuclear Information System (INIS)

    Kubala-Kukuś, A.; Banaś, D.; Braziewicz, J.; Dziadowicz, M.; Kopeć, E.; Majewska, U.; Mazurek, M.; Pajek, M.; Sobisz, M.; Stabrawa, I.; Wudarczyk-Moćko, J.; Góźdź, S.

    2015-01-01

    A particular subject of X-ray fluorescence analysis is its application in studies of the multielemental sample of composition in a wide range of concentrations, samples with different matrices, also inhomogeneous ones and those characterized with different grain size. Typical examples of these kinds of samples are soil or geological samples for which XRF elemental analysis may be difficult due to XRF disturbing effects. In this paper the WDXRF technique was applied in elemental analysis concerning different soil and geological samples (therapeutic mud, floral soil, brown soil, sandy soil, calcium aluminum cement). The sample morphology was analyzed using X-ray microtomography technique. The paper discusses the differences between the composition of samples, the influence of procedures with respect to the preparation of samples as regards their morphology and, finally, a quantitative analysis. The results of the studies were statistically tested (one-way ANOVA and correlation coefficients). For lead concentration determination in samples of sandy soil and cement-like matrix, the WDXRF spectrometer calibration was performed. The elemental analysis of the samples was complemented with knowledge of chemical composition obtained by X-ray powder diffraction.

  3. X-ray spectrometry and X-ray microtomography techniques for soil and geological samples analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kubala-Kukuś, A.; Banaś, D.; Braziewicz, J. [Institute of Physics, Jan Kochanowski University, ul. Świetokrzyska 15, 25-406 Kielce (Poland); Holycross Cancer Center, ul. Artwińskiego 3, 25-734 Kielce (Poland); Dziadowicz, M.; Kopeć, E. [Institute of Physics, Jan Kochanowski University, ul. Świetokrzyska 15, 25-406 Kielce (Poland); Majewska, U. [Institute of Physics, Jan Kochanowski University, ul. Świetokrzyska 15, 25-406 Kielce (Poland); Holycross Cancer Center, ul. Artwińskiego 3, 25-734 Kielce (Poland); Mazurek, M.; Pajek, M.; Sobisz, M.; Stabrawa, I. [Institute of Physics, Jan Kochanowski University, ul. Świetokrzyska 15, 25-406 Kielce (Poland); Wudarczyk-Moćko, J. [Holycross Cancer Center, ul. Artwińskiego 3, 25-734 Kielce (Poland); Góźdź, S. [Holycross Cancer Center, ul. Artwińskiego 3, 25-734 Kielce (Poland); Institute of Public Health, Jan Kochanowski University, IX Wieków Kielc 19, 25-317 Kielce (Poland)

    2015-12-01

    A particular subject of X-ray fluorescence analysis is its application in studies of the multielemental sample of composition in a wide range of concentrations, samples with different matrices, also inhomogeneous ones and those characterized with different grain size. Typical examples of these kinds of samples are soil or geological samples for which XRF elemental analysis may be difficult due to XRF disturbing effects. In this paper the WDXRF technique was applied in elemental analysis concerning different soil and geological samples (therapeutic mud, floral soil, brown soil, sandy soil, calcium aluminum cement). The sample morphology was analyzed using X-ray microtomography technique. The paper discusses the differences between the composition of samples, the influence of procedures with respect to the preparation of samples as regards their morphology and, finally, a quantitative analysis. The results of the studies were statistically tested (one-way ANOVA and correlation coefficients). For lead concentration determination in samples of sandy soil and cement-like matrix, the WDXRF spectrometer calibration was performed. The elemental analysis of the samples was complemented with knowledge of chemical composition obtained by X-ray powder diffraction.

  4. Sampling methods for rumen microbial counts by Real-Time PCR techniques

    Directory of Open Access Journals (Sweden)

    S. Puppo

    2010-02-01

    Full Text Available Fresh rumen samples were withdrawn from 4 cannulated buffalo females fed a fibrous diets in order to quantify bacteria concentration in the rumen by Real-Time PCR techniques. To obtain DNA of a good quality from whole rumen fluid, eight (M1-M8 different pre-filtration methods (cheese cloths, glass-fibre and nylon filter in combination with various centrifugation speeds (1000, 5000 and 14,000 rpm were tested. Genomic DNA extraction was performed either on fresh or frozen samples (-20°C. The quantitative bacteria analysis was realized according to Real-Time PCR procedure for Butyrivibrio fibrisolvens reported in literature. M5 resulted the best sampling procedure allowing to obtain a suitable genomic DNA. No differences were revealed between fresh and frozen samples.

  5. Randomization of grab-sampling strategies for estimating the annual exposure of U miners to Rn daughters.

    Science.gov (United States)

    Borak, T B

    1986-04-01

    Periodic grab sampling in combination with time-of-occupancy surveys has been the accepted procedure for estimating the annual exposure of underground U miners to Rn daughters. Temporal variations in the concentration of potential alpha energy in the mine generate uncertainties in this process. A system to randomize the selection of locations for measurement is described which can reduce uncertainties and eliminate systematic biases in the data. In general, a sample frequency of 50 measurements per year is sufficient to satisfy the criteria that the annual exposure be determined in working level months to within +/- 50% of the true value with a 95% level of confidence. Suggestions for implementing this randomization scheme are presented.

  6. A systematic examination of a random sampling strategy for source apportionment calculations.

    Science.gov (United States)

    Andersson, August

    2011-12-15

    Estimating the relative contributions from multiple potential sources of a specific component in a mixed environmental matrix is a general challenge in diverse fields such as atmospheric, environmental and earth sciences. Perhaps the most common strategy for tackling such problems is by setting up a system of linear equations for the fractional influence of different sources. Even though an algebraic solution of this approach is possible for the common situation with N+1 sources and N source markers, such methodology introduces a bias, since it is implicitly assumed that the calculated fractions and the corresponding uncertainties are independent of the variability of the source distributions. Here, a random sampling (RS) strategy for accounting for such statistical bias is examined by investigating rationally designed synthetic data sets. This random sampling methodology is found to be robust and accurate with respect to reproducibility and predictability. This method is also compared to a numerical integration solution for a two-source situation where source variability also is included. A general observation from this examination is that the variability of the source profiles not only affects the calculated precision but also the mean/median source contributions. Copyright © 2011 Elsevier B.V. All rights reserved.

  7. Simulation of steam explosion in stratified melt-coolant configuration

    International Nuclear Information System (INIS)

    Leskovar, Matjaž; Centrih, Vasilij; Uršič, Mitja

    2016-01-01

    Highlights: • Strong steam explosions may develop spontaneously in stratified configurations. • Considerable melt-coolant premixed layer formed in subcooled water with hot melts. • Analysis with MC3D code provided insight into stratified steam explosion phenomenon. • Up to 25% of poured melt was mixed with water and available for steam explosion. • Better instrumented experiments needed to determine dominant mixing process. - Abstract: A steam explosion is an energetic fuel coolant interaction process, which may occur during a severe reactor accident when the molten core comes into contact with the coolant water. In nuclear reactor safety analyses steam explosions are primarily considered in melt jet-coolant pool configurations where sufficiently deep coolant pool conditions provide complete jet breakup and efficient premixture formation. Stratified melt-coolant configurations, i.e. a molten melt layer below a coolant layer, were up to now believed as being unable to generate strong explosive interactions. Based on the hypothesis that there are no interfacial instabilities in a stratified configuration it was assumed that the amount of melt in the premixture is insufficient to produce strong explosions. However, the recently performed experiments in the PULiMS and SES (KTH, Sweden) facilities with oxidic corium simulants revealed that strong steam explosions may develop spontaneously also in stratified melt-coolant configurations, where with high temperature melts and subcooled water conditions a considerable melt-coolant premixed layer is formed. In the article, the performed study of steam explosions in a stratified melt-coolant configuration in PULiMS like conditions is presented. The goal of this analytical work is to supplement the experimental activities within the PULiMS research program by addressing the key questions, especially regarding the explosivity of the formed premixed layer and the mechanisms responsible for the melt-water mixing. To

  8. Characterization of electron microscopes with binary pseudo-random multilayer test samples

    International Nuclear Information System (INIS)

    Yashchuk, Valeriy V.; Conley, Raymond; Anderson, Erik H.; Barber, Samuel K.; Bouet, Nathalie; McKinney, Wayne R.; Takacs, Peter Z.; Voronov, Dmitriy L.

    2010-01-01

    We discuss the results of SEM and TEM measurements with the BPRML test samples fabricated from a BPRML (WSi2/Si with fundamental layer thickness of 3 nm) with a Dual Beam FIB (focused ion beam)/SEM technique. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML test samples can be used to characterize x-ray microscopes. Corresponding work with x-ray microscopes is in progress.

  9. Automated interpretable computational biology in the clinic: a framework to predict disease severity and stratify patients from clinical data

    Directory of Open Access Journals (Sweden)

    Soumya Banerjee

    2017-10-01

    Full Text Available We outline an automated computational and machine learning framework that predicts disease severity and stratifies patients. We apply our framework to available clinical data. Our algorithm automatically generates insights and predicts disease severity with minimal operator intervention. The computational framework presented here can be used to stratify patients, predict disease severity and propose novel biomarkers for disease. Insights from machine learning algorithms coupled with clinical data may help guide therapy, personalize treatment and help clinicians understand the change in disease over time. Computational techniques like these can be used in translational medicine in close collaboration with clinicians and healthcare providers. Our models are also interpretable, allowing clinicians with minimal machine learning experience to engage in model building. This work is a step towards automated machine learning in the clinic.

  10. A smart rotary technique versus conventional pulpectomy for primary teeth: A randomized controlled clinical study.

    Science.gov (United States)

    Mokhtari, Negar; Shirazi, Alireza-Sarraf; Ebrahimi, Masoumeh

    2017-11-01

    Techniques with adequate accuracy of working length determination along with shorter duration of treatment in pulpectomy procedure seems to be essential in pediatric dentistry. The aim of the present study was to evaluate the accuracy of root canal length measurement with Root ZX II apex locator and rotary system in pulpectomy of primary teeth. In this randomized control clinical trial complete pulpectomy was performed on 80 mandibular primary molars in 80, 4-6-year-old children. The study population was randomly divided into case and control groups. In control group conventional pulpectomy was performed and in the case group working length was determined by electronic apex locator Root ZXII and instrumented with Mtwo rotary files. Statistical evaluation was performed using Mann-Whitney and Chi-Square tests ( P <0.05). There were no significant differences between electronic apex locator Root ZXII and conventional method in accuracy of root canal length determination. However significantly less time was needed for instrumenting with rotary files ( P =0.000). Considering the comparable results in accuracy of root canal length determination and the considerably shorter instrumentation time in Root ZXII apex locator and rotary system, it may be suggested for pulpectomy in primary molar teeth. Key words: Rotary technique, conventional technique, pulpectomy, primary teeth.

  11. Sample preparation techniques in trace element analysis by X-ray emission spectroscopy

    International Nuclear Information System (INIS)

    Valkovic, V.

    1983-11-01

    The report, written under a research contract with the IAEA, contains a detailed presentation of the most difficult problem encountered in the trace element analysis by methods of the X-ray emission spectroscopy, namely the sample preparation techniques. The following items are covered. Sampling - with specific consideration of aerosols, water, soil, biological materials, petroleum and its products, storage of samples and their handling. Pretreatment of samples - preconcentration, ashing, solvent extraction, ion exchange and electrodeposition. Sample preparations for PIXE - analysis - backings, target uniformity and homogeneity, effects of irradiation, internal standards and specific examples of preparation (aqueous, biological, blood serum and solid samples). Sample preparations for radioactive sources or tube excitation - with specific examples (water, liquid and solid samples, soil, geological, plants and tissue samples). Finally, the problem of standards and reference materials, as well as that of interlaboratory comparisons, is discussed

  12. Mapping aboveground woody biomass using forest inventory, remote sensing and geostatistical techniques.

    Science.gov (United States)

    Yadav, Bechu K V; Nandy, S

    2015-05-01

    Mapping forest biomass is fundamental for estimating CO₂ emissions, and planning and monitoring of forests and ecosystem productivity. The present study attempted to map aboveground woody biomass (AGWB) integrating forest inventory, remote sensing and geostatistical techniques, viz., direct radiometric relationships (DRR), k-nearest neighbours (k-NN) and cokriging (CoK) and to evaluate their accuracy. A part of the Timli Forest Range of Kalsi Soil and Water Conservation Division, Uttarakhand, India was selected for the present study. Stratified random sampling was used to collect biophysical data from 36 sample plots of 0.1 ha (31.62 m × 31.62 m) size. Species-specific volumetric equations were used for calculating volume and multiplied by specific gravity to get biomass. Three forest-type density classes, viz. 10-40, 40-70 and >70% of Shorea robusta forest and four non-forest classes were delineated using on-screen visual interpretation of IRS P6 LISS-III data of December 2012. The volume in different strata of forest-type density ranged from 189.84 to 484.36 m(3) ha(-1). The total growing stock of the forest was found to be 2,024,652.88 m(3). The AGWB ranged from 143 to 421 Mgha(-1). Spectral bands and vegetation indices were used as independent variables and biomass as dependent variable for DRR, k-NN and CoK. After validation and comparison, k-NN method of Mahalanobis distance (root mean square error (RMSE) = 42.25 Mgha(-1)) was found to be the best method followed by fuzzy distance and Euclidean distance with RMSE of 44.23 and 45.13 Mgha(-1) respectively. DRR was found to be the least accurate method with RMSE of 67.17 Mgha(-1). The study highlighted the potential of integrating of forest inventory, remote sensing and geostatistical techniques for forest biomass mapping.

  13. Network Sampling with Memory: A proposal for more efficient sampling from social networks

    Science.gov (United States)

    Mouw, Ted; Verdery, Ashton M.

    2013-01-01

    Techniques for sampling from networks have grown into an important area of research across several fields. For sociologists, the possibility of sampling from a network is appealing for two reasons: (1) A network sample can yield substantively interesting data about network structures and social interactions, and (2) it is useful in situations where study populations are difficult or impossible to survey with traditional sampling approaches because of the lack of a sampling frame. Despite its appeal, methodological concerns about the precision and accuracy of network-based sampling methods remain. In particular, recent research has shown that sampling from a network using a random walk based approach such as Respondent Driven Sampling (RDS) can result in high design effects (DE)—the ratio of the sampling variance to the sampling variance of simple random sampling (SRS). A high design effect means that more cases must be collected to achieve the same level of precision as SRS. In this paper we propose an alternative strategy, Network Sampling with Memory (NSM), which collects network data from respondents in order to reduce design effects and, correspondingly, the number of interviews needed to achieve a given level of statistical power. NSM combines a “List” mode, where all individuals on the revealed network list are sampled with the same cumulative probability, with a “Search” mode, which gives priority to bridge nodes connecting the current sample to unexplored parts of the network. We test the relative efficiency of NSM compared to RDS and SRS on 162 school and university networks from Add Health and Facebook that range in size from 110 to 16,278 nodes. The results show that the average design effect for NSM on these 162 networks is 1.16, which is very close to the efficiency of a simple random sample (DE=1), and 98.5% lower than the average DE we observed for RDS. PMID:24159246

  14. Comparison of Sampling Designs for Estimating Deforestation from Landsat TM and MODIS Imagery: A Case Study in Mato Grosso, Brazil

    Directory of Open Access Journals (Sweden)

    Shanyou Zhu

    2014-01-01

    Full Text Available Sampling designs are commonly used to estimate deforestation over large areas, but comparisons between different sampling strategies are required. Using PRODES deforestation data as a reference, deforestation in the state of Mato Grosso in Brazil from 2005 to 2006 is evaluated using Landsat imagery and a nearly synchronous MODIS dataset. The MODIS-derived deforestation is used to assist in sampling and extrapolation. Three sampling designs are compared according to the estimated deforestation of the entire study area based on simple extrapolation and linear regression models. The results show that stratified sampling for strata construction and sample allocation using the MODIS-derived deforestation hotspots provided more precise estimations than simple random and systematic sampling. Moreover, the relationship between the MODIS-derived and TM-derived deforestation provides a precise estimate of the total deforestation area as well as the distribution of deforestation in each block.

  15. RADIAL STABILITY IN STRATIFIED STARS

    International Nuclear Information System (INIS)

    Pereira, Jonas P.; Rueda, Jorge A.

    2015-01-01

    We formulate within a generalized distributional approach the treatment of the stability against radial perturbations for both neutral and charged stratified stars in Newtonian and Einstein's gravity. We obtain from this approach the boundary conditions connecting any two phases within a star and underline its relevance for realistic models of compact stars with phase transitions, owing to the modification of the star's set of eigenmodes with respect to the continuous case

  16. A Systematic Review of Surgical Randomized Controlled Trials: Part 2. Funding Source, Conflict of Interest, and Sample Size in Plastic Surgery.

    Science.gov (United States)

    Voineskos, Sophocles H; Coroneos, Christopher J; Ziolkowski, Natalia I; Kaur, Manraj N; Banfield, Laura; Meade, Maureen O; Chung, Kevin C; Thoma, Achilleas; Bhandari, Mohit

    2016-02-01

    The authors examined industry support, conflict of interest, and sample size in plastic surgery randomized controlled trials that compared surgical interventions. They hypothesized that industry-funded trials demonstrate statistically significant outcomes more often, and randomized controlled trials with small sample sizes report statistically significant results more frequently. An electronic search identified randomized controlled trials published between 2000 and 2013. Independent reviewers assessed manuscripts and performed data extraction. Funding source, conflict of interest, primary outcome direction, and sample size were examined. Chi-squared and independent-samples t tests were used in the analysis. The search identified 173 randomized controlled trials, of which 100 (58 percent) did not acknowledge funding status. A relationship between funding source and trial outcome direction was not observed. Both funding status and conflict of interest reporting improved over time. Only 24 percent (six of 25) of industry-funded randomized controlled trials reported authors to have independent control of data and manuscript contents. The mean number of patients randomized was 73 per trial (median, 43, minimum, 3, maximum, 936). Small trials were not found to be positive more often than large trials (p = 0.87). Randomized controlled trials with small sample size were common; however, this provides great opportunity for the field to engage in further collaboration and produce larger, more definitive trials. Reporting of trial funding and conflict of interest is historically poor, but it greatly improved over the study period. Underreporting at author and journal levels remains a limitation when assessing the relationship between funding source and trial outcomes. Improved reporting and manuscript control should be goals that both authors and journals can actively achieve.

  17. Large-volume constant-concentration sampling technique coupling with surface-enhanced Raman spectroscopy for rapid on-site gas analysis

    Science.gov (United States)

    Zhang, Zhuomin; Zhan, Yisen; Huang, Yichun; Li, Gongke

    2017-08-01

    In this work, a portable large-volume constant-concentration (LVCC) sampling technique coupling with surface-enhanced Raman spectroscopy (SERS) was developed for the rapid on-site gas analysis based on suitable derivatization methods. LVCC sampling technique mainly consisted of a specially designed sampling cell including the rigid sample container and flexible sampling bag, and an absorption-derivatization module with a portable pump and a gas flowmeter. LVCC sampling technique allowed large, alterable and well-controlled sampling volume, which kept the concentration of gas target in headspace phase constant during the entire sampling process and made the sampling result more representative. Moreover, absorption and derivatization of gas target during LVCC sampling process were efficiently merged in one step using bromine-thiourea and OPA-NH4+ strategy for ethylene and SO2 respectively, which made LVCC sampling technique conveniently adapted to consequent SERS analysis. Finally, a new LVCC sampling-SERS method was developed and successfully applied for rapid analysis of trace ethylene and SO2 from fruits. It was satisfied that trace ethylene and SO2 from real fruit samples could be actually and accurately quantified by this method. The minor concentration fluctuations of ethylene and SO2 during the entire LVCC sampling process were proved to be samples were achieved in range of 95.0-101% and 97.0-104% respectively. It is expected that portable LVCC sampling technique would pave the way for rapid on-site analysis of accurate concentrations of trace gas targets from real samples by SERS.

  18. Distributed fiber sparse-wideband vibration sensing by sub-Nyquist additive random sampling

    Science.gov (United States)

    Zhang, Jingdong; Zheng, Hua; Zhu, Tao; Yin, Guolu; Liu, Min; Bai, Yongzhong; Qu, Dingrong; Qiu, Feng; Huang, Xianbing

    2018-05-01

    The round trip time of the light pulse limits the maximum detectable vibration frequency response range of phase-sensitive optical time domain reflectometry ({\\phi}-OTDR). Unlike the uniform laser pulse interval in conventional {\\phi}-OTDR, we randomly modulate the pulse interval, so that an equivalent sub-Nyquist additive random sampling (sNARS) is realized for every sensing point of the long interrogation fiber. For an {\\phi}-OTDR system with 10 km sensing length, the sNARS method is optimized by theoretical analysis and Monte Carlo simulation, and the experimental results verify that a wide-band spars signal can be identified and reconstructed. Such a method can broaden the vibration frequency response range of {\\phi}-OTDR, which is of great significance in sparse-wideband-frequency vibration signal detection, such as rail track monitoring and metal defect detection.

  19. Comparative Study of Radon Concentration with Two Techniques and Elemental Analysis in Drinking Water Samples of the Jammu District, Jammu and Kashmir, India.

    Science.gov (United States)

    Kumar, Ajay; Kaur, Manpreet; Mehra, Rohit; Sharma, Dinesh Kumar; Mishra, Rosaline

    2017-10-01

    The level of radon concentration has been assessed using the Advanced SMART RnDuo technique in 30 drinking water samples from Jammu district, Jammu and Kashmir, India. The water samples were collected from wells, hand pumps, submersible pumps, and stored waters. The randomly obtained 14 values of radon concentration in water sources using the SMART RnDuo technique have been compared and cross checked by a RAD7 device. A good positive correlation (R = 0.88) has been observed between the two techniques. The overall value of radon concentration in various water sources has ranged from 2.45 to 18.43 Bq L, with a mean value of 8.24 ± 4.04 Bq L, and it agreed well with the recommended limit suggested by the European Commission and UNSCEAR. However, the higher activity of mean radon concentration was found in groundwater drawn from well, hand and submersible pumps as compared to stored water. The total annual effective dose due to radon inhalation and ingestion ranged from 6.69 to 50.31 μSv y with a mean value of 22.48 ± 11.03 μSv y. The total annual effective dose was found to lie within the safe limit (100 μSv y) suggested by WHO. Heavy metal analysis was also carried out in various water sources by using an atomic absorption spectrophotometer (AAS), and the highest value of heavy metals was found mostly in groundwater samples. The obtained results were compared with Indian and International organizations like WHO and the EU Council. Among all the samples, the elemental analysis is not on the exceeding side of the permissible limit.

  20. Soil mixing of stratified contaminated sands.

    Science.gov (United States)

    Al-Tabba, A; Ayotamuno, M J; Martin, R J

    2000-02-01

    Validation of soil mixing for the treatment of contaminated ground is needed in a wide range of site conditions to widen the application of the technology and to understand the mechanisms involved. Since very limited work has been carried out in heterogeneous ground conditions, this paper investigates the effectiveness of soil mixing in stratified sands using laboratory-scale augers. This enabled a low cost investigation of factors such as grout type and form, auger design, installation procedure, mixing mode, curing period, thickness of soil layers and natural moisture content on the unconfined compressive strength, leachability and leachate pH of the soil-grout mixes. The results showed that the auger design plays a very important part in the mixing process in heterogeneous sands. The variability of the properties measured in the stratified soils and the measurable variations caused by the various factors considered, highlighted the importance of duplicating appropriate in situ conditions, the usefulness of laboratory-scale modelling of in situ conditions and the importance of modelling soil and contaminant heterogeneities at the treatability study stage.

  1. Experimental technique to measure thoron generation rate of building material samples using RAD7 detector

    International Nuclear Information System (INIS)

    Csige, I.; Szabó, Zs.; Szabó, Cs.

    2013-01-01

    Thoron ( 220 Rn) is the second most abundant radon isotope in our living environment. In some dwellings it is present in significant amount which calls for its identification and remediation. Indoor thoron originates mainly from building materials. In this work we have developed and tested an experimental technique to measure thoron generation rate in building material samples using RAD7 radon-thoron detector. The mathematical model of the measurement technique provides the thoron concentration response of RAD7 as a function of the sample thickness. For experimental validation of the technique an adobe building material sample was selected for measuring the thoron concentration at nineteen different sample thicknesses. Fitting the parameters of the model to the measurement results, both the generation rate and the diffusion length of thoron was estimated. We have also determined the optimal sample thickness for estimating the thoron generation rate from a single measurement. -- Highlights: • RAD7 is used for the determination of thoron generation rate (emanation). • The described model takes into account the thoron decay and attenuation. • The model describes well the experimental results. • A single point measurement method is offered at a determined sample thickness

  2. Stratified charge rotary aircraft engine technology enablement program

    Science.gov (United States)

    Badgley, P. R.; Irion, C. E.; Myers, D. M.

    1985-01-01

    The multifuel stratified charge rotary engine is discussed. A single rotor, 0.7L/40 cu in displacement, research rig engine was tested. The research rig engine was designed for operation at high speeds and pressures, combustion chamber peak pressure providing margin for speed and load excursions above the design requirement for a high is advanced aircraft engine. It is indicated that the single rotor research rig engine is capable of meeting the established design requirements of 120 kW, 8,000 RPM, 1,379 KPA BMEP. The research rig engine, when fully developed, will be a valuable tool for investigating, advanced and highly advanced technology components, and provide an understanding of the stratified charge rotary engine combustion process.

  3. A review of recent developments on turbulent entrainment in stratified flows

    International Nuclear Information System (INIS)

    Cotel, Aline J

    2010-01-01

    Stratified interfaces are present in many geophysical flow situations, and transport across such an interface is an essential factor for correctly evaluating the physical processes taking place at many spatial and temporal scales in such flows. In order to accurately evaluate vertical and lateral transport occurring when a turbulent flow impinges on a stratified interface, the turbulent entrainment and vorticity generation mechanisms near the interface must be understood and quantified. Laboratory experiments were performed for three flow configurations: a vertical thermal, a sloping gravity current and a vertical turbulent jet with various tilt angles and precession speeds. All three flows impinged on an interface separating a two-layer stably stratified environment. The entrainment rate is quantified for each flow using laser-induced fluorescence and compared to predictions of Cotel and Breidenthal (1997 Appl. Sci. Res. 57 349-66). The possible applications of transport across stratified interfaces include the contribution of hydrothermal plumes to the global ocean energy budget, turbidity currents on the ocean floor, the design of lake de-stratification systems, modeling gas leaks from storage reservoirs, weather forecasting and global climate change.

  4. Peyton's four-step approach for teaching complex spinal manipulation techniques - a prospective randomized trial.

    Science.gov (United States)

    Gradl-Dietsch, Gertraud; Lübke, Cavan; Horst, Klemens; Simon, Melanie; Modabber, Ali; Sönmez, Tolga T; Münker, Ralf; Nebelung, Sven; Knobe, Matthias

    2016-11-03

    The objectives of this prospective randomized trial were to assess the impact of Peyton's four-step approach on the acquisition of complex psychomotor skills and to examine the influence of gender on learning outcomes. We randomly assigned 95 third to fifth year medical students to an intervention group which received instructions according to Peyton (PG) or a control group, which received conventional teaching (CG). Both groups attended four sessions on the principles of manual therapy and specific manipulative and diagnostic techniques for the spine. We assessed differences in theoretical knowledge (multiple choice (MC) exam) and practical skills (Objective Structured Practical Examination (OSPE)) with respect to type of intervention and gender. Participants took a second OSPE 6 months after completion of the course. There were no differences between groups with respect to the MC exam. Students in the PG group scored significantly higher in the OSPE. Gender had no additional impact. Results of the second OSPE showed a significant decline in competency regardless of gender and type of intervention. Peyton's approach is superior to standard instruction for teaching complex spinal manipulation skills regardless of gender. Skills retention was equally low for both techniques.

  5. Geostatistics for Mapping Leaf Area Index over a Cropland Landscape: Efficiency Sampling Assessment

    Directory of Open Access Journals (Sweden)

    Javier Garcia-Haro

    2010-11-01

    Full Text Available This paper evaluates the performance of spatial methods to estimate leaf area index (LAI fields from ground-based measurements at high-spatial resolution over a cropland landscape. Three geostatistical model variants of the kriging technique, the ordinary kriging (OK, the collocated cokriging (CKC and kriging with an external drift (KED are used. The study focused on the influence of the spatial sampling protocol, auxiliary information, and spatial resolution in the estimates. The main advantage of these models lies in the possibility of considering the spatial dependence of the data and, in the case of the KED and CKC, the auxiliary information for each location used for prediction purposes. A high-resolution NDVI image computed from SPOT TOA reflectance data is used as an auxiliary variable in LAI predictions. The CKC and KED predictions have proven the relevance of the auxiliary information to reproduce the spatial pattern at local scales, proving the KED model to be the best estimator when a non-stationary trend is observed. Advantages and limitations of the methods in LAI field predictions for two systematic and two stratified spatial samplings are discussed for high (20 m, medium (300 m and coarse (1 km spatial scales. The KED has exhibited the best observed local accuracy for all the spatial samplings. Meanwhile, the OK model provides comparable results when a well stratified sampling scheme is considered by land cover.

  6. Sampling phased array - a new technique for ultrasonic signal processing and imaging

    OpenAIRE

    Verkooijen, J.; Boulavinov, A.

    2008-01-01

    Over the past 10 years, the improvement in the field of microelectronics and computer engineering has led to significant advances in ultrasonic signal processing and image construction techniques that are currently being applied to non-destructive material evaluation. A new phased array technique, called 'Sampling Phased Array', has been developed in the Fraunhofer Institute for Non-Destructive Testing([1]). It realises a unique approach of measurement and processing of ultrasonic signals. Th...

  7. Non-parametric adaptive importance sampling for the probability estimation of a launcher impact position

    International Nuclear Information System (INIS)

    Morio, Jerome

    2011-01-01

    Importance sampling (IS) is a useful simulation technique to estimate critical probability with a better accuracy than Monte Carlo methods. It consists in generating random weighted samples from an auxiliary distribution rather than the distribution of interest. The crucial part of this algorithm is the choice of an efficient auxiliary PDF that has to be able to simulate more rare random events. The optimisation of this auxiliary distribution is often in practice very difficult. In this article, we propose to approach the IS optimal auxiliary density with non-parametric adaptive importance sampling (NAIS). We apply this technique for the probability estimation of spatial launcher impact position since it has currently become a more and more important issue in the field of aeronautics.

  8. Zooplankton structure and vertical migration: Using acoustics and biomass to compare stratified and mixed fjord systems

    Science.gov (United States)

    Díaz-Astudillo, Macarena; Cáceres, Mario A.; Landaeta, Mauricio F.

    2017-09-01

    The patterns of abundance, composition, biomass and vertical migration of zooplankton in short-time scales (ADCP device mounted on the hull of a ship were used to obtain vertical profiles of current velocity data and intensity of the backscattered acoustic signal, which was used to study the migratory strategies and to relate the echo intensity with zooplankton biomass. Repeated vertical profiles of temperature, salinity and density were obtained with a CTD instrument to describe the density patterns during both experiments. Zooplankton were sampled every 3 h using a Bongo net to determine abundance, composition and biomass. Migrations were diel in the stratified station, semi-diel in the mixed station, and controlled by light in both locations, with large and significant differences in zooplankton abundance and biomass between day and night samples. No migration pattern associated with the effect of tides was found. The depth of maximum backscatter strength showed differences of approximately 30 m between stations and was deeper in the mixed station. The relation between mean volume backscattering strength (dB) computed from echo intensity and log10 of total dry weight (mg m-3) of zooplankton biomass was moderate but significant in both locations. Biomass estimated from biological samples was higher in the mixed station and determined by euphausiids. Copepods were the most abundant group in both stations. Acoustic methods were a useful technique to understand the detailed patterns of migratory strategies of zooplankton and to help estimate zooplankton biomass and abundance in the inner waters of southern Chile.

  9. Two-compartment, two-sample technique for accurate estimation of effective renal plasma flow: Theoretical development and comparison with other methods

    International Nuclear Information System (INIS)

    Lear, J.L.; Feyerabend, A.; Gregory, C.

    1989-01-01

    Discordance between effective renal plasma flow (ERPF) measurements from radionuclide techniques that use single versus multiple plasma samples was investigated. In particular, the authors determined whether effects of variations in distribution volume (Vd) of iodine-131 iodohippurate on measurement of ERPF could be ignored, an assumption implicit in the single-sample technique. The influence of Vd on ERPF was found to be significant, a factor indicating an important and previously unappreciated source of error in the single-sample technique. Therefore, a new two-compartment, two-plasma-sample technique was developed on the basis of the observations that while variations in Vd occur from patient to patient, the relationship between intravascular and extravascular components of Vd and the rate of iodohippurate exchange between the components are stable throughout a wide range of physiologic and pathologic conditions. The new technique was applied in a series of 30 studies in 19 patients. Results were compared with those achieved with the reference, single-sample, and slope-intercept techniques. The new two-compartment, two-sample technique yielded estimates of ERPF that more closely agreed with the reference multiple-sample method than either the single-sample or slope-intercept techniques

  10. A Randomized Controlled Trial of Mastication with Complete Dentures Made by a Conventional or an Abbreviated Technique.

    Science.gov (United States)

    Mengatto, Cristiane Machado; Gameiro, Gustavo Hauber; Brondani, Mario; Owen, C Peter; MacEntee, Michael I

    The aim of this randomized clinical trial was to test the hypothesis that there are no statistically significant differences after 3 and 6 months in masticatory performance or chewing ability of people with new complete dentures made by an abbreviated or a conventional technique. The trial included 20 edentulous participants at a dental school in Brazil assigned randomly to receive dentures made by either a conventional technique involving six clinical sessions or by an abbreviated technique involving three clinical sessions. At baseline with old dentures and at 3 and 6 months with new dentures, masticatory performance was measured by counting the number of chewing strokes and the time before participants had an urge to swallow and by calculating the medium particle size of a silicone material after 20 chewing strokes and at the urge to swallow. On each occasion, the participants recorded on visual analog scales their ability to chew five food textures. Statistical significance (P ≤ .05) of changes in masticatory performance and chewing ability during the trial were analyzed with generalized estimating equations. Both techniques improved masticatory performance between baseline and 6 months and the ability to bite and chew all foods apart from hard apples. There were no significant differences in masticatory performance or chewing ability after 6 months between complete dentures made by a conventional or an abbreviated technique.

  11. Stratified turbulent Bunsen flames : flame surface analysis and flame surface density modelling

    NARCIS (Netherlands)

    Ramaekers, W.J.S.; Oijen, van J.A.; Goey, de L.P.H.

    2012-01-01

    In this paper it is investigated whether the Flame Surface Density (FSD) model, developed for turbulent premixed combustion, is also applicable to stratified flames. Direct Numerical Simulations (DNS) of turbulent stratified Bunsen flames have been carried out, using the Flamelet Generated Manifold

  12. THE EFFECT OF EXTERN AND INTERN ENVIRONMENT TOWARD BUSINESS STRATEGIES AND THEIR IMPACT TOWARD BUSINESS PERFORMANCE

    Directory of Open Access Journals (Sweden)

    Seo A.Y.

    2018-01-01

    Full Text Available This study was done to see the effect of extern and intern environment toward business strategies and the performance of micro, small and medium enterprises. The population of this study was business owners in Bajawa Regency, Nusa Tenggara Timur Province, Indonesia, represented by 122 respondents as the samples of this study. The samples were chosen using a proportionate stratified random sampling. The data of this study were then analyzed using Partial Least Square technique. The result of this study shows that extern and intern environment have significant effects toward business strategies and performance as mediators.

  13. Power distribution system reliability evaluation using dagger-sampling Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Hu, Y.; Zhao, S.; Ma, Y. [North China Electric Power Univ., Hebei (China). Dept. of Electrical Engineering

    2009-03-11

    A dagger-sampling Monte Carlo simulation method was used to evaluate power distribution system reliability. The dagger-sampling technique was used to record the failure of a component as an incident and to determine its occurrence probability by generating incident samples using random numbers. The dagger sampling technique was combined with the direct sequential Monte Carlo method to calculate average values of load point indices and system indices. Results of the 2 methods with simulation times of up to 100,000 years were then compared. The comparative evaluation showed that less computing time was required using the dagger-sampling technique due to its higher convergence speed. When simulation times were 1000 years, the dagger-sampling method required 0.05 seconds to accomplish an evaluation, while the direct method required 0.27 seconds. 12 refs., 3 tabs., 4 figs.

  14. Investigations on flow reversal in stratified horizontal flow

    International Nuclear Information System (INIS)

    Staebler, T.; Meyer, L.; Schulenberg, T.; Laurien, E.

    2005-01-01

    The phenomena of flow reversal in stratified flows are investigated in a horizontal channel with application to the Emergency Core Cooling System (ECCS) in Pressurized Water Reactors (PWR). In case of a Loss-of-Coolant-Accident (LOCA), coolant can be injected through a secondary pipe within the feeding line of the primary circuit, the so called hot leg, counter-currently to the steam flow. It is essential that the coolant reaches the reactor core to prevent overheating. Due to high temperatures in such accident scenarios, steam is generated in the core, which escapes from the reactor vessel through the hot leg. In case of sufficiently high steam flow rates, only a reduced amount of coolant or even no coolant will be delivered to the reactor core. The WENKA test facility at the Institute for Nuclear and Energy Technologies (IKET) at Forschungszentrum Karlsruhe is capable to investigate the fluid dynamics of two-phase flows in such scenarios. Water and air flow counter-currently in a horizontal channel made of clear acrylic glass to allow full optical access. Flow rates of water and air can be varied independently within a wide range. Once flow reversal sets in, a strong hysteresis effect must be taken into account. This was quantified during the present investigations. Local experimental data are needed to expand appropriate models on flow reversal in horizontal two-phase flow and to include them into numerical codes. Investigations are carried out by means of Particle Image Velocimetry (PIV) to obtain local flow velocities without disturbing the flow. Due to the wavy character of the flow, strong reflections at the interfacial area must be taken into account. Using fluorescent particles and an optical filter allows eliminating the reflections and recording only the signals of the particles. The challenges in conducting local investigations in stratified wavy flows by applying optical measurement techniques are discussed. Results are presented and discussed allowing

  15. Neutron activation analysis technique and X-ray fluorescence in bovine liver sample

    International Nuclear Information System (INIS)

    Maihara, V.A.; Favaro, D.I.T.; Vasconcellos, M.B.A.; Sato, I.M.; Salvador, V.L.

    2002-01-01

    Many analytical techniques have been used in food and diet analysis in order to determine a great number of nutritional elements, ranging from percentage to ng g -1 , with high sensitivity and accuracy. Instrumental Neutron activation Analysis (INAA) has been employed to certificate many trace elements in biological reference materials. More recently, the X-Ray Fluorescence (FRX-WD) has been also used to determine some essential elements in food samples. The INAA has been applied in nutrition studies in our laboratory at IPEN since the 80 s. For the development of analytical methodologies the use of the reference materials with the same characteristics of the sample analyzed is essential. Several Brazilian laboratories do not have conditions to use these materials due their high cost.In this paper preliminary results of commercial bovine liver sample analyses obtained by INAA and WD-XRF methods are presented. This sample was prepared to be a Brazilian candidate of reference material for a group of laboratories participating in a research project sponsored by FAPESP. The concentrations of some elements like Cl, K, Na, P, S and trace elements Br, Ca, Co, Cu, Fe, Mg, Mn, Mo, Rb, Se and Zn were determined by INAA and WD-XFR. For validation methodology of both techniques, NIST SRM 1577b Bovine Liver reference material was analyzed and the detection limits were calculated. The concentrations of elements determined by both analytical techniques were compared by using the Student's t-test and for Cl, Cu, Fe, K, Mg, Na, Rn and Zn the results do show no statistical difference for 95% significance level. (author)

  16. Tobacco smoking surveillance: is quota sampling an efficient tool for monitoring national trends? A comparison with a random cross-sectional survey.

    Directory of Open Access Journals (Sweden)

    Romain Guignard

    Full Text Available OBJECTIVES: It is crucial for policy makers to monitor the evolution of tobacco smoking prevalence. In France, this monitoring is based on a series of cross-sectional general population surveys, the Health Barometers, conducted every five years and based on random samples. A methodological study has been carried out to assess the reliability of a monitoring system based on regular quota sampling surveys for smoking prevalence. DESIGN / OUTCOME MEASURES: In 2010, current and daily tobacco smoking prevalences obtained in a quota survey on 8,018 people were compared with those of the 2010 Health Barometer carried out on 27,653 people. Prevalences were assessed separately according to the telephone equipment of the interviewee (landline phone owner vs "mobile-only", and logistic regressions were conducted in the pooled database to assess the impact of the telephone equipment and of the survey mode on the prevalences found. Finally, logistic regressions adjusted for sociodemographic characteristics were conducted in the random sample in order to determine the impact of the needed number of calls to interwiew "hard-to-reach" people on the prevalence found. RESULTS: Current and daily prevalences were higher in the random sample (respectively 33.9% and 27.5% in 15-75 years-old than in the quota sample (respectively 30.2% and 25.3%. In both surveys, current and daily prevalences were lower among landline phone owners (respectively 31.8% and 25.5% in the random sample and 28.9% and 24.0% in the quota survey. The required number of calls was slightly related to the smoking status after adjustment for sociodemographic characteristics. CONCLUSION: Random sampling appears to be more effective than quota sampling, mainly by making it possible to interview hard-to-reach populations.

  17. Stratified Entomological Sampling in Preparation for an Area-Wide Integrated Pest Management Program: The Example of Glossina palpalis gambiensis (Diptera: Glossinidae) in the Niayes of Senegal

    International Nuclear Information System (INIS)

    Bouyer, Jeremy; Seck, Momar Talla; Guerrini, Laure; Sall, Baba; Ndiaye, Elhadji Youssou; Vreysen, Marc J.B.

    2010-01-01

    The riverine tsetse species Glossina palpalis gambiensis Vanderplank 1949 (Diptera: Glossinidae) inhabits riparian forests along river systems in West Africa. The government of Senegal has embarked on a project to eliminate this tsetse species, and African animal trypanosomoses, from the Niayes are using an area-wide integrated pest management approach. A stratified entomological sampling strategy was therefore developed using spatial analytical tools and mathematical modeling. A preliminary phytosociological census identified eight types of suitable habitat, which could be discriminated from LandSat 7ETM satellite images and denominated wet areas. At the end of March 2009, 683 unbaited Vavoua traps had been deployed, and the observed infested area in the Niayes was 525 km2. In the remaining area, a mathematical model was used to assess the risk that flies were present despite a sequence of zero catches. The analysis showed that this risk was above 0.05 in19% of this area that will be considered as infested during the control operations.The remote sensing analysis that identifed the wet areas allowed a restriction of the area to be surveyed to 4% of the total surface area (7,150km2), whereas the mathematical model provided an efficient method to improve the accuracy and the robustness of the sampling protocol. The final size of the control area will be decided based on the entomological collection data.This entomological sampling procedure might be used for other vector or pest control scenarios. (Authors)

  18. Effect of novel inhaler technique reminder labels on the retention of inhaler technique skills in asthma: a single-blind randomized controlled trial.

    Science.gov (United States)

    Basheti, Iman A; Obeidat, Nathir M; Reddel, Helen K

    2017-02-09

    Inhaler technique can be corrected with training, but skills drop off quickly without repeated training. The aim of our study was to explore the effect of novel inhaler technique labels on the retention of correct inhaler technique. In this single-blind randomized parallel-group active-controlled study, clinical pharmacists enrolled asthma patients using controller medication by Accuhaler [Diskus] or Turbuhaler. Inhaler technique was assessed using published checklists (score 0-9). Symptom control was assessed by asthma control test. Patients were randomized into active (ACCa; THa) and control (ACCc; THc) groups. All patients received a "Show-and-Tell" inhaler technique counseling service. Active patients also received inhaler labels highlighting their initial errors. Baseline data were available for 95 patients, 68% females, mean age 44.9 (SD 15.2) years. Mean inhaler scores were ACCa:5.3 ± 1.0; THa:4.7 ± 0.9, ACCc:5.5 ± 1.1; THc:4.2 ± 1.0. Asthma was poorly controlled (mean ACT scores ACCa:13.9 ± 4.3; THa:12.1 ± 3.9; ACCc:12.7 ± 3.3; THc:14.3 ± 3.7). After training, all patients had correct technique (score 9/9). After 3 months, there was significantly less decline in inhaler technique scores for active than control groups (mean difference: Accuhaler -1.04 (95% confidence interval -1.92, -0.16, P = 0.022); Turbuhaler -1.61 (-2.63, -0.59, P = 0.003). Symptom control improved significantly, with no significant difference between active and control patients, but active patients used less reliever medication (active 2.19 (SD 1.78) vs. control 3.42 (1.83) puffs/day, P = 0.002). After inhaler training, novel inhaler technique labels improve retention of correct inhaler technique skills with dry powder inhalers. Inhaler technique labels represent a simple, scalable intervention that has the potential to extend the benefit of inhaler training on asthma outcomes. REMINDER LABELS IMPROVE INHALER TECHNIQUE: Personalized

  19. Improved mesh based photon sampling techniques for neutron activation analysis

    International Nuclear Information System (INIS)

    Relson, E.; Wilson, P. P. H.; Biondo, E. D.

    2013-01-01

    The design of fusion power systems requires analysis of neutron activation of large, complex volumes, and the resulting particles emitted from these volumes. Structured mesh-based discretization of these problems allows for improved modeling in these activation analysis problems. Finer discretization of these problems results in large computational costs, which drives the investigation of more efficient methods. Within an ad hoc subroutine of the Monte Carlo transport code MCNP, we implement sampling of voxels and photon energies for volumetric sources using the alias method. The alias method enables efficient sampling of a discrete probability distribution, and operates in 0(1) time, whereas the simpler direct discrete method requires 0(log(n)) time. By using the alias method, voxel sampling becomes a viable alternative to sampling space with the 0(1) approach of uniformly sampling the problem volume. Additionally, with voxel sampling it is straightforward to introduce biasing of volumetric sources, and we implement this biasing of voxels as an additional variance reduction technique that can be applied. We verify our implementation and compare the alias method, with and without biasing, to direct discrete sampling of voxels, and to uniform sampling. We study the behavior of source biasing in a second set of tests and find trends between improvements and source shape, material, and material density. Overall, however, the magnitude of improvements from source biasing appears to be limited. Future work will benefit from the implementation of efficient voxel sampling - particularly with conformal unstructured meshes where the uniform sampling approach cannot be applied. (authors)

  20. Large-volume constant-concentration sampling technique coupling with surface-enhanced Raman spectroscopy for rapid on-site gas analysis.

    Science.gov (United States)

    Zhang, Zhuomin; Zhan, Yisen; Huang, Yichun; Li, Gongke

    2017-08-05

    In this work, a portable large-volume constant-concentration (LVCC) sampling technique coupling with surface-enhanced Raman spectroscopy (SERS) was developed for the rapid on-site gas analysis based on suitable derivatization methods. LVCC sampling technique mainly consisted of a specially designed sampling cell including the rigid sample container and flexible sampling bag, and an absorption-derivatization module with a portable pump and a gas flowmeter. LVCC sampling technique allowed large, alterable and well-controlled sampling volume, which kept the concentration of gas target in headspace phase constant during the entire sampling process and made the sampling result more representative. Moreover, absorption and derivatization of gas target during LVCC sampling process were efficiently merged in one step using bromine-thiourea and OPA-NH 4 + strategy for ethylene and SO 2 respectively, which made LVCC sampling technique conveniently adapted to consequent SERS analysis. Finally, a new LVCC sampling-SERS method was developed and successfully applied for rapid analysis of trace ethylene and SO 2 from fruits. It was satisfied that trace ethylene and SO 2 from real fruit samples could be actually and accurately quantified by this method. The minor concentration fluctuations of ethylene and SO 2 during the entire LVCC sampling process were proved to be gas targets from real samples by SERS. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. EFEKTIVITAS KINERJA PEGAWAI DAN KUALITAS PELAYANAN DALAM PERSPEKTIF IKLIM ORGANISASI DAN LINGKUNGAN KERJA PADA KANTOR BAPPEDA KABUPATEN NGAWI

    Directory of Open Access Journals (Sweden)

    A.R. Djoko Purwito

    2016-09-01

    Full Text Available This study aimed to analyze the influence of independent variables group consisting of organizational climate and work environment partially and simultaneously the dependent variable group consisting of effectiveness of employee performance and quality of service in the office of the Regional Development Planning Board Ngawi Regency . Method of sampling using proportional stratified random sampling  number of 45 samples. The data collection techniques using a questionnaire with 51 questions a number of items . Data analysis technique using canonical analysis . The results of the study showed that the organizational climate and work environment partially and simultaneously significantly influence the effectiveness of employee performance and quality of service in the office of the Regional Development Planning Board Ngawi Regency.

  2. Spinning phenomena and energetics of spherically pulsating patterns in stratified fluids

    International Nuclear Information System (INIS)

    Ibragimov, Ranis N; Dameron, Michael

    2011-01-01

    The nonlinear solutions of the two-dimensional Boussinesq equations describing internal waves in rotating stratified fluids were obtained as group invariant solutions. The latter nonlinear solutions correspond to the rotation transformation preserving the form of the original nonlinear equations of motion. It is shown that the obtained class of exact solutions can be associated with the spherically pulsating patterns observed in uniformly stratified fluids. It is also shown that the obtained rotationally symmetric solutions are bounded functions that can be visualized as spinning patterns in stratified fluids. It is also shown that the rotational transformation provides the energy conservation law together with other conservation laws for which the spinning phenomena is observed. The effects of nonlinearity and the Earth's rotation on such a phenomenon are also discussed.

  3. Hydrogeology and water quality of the stratified-drift aquifer in the Pony Hollow Creek Valley, Tompkins County, New York

    Science.gov (United States)

    Bugliosi, Edward F.; Miller, Todd S.; Reynolds, Richard J.

    2014-01-01

    The lithology, areal extent, and the water-table configuration in stratified-drift aquifers in the northern part of the Pony Hollow Creek valley in the Town of Newfield, New York, were mapped as part of an ongoing aquifer mapping program in Tompkins County. Surficial geologic and soil maps, well and test-boring records, light detection and ranging (lidar) data, water-level measurements, and passive-seismic surveys were used to map the aquifer geometry, construct geologic sections, and determine the depth to bedrock at selected locations throughout the valley. Additionally, water-quality samples were collected from selected streams and wells to characterize the quality of surface and groundwater in the study area. Sedimentary bedrock underlies the study area and is overlain by unstratified drift (till), stratified drift (glaciolacustrine and glaciofluvial deposits), and recent post glacial alluvium. The major type of unconsolidated, water-yielding material in the study area is stratified drift, which consists of glaciofluvial sand and gravel, and is present in sufficient amounts in most places to form an extensive unconfined aquifer throughout the study area, which is the source of water for most residents, farms, and businesses in the valleys. A map of the water table in the unconfined aquifer was constructed by using (1) measurements made between the mid-1960s through 2010, (2) control on the altitudes of perennial streams at 10-foot contour intervals from lidar data collected by Tompkins County, and (3) water surfaces of ponds and wetlands that are hydraulically connected to the unconfined aquifer. Water-table contours indicate that the direction of groundwater flow within the stratified-drift aquifer is predominantly from the valley walls toward the streams and ponds in the central part of the valley where groundwater then flows southwestward (down valley) toward the confluence with the Cayuta Creek valley. Locally, the direction of groundwater flow is radially

  4. Iterative algorithm of discrete Fourier transform for processing randomly sampled NMR data sets

    International Nuclear Information System (INIS)

    Stanek, Jan; Kozminski, Wiktor

    2010-01-01

    Spectra obtained by application of multidimensional Fourier Transformation (MFT) to sparsely sampled nD NMR signals are usually corrupted due to missing data. In the present paper this phenomenon is investigated on simulations and experiments. An effective iterative algorithm for artifact suppression for sparse on-grid NMR data sets is discussed in detail. It includes automated peak recognition based on statistical methods. The results enable one to study NMR spectra of high dynamic range of peak intensities preserving benefits of random sampling, namely the superior resolution in indirectly measured dimensions. Experimental examples include 3D 15 N- and 13 C-edited NOESY-HSQC spectra of human ubiquitin.

  5. Application of digital sampling techniques to particle identification in scintillation detectors

    International Nuclear Information System (INIS)

    Bardelli, L.; Bini, M.; Poggi, G.; Taccetti, N.

    2002-01-01

    In this paper, the use of a fast digitizing system for identification of fast charged particles with scintillation detectors is discussed. The three-layer phoswich detectors developed in the framework of the FIASCO experiment for the detection of light charged particles (LCP) and intermediate mass fragments (IMF) emitted in heavy-ion collisions at Fermi energies are briefly discussed. The standard analog electronics treatment of the signals for particle identification is illustrated. After a description of the digitizer designed to perform a fast digital sampling of the phoswich signals, the feasibility of particle identification on the sampled data is demonstrated. The results obtained with two different pulse shape discrimination analyses based on the digitally sampled data are compared with the standard analog signal treatment. The obtained results suggest, for the present application, the replacement of the analog methods with the digital sampling technique

  6. Uranium content measurement in drinking water samples using track etch technique

    International Nuclear Information System (INIS)

    Kumar, Mukesh; Kumar, Ajay; Singh, Surinder; Mahajan, R.K.; Walia, T.P.S.

    2003-01-01

    The concentration of uranium has been assessed in drinking water samples collected from different locations in Bathinda district, Punjab, India. The water samples are taken from hand pumps and tube wells. Uranium is determined using fission track technique. Uranium concentration in the water samples varies from 1.65±0.06 to 74.98±0.38 μg/l. These values are compared with safe limit values recommended for drinking water. Most of the water samples are found to have uranium concentration above the safe limit. Analysis of some heavy metals (Zn, Cd, Pb and Cu) in water is also done in order to see if some correlation exists between the concentration of uranium and these heavy metals. A weak positive correlation has been observed between the concentration of uranium and heavy metals of Pb, Cd and Cu

  7. Modeling the Conducting Stably-Stratified Layer of the Earth's Core

    Science.gov (United States)

    Petitdemange, L.; Philidet, J.; Gissinger, C.

    2017-12-01

    Observations of the Earth magnetic field as well as recent theoretical works tend to show that the Earth's outer liquid core is mostly comprised of a convective zone in which the Earth's magnetic field is generated - likely by dynamo action -, but also features a thin, stably stratified layer at the top of the core.We carry out direct numerical simulations by modeling this thin layer as an axisymmetric spherical Couette flow for a stably stratified fluid embedded in a dipolar magnetic field. The dynamo region is modeled by a conducting inner core rotating slightly faster than the insulating mantle due to magnetic torques acting on it, such that a weak differential rotation (low Rossby limit) can develop in the stably stratified layer.In the case of a non-stratified fluid, the combined action of the differential rotation and the magnetic field leads to the well known regime of `super-rotation', in which the fluid rotates faster than the inner core. Whereas in the classical case, this super-rotation is known to vanish in the magnetostrophic limit, we show here that the fluid stratification significantly extends the magnitude of the super-rotation, keeping this phenomenon relevant for the Earth core. Finally, we study how the shear layers generated by this new state might give birth to magnetohydrodynamic instabilities or waves impacting the secular variations or jerks of the Earth's magnetic field.

  8. Attempts to develop a new nuclear measurement technique of β-glucuronidase levels in biological samples

    International Nuclear Information System (INIS)

    Unak, T.; Avcibasi, U.; Yildirim, Y.; Cetinkaya, B.

    2003-01-01

    β-Glucuronidase is one of the most important hydrolytic enzymes in living systems and plays an essential role in the detoxification pathway of toxic materials incorporated into the metabolism. Some organs, especially liver and some tumour tissues, have high level of β-glucuronidase activity. As a result the enzymatic activity of some kind of tumour cells, the radiolabelled glucuronide conjugates of cytotoxic, as well as radiotoxic compounds have potentially very valuable diagnostic and therapeutic applications in cancer research. For this reason, a sensitive measurement of β-glucuronidase levels in normal and tumour tissues is a very important step for these kinds of applications. According to the classical measurement method of β-glucuronidase activity, in general, the quantity of phenolphthalein liberated from its glucuronide conjugate, i.e. phenolphthalein-glucuronide, by β-glucuronidase has been measured by use of the spectrophotometric technique. The lower detection limit of phenolphthalein by the spectrophotometric technique is about 1-3 mg. This means that the β-glucuronidase levels could not be detected in biological samples having lower levels of β-glucuronidase activity and therefore the applications of the spectrophotometric technique in cancer research are very seriously limited. Starting from this consideration, we recently attempted to develop a new nuclear technique to measure much lower concentrations of β-glucuronidase in biological samples. To improve the detection limit, phenolphthalein-glucuronide and also phenyl-N-glucuronide were radioiodinated with 131 I and their radioactivity was measured by use of the counting technique. Therefore, the quantity of phenolphthalein or aniline radioiodinated with 131 I and liberated by the deglucuronidation reactivity of β-glucuronidase was used in an attempt to measure levels lower than the spectrophotometric measurement technique. The results obtained clearly verified that 0.01 pg level of

  9. Random-Access Technique for Self-Organization of 5G Millimeter-Wave Cellular Communications

    Directory of Open Access Journals (Sweden)

    Jasper Meynard Arana

    2016-01-01

    Full Text Available The random-access (RA technique is a key procedure in cellular networks and self-organizing networks (SONs, but the overall processing time of this technique in millimeter-wave (mm-wave cellular systems with directional beams is very long because RA preambles (RAPs should be transmitted in all directions of Tx and Rx beams. In this paper, two different types of preambles (RAP-1 and RAP-2 are proposed to reduce the processing time in the RA stage. After analyzing the correlation property, false-alarm probability, and detection probability of the proposed RAPs, we perform simulations to show that the RAP-2 is suitable for RA in mm-wave cellular systems with directional beams because of the smaller processing time and high detection probability in multiuser environments.

  10. Comparison between ultrasound guided technique and digital palpation technique for radial artery cannulation in adult patients: An updated meta-analysis of randomized controlled trials.

    Science.gov (United States)

    Bhattacharjee, Sulagna; Maitra, Souvik; Baidya, Dalim K

    2018-03-22

    Possible advantages and risks associated with ultrasound guided radial artery cannulation in-comparison to digital palpation guided method in adult patients are not fully known. We have compared ultrasound guided radial artery cannulation with digital palpation technique in this meta-analysis. Meta-analysis of randomized controlled trials. Trials conducted in operating room, emergency department, cardiac catheterization laboratory. PubMed and Cochrane Central Register of Controlled Trials (CENTRAL) were searched (from 1946 to 20th November 2017) to identify prospective randomized controlled trials in adult patients. Two-dimensional ultrasound guided radial artery catheterization versus digital palpation guided radial artery cannulation. Overall cannulation success rate, first attempt success rate, time to cannulation and mean number of attempts to successful cannulation. Odds ratio (OR) and standardized mean difference (SMD) or mean difference (MD) with 95% confidence interval (CI) were calculated for categorical and continuous variables respectively. Data of 1895 patients from 10 studies have been included in this meta- analysis. Overall cannulation success rate was similar between ultrasound guided technique and digital palpation [OR (95% CI) 2.01 (1.00, 4.06); p = 0.05]. Ultrasound guided radial artery cannulation is associated with higher first attempt success rate of radial artery cannulation in comparison to digital palpation [OR (95% CI) 2.76 (186, 4.10); p guided technique with palpation technique. Radial artery cannulation by ultrasound guidance may increase the first attempt success rate but not the overall cannulation success when compared to digital palpation technique. However, results of this meta-analysis should be interpreted with caution due presence of heterogeneity. Copyright © 2018. Published by Elsevier Inc.

  11. A Fast MHD Code for Gravitationally Stratified Media using Graphical Processing Units: SMAUG

    Science.gov (United States)

    Griffiths, M. K.; Fedun, V.; Erdélyi, R.

    2015-03-01

    Parallelization techniques have been exploited most successfully by the gaming/graphics industry with the adoption of graphical processing units (GPUs), possessing hundreds of processor cores. The opportunity has been recognized by the computational sciences and engineering communities, who have recently harnessed successfully the numerical performance of GPUs. For example, parallel magnetohydrodynamic (MHD) algorithms are important for numerical modelling of highly inhomogeneous solar, astrophysical and geophysical plasmas. Here, we describe the implementation of SMAUG, the Sheffield Magnetohydrodynamics Algorithm Using GPUs. SMAUG is a 1-3D MHD code capable of modelling magnetized and gravitationally stratified plasma. The objective of this paper is to present the numerical methods and techniques used for porting the code to this novel and highly parallel compute architecture. The methods employed are justified by the performance benchmarks and validation results demonstrating that the code successfully simulates the physics for a range of test scenarios including a full 3D realistic model of wave propagation in the solar atmosphere.

  12. Compressive Sampling based Image Coding for Resource-deficient Visual Communication.

    Science.gov (United States)

    Liu, Xianming; Zhai, Deming; Zhou, Jiantao; Zhang, Xinfeng; Zhao, Debin; Gao, Wen

    2016-04-14

    In this paper, a new compressive sampling based image coding scheme is developed to achieve competitive coding efficiency at lower encoder computational complexity, while supporting error resilience. This technique is particularly suitable for visual communication with resource-deficient devices. At the encoder, compact image representation is produced, which is a polyphase down-sampled version of the input image; but the conventional low-pass filter prior to down-sampling is replaced by a local random binary convolution kernel. The pixels of the resulting down-sampled pre-filtered image are local random measurements and placed in the original spatial configuration. The advantages of local random measurements are two folds: 1) preserve high-frequency image features that are otherwise discarded by low-pass filtering; 2) remain a conventional image and can therefore be coded by any standardized codec to remove statistical redundancy of larger scales. Moreover, measurements generated by different kernels can be considered as multiple descriptions of the original image and therefore the proposed scheme has the advantage of multiple description coding. At the decoder, a unified sparsity-based soft-decoding technique is developed to recover the original image from received measurements in a framework of compressive sensing. Experimental results demonstrate that the proposed scheme is competitive compared with existing methods, with a unique strength of recovering fine details and sharp edges at low bit-rates.

  13. Development of Large Sample Neutron Activation Technique for New Applications in Thailand

    International Nuclear Information System (INIS)

    Laoharojanaphand, S.; Tippayakul, C.; Wonglee, S.; Channuie, J.

    2018-01-01

    The development of the Large Sample Neutron Activation Analysis (LSNAA) in Thailand is presented in this paper. The technique had been firstly developed with rice sample as the test subject. The Thai Research Reactor-1/Modification 1 (TRR-1/M1) was used as the neutron source. The first step was to select and characterize an appropriate irradiation facility for the research. An out-core irradiation facility (A4 position) was first attempted. The results performed with the A4 facility were then used as guides for the subsequent experiments with the thermal column facility. The characterization of the thermal column was performed with Cu-wire to determine spatial distribution without and with rice sample. The flux depression without rice sample was observed to be less than 30% while the flux depression with rice sample increased to within 60%. The flux monitors internal to the rice sample were used to determine average flux over the rice sample. The gamma selfshielding effect during gamma measurement was corrected using the Monte Carlo simulation. The ratio between the efficiencies of the volume source and the point source for each energy point was calculated by the MCNPX code. The research team adopted the k0-NAA methodology to calculate the element concentration in the research. The k0-NAA program which developed by IAEA was set up to simulate the conditions of the irradiation and measurement facilities used in this research. The element concentrations in the bulk rice sample were then calculated taking into account the flux depression and gamma efficiency corrections. At the moment, the results still show large discrepancies with the reference values. However, more research on the validation will be performed to identify sources of errors. Moreover, this LS-NAA technique was introduced for the activation analysis of the IAEA archaeological mock-up. The results are provided in this report. (author)

  14. Efficient sampling algorithms for Monte Carlo based treatment planning

    International Nuclear Information System (INIS)

    DeMarco, J.J.; Solberg, T.D.; Chetty, I.; Smathers, J.B.

    1998-01-01

    Efficient sampling algorithms are necessary for producing a fast Monte Carlo based treatment planning code. This study evaluates several aspects of a photon-based tracking scheme and the effect of optimal sampling algorithms on the efficiency of the code. Four areas were tested: pseudo-random number generation, generalized sampling of a discrete distribution, sampling from the exponential distribution, and delta scattering as applied to photon transport through a heterogeneous simulation geometry. Generalized sampling of a discrete distribution using the cutpoint method can produce speedup gains of one order of magnitude versus conventional sequential sampling. Photon transport modifications based upon the delta scattering method were implemented and compared with a conventional boundary and collision checking algorithm. The delta scattering algorithm is faster by a factor of six versus the conventional algorithm for a boundary size of 5 mm within a heterogeneous geometry. A comparison of portable pseudo-random number algorithms and exponential sampling techniques is also discussed

  15. Women’s perspectives and experiences on screening for osteoporosis (Risk-stratified Osteoporosis Strategy Evaluation, ROSE)

    DEFF Research Database (Denmark)

    Rothmann, Mette Juel; Huniche, Lotte; Ammentorp, Jette

    2014-01-01

    main themes: knowledge about osteoporosis, psychological aspects of screening, and moral duty. The women viewed the program in the context of their everyday life and life trajectories. Age, lifestyle, and knowledge about osteoporosis were important to how women ascribed meaning to the program, how......This study aimed to investigate women's perspectives and experiences with screening for osteoporosis. Focus groups and individual interviews were conducted. Three main themes emerged: knowledge about osteoporosis, psychological aspects of screening, and moral duty. Generally, screening was accepted...... due to life experiences, self-perceived risk, and the preventive nature of screening. PURPOSE: The risk-stratified osteoporosis strategy evaluation (ROSE) study is a randomized prospective population-based trial investigating the efficacy of a screening program to prevent fractures in women aged 65...

  16. Background stratified Poisson regression analysis of cohort data.

    Science.gov (United States)

    Richardson, David B; Langholz, Bryan

    2012-03-01

    Background stratified Poisson regression is an approach that has been used in the analysis of data derived from a variety of epidemiologically important studies of radiation-exposed populations, including uranium miners, nuclear industry workers, and atomic bomb survivors. We describe a novel approach to fit Poisson regression models that adjust for a set of covariates through background stratification while directly estimating the radiation-disease association of primary interest. The approach makes use of an expression for the Poisson likelihood that treats the coefficients for stratum-specific indicator variables as 'nuisance' variables and avoids the need to explicitly estimate the coefficients for these stratum-specific parameters. Log-linear models, as well as other general relative rate models, are accommodated. This approach is illustrated using data from the Life Span Study of Japanese atomic bomb survivors and data from a study of underground uranium miners. The point estimate and confidence interval obtained from this 'conditional' regression approach are identical to the values obtained using unconditional Poisson regression with model terms for each background stratum. Moreover, it is shown that the proposed approach allows estimation of background stratified Poisson regression models of non-standard form, such as models that parameterize latency effects, as well as regression models in which the number of strata is large, thereby overcoming the limitations of previously available statistical software for fitting background stratified Poisson regression models.

  17. Integrated sampling vs ion chromatography: Mathematical considerations

    International Nuclear Information System (INIS)

    Sundberg, L.L.

    1992-01-01

    This paper presents some general purpose considerations that can be utilized when comparisons are made between the results of integrated sampling over several hours or days, and ion chromatography where sample collection times are measured in minutes. The discussion is geared toward the measurement of soluble transition metal ions in BWR feedwater. Under steady-state conditions, the concentrations reported by both techniques should be in reasonable agreement. Transient operations effect both types of measurements. A simplistic model, applicable to both sampling techniques, is presented that demonstrates the effect of transients which occur during the acquisition of a steady-state sample. For a common set of conditions, the integrated concentration is proportional to the concentration and duration of the transient, and inversely proportional to the sample collection time. The adjustment of the collection period during a known transient allows an estimation of peak transient concentration. Though the probability of sampling a random transient with the integrated sampling technique is very high, the magnitude is severely diluted with long integration times. Transient concentrations are magnified with ion chromatography, but the probability of sampling a transient is significantly lower using normal ion chromatography operations. Various data averaging techniques are discussed for integrated sampling and IC determinations. The use of time-weighted averages appears to offer more advantages over arithmetic and geometric means for integrated sampling when the collection period is variable. For replicate steady-state ion chromatography determinations which bracket a transient sample, it may be advantageous to ignore the calculation of averages, and report the data as trending information only

  18. Invited Review. Combustion instability in spray-guided stratified-charge engines. A review

    Energy Technology Data Exchange (ETDEWEB)

    Fansler, Todd D. [Univ. of Wisconsin, Madison, WI (United States); Reuss, D. L. [Univ. of Michigan, Ann Arbor, MI (United States); Sandia National Lab. (SNL-CA), Livermore, CA (United States); Sick, V. [Univ. of Michigan, Ann Arbor, MI (United States); Dahms, R. N. [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2015-02-02

    Our article reviews systematic research on combustion instabilities (principally rare, random misfires and partial burns) in spray-guided stratified-charge (SGSC) engines operated at part load with highly stratified fuel -air -residual mixtures. Results from high-speed optical imaging diagnostics and numerical simulation provide a conceptual framework and quantify the sensitivity of ignition and flame propagation to strong, cyclically varying temporal and spatial gradients in the flow field and in the fuel -air -residual distribution. For SGSC engines using multi-hole injectors, spark stretching and locally rich ignition are beneficial. Moreover, combustion instability is dominated by convective flow fluctuations that impede motion of the spark or flame kernel toward the bulk of the fuel, coupled with low flame speeds due to locally lean mixtures surrounding the kernel. In SGSC engines using outwardly opening piezo-electric injectors, ignition and early flame growth are strongly influenced by the spray's characteristic recirculation vortex. For both injection systems, the spray and the intake/compression-generated flow field influence each other. Factors underlying the benefits of multi-pulse injection are identified. Finally, some unresolved questions include (1) the extent to which piezo-SGSC misfires are caused by failure to form a flame kernel rather than by flame-kernel extinction (as in multi-hole SGSC engines); (2) the relative contributions of partially premixed flame propagation and mixing-controlled combustion under the exceptionally late-injection conditions that permit SGSC operation on E85-like fuels with very low NOx and soot emissions; and (3) the effects of flow-field variability on later combustion, where fuel-air-residual mixing within the piston bowl becomes important.

  19. Contributions to sampling statistics

    CERN Document Server

    Conti, Pier; Ranalli, Maria

    2014-01-01

    This book contains a selection of the papers presented at the ITACOSM 2013 Conference, held in Milan in June 2013. ITACOSM is the bi-annual meeting of the Survey Sampling Group S2G of the Italian Statistical Society, intended as an international  forum of scientific discussion on the developments of theory and application of survey sampling methodologies and applications in human and natural sciences. The book gathers research papers carefully selected from both invited and contributed sessions of the conference. The whole book appears to be a relevant contribution to various key aspects of sampling methodology and techniques; it deals with some hot topics in sampling theory, such as calibration, quantile-regression and multiple frame surveys, and with innovative methodologies in important topics of both sampling theory and applications. Contributions cut across current sampling methodologies such as interval estimation for complex samples, randomized responses, bootstrap, weighting, modeling, imputati...

  20. Exploring Various Monte Carlo Simulations for Geoscience Applications

    Science.gov (United States)

    Blais, R.

    2010-12-01

    Computer simulations are increasingly important in geoscience research and development. At the core of stochastic or Monte Carlo simulations are the random number sequences that are assumed to be distributed with specific characteristics. Computer generated random numbers, uniformly distributed on (0, 1), can be very different depending on the selection of pseudo-random number (PRN), or chaotic random number (CRN) generators. Equidistributed quasi-random numbers (QRNs) can also be used in Monte Carlo simulations. In the evaluation of some definite integrals, the resulting error variances can even be of different orders of magnitude. Furthermore, practical techniques for variance reduction such as Importance Sampling and Stratified Sampling can be implemented to significantly improve the results. A comparative analysis of these strategies has been carried out for computational applications in planar and spatial contexts. Based on these experiments, and on examples of geodetic applications of gravimetric terrain corrections and gravity inversion, conclusions and recommendations concerning their performance and general applicability are included.